1 / 81

Formative Assessment Overview: Specific Assessment Tools to Measure Student Literacy Skills and Behavior Jim Wright int

Formative Assessment Overview: Specific Assessment Tools to Measure Student Literacy Skills and Behavior Jim Wright www.interventioncentral.org. What is the relevant academic or behavioral outcome measure to be tracked?.

mercer
Télécharger la présentation

Formative Assessment Overview: Specific Assessment Tools to Measure Student Literacy Skills and Behavior Jim Wright int

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Formative Assessment Overview: Specific Assessment Tools to Measure Student Literacy Skills and BehaviorJim Wrightwww.interventioncentral.org

  2. What is the relevant academic or behavioral outcome measure to be tracked? • Is the focus the core curriculum or system, subgroups of underperforming learners, or individual struggling students? • What method(s) should be used to measure the target academic skill or behavior? • What goal(s) are set for improvement? • How does the school check up on progress toward the goal(s)? Effective Formative Evaluation: The Underlying Logic…

  3. Summative data is static information that provides a fixed ‘snapshot’ of the student’s academic performance or behaviors at a particular point in time. School records are one source of data that is often summative in nature—frequently referred to as archival data. Attendance data and office disciplinary referrals are two examples of archival records, data that is routinely collected on all students. In contrast to archival data, background information is collected specifically on the target student. Examples of background information are teacher interviews and student interest surveys, each of which can shed light on a student’s academic or behavioral strengths and weaknesses. Like archival data, background information is usually summative, providing a measurement of the student at a single point in time.

  4. Formative assessment measures are those that can be administered or collected frequently—for example, on a weekly or even daily basis. These measures provide a flow of regularly updated information (progress monitoring) about the student’s progress in the identified area(s) of academic or behavioral concern. Formative data provide a ‘moving picture’ of the student; the data unfold through time to tell the story of that student’s response to various classroom instructional and behavior management strategies. Examples of measures that provide formative data are Curriculum-Based Measurement probes in oral reading fluency and Daily Behavior Report Cards.

  5. Formal Assessment Defined “Formative assessment [in academics] refers to the gathering and use of information about students’ ongoing learning by both teachers and students to modify teaching and learning activities. …. Today…there are compelling research results indicating that the practice of formative assessment may be the most significant single factor in raising the academic achievement of all students—and especially that of lower-achieving students.” p. 7 Source: Harlen, W. (2003). Enhancing inquiry through formative assessment. San Francisco, CA: Exploratorium. Retrieved on September 17, 2008, from http://www.exploratorium.edu/ifi/resources/harlen_monograph.pdf

  6. Academic or Behavioral Targets Are Stated as ‘Replacement Behaviors’ “A problem solution is defined as one or more changes to the instruction, curriculum, or environment that function(s) to reduce or eliminate a problem.” p. 159 Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

  7. School Instructional Time: The Irreplaceable Resource “In the average school system, there are 330 minutes in the instructional day, 1,650 minutes in the instructional week, and 56,700 minutes in the instructional year. Except in unusual circumstances, these are the only minutes we have to provide effective services for students. The number of years we have to apply these minutes is fixed. Therefore, each minute counts and schools cannot afford to support inefficient models of service delivery.” p. 177 Source: Batsche, G. M., Castillo, J. M., Dixon, D. N., & Forde, S. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 177-193).

  8. Formative Assessment: Essential Questions… 1. What is the relevant academic or behavioral outcome measure to be tracked? Problems identified for formative assessment should be: • Important to school stakeholders. • Measureable & observable. • Stated positively as ‘replacement behaviors’ or goal statements rather than as general negative concerns (Bastche et al., 2008). • Based on a minimum of inference (T. Christ, 2008). Source: Batsche, G. M., Castillo, J. M., Dixon, D. N., & Forde, S. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 177-193).Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

  9. Academic or Behavioral Targets Are Stated as ‘Replacement Behaviors’ “The implementation of successful interventions begins with accurate problem identification. Traditionally, the student problem was stated as a broad, general concern (e.g., impulsive, aggressive, reading below grade level) that a teacher identified. In a competency-based approach, however, the problem identification is stated in terms of the desired replacement behaviors that will increase the student’s probability of successful adaptation to the task demands of the academic setting.” p. 178 Source: Batsche, G. M., Castillo, J. M., Dixon, D. N., & Forde, S. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 177-193).

  10. Inference: Moving Beyond the Margins of the ‘Known’ “An inference is a tentative conclusion without direct or conclusive support from available data. All hypotheses are, by definition, inferences. It is critical that problem analysts make distinctions between what is known and what is inferred or hypothesized….Low-level inferences should be exhausted prior to the use of high-level inferences.” p. 161 Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

  11. High-Inference Hypothesis. The student has an auditory processing issue that prevents success in reading. The student requires a multisensory approach to reading instruction to address reading deficits. Unknown Known Unknown Low-Inference Hypothesis. The student needs to build reading fluency skills to become more proficient in decoding. Known Examples of High vs. Low Inference Hypotheses The results of grade-wide benchmarking in reading show that a target 2nd-grade student can read aloud at approximately half the rate of the median child in the grade.

  12. Adopting a Low-Inference Model of Reading Skills • 5 Big Ideas in Beginning Reading • Phonemic Awareness • Alphabetic Principle • Fluency with Text • Vocabulary • Comprehension Source: Source: Big ideas in beginning reading. University of Oregon. Retrieved September 23, 2007, from http://reading.uoregon.edu/index.php

  13. Formative Assessment: Essential Questions… 2. Is the focus the core curriculum or system, subgroups of underperforming learners, or individual struggling students? Apply the ‘80-15-5 ‘Rule (T. Christ, 2008) : • If less than 80% of students are successfully meeting academic or behavioral goals, the formative assessment focus is on the core curriculum and general student population. • If no more than 15% of students are not successful in meeting academic or behavioral goals, the formative assessment focus is on small-group ‘treatments’ or interventions. • If no more than 5% of students are not successful in meeting academic or behavioral goals, the formative assessment focus is on the individual student. Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 159-176).

  14. Using Local Norms in Coordination with Benchmark Data

  15. Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data 31 34 34 39 41 43 52 55 59 61 68 71 74 75 85 89 102 108 112 115 118 118 131 LOCAL NORMS EXAMPLE: Twenty-three 4th-grade students were administered oral reading fluency Curriculum-Based Measurement passages at the 4th-grade level in their school. • In their current number form, these data are not easy to interpret. • So the school converts them into a visual display—a box-plot —to show the distribution of scores and to convert the scores to percentile form. • When Billy, a struggling reader, is screened in CBM reading fluency, he shows a SIGNIFICANT skill gap when compared to his grade peers.

  16. Median (2nd Quartile)=71 Group Norms: Converted to Box-Plot National Reading Norms: 112 CRW Per Min 1st Quartile=43 3rd Quartile=108 Source: Tindal, G., Hansbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33]. Eugene, OR: University of Oregon. Billy=19 Hi Value=131 Low Value=31 0 20 40 60 80 100 120 140 160 Correctly Read Words-Book 4-1 Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students January Benchmarking Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data 31 34 34 39 41 43 52 55 59 61 68 71 74 75 85 89 102 108 112 115 118 118 131

  17. Team Activity: Formative Assessment and Your Schools • At your tables, discuss: • What kinds of formative measures your schools tend to collect most often. • How ‘ready’ your schools are to collect, interpret, and act on formative assessment data..

  18. Formative Assessment: Essential Questions… 3. What method(s) should be used to measure the target academic skill or behavior? Formative assessment methods should be as direct a measure as possible of the problem or issue being evaluated. These assessment methods can: • Consist of General Outcome Measures or Specific Sub-Skill Mastery Measures • Include existing (‘extant’) data from the school system Curriculum-Based Measurement (CBM) is widely used to track basic student academic skills. Daily Behavior Report Cards (DBRCs) are increasingly used as one source of formative behavioral data. Source: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.

  19. Making Use of Existing (‘Extant’) Data

  20. Extant (Existing) Data (Chafouleas et al., 2007) • Definition: Information that is collected by schools as a matter of course. • Extant data comes in two forms: • Performance summaries (e.g., class grades, teacher summary comments on report cards, state test scores). • Student work products (e.g., research papers, math homework, PowerPoint presentation). Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

  21. Advantages of Using Extant Data (Chafouleas et al., 2007) • Information is already existing and easy to access. • Students will not show ‘reactive’ effects when data is collected, as the information collected is part of the normal routine of schools. • Extant data is ‘relevant’ to school data consumers (such as classroom teachers, administrators, and members of problem-solving teams). Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

  22. Drawbacks of Using Extant Data (Chafouleas et al., 2007) • Time is required to collate and summarize the data (e.g., summarizing a week’s worth of disciplinary office referrals). • The data may be limited and not reveal the full dimension of the student’s presenting problem(s). • There is no guarantee that school staff are consistent and accurate in how they collect the data (e.g., grading policies can vary across classrooms; instructors may have differing expectations regarding what types of assignments are given a formal grade; standards may fluctuate across teachers for filling out disciplinary referrals). • Little research has been done on the ‘psychometric adequacy’ of extant data sources. Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

  23. Grades as a Classroom-Based ‘Pulse’ Measure of Academic Performance

  24. Grades & Other Teacher Performance Summary Data (Chafouleas et al., 2007) • Teacher test and quiz grades can be useful as a supplemental method for monitoring the impact of student behavioral interventions. • Other data about student academic performance (e.g., homework completion, homework grades, etc.) can also be tracked and graphed to judge intervention effectiveness. Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

  25. 2-Wk 9/23/07 4-Wk 10/07/07 6-Wk 10/21/07 8-Wk 11/03/07 10-Wk 11/20/07 12-Wk 12/05/07 Marc Ripley (From Chafouleas et al., 2007) Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press.

  26. Online Grading Systems

  27. Academic Measures Can Serve As Indicators of Improved Student Behavior Academic measures (e.g., grades, CBM data) can be useful as part of the progress-monitoring ‘portfolio’ of data collected on a student because: • Students with problem behaviors often struggle academically, so tracking academics as a target is justified in its own right. • Improved academic performance generally correlates with reduced behavioral problems. • Individualized interventions for misbehaving students frequently contain academic components (as the behavior problems can emerge in response to chronic academic deficits). Academic progress-monitoring data helps the school to track the effectiveness of the academic interventions.

  28. Curriculum-Based Measurement: Assessing Basic Academic Skills

  29. Curriculum-Based Assessment: Advantages Over Commercial, Norm-Referenced Achievement Tests

  30. Commercial Tests: Limitations • Compare child to ‘national’ average rather than to class or school peers • Have unknown overlap with student curriculum, classroom content • Can be given only infrequently • Are not sensitive to short-term student gains in academic skills

  31. Curriculum-Based Evaluation

  32. Curriculum-Based Evaluation: Definition “Whereas standardized commercial achievement tests measure broad curriculum areas and/or skills, CBE measures specific skills that are presently being taught in the classroom, usually in basic skills. Several approaches to CBE have been developed. Four common characteristics exist across these models: • The measurement procedures assess students directly using the materials in which they are being instructed. This involves sampling items from the curriculum. • Administration of each measure is generally brief in duration (typically 1-5 mins.) • The design is structured such that frequent and repeated measurement is possible and measures are sensitive to change. • Data are usually displayed graphically to allow monitoring of student performance.” SOURCE: CAST Website: http://www.cast.org/publications/ncac/ncac_curriculumbe.html

  33. SOURCE: CAST Website: http://www.cast.org/publications/ncac/ncac_curriculumbe.html

  34. Curriculum-Based Measurement/ Assessment: Defining Characteristics: • Assesses preselected objectives from local curriculum • Has standardized directions for administration • Is timed, yielding fluency, accuracy scores • Uses objective, standardized, ‘quick’ guidelines for scoring • Permits charting and teacher feedback Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

  35. CBM Student Reading Samples: What Difference Does Fluency Make? • 3rd Grade: 19 Words Per Minute • 3rd Grade: 70 Words Per Minute • 3rd Grade: 98 Words Per Minute

  36. CBM Techniques have been developed to assess: • Reading fluency • Reading comprehension • Math computation • Writing • Spelling • Phonemic awareness skills • Early math skills

  37. Measuring General vs. Specific Academic Outcomes • General Outcome Measures: Track the student’s increasing proficiency on general curriculum goals such as reading fluency. An example is CBM-Oral Reading Fluency (Hintz et al., 2006). • Specific Sub-Skill Mastery Measures: Track short-term student academic progress with clear criteria for mastery (Burns & Gibbons, 2008). An example is Letter Identification. Sources: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge. Hintz, J. M., Christ, T. J., & Methe, S. A. (2006). Curriculum-based assessment. Psychology in the Schools, 43, 45-56.

  38. Example of Curriculum-Based Assessment Reading Probe

  39. DIBELS Reading Probe: Level 2.1

  40. 57 WPM

  41. Assessing Basic Academic Skills: Curriculum-Based Measurement Reading: These 3 measures all proved ‘adequate predictors’ of student performance on reading content tasks: • Reading aloud (Oral Reading Fluency): Passages from content-area tests: 1 minute. • Maze task (every 7th item replaced with multiple choice/answer plus 2 distracters): Passages from content-area texts: 2 minutes. • Vocabulary matching: 10 vocabulary items and 12 definitions (including 2 distracters): 10 minutes. Source: Espin, C. A., & Tindal, G. (1998). Curriculum-based measurement for secondary students. In M. R. Shinn (Ed.) Advanced applications of curriculum-based measurement. New York: Guilford Press.

  42. Assessing Basic Academic Skills: Curriculum-Based Measurement Writing: CBM/ Word Sequence is a ‘valid indicator of general writing proficiency’. It evaluates units of writing and their relation to one another. Successive pairs of ‘writing units’ make up each word sequence. The mechanics and conventions of each word sequence must be correct for the student to receive credit for that sequence. CBM/ Word Sequence is the most comprehensive CBM writing measure. Source: Espin, C. A., & Tindal, G. (1998). Curriculum-based measurement for secondary students. In M. R. Shinn (Ed.) Advanced applications of curriculum-based measurement. New York: Guilford Press.

  43. Format Option 1 20 vocabulary terms appear alphabetically in the right column. Items are drawn randomly from a ‘vocabulary pool’ Randomly arranged definitions appear in the left column. The student writes the letter of the correct term next to each matching definition. The student receives 1 point for each correct response. Each probe lasts 5 minutes. 2-3 probes are given in a session. Curriculum-Based Evaluation: Math Vocabulary Source: Howell, K. W. (2008). Best practices in curriculum-based evaluation and advanced reading. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 397-418).

  44. Format Option 2 20 randomly arranged vocabulary definitions appear in the right column. Items are drawn randomly from a ‘vocabulary pool’ The student writes the name of the correct term next to each matching definition. The student is given 0.5 point for each correct term and another 0.5 point if the term is spelled correctly. Each probe lasts 5 minutes. 2-3 probes are given in a session. Curriculum-Based Evaluation: Math Vocabulary Source: Howell, K. W. (2008). Best practices in curriculum-based evaluation and advanced reading. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 397-418).

  45. Monitoring Student Academic Behaviors:Daily Behavior Report Cards

  46. Daily Behavior Report Cards (DBRCs) Are… • brief forms containing student behavior-rating items. The teacher typically rates the student daily (or even more frequently) on the DBRC. The results can be graphed to document student response to an intervention.

  47. Daily Behavior Report Cards Can Monitor… • Hyperactivity • On-Task Behavior (Attention) • Work Completion • Organization Skills • Compliance With Adult Requests • Ability to Interact Appropriately With Peers

  48. Jim Blalock May 5 Mrs. Williams Rm 108 Daily Behavior Report Card: Daily Version

  49. Jim Blalock Mrs. Williams Rm 108 Daily Behavior Report Card: Weekly Version 05 05 07 05 06 07 05 07 07 05 08 07 05 09 07 40 0 60 60 50

More Related