1 / 48

Monitoring Student Academics: Curriculum-Based Measurement

‘But Is This Student Benefiting From Special Education Services?’: Reliable Methods to Measure Student Academic Progress Jim Wright www.interventioncentral.org. Monitoring Student Academics: Curriculum-Based Measurement. 5 Strands of Mathematical Proficiency Understanding Computing Applying

grich
Télécharger la présentation

Monitoring Student Academics: Curriculum-Based Measurement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ‘But Is This Student Benefiting From Special Education Services?’: Reliable Methods to Measure Student Academic ProgressJim Wrightwww.interventioncentral.org

  2. Monitoring Student Academics: Curriculum-Based Measurement

  3. 5 Strands of Mathematical Proficiency • Understanding • Computing • Applying • Reasoning • Engagement • 5 Big Ideas in Beginning Reading • Phonemic Awareness • Alphabetic Principle • Fluency with Text • Vocabulary • Comprehension Source: Big ideas in beginning reading. University of Oregon. Retrieved September 23, 2007, from http://reading.uoregon.edu/index.php Source: National Research Council. (2002). Helping children learn mathematics. Mathematics Learning Study Committee, J. Kilpatrick & J. Swafford, Editors, Center for Education, Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press. Models in Reading & Math

  4. Curriculum-Based Evaluation: Definition “Whereas standardized commercial achievement tests measure broad curriculum areas and/or skills, CBE measures specific skills that are presently being taught in the classroom, usually in basic skills. Several approaches to CBE have been developed. Four common characteristics exist across these models: • The measurement procedures assess students directly using the materials in which they are being instructed. This involves sampling items from the curriculum. • Administration of each measure is generally brief in duration (typically 1-5 mins.) • The design is structured such that frequent and repeated measurement is possible and measures are sensitive to change. • Data are usually displayed graphically to allow monitoring of student performance.” SOURCE: CAST Website: http://www.cast.org/publications/ncac/ncac_curriculumbe.html

  5. SOURCE: CAST Website: http://www.cast.org/publications/ncac/ncac_curriculumbe.html

  6. Curriculum-Based Measurement: Advantages as a Set of Tools to Monitor RTI/Academic Cases • Aligns with curriculum-goals and materials • Is reliable and valid (has ‘technical adequacy’) • Is criterion-referenced: sets specific performance levels for specific tasks • Uses standard procedures to prepare materials, administer, and score • Samples student performance to give objective, observable ‘low-inference’ information about student performance • Has decision rules to help educators to interpret student data and make appropriate instructional decisions • Is efficient to implement in schools (e.g., training can be done quickly; the measures are brief and feasible for classrooms, etc.) • Provides data that can be converted into visual displays for ease of communication Source: Hosp, M.K., Hosp, J. L., & Howell, K. W. (2007). The ABCs of CBM. New York: Guilford.

  7. Standards for Judging Academic Measures for RTI (National Center for Student Progress Monitoring)

  8. RTI : Assessment & Progress-Monitoring To measure student ‘response to instruction/intervention’ effectively, the RTI Literacy model measures students’ reading performance and progress on schedules matched to each student’s risk profile and intervention Tier membership. • Benchmarking/Universal Screening. All children in a grade level are assessed at least 3 times per year on a common collection of literacy assessments. • Strategic Monitoring. Students placed in Tier 2 (supplemental) reading groups are assessed 1-2 times per month to gauge their progress with this intervention. • Intensive Monitoring. Students who participate in an intensive, individualized Tier 3 reading intervention are assessed at least once per week. Source: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.

  9. Evaluating the ‘RTI Readiness’ of School AssessmentsJim Wrightwww.interventioncentral.org

  10. Interpreting the Results of This Survey… • YES to Items 1-3. Background. The measure gives valid general information about the student’s academic skills and performance. While not sufficient, the data can be interpreted as part of a larger collection of student data. • YES to Items 4-5. Baseline. The measure gives reliable results when given by different people and at different times of the day or week. Therefore, the measure can be used to collect a current ‘snapshot’ of the student’s academic skills prior to starting an intervention. • YES to Items 6-7. Goal-Setting. The measure includes standards (e.g., benchmarks or performance criteria) for ‘typical’ student performance (e.g., at a given grade level) and guidelines for estimating rates of student progress. Schools can use the measure to assess the gap in performance between a student and grade level peers—and also to estimate expected rates of student progress during an intervention. • YES to Items 8-11. Progress Monitoring. The measure has the appropriate qualities to be used to track student progress in response to an intervention.

  11. Background: Validity • Content Validity. Does the measure provide meaningful information about the academic skill of interest? • Convergent Validity. Does the measure yield results that are generally consistent with other well-regarded tests designed to measure the same academic skill? • Predictive Validity. Does the measure predict student success on an important future test, task, or other outcome?

  12. Baseline: Reliability • Test-Retest/Alternate-Form Reliability. Does the measure have more than one version or form? If two alternate, functionally equivalent versions of the measure are administered to the student, does the student perform about the same on both? • Interrater Reliability. When two different evaluators observe the same student’s performance and independently use the measure to rate that performance, do they come up with similar ratings?

  13. Benchmarks & Goal-Setting • Performance Benchmarks. Does the measure include benchmarks or other performance criteria that indicate typical or expected student performance in the academic skill? • Goal-Setting. Does the measure include guidelines for setting specific goals for improvement?

  14. Progress-Monitoring and Instructional Impact • Repeated Assessments. Does the measure have sufficient alternative forms to assess the student weekly for at least 20 weeks? • Equivalent Alternate Forms. Are the measure’s repeated assessments (alternative forms) equivalent in content and level of difficulty? • Sensitive to Short-Term Student Gains. Is the measure sensitive to short-term improvements in student academic performance? • Positive Impact on Learning. Does research show that the measure gives teachers information that helps them to make instructional decisions that positively impact student learning?

  15. Team Activity: Evaluating the ‘RTI Readiness’ of School Assessments • At your table: • Review the handout Evaluate the RTI Readiness of Your School’s Academic Measures. • Discuss how your school or district might use such a formto evaluate common classroomacademic measures.

  16. Example of Curriculum-Based Assessment Reading Probe

  17. DIBELS Reading Probe: Benchmark 2.1

  18. 57 WPM

  19. CBM Student Reading Samples: What Difference Does Fluency Make? • 3rd Grade: 19 Words Per Minute • 3rd Grade: 70 Words Per Minute • 3rd Grade: 98 Words Per Minute

  20. Assessing Basic Academic Skills: Curriculum-Based Measurement Reading: These 3 measures all proved ‘adequate predictors’ of student performance on reading content tasks: • Reading aloud (Oral Reading Fluency): Passages from content-area tests: 1 minute. • Maze task (every 7th item replaced with multiple choice/answer plus 2 distracters): Passages from content-area texts: 2 minutes. • Vocabulary matching: 10 vocabulary items and 12 definitions (including 2 distracters): 10 minutes. Source: Espin, C. A., & Tindal, G. (1998). Curriculum-based measurement for secondary students. In M. R. Shinn (Ed.) Advanced applications of curriculum-based measurement. New York: Guilford Press.

  21. www.interventioncentral.org • www.superkids.com Assessing Basic Academic Skills: Curriculum-Based Measurement Mathematics: Single-skill basic arithmetic combinations an ‘adequate measure of performance’ for low-achieving middle school students. Websites to create CBM math computation probes: Source: Espin, C. A., & Tindal, G. (1998). Curriculum-based measurement for secondary students. In M. R. Shinn (Ed.) Advanced applications of curriculum-based measurement. New York: Guilford Press.

  22. Assessing Basic Academic Skills: Curriculum-Based Measurement Writing: CBM/ Word Sequence is a ‘valid indicator of general writing proficiency’. It evaluates units of writing and their relation to one another. Successive pairs of ‘writing units’ make up each word sequence. The mechanics and conventions of each word sequence must be correct for the student to receive credit for that sequence. CBM/ Word Sequence is the most comprehensive CBM writing measure. Source: Espin, C. A., & Tindal, G. (1998). Curriculum-based measurement for secondary students. In M. R. Shinn (Ed.) Advanced applications of curriculum-based measurement. New York: Guilford Press.

  23. Example: Using Local Reading Norms in Coordination with Research Norms

  24. Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data 31 34 34 39 41 43 52 55 59 61 68 71 74 75 85 89 102 108 112 115 118 118 131 LOCAL NORMS EXAMPLE: Twenty-three 4th-grade students were administered oral reading fluency Curriculum-Based Measurement passages at the 4th-grade level in their school. • In their current number form, these data are not easy to interpret. • So the school converts them into a visual display—a box-plot —to show the distribution of scores and to convert the scores to percentile form. • When Billy, a struggling reader, is screened in CBM reading fluency, he shows a SIGNIFICANT skill gap when compared to his grade peers.

  25. Median (2nd Quartile)=71 Group Norms: Converted to Box-Plot National Reading Norms: 112 CRW Per Min 1st Quartile=43 3rd Quartile=108 Source: Tindal, G., Hansbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33]. Eugene, OR: University of Oregon. Billy=19 Hi Value=131 Low Value=31 0 20 40 60 80 100 120 140 160 Correctly Read Words-Book 4-1 Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students January Benchmarking Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data 31 34 34 39 41 43 52 55 59 61 68 71 74 75 85 89 102 108 112 115 118 118 131

  26. Interpreting Data: The Power of Visual Display

  27. Sample Peer Tutoring Chart

  28. Sample Peer Tutoring Chart

  29. Single-Subject (Applied) Research Designs “Single-case designs evolved because of the need to understand patterns of individual behavior in response to independent variables, and more practically, to examine intervention effectiveness. Design use can be flexible, described as a process of response-guided experimentation…, providing a mechanism for documenting attempts to live up to legal mandates for students who are not responding to routine instructional methods.” p. 71 Source: Barnett, D. W., Daly, E. J., Jones, K. M., & Lentz, F.E. (2004). Response to intervention: Empirically based special service decisions from single-case designs of increasing and decreasing intensity. Journal of Special Education, 38, 66-79.

  30. Single-Subject (Applied) Research Designs: Steps “The basic methods [of single-case designs] are • selecting socially important variables as dependent measures or target behaviors • taking repeated measures until stable patterns emerge so that participants may serve as their own controls (i.e., baseline) • implementing a well-described intervention or discrete intervention trials • continuing measurement of both the dependent and independent variables within an acceptable pattern of intervention application and/or withdrawal to detect changes in behavior and make efficacy attributions • graphically analyzing the results to enable ongoing comparisons of the student’s performance under baseline and intervention conditions, and • replicating the results to reach the ultimate goal of the dissemination of effective practices.” Source: Barnett, D. W., Daly, E. J., Jones, K. M., & Lentz, F.E. (2004). Response to intervention: Empirically based special service decisions from single-case designs of increasing and decreasing intensity. Journal of Special Education, 38, 66-79.

  31. 3 17 1 20 1 27 1 13 4 14 2 10 2 3 3 3 3 10 3 24 3 31 4 7 2 24 4 11 2 28 2 7 2 14 1 31 3 7 4 18 3 14 3 21 3 28 1 17 4 4 1 24 Jared: Intervention Phase 1: Weeks 1-6 X X F 3/7 82 CRW Th 2/27 79 CRW W 1/29 77 CRW Th 2/13 75 CRW M 2/3 75 CRW W 1/22 71 CRW

  32. Formative Assessment: Donald: Grade 3

  33. Formative Assessment: Donald: Grade 3

  34. IEP Goal Statements for CBA/CBM

  35. Reading In [number of weeks until Annual Review], when given a randomly selected passage from [level and name of reading series] for 1 minute Student will read aloud At [number] correctly read words with no more than [number] decoding errors. IEP Goals for CBA/CBM: READING

  36. Written Expression In [number of weeks until Annual Review], when given a story starter or topic sentence and 3 minutes in which to write Student will write IEP Goals for CBA/CBM: Written Expression A total of: [number] of wordsor [number] of correctly spelled wordsor [number] of correct word/writing sequences

  37. Spelling In [number of weeks until Annual Review], when dictated randomly selected words from [level and name of spelling series or description of spelling word list] for 2 minutes Student will write [Number of correct letter sequences] IEP Goals for CBA/CBM: Spelling

  38. IEP Goal Statements for CBA/CBM

  39. Writing CBM Goals in Student IEPs (Wright, 1992) Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

  40. Writing CBM Goals in Student IEPs (Wright, 1992) Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

  41. Writing CBM Goals in Student IEPs (Wright, 1992) Source: Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved on September 4, 2008, from http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf

  42. Reading In [number of weeks until Annual Review], when given a randomly selected passage from [level and name of reading series] for 1 minute Student will read aloud At [number] correctly read words with no more than [number] decoding errors. IEP Goals for CBA/CBM: READING

  43. Written Expression In [number of weeks until Annual Review], when given a story starter or topic sentence and 3 minutes in which to write Student will write IEP Goals for CBA/CBM: Written Expression A total of: [number] of wordsor [number] of correctly spelled wordsor [number] of correct word/writing sequences

  44. Spelling In [number of weeks until Annual Review], when dictated randomly selected words from [level and name of spelling series or description of spelling word list] for 2 minutes Student will write [Number of correct letter sequences] IEP Goals for CBA/CBM: Spelling

  45. Formative Assessment: Essential Questions… 5. How does the school check up on progress toward the goal(s)? The school periodically checks the formative assessment data to determine whether the goal is being attained. Examples of this progress evaluation process include the following: • System-Wide: A school-wide team meets on a monthly basis to review the frequency and type of office disciplinary referrals to judge whether those referrals have dropped below the acceptable threshold for student behavior. • Group Level: Teachers at a grade level assembles every six weeks to review CBM data on students receiving small-group supplemental instruction to determine whether students are ready to exit (Burns & Gibbons, 2008). • Individual Level: A building problem-solving team gathers every eight weeks to review CBM data to a student’s response to an intensive reading fluency plan. Sources: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge. Shinn, M. R. (1989). Curriculum-based measurement: Assessing special children. New York: Guilford.

  46. Team Activity: RTI & Assesment/Progress-Monitoring • At your table: • Discuss the challenge of creating an array of screening and progress-monitoring tools to support RTI. • Talk about possible solutions to these challenges. • Be prepared to share the ‘high points’ of your conversation.

More Related