1 / 93

Curriculum Based Measurement (CBM) Training

Curriculum Based Measurement (CBM) Training. Middle School CBM Training. Characteristics of General Outcome Measures (GOMs). Powerful measures that are: Simple Easier to obtain data (less time and good data) Accurate Very specific data Efficient Only a few minutes to administer

afram
Télécharger la présentation

Curriculum Based Measurement (CBM) Training

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Curriculum Based Measurement(CBM) Training Middle School CBM Training

  2. Characteristics of General Outcome Measures (GOMs) • Powerful measures that are: • Simple • Easier to obtain data (less time and good data) • Accurate • Very specific data • Efficient • Only a few minutes to administer • Generalizable • Reliable • Can compare and contrast student performance across school, district, country Adapted from www.aimsweb.com

  3. General Outcome Measures (GOMs) from Other Fields Medicine measures height, weight, temperature, and/or blood pressure. Federal Reserve Board measures the Consumer Price Index. Wall Street measures the Dow-Jones Industrial Average. Companies report earnings per share. McDonald’s measures how many hamburgers they sell. In Education, Curriculum Based Measurement is a General Outcome Measure Adapted from www.aimsweb.com

  4. Using Curriculum Based Measures as General Outcome Measures • It’s about using General Outcome Measures (GOMs) for formative assessment/evaluation to: • Inform teaching AND • ensure accountability. • It’s different from, but related to, summative high-stakes testing/evaluation, which: • Doesn’t inform teaching. • Mostly used for accountability/motivation. Adapted from www.aimsweb.com

  5. Using Curriculum Based Measurement as a General Outcome Measure Universal (school-wide) screening using CBMs allows us to add systematic Formative Evaluation to current practice. • For Teachers (and Students) • Early Identification of At Risk Students • Instructional Planning • Monitoring Student Progress • For Parents • Opportunities for Communication/Involvement • Accountability • For Administrators • Resource Allocation/Planning and Support • Accountability Adapted from www.aimsweb.com

  6. Using Curriculum Based Measurement as a General Outcome Measure: Research • Curriculum-Based Measurement (CBM) was developed more than 20 years ago by Stanley Deno at the University of Minnesota through a federal contract to develop a reliable and valid measurement system for evaluating basic skills growth. • CBM is supported by more than 25 years of school-based research by the US Department of Education. • Supporting documentation can be found in 100s of articles, book chapters, and books in the professional literature describing the use of CBM to make a variety of important educational decisions. Adapted from www.aimsweb.com

  7. Summary of Research Validating Curriculum Based Measurement Reliable and valid indicator of student achievement Simple, efficient, and of short duration to facilitate frequent administration by teachers Provides assessment information that helps teachers plan better instruction Sensitive to the improvement of students’ achievement over time Easily understood by teachers and parents Improves achievement when used to monitor progress Adapted from www.aimsweb.com

  8. Curriculum Based Measurement: Advantages • Direct measure of student performance • Helps target specific areas of instructional need for students • Quick to administer • Provides visual representation (reports) of individual student progress and how classes are acquiring essential reading skills • Sensitive to even small improvements in performance • Capable of having many forms • Monitoring frequently enables staff to see trends in individual and group performance—and compare those trends with targets set for their students. • Correlates strongly with “best practices” for instruction and assessment, and research-supported methods for assessment and intervention.

  9. Curriculum Based Measurement: Things to Remember • Designed to serve as “indicators” of general reading achievement: CBM probes don’t measure everything, but measure the important things. • Standardized tests to be given, scored, and interpreted in a standard way. • Researched with respect to psychometric properties to ensure accurate measures of learning. • Are sensitive to improvement in brief intervals of time. • Tell us how students earned their scores (qualitative information). • Designed to be as short as possible to ensure “do-ability.” • Are linked to decision making for promoting positive achievement and problem-solving. Adapted from www.aimsweb.com

  10. Curriculum Based Measurement CBM has been shown to posses high levels of reliability Reliability - the extent to which the measurements of a test remain consistent over repeated tests of the same subject under identical conditions 42 one-minute CBM type assessments in reading, math, and written expression for grade K-5 were found to have reliability coefficients between .90-.99 with just three one-minute administrations (Jenkins, 2002) 10

  11. Curriculum Based Measurement Discriminant Validity - Does it appear to measure what it’s supposed to measure? And Doesn’t associate with constructs that shouldn’t be related. Several studies have demonstrated the ability of CBM to differentiate between students receiving special education services, students receiving Chapter 1 services, and students not receiving any of those services (Deno, Marston, Shinn, and Tindal, 1983; Marston and Deno, 1982; Shinn and Marston, 1985; and Shinn, Tindal, Spira, and Marston, 1987). 11

  12. What is Curriculum Based Measurement? • Curriculum-based measurement • Data collection tools derived directly from the curriculum that student is expected to learn • CBM – assessment tools created by teacher (pull material from class curriculum) • CBA – assessments pulled from a package (i.e., Skill Builders, DIBELS, Aims-Web)

  13. Curriculum Based Measurement • CBM is believed to reduce the gap between assessment and instruction • Aides teachers in generating superior student achievement • Improved communication • Higher level of sensitivity • Enhancement of the database • Administration time is shorter • More cost effective

  14. Why Fluency Measures? • Lots of good data can be obtained in small amounts of time • Fluency measures are significantly related to longer tests • Not what you know but how well do you know it

  15. Student Driver

  16. Correlation Studies looking at EOG and CBM Assessments • EOG and ORF correlation coefficients • 3rd grade: .69 • 4th grade: .59 • 5th grade: .53 • EOG and Maze Fluency correlation coefficients • 3rd grade: .61 • 4th grade: .63 • 5th grade: .63

  17. Correlation Studies looking at District performance on EOG and CBM Assessments • EOG and Skill Builder Word Problem probes correlation coefficients • 3rd grade: .64 • 4th grade: .49 • 5th grade: .60

  18. Cleveland County Schools EOG/CBM Data 2007

  19. Local Norms for Both Elementary and Middle Schools • Currently have CBM norms for K-5th grade • Currently gathering data from all middle schools for middle school norms • Norm sheet handouts

  20. 5th Grade End of Year Norms

  21. Curriculum Based Measurements • Any skill can be measured with a curriculum based measure • Example

  22. Curriculum Based Measurements at the Middle School Level • Oral Reading Fluency • Maze Fluency • Math Computation • Math Word Problems • Written Expression

  23. MAZE Fluency (Comprehension) • Students read silently for 3 minutes from AIMSweb Standard Reading MAZE Passages • Determine the number of correct answers • Record the total number of correct answers followed by the total number of errors (e.g., 35/2, 45/0)

  24. Student Copy

  25. Examiner Copy

  26. Administering the MAZE Probes • MAZE is a standardized test. • Procedures and directions must be uniform. • Once students are familiar with the test directions, the shortened “familiar” directions may be used.

  27. Important Points • Administer a simple practice test to familiarize the student with the procedure. • Attach a cover sheet to the student’s probe so that student does not begin the test prematurely. • Monitor student to ensure that he/she is circling the answers instead of writing them. • Discard the MAZE passage and administer another if there are any interruptions.

  28. Scoring MAZE • Score MAZE probes. • Use the answer key and put a slash(/) through incorrect words. • Determine the number of correct answers. • Subtract the number of incorrect answers from the total number of items attempted. • Record the total number of correct answers and the total number of errors (e.g., 20/4,15/0).

  29. Threats to Validity Patterns of responses that may suggest the student’s performance on a MAZE probe may be invalid: • High number of correct responses with a high number of errors • Correct beginning responses followed by many errors • Suspected cheating

  30. Oral Reading Fluency • Many ways to obtain data • DIBELS is nice because they have standardized directions • DIBELS also has standardized passages by grade level • DIBELS only goes up to 6th grade so we use AimsWeb standardized passages in middle school

  31. DIBELS® Oral Reading Fluency(DORF) Examiner shows reading passage to student. Student reads the passage. Score: Number of words read correctly in 1 minute.

  32. Oral Reading Fluency Probes: Example Examiner Copy Student Copy

  33. Materials • Administrator copy • Student passage • Clipboard • Stopwatch • Pen or pencil

  34. Directions for Administration • Place the scoring booklet on the clipboard and position so that the student cannot see what you record. • Place the reading passage in front of the student.

  35. Directions • Say these specific directions to the student: Please read this(point)out loud. If you get stuck, I will tell you the word so you can keep reading. When I say “stop” I may ask you to tell me about what you read, so do your best reading. Start here(point to first word of the passage). Begin.

  36. Start your stopwatch after the student says the first word of the passage. Follow along on the examiner scoring page. Put a slash (/) over words read incorrectly. If student hesitates on a word for 3 seconds supply the word for the student At the end of 1 minute place a bracket (]) after the last word read, say “Stop” and stop your stopwatch. Record the total number of words read correctly on the bottom of the scoring page. Directions

  37. Timing Rule for DORF:Continuous for 1 Minute • Start your stopwatch after the student says the first word. • At the end of 1 minute place a bracket (]) after the last word read, say “Stop” and stop your stopwatch.

  38. Wait Rule for DORF: 3 Seconds • Maximum time for each word is 3 seconds. • If the student does not read a word within 3 seconds, say the word and mark the word as incorrect. • If necessary, indicate for the studentto continue with the next word.

  39. Discontinue Rule:Part I: Zero (0) Words in the First Row • If the student does not read any words correctly in the first row of the first passage, discontinue administering the passage and record a score of zero (0).

  40. Discontinue Rule:Part II: Fewer than ten (10) words in First Passage • If the student reads fewer than 10 words per minute in the first passage, do not administer the next two passages. Record the score from the first passage.

  41. Directions for Scoring • Put a slash (/) over any word read incorrectly or omitted. • Do not mark words read correctly or any words added or repeated.

  42. Scoring Examples Mispronounced Words • A word is scored as correct if it is pronounced correctly in the context of the sentence. • If the word is mispronounced in the context, it is scored as an error.

  43. Scoring Examples Numerals • Numerals must be read correctly in the context of the sentence.

  44. Scoring Examples Repeated Words • Words that are repeated are ignored in scoring.

  45. Scoring Examples Inserted Words • Inserted words are ignored and not counted as errors. • The student does not get additional credit for inserted words.

  46. Scoring Examples Omitted Words • Omitted words are scored as incorrect.

  47. Scoring Examples Word Order • All words that are read correctly but in the wrong order are scored as incorrect.

More Related