1 / 44

Curriculum-based Measures: Math

Curriculum-based Measures: Math. Kat Nelson, M.Ed University of Utah. Objectives . You will be able to define a CBM and articulate the big ideas of using math CBM with the CCSS and the MTSS model . You will be able to administer and score screener and progress monitoring probes.

fabian
Télécharger la présentation

Curriculum-based Measures: Math

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Curriculum-based Measures: Math Kat Nelson, M.Ed University of Utah

  2. Objectives • You will be able to define a CBM and articulate the big ideas of using math CBM with the CCSSand the MTSSmodel. • You will be able to administer and score screener and progress monitoring probes. • You will be able to use the problem solving process to interpret the data produced from the math CBM.

  3. CBM: Big Ideas(Kelly, Hosp, Howell, 2008) • “CBM is a quick and reliable method for gathering information about student performance and progress.” • CBM is… • Aligned with Curriculum • Valid and Reliable • Standardized measures • Provides low-inference Information

  4. CBM: Big Ideas(Kelly, Hosp, Howell, 2008) • CBM probes are repeated measures that are efficient, and sensitive to growth. • Sensitivity to growth = Informing your instruction frequently. • Information about performance and growth can be easily shared with stakeholders • Indicator of future reading and math achievement

  5. Curriculum-Based Measurement And The Common Core State Standarads Big Ideas

  6. Common Core & CBM(Shinn, 2012) • The Common Core State Standards (CCSS) provide sets of College and Career focused outcomes and annual Criterion Referenced Tests to measure student learning as a summative evaluation. • The assessment implications of CCSS are clearly related to summative evaluation and accountability • No single test is sufficient for all the data-based decisions, screenings, intervention planning/diagnosis, progress monitoring, accountability/program evaluation that schools makein their attempts to identify student learning needs.

  7. Common Core & CBM(Shinn, 2012) • Assessment of CCSS need not be separate items or tests for each standard, but may include “rich tasks” that address a number of separate standards. • AIMSweb’s Curriculum-Based Measurement (CBM) tests typically are based on these rich tasks that are validated as “vital signs” or “indicators” of general basic skill outcomes.

  8. Common Core & CBM(Shinn, 2012) • AIMSweb’s CBM tests are consistent with the CCSS. They are content valid. • AIMSweb’sCBM tests are complementary to the assessment requirements to attain proficiency on the CCSS.

  9. Curriculum Based- Measurement And Multi-tier System Of Support Big Ideas

  10. Multi-Tiered System of Support • Schools identify students at risk for poor learning outcomes • Monitor student progress • Provide evidence based interventions and adjust the intensity and nature of those interventions depending on a student’s responsiveness. (NCRtI, 2010)

  11. Key Features of MTSS (Sugai, 2008) • Universal Design • Data-based decision making and problem solving • Continuous progress monitoring • Focus on successful student outcomes • Continuum of evidence-based interventions • A core curriculum is provided for all students • A modification of this core is arranged for students who are identified as non-responsive • A specialized and intensive curriculum for students with intensive needs • Focus on fidelity of implementation

  12. Problem Solving Process

  13. Using CBM within MTSS • Tier 1 Universal Screening • Establishes benchmarks three times throughout the school year • Tier 2 Progress monitoring • Monitoring students at-risk by assessing monthly • Tier 3 Intensive Progress monitoring • Frequent assessment for students at risk or significant needs

  14. Conducting A Math CBM Directions and Scoring Procedures

  15. Selecting the Measure • At Kindergarten or Grade 1 • Oral Counting • Quantity Array • Number Identification • Quantity Discrimination • Missing Number • At Grade 1-8 • Computation (Mixed and/or Facts) • Concepts & Applications • As appropriate (Grade 9?) • Algebra

  16. Let’s take a Look • Early Numeracy Measures

  17. Let’s Take a Look • Concepts and Applications or M-Cap

  18. Let’s Take a Look • Computation

  19. Administrationof Computation Probe • The number of correctly written digits in 2 minutes from the end-of-year curriculum • Correct digits • Not correct problems or answers • Why? • 2 minutes • Depends on grade and publisher

  20. Computation • Student(s) are given a sheet of math problems and pencil • Student(s) complete as many math problems as they can in 2 minutes • At the end of 2 minutes the number of correctly written digits is counted

  21. Directions for Computation • Give the child(ren) a math sheet(s) and pencil • Say “The sheet on your desk is math facts. There are several types of problems on the sheet. Some are (insert types of problems on sheet). Look at each problem carefully before you answer it. When I say ‘please begin’, start answering the problems. Begin with the first problem and work across the page. Then go to the next row. If you cannot answer the problem, mark an ‘X’ through it and go to the next one. If you finish a page, turn the page and continue working. Are there any questions?”

  22. Directions – Your Turn • The sheet on your desk is math facts. There are several types of problems on the sheet. Some are (insert types of problems on sheet). Look at each problem carefully before you answer it. When I say ‘please begin’, start answering the problems. Begin with the first problem and work across the page. Then go to the next row. If you cannot answer the problem, mark an ‘X’ through it and go to the next one. If you finish a page, turn the page and continue working. Are there any questions?”

  23. Directions Continued • Say “Please begin” and start your timer • Make sure students are not skipping problems in rows and do not skip around or answer only the easy problems • Say “Please stop” at the end of 2 minutes

  24. Scoring • If the answer is correct, the student earns the score equivalent to the number of correct digits written using the “longest method” taught to solve the problem, even if the work is not shown • If a problem has been crossed out, credit is given for the correct digits written • If the problem has not been completed, credit is earned for any correct digits written

  25. Scoring Continued • Reversed digits (e.g., 3 as E) or rotated digits, with the exception of 6 & 9 are counted as correct • Parts of the answer above the line (carries or borrows) are not counted as correct digits • In multiplication problems, a “0”, “X”, or <blank> counts as a place holder and is scored as a CD

  26. Scoring Continued • A division BASIC FACT is when both the divisor and the quotient are 9 or less. If the answer is correct the total CD always equals 1 • In division problems, remainder zeroes (r 0) are not counted as correct digits • In division problems, place holders are not counted as correct digits

  27. Scoring

  28. Computation Scoring – Your Turn

  29. Put It To Practice Benchmarking, Survey Level Assessment, and Progress Monitoring

  30. Tier 1- Universal Screening Big Ideas(Hosp, Hosp, Howell, 2007) • Provides a reliable and valid way to identify • Students who are at risk for failure • Students who are not making adequate progress • Students who need additional diagnostic evaluation • Students’ instructional level. • 3 times a year for the entire school • 3 probes are given and you take the median score

  31. What is Proficient?How Much Progress can we Expect?(Hosp, Hosp, Howell, 2007) • Benchmarks - Use standards for level of performance that are empirically validated by researchers. • Norms – Compare a student’s score to the performance of others in her grade or instructional level

  32. Proficiency Levels or Benchmarks for Math CBM(Burns, VanDerHeyden, Jiban, 2006)

  33. Norms for Math CBM: Correct Digits(AIMSweb, 2006)

  34. Spring Benchmark Data for 2nd Grade Making Informed Data Based-Decisions

  35. Making Informed Data Based-Decisions Spring Benchmark Data for 2nd Grade

  36. Survey Level Assessment(Hosp, 2012) • Purposes • To determine the appropriate instructional placement level for the student • The highest level of materials that the student can be expected to benefit from instruction in • To provide baseline data, or a starting point for progress monitoring • In order to monitor progress toward a future goal, you need to know how the student is currently performing

  37. Survey Level Assessment (Hosp, 2012) • Start with grade level passages/worksheets (probes) • Administer 3 separate probes (at same difficulty level) using standard CBM procedures • Calculate the median (i.e., find the middle score) • Is the student’s score within instructional range? • Yes: this is the student’s instructional level • No: if above level (too easy), administer 3 probes at next level of difficulty • No: if below level (too hard), administer 3 probes at previous level of difficulty

  38. Survey Level Assessment 4 2 6/13/13 2 7 12 1010 F 1 25 23 27 25 I x x

  39. Progress Monitoring Big Ideas: Tier 2 & 3 • Purpose: (Hosp, Hosp, Howell, 2007) • To ensure that instruction is working • To signal when a change is needed • To guide adjustments in the program • Frequency: • Tier 2: Monthly – to show progress and to inform instruction • Tier 3: Weekly to Bi-Weekly – to ensure that students who are themost treatment resistant are making progress.

  40. Progress Monitoring:Determine the Goal Calculating Aim Line Weekly Growth Rates for Math CBM: Correct Digits • Median Score from SLA or Benchmark + (Number of Weeks x Rate of Improvement) = Goal • Student 4 • 25 + (20 x .50) = 35 • Goal = 35 Correct Digits in 20 weeks (Fuchs, Fuchs, Hamlett, Walz, and Germann1993)

  41. Your Turn: Calculate Goal for Student 1 2nd grade: Spring Benchmark Scores Calculate Median Score from SLA or Benchmark + (Number of Weeks x Rate of Improvement) = Goal

  42. Making Informed Data Based-Decisions • Is our intervention working? • What changes should we make?

  43. Progress Monitoring: Another Look

  44. CBM and Web-Based Data Management Resources • AIMSWeb • EasyCBM • EDCheckup • Intervention Central • iSTEEP • Yearly Progress Pro • NCRtI • enumeracy • PM Focus

More Related