1 / 20

Making Data Driven Decisions: Cut Points, Curve Analysis, and Odd Balls

Making Data Driven Decisions: Cut Points, Curve Analysis, and Odd Balls. Robert Rosenthal, David Lillenstein, Jason Pedersen, Laura Lent, Richard Hall, Joe Kovaleski, and Edward Shapiro. Agenda.

Télécharger la présentation

Making Data Driven Decisions: Cut Points, Curve Analysis, and Odd Balls

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Making Data Driven Decisions: Cut Points, Curve Analysis, and Odd Balls Robert Rosenthal, David Lillenstein, Jason Pedersen, Laura Lent, Richard Hall, Joe Kovaleski, and Edward Shapiro

  2. Agenda To hear how decisions are made regarding intervention and evaluation in schools from all over Pennsylvania that implement Response to Intervention and Instruction models. To learn about some outcomes as a result of these decision strategies.

  3. Overlook Elementary

  4. Instructional Programs ‘06- ‘10

  5. Grade Level Team Meetings • Examine data every 6 weeks • Include all data on excel spreadsheet • Use DIBELS Prog monitoring charts • Calculate slope (rate of progress) • Generally follow DIBELS recommended Instructional levels • Must present data to not follow recommended levels

  6. Data examined at Team Meetings • Universal screening (DIBELS) • Unit (curriculum) test scores • Unit (curriculum) weekly assessment • 4-Sight scores (3 times) • PSSA (annual state assessment) • Rate of progress (slope of PM data) • Length of time at a tier level • Instructional program at T2&3 • Behavior infractions

  7. Tier Assignment Decisions • First look at DIBELS recommendation • K-2 • Then examine Unit/Weekly test scores • For students in T2 or 3: • Sub-groups decoding/fluency/comp • 3-6 • Then examine Unit/Weekly tests, PSSA, 4-Sight • Sub-group fluency/decoding/comprehension/writing

  8. Making Sub-Groups Every 6 Weeks, Once Tier Level Decision is Made: Group by high vs low Group by decoding vs fluency vs comp Group by Program (Fundations)

  9. K-2nd Teacher Perceptions-What Influences Tier Placement

  10. 3rd-6th Teacher Perceptions- What Influences Tier Placement

  11. Percent of Time We Followed DIBELS Instructional Recommendations

  12. When Didn’t Follow Inst Rec • 43 times (10% of total students) we gave more support than indicated • 31 times (8%) we gave less support • Reasons: • Unit Test scores • Behavior/Emotional Issues (gave more) • Borderline- look at other data • Not a fluency problem (gave more) • A fluke (gave less- other indicators ok) • Resources- group when similar

  13. Decision to Evaluate • Rate of progress is below target and typical rate (unless not fluency prob) • History of failure in curriculum • In targeted instructional support for at least 6 months with multiple data-driven changes using research-proven techniques and programs • PM shows significantly below peers • BB or B on PSSA’s

  14. Teacher Perceptions: What Influences Decision to Evaluate

  15. Eligibility Decisions • LEA decided to use discrepancy • Augment ER with RtI data • Slope scores can help support decision (especially when ½ target rate) • Helps in making recommendations • Type and quantity of program • Sometimes data is conflictual: • Used to be: Discrepancy rules • Now any sign of success makes it difficult • Always helps with ED classification

  16. Evaluations Across Years

  17. Placements Across Years

  18. Average T/A: Differences between Referral Sources

  19. State Testing Across Years

  20. Conclusions • Must include special ed students • Teachers need more training • We see a reduction in testing, with school referrals being more accurate • Now at Team meetings staff don’t ask about evaluations, they ask about interventions • Must continually remind staff to look at data to make decisions- we need to move them to less support more often

More Related