1 / 83

Tom Jenkins, Ed.D. Educational Consultation Services, LLC Wilmington, NC

The Formative Evaluation Component of Your MTIM Data Based Instructional Decision Making from the Building Level to the Individual Student Level. Tom Jenkins, Ed.D. Educational Consultation Services, LLC Wilmington, NC. Formative Evaluation.

bina
Télécharger la présentation

Tom Jenkins, Ed.D. Educational Consultation Services, LLC Wilmington, NC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Formative Evaluation Component of Your MTIMData Based Instructional Decision Making from the Building Level to the Individual Student Level Tom Jenkins, Ed.D. Educational Consultation Services, LLC Wilmington, NC

  2. Formative Evaluation Collecting data to make determinations about the effectiveness of instruction with students – and everything in between • Collecting data to make determinations about the effectiveness of your MTIM implementation

  3. Analyzing Your MTIM Implementation • Building consensus is necessary in three stages • pre-implementation • implementation • maintenance

  4. Analyzing Your MTIM Implementation • Pre-implementation tasks • Identify the purpose of implementation and the critical data • whose performance will improve as a result of implementation? • what aspect of performance will improve? • how will you know? • take some time to think about this

  5. Analyzing Your MTIM Implementation • Pre-implementation tasks continued • Identify the mechanisms that will cause student performance to change • enhanced delivery of Core • screening • intervening • progress monitoring

  6. Analyzing Your MTIM Implementation • Pre-implementation tasks continued • Present plan to stakeholders • allow for stakeholder input • create forms of communication for stakeholder groups • don’t forget your school board • disagreement is normal and should not be overlooked

  7. Analyzing Your MTIM Implementation • Implementation tasks • Consensus is most easily gained through success! • implement where leadership is strong • assess treatment integrity • start small • provide support • communicate success to stakeholders

  8. Analyzing Your MTIM Implementation • Maintenance tasks • Build RTI/TIM into the school culture • set up team meetings that are on the calendar • present success stories and current status data at staff meetings

  9. Analyzing Your MTIM Implementation • Maintenance tasks continued • Training is required each and every year • assessment tool training • intervention training • new staff • survey your staff activity

  10. Analyzing Your MTIM Implementation • Maintenance tasks continued • maintain communication with and present results to stakeholders • show connection between implementation and state accountability measures

  11. Analyzing Your MTIM Implementation • Things that reduce consensus and support for RTI implementation • lack of identifiable team or individual who is in charge • lack of focus on what RTI is about • asking classroom teachers to do everything • not communicating to critical stakeholders • not identifying how or when students move between tiers

  12. Data For Each Tier - Where Do They Come From? Tier 1: Universal Screening, accountability assessments, grades, classroom assessments, referral patterns, discipline referrals Tier 2: Universal Screening - Group Level Diagnostics (maybe), systematic progress monitoring, formative assessment large-scale assessment data and classroom assessment Tier 3: Universal Screenings, Individual Diagnostics, intensive and systematic progress monitoring, formative assessment, other informal assessments

  13. Building MTIM Infrastructure • Formative evaluation process • Informed by data • Highly involved school-based leadership team (SBLT) • School-based MTIM coach • Provide Technical Assistance • Interpretation and Use of Data • Facilitates regular data meetings for building and grade levels

  14. Formative Evaluation Component of Infrastructure What we need: Screening system for identifying students at risk Diagnostic assessment tools for identifying specific needs of students identified by screening Systematic, explicit, research based instructional strategies – differentiated instruction Progress monitoring plan Evaluation of whether instruction is effective

  15. Data For Each Tier – Where Do They Come From? Results Monitoring Addl. Diagnostic Assessment Instruction Screen All Students Individualized Intensive Individual Diagnostic Intensive 1-5% weekly Small Group Intervention By skill Supplemental 5-10% Standard Protocol Behavior Academics 2 times/month Core Bench- Mark Assessment Annual Testing Monthly Screening None Continue With Core Instruction with differentiation Grades Classroom Assessments Yearly Assessments 80-90% Step 2 Step 3 Step 4 Step 1

  16. Tier I - Core/Benchmark • Universal Screening • Academics: Screen all students, begin in kindergarten; 3 times per year with appropriate early literacy and math measures • More intense instruction and monitoring within classroom for students below cut scores • See worksheet

  17. Cut Score Worksheet • Step One: Put all student scores on the university screening measure on a histogram type chart. • Step Two: Calculate typical Growth Rate of specific skills. Three formulas can be used here. • EOYBM – BOYBM / 36 weeks = GR • Or • EOYBM – MOYBM / 18 weeks = GR • Or • MOYBM – BOYBM / 18 weeks = GR • Step Three: Determine the Targeted Growth Rate for students. Two formulas can be used here depending on the desired amount of ambitiousness. GR * 1.5 = TGR • Or GR * 2.0 = TGR

  18. Cut Score Worksheet • Step Four: Calculate the Growth Goal for the instructional period. • TGR * NWI (18 or 36) = GG • Step Five: Calculate the Cut Score for determination of level of instruction. Two formulas can be used here depending on the length of the instructional period used in step four. MOYBM – GG = CS • Or EOYBM – GG = CS • Step Six: Using the Cut Score place a line of demarcation on the histogram created in step one. Any students above the Cut Score should obtain the GG via Core instruction. Any students below the Cut Score may need Strategic instruction to obtain the TGR and GG. Students in need of Intensive instruction should be identified using progress monitoring data during Strategic instruction implementation. Progress monitoring within all three tiers allows for students movement between the tiers during the instructional period.

  19. Cut Score Worksheet Activity • Knowing that your MOYBM is 40 and your BOYBM is 20 what would be the cut score using a accelerator of 1.5? • BOYBM = 20 • MOYBM = 40 • 18 weeks of instruction/intervention • Accelerator of 1.5

  20. Cut Score Worksheet Activity • Step Two: MOYBM – BOYBM / 18 = GR 40 – 20 / 18 = 1.11 • Step Three: GR * 1.5 = TGR 1.11 * 1.5 = 1.67 • Step Four: TGR * NWI = GG 1.67 * 18 = 29.97 • Step Five: MOYBM – GG = CS 40 – 29.97 = 10.03 • Step Six: All students above the score of 10.03 should be able to meet the expected Growth Goal via Core instruction. All students below the score of 10.03 would probably be initially placed in Strategic instruction to obtain the Targeted Growth Rate and necessary Growth Goal.

  21. Universal Screening ResultsFidelity Check: Are you doing the right thing? • Assess success of instructional program • Percent of students at or above benchmarks • If necessary, examine curriculum, instruction, or both • Identify students below benchmarks • Interventions within general education classroom • Assess progress and consider need for more intensive interventions

  22. Decision Rules: What is a “Good” Response to Intervention? Positive Response Gap is closing Can extrapolate point at which target student(s) will “come in range” of target--even if this is long range Level of “risk” lowers over time Questionable Response Rate at which gap is widening slows considerably, but gap is still widening Gap stops widening but closure does not occur Poor Response Gap continues to widen with no change in rate.

  23. Positive Response to Intervention Expected Trajectory Performance Observed Trajectory Time

  24. Decision Rules: Linking RtI to Intervention Decisions Positive Continue intervention with current goal Continue intervention with goal increased Fade intervention to determine if student(s) have acquired functional independence.

  25. Decision Rules: What is a “Questionable” Response to Intervention? Positive Response Gap is closing Can extrapolate point at which target student(s) will “come in range” of target--even if this is long range Questionable Response Rate at which gap is widening slows considerably, but gap is still widening Gap stops widening but closure does not occur Level of “risk” remains the same over time Poor Response Gap continues to widen with no change in rate.

  26. Questionable Response to Intervention Expected Trajectory Performance Observed Trajectory Time

  27. Decision Rules: Linking RtI to Intervention Decisions Questionable Was intervention implemented as intended? If no - employ strategies to increase implementation integrity If yes - Increase intensity of current intervention for a short period of time and assess impact. If rate improves, continue. If rate does not improve, return to problem solving.

  28. Decision Rules: What is a “Poor” Response to Intervention? Positive Response Gap is closing Can extrapolate point at which target student(s) will “come in range” of target--even if this is long range Questionable Response Rate at which gap is widening slows considerably, but gap is still widening Gap stops widening but closure does not occur Poor Response Gap continues to widen with no change in rate. Level of “risk” worsens over time

  29. Poor Response to Intervention Expected Trajectory Performance Observed Trajectory Time

  30. Decision Rules: Linking RtI to Intervention Decisions Poor Was intervention implemented as intended? If no - employ strategies in increase implementation integrity If yes - Is intervention aligned with the verified hypothesis? (Intervention Design) Are there other hypotheses to consider? (Problem Analysis) Was the problem identified correctly? (Problem Identification)

  31. Building Your MTIM

  32. Thoughts, fleshing out your pyramid, and action plan activity • Is your district/school ready for a universal screening process? Consider…… • Do you have a person with the ability to be a data coach? • Who would be your school leadership team/data analysis team? • Current screening assessment tools • Staff ability to analyze data and make instructional decisions • Do you have consensus?

  33. Data For Each Tier – Where Do They Come From? Results Monitoring Addl. Diagnostic Assessment Instruction Screen All Students Individualized Intensive Individual Diagnostic Intensive 1-5% weekly Small Group Intervention By skill Supplemental 5-10% Standard Protocol Behavior Academics 2 times/month Core Bench- Mark Assessment Annual Testing ODRs Monthly Bx Screening None Continue With Core Instruction with differentiation Grades Classroom Assessments Yearly Assessments 80-90% Step 2 Step 3 Step 4 Step 1

  34. Need for CBM Type Assessments • All reading probes scored corrects per minute

  35. Need for CBM Type Assessments • Math computations are scored by correct digits per minute

  36. Need for CBM Type Assessments • Correct Sequences for written expression • Two words form a sequence, word and punctuation form a sequence. • Most words and punctuation are used twice • Three minutes to brainstorm, write, and edit

  37. Tier II Strategic and Tier III Intensive Progress Monitoring • Progress monitoring is essential for four reasons • There is no guarantee that interventions will be successful, thus the intervention must be “tested” to evaluate effectiveness • Increased emphasis of specific outcomes for students, data base must be generated to guide intervention decision making • Pre/post testing has be shown to be unreliable (small amount of data) and provides too little data to allow for instructional decision making – progress monitoring allows for evaluation of level of performance and rate of learning • Research has shown that progress monitoring is associated with improved educational outcomes

  38. If we use research validated reading practices, monitor students’ progress and make changes to instruction based on what we find, between 95 and 100 percent of children can become proficient readers. Torgesen, 2000, Learning Disabilities Research and Practice. Individual Differences in Response to Early Intervention in Reading: The Lingering Problem of Treatment Resisters Research has shown that it works!

  39. Tier II Strategic and Tier III Intensive Progress Monitoring • Essential components that must be in place for successful progress monitoring within each tier • A well-defined behavior • Identification of student’s current level of performance (baseline) • Instruction/Intervention • Goal • A measurement strategy • Graph • Decision-making plan

  40. Tier II Strategic and Tier III Intensive Progress Monitoring • Target behaviors that are observable and measurable • Skill specific assessments • Focus on enabling skills • Skills that are prerequisite skills for more complex skills • Deficiencies in enabling skills often adversely affects performance on global assessments

  41. Tier II Strategic and Tier III Intensive Progress Monitoring • Enabling skills for reading • Phonemic awareness • Alphabetic understanding • Fluency • Sight words • Comprehension

  42. Tier II Strategic and Tier III Intensive Progress Monitoring • Enabling skills for math • Number sense • Facts • Computation • Applications • Problem solving • Enabling skills for written expression • Mechanics • Expression

  43. Tier II Strategic and Tier III Intensive Progress Monitoring • Enabling skills for behavior • Social skills • Work completion • Compliance • Problem solving skills

  44. Tier II Strategic and Tier III Intensive Progress Monitoring • Goal setting……………. • Standard against which progress can be compared • Allows for aimline to be established • Possible goals • Level of behavior that is expected – several ways to establish this • Most frequently used is accelerated growth rates for academics and percent of time expectations for behavior

  45. Tier II Strategic and Tier III Intensive Progress Monitoring • To identify an accelerated growth rate • Take the growth rate that is calculated and multiply it by 1.5 to obtain a slightly ambitious growth rate or • Take the growth rate that is calculated and multiply it by 2 to obtain a more ambitious growth rate • Then multiply by the number of weeks of intervention

  46. Tier II Strategic and Tier III Intensive Progress Monitoring • For behavior research indicates that a 75% level of performance can be used for non threatening behaviors • For behaviors that are threatening or dangerous a 100% level of performance should be used

  47. Tier II Strategic and Tier III Intensive Progress Monitoring • Goal calculation activity • Using the data below what would Nicole’s goal be, in each area, for an intensive intervention plan that was implemented for nine weeks using a 1.5 accelerator?

More Related