1 / 127

Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida

Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida. 2011 FASP Conference November 4 th , 2011 Kevin Stockslager Kelly Justice Beth Hardcastle. Advanced Organizer. Accountability and Evaluation MTSS and Program Evaluation in the Schools

blythe
Télécharger la présentation

Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4th, 2011 Kevin Stockslager Kelly Justice Beth Hardcastle

  2. Advanced Organizer • Accountability and Evaluation • MTSS and Program Evaluation in the Schools • Example of an MTSS Evaluation Model • Review of Potential Data Sources • Surveys • Self-Assessments • Permanent Product Reviews

  3. PS/RtI vs. MTSS

  4. Accountability and Evaluation

  5. What does… • Accountability mean to you? • Evaluation mean to you?

  6. Accountability in Florida • Increasing accountability focus the last decade • Examples include: • School grading • AYP • Special education rules • DA • FEAPs & Teacher evaluation systems

  7. Impact of Accountability Criticisms Positives Establishes and maintains standards for performance Reinforces use of data to monitor student outcomes Reinforces need to examine resource use Student outcome rather than process focus Success stories • Lack of educator involvement • Controversy • Consequence driven • Compliance driven • Conflicting requirements • Duck and cover approach

  8. (Hall & Hord)

  9. Accountability & Evaluation Issues • Compliance driven versus informative evaluation • Evaluation often done to meet accountability requirements • Evaluation can serve to help integrate and improve school and district services • Evaluation is fundamental to MTSS • MTSS has the potential to: • Be viewed as one more thing we have to do OR • Help address accountability & evaluation demands through the multi-tier framework

  10. MTSS and program evaluation in the schools

  11. Important MTSS Evaluation Issues • Stakeholders should be involved in all aspects of planning and carrying out the evaluation process as well as in decision-making • Goals through planning should drive the process • Information obtained to: • Determine where you currently are (needs) • Take ongoing looks at how things are working • Make decisions about what to keep doing and what to change or eliminate

  12. MTSS Evaluation Issues cont. • The data you collect should be driven by the evaluation questions you want to answer • Are students meeting expectations? Academically? Behaviorally? Social-emotionally? • Are we implementing MTSS with fidelity? • Do we have the capacity to implement successfully? • Do staff buy into implementing MTSS? *Example questions

  13. Table Top Activity • Brainstorm and discuss some additional evaluation questions that you might want to answer at your schools • (2-3 minutes then report out)

  14. How Are Students Performing? Examples of data sources • Academics • FCAT • FAIR • Core K-12 • End of Course Exams • Behavior • Attendance • Tardies • Suspensions • Discipline referrals • Global Outcomes • Graduation Rates

  15. Are Schools Implementing MTSS with Fidelity? Examples of data sources • Curriculum and Instruction/Intervention • Principal walkthroughs • Lesson plans • Intervention Documentation Worksheets • Components of MTSS and Data-Based Problem-Solving* • BOQ, PIC, BAT • SAPSI, Tier I & II CCCs, Tier III CCCs * See http://flpbs.fmhi.usf.edu/ and http://floridarti.usf.edu for more information

  16. Do We Have the Capacity to Implement MTSS with Fidelity? Examples of data sources • Leadership Team structure and functioning • Organizational charts • Minutes/meeting summaries • SAPSI, BOQ, PIC • Staff knowledge and skills • FEAPs & teacher evaluation system • Staff development evaluations • Work samples • Resources allocated to match needs • SIP, DIP • Master calendar/schedule • School rosters • Resource maps

  17. Do Staff Buy Into Implementing MTSS? Examples of data sources • Leadership vision and commitment • SAPSI, BOQ, PIC • Required and non-required plans • Staff buy in • SAPSI, BOQ, PIC • District/school staff and climate surveys • Dialogue • Brief interviews with key personnel

  18. Example of an MTSS Evaluation Model

  19. Table Top Activity • Mock Small-Group Planning and Problem-Solving Process

  20. Small-Group Planning and Problem-Solving Process • What is our desired goal? • Brainstorm the resources and barriers to achieving our goal • Select a barrier/group or related barriers to address first • Brainstorm strategies to reduce or eliminate our selected barrier • Develop an action plan to reduce or eliminate our selected barrier • Include who, what, when (Be specific!) • Develop a follow-up plan for each action • Include who, what, when • Develop a plan to evaluate the reduction or elimination of our chosen barrier • Develop a plan to evaluate progress towards achieving our goal from Step 1

  21. Mock Small-Group Planning and Problem-Solving • Goal: Develop and implement a data-based evaluation system in my school and/or district • Brainstorm the resources and barriers to achieving our goal • Select a barrier/group or related barriers to address first • Brainstorm strategies to reduce or eliminate our selected barrier

  22. Potential Data Sources

  23. Perceptions of RtI Skills Survey Assessing Perceptions of Skills Integral to PS/RtI Practices

  24. Briefly… • Role of survey data • Beliefs Survey • Perceptions of Practices Survey

  25. Perceptions of Skills The likelihood of embracing new practices increases when: • Educators understand the need for the practice • Educators perceive they either have the skills to implement the practice or will be supported in developing required skills (Showers, Joyce, Bennett, 1987)

  26. Description and Purpose Perceptions of RtI Skills Survey

  27. Perceptions of Skills—Description and Purpose • Theoretical Background: • Assess educators’ perceptions of skills they possess to implement PS/RtI • Understand perceptions of skills and how perceptions change as function of professional development to facilitate PS/RtI implementation

  28. Description of Survey • Assesses skills/amount of support needed for: • Applying PS/RtI practices to academic content • Applying PS/RtI practices to behavior content • Data manipulation and technology use • 20 items; 5-point Likert scale • 1= I do not have the skill at all (NS)…5= I am highly skilled in this area and could teacher others (VHS)

  29. Purpose of Instrument Purpose of the Perceptions of RtI Skills Survey: • Assess impact of professional development • Identify “comfort level” with PS/RtI practices to inform PD; allocate resources

  30. Administration Procedures & Scoring Perceptions of RtI Skills Survey

  31. Administration procedures-Intended Audience • Who should complete? • SBLT members • Instructional staff • Who should use results? • SBLTs • DBLTs

  32. Directions for Administration • Methods for administration/dissemination • Completed individually • Anonymity • Opportunity for questions • Role of school principal—explain the “why” • Role of RtI coach/coordinator/SBLT member • Frequency of use: resources, rationale, recommendations

  33. Scoring Two techniques to analyze survey responses: • Mean rating for each item calculated to determine average perceived skill level • Frequency of each response option selected calculated for each item

  34. Calculating Item Mean • Overall assessment of perceived skills of educators within a school/district • Can be done at domain(factor) and/or individual item level • Domain level: examine patterns in perceived skills re: academic content, behavior content, data manipulation/technology use • Item level: identify specific skills staff perceive possessing v. skills in need of support

  35. Calculating Frequency of Response Options • Provides information on range of perceived skill levels • Can be used to determine what percentage of staff may require little, some, or high levels of support to implement PS/RtI • Informs professional development decisions

  36. Answering Evaluation Questions • Use data to inform evaluation questions • Use data to answer broad/specific questions • Align analysis and data display with evaluation questions • Consider available technology resources to facilitate analyses of data—online administration, automatic analysis, knowledge and skill of personnel

  37. Technical Adequacy Perceptions of RtI Skills Survey

  38. Technical Adequacy Content validity: • Item set developed to represent perceived skills important to implementing PS/RtI • Reviewed by Educator Expert Validation Panel (EEVP) Construct validity: • Factor analysis conducted using sample of 2,184 educators • Three resultant factors

  39. Technical Adequacy (cont.) Internal Consistency Reliability: • Factor 1 (Perceptions of RtI skills applied to academic content): α = .97 • Factor 2 (Perceptions of RtI skills applied to behavior content): α = .97 • Factor 3 (Perceptions of Data Manipulation and Technology Use Skills): α = .94

  40. Interpretation and use of data Perceptions of RtI Skills Survey

  41. Interpretation & Use of Data • Three domains: • Perceptions of skills applied to academic content • Perceptions of skills applied to behavior content • Perceptions of data manipulation and technology use skills • Three methodologies: • Calculate mean at domain level • Calculate mean at item level • Frequency/percentage of who selected each response option • Identify specific skills/skills sets for PS/support

  42. Interpretation & Use of Data (cont.) • Sharing data with stakeholders: • DBLTs, SBLTs, instructional staff • Use data to: • Develop/adjust PD goals • Design training/coaching activities • Facilitate consensus-building discussions re: rationale for PD, patterns, barriers

  43. Facilitating Discussions Sample guiding questions… • To what extent do you believe your school possesses the skills to use school-based data to evaluate core instruction (Tier 1)? Supplemental instruction (Tier 2)? • Based on what staff has learned about data-based decision-making, how consistent are those skills with PS/RtIpractices (i.e., to what degree do teams evaluate the effectiveness of core and supplemental instruction?

More Related