1 / 24

Assessing the Degree of Implementation when Evaluating a Statewide Initiative

Assessing the Degree of Implementation when Evaluating a Statewide Initiative. Patricia Mueller Ed.D., Patricia Noonan, Ph.D., Amy Gaumer Erickson, Ph.D. & Julie Morrison, Ph.D. SIG/SPDG Evaluators. Purpose of the Session.

lawson
Télécharger la présentation

Assessing the Degree of Implementation when Evaluating a Statewide Initiative

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing the Degree of Implementation whenEvaluating a Statewide Initiative Patricia Mueller Ed.D., Patricia Noonan, Ph.D., Amy Gaumer Erickson, Ph.D. & Julie Morrison, Ph.D. SIG/SPDG Evaluators

  2. Purpose of the Session • Explore current trends in assessing implementation of large-scale statewide initiatives. • Examine extent to which the initiatives use evaluation data to guide their work. • Engage in a dialogue with Project Directors and Evaluators to assess the degree to which initiatives are being implemented as planned.

  3. Intervention-Level Treatment integrity Intervention fidelity Procedural adherence Traditional Applications: Procedural checklists Direct observation Teacher self-report Permanent products Implementation-Level Implementation analysis Formative assessment Process evaluation Traditional Applications: Compliance in implementing program components Understanding Implementation Adherence

  4. Trends in Measuring the Degree of Implementation Adherence • Quality, not compliance • Living forms & processes • Electronic, real-time data collection • Triangulation of data (multi-method; multi-informant) • High versus low implementer comparison

  5. Innovation Implementation Across Five States • 4 of the 5 projects focus on integrated RtI models (e.g., academics, behavior, differentiated instruction/Understanding by Design); 1 project focuses on building capacity of leadership. • 1 is in last year of funding, 4 are half-way through the funding cycle. • All are utilizing demonstration sites as the vehicle to build state-wide capacity & scale-up the initiatives.

  6. Missouri Integrated Model (MIM) • 11 Essential Features of Systems Change • Driven by Building and District-level Teams • Professional Development with Coaching • 14 Pilot Districts • 5 Implementation Facilitators • Management Team • Implementation Team • Advisory Group

  7. www.MIMschools.org/cop

  8. Essential Features Faculty Survey Summary Report

  9. Self-Study Summary

  10. Results of Implementation Facilitator Interview • Long and short reports summarizing information: Successes Observed, Observed Challenges and Responses, Anticipated Challenges, Essential Features, Supporting the Ifs • Informs Management Team Discussion and Planning • Included in Annual Report to Stakeholders and Pilot Schools How the MIM supports IFs…

  11. Collaboration Survey (RPDC and State)—Online survey, score 1-5 • Knowledge • Support • Participation • Collaboration Results Compared and Discussed

  12. Comparison School Data

  13. Ohio’s SPDG: Developing Essential Leadership Practices at the District Level to Improve Instruction for All Students • The Ohio Improvement Process is a structured process for strategic planning based on the use of a connected set of web-based tools. • Participants include teams of administrators from 16 school districts each year (48 school districts in all over 3 years)

  14. Ohio Example: Assessing Implementation on Two Levels • State/Regional Level: Measuring the degree to which the staff selection, recruitment, training of a state-wide network of Regional Facilitators was implemented as planned. • District Level: Measuring the degree to which these same implementation drivers are being realized among the participating school districts to achieve the project’s goals.

  15. Ohio Example: Using Data • Two constructs have been incorporated into self-assessment measures as they have been shown by research to be predictors of implementation: • Self-efficacy: District Leadership Team members’ belief in their own abilities to perform newly acquired essential leadership practices. • Social validity: The degree to which District Leadership Team members judge the project’s goals, procedures, and outcomes to be acceptable.

  16. Ohio Example: Triangulation • The degree of implementation is being assessed at the district level through a triangulated process involving: • Analysis of functional products • Self-assessments of essential leadership practices (web-based for real-time analysis and feedback) • Interviews of key informants

  17. Assessing Implementation Drivers(Fixen & NIRN body of work) • Staff selection, recruitment, training, coaching and consultation • Performance assessment • Using data to support decision-making • Facilitative administration supports • Systems interventions

  18. Staff Selection, Recruitment, Training, Coaching & Consultation • State coach team members conduct interviews for regional coaches. • Regional coach job description includes prescriptive set of knowledge, skills & dispositions. • 6 day-training schedule in place for new coaches. • Recognition that replicable training modules need to be developed in order for scaling-up to occur.

  19. Performance Assessments • Use of on-line PD/TA Log • Baseline student-level data collected and trend tracking (e.g., achievement, discipline referrals). • Annual Participating Personnel Survey at the LEA level to assess skill/knowledge acquisition & application • TBD: How will the effectiveness of the coaches be assessed? How will the effectiveness of the coaching processes be assessed? Fidelity measures? (Fixsen, 5/09)

  20. Using Data to Support Decisions • Quarterly formative evaluation data collected; annual process & outcome data reported. • Assessment of organizational functioning (i.e., RtI Blueprint District Self-Assessment). Baseline data collected. Discussing frequency of administration. • Increased emphasis on buy-in at all levels of the organization. Goal is to use data to assess readiness & commitment. • TBD: How to tie quarterly & annual reports to organizational processes & outcomes? (Fixsen, 5/09)

  21. Facilitative Administration Supports • Learning curve from Year 1 Cohort to Year 2 Cohort. Lessons learned…without full administrative support, initiatives will fail. • TBD: Need to determine types of supports administrators need to ensure success of the innovation and develop assessment tools. How will the effectiveness of the administrative processes be assessed? (Fixsen, 5/09)

  22. Systems Interventions • All projects identify partnerships with IHE’s and family support organizations. • Involvement of the partners is limited. It’s a challenge to bring partners to the table when much of the effort focuses on field work, but without them, the innovations won’t be sustainable. • TBD: Assessment of the partnerships with external systems to ensure success of the initiatives. What strategies are in place or will need to be created for the innovation to work with external partners? (Fixsen, 5/09)

  23. Discussion Questions • To what degree does evaluation of your initiative tend to focus on compliance measures? • To what degree does evaluation of your initiative address depth of implementation and attend to implementation drivers? • How has evaluation practice assisted with data-based decision-making or mid-course corrections in your initiatives?

  24. Contact Information • Pat Mueller • Evergreen Educational Consulting, LLC • eec@gmavt.net; website: www.eecvt.com • Pattie Noonan • University of Kansas • pnoonan@ku.edu; website: www.MIMschools.org • Amy Gaumer Erickson • University of Kansas • aerickson@ku.edu • Julie Morrison • julie.morrison@uc.edu

More Related