1 / 35

Evaluation Tools As Implementation Drivers

Evaluation Tools As Implementation Drivers. An Example from the California SPDG’s ERIA: Effective Reading Interventions Academy. Early ERIA: Effective Reading Interventions Academy. ERIA established in 2003-04 before the Era of RTI upper-elementary and middle schools focus

nadda
Télécharger la présentation

Evaluation Tools As Implementation Drivers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation Tools As Implementation Drivers An Example from the California SPDG’s ERIA: Effective Reading Interventions Academy

  2. Early ERIA: Effective Reading Interventions Academy • ERIA established in 2003-04 • before the Era of RTI • upper-elementary and middle schools focus • a diversity of approaches at sites in regional cohorts • recently dev ERIA 2.0 middle and high school focus • Frequently Asked Questions • “What is ERIA?” • “How does this relate to RtI2?” • “What is ‘intervention’ and where do we get it?” • How do we Evaluate this?

  3. How?

  4. How? Evaluation Tools asImplementationDrivers

  5. Two Integrative Evaluation Tools Serve as Implementation Drivers • Program Guide articulates PD model • introduces and illustrates • contextualizes the training • gets away from “you had to be there” • Implementation Rubricoperationalizes PD model • drives ongoing implementation • enables fidelity checks • is possible to evaluate • Everyone is on the same page • Sustainability (beyond funding, staff turnover) • Scale-up (recruit new sites/districts, beyond SPDG) • Diversity of approaches enabled

  6. Evaluation Drives ERIA’s Evidence-based Practices • The Program Guide, a 16-page booklet, explicitly addresses both implementation and intervention practices to guide the design of a site-based program. • The Implementation Rubric is a 10-item instrument which provides a framework for trainers, coaches, site team members, and teachers to evaluate and discuss implementation, fidelity, and next steps. • Some additional tools include: • end-of-event training surveys and three-month follow-ups • feedback and support from cohort coaches and site team • fidelity observations • student data

  7. ERIA’s Evidence-based Practices • The Program Guide articulates a comprehensive set of practices for all stakeholders. Implementation Practices Intervention Practices • Initial Training • Team-based Site-level Practice and Implementation • Implementation Rubric facilitates self-eval • Ongoing Coaching • Booster Trainings • Implementation Rubric reflection on next steps • The 5 Steps of ERIA • Data-informedDecision-making • Screening and Assessment • Progress Monitoring • Tiered Interventions and Learning Supports • Enhanced Literacy Instruction

  8. SPDG Evaluators Li Walter and Alan Wood synthesized content expert input and worked to make it readily accessible to a variety of stakeholders. “5 Steps” Step 1: Identify Step 2: Assess Step 3: Deliver Step 4: Monitor Step 5: Improve Evaluation Tool: The Program Guide

  9. Stakeholders from every level were included to provide input both on implementation and intervention practices. A variety of existing documents and other resources were also synthesized into theProgram Guide.

  10. There is a Table of Contents for ease of use.

  11. Implementation practices are outlined over a three-year schedule.

  12. Implementation practices are described in detail.

  13. The Site Team is key to implementing ERIA. It and its supporting structures are detailed.

  14. An expansive list of key roles are described in detail.

  15. While the first half of the Program Guide focuses on implementation practices, the second half focuses on intervention practices and the 5 Steps: Step 1: Identify Step 2: Assess Step 3: Deliver Step 4: Monitor Step 5: Improve

  16. Step 1: Identify “Identify struggling readers through universal literacy screening early in the school year using statewide English-language Arts test scores.”

  17. Step 2: Assess (1 of 2) “Assess the decoding, reading fluency, and comprehension skills of struggling readers to guide intervention placement and instruction.”

  18. Step 2: Assess (2 of 2) Presents a variety of assessment tools and strategies, both for basic and advanced implementation.

  19. Step 3: Deliver (1 of 2) “Deliver interventions to assess specific skill needs for success in the core curriculum using evidence-based programs and practices with fidelity.”

  20. Step 3: Deliver (2 of 2) Presents a variety of Intervention topics, programs, and models and how they may be appropriate for implementation, both in basic and advanced implementation.

  21. Step 4: Monitor “Monitor the progress of struggling students to ensure that interventions are helping students improve and to adjust intervention placements accordingly.”

  22. Step 5: Improve “Improve content literacy instructional practices to actively and effectively engage all students in the core curriculum.”

  23. Student Outcomes Past successes, increasing English-Language Arts proficiency inclusive of Students with Disabilities, is detailed.

  24. Evaluation Tool:Implementation Rubric • The 10 items are intervention practices-focused mostly, with site team and fidelity items • The overall tool and process of how the rubric isused drives the implementation practices • Self-evaluate and reflect on learning and implementation. • Shared with coaches and trainers to guide activities • Evaluates the fidelity of implementation of both the PD model and the interventions • Former 26-item, 3-point ERIA Checklist lacked the specificity to be meaningful and useful.

  25. Implementation Rubric, Adapted from “Goal Attainment Scales” • Amy Gaumer Erickson and Monica Ballay presented “goal attainment scales” on a June 17 SIG Network webinar: http://www.signetwork.org/content_pages/78 • Rubric explicitly describes 5 implementation levels for each of 10 items: • Levels 1, 2, and 3 reflect the “Not started,” “In progress,” and “Achieved” implementation levels of former checklist. • Levels 4 and 5 detail concrete steps towards optimal implementation, beyond the basics. • Each implementation level for each item is explicitly described, building more meaning into the tool than our previous checklist format allowed.

  26. Implementation Rubric Excel File:Multi-year tracking and Automated Reports • The same file is used in all three years of ERIA,reporting both the trend and most-recent entries.

  27. Evaluating Professional Development:Guskey’sFive Critical Levels • All five levels are addressed through various tools, structures, and practices over ERIA’s three year professional development schedule. • Level 1: Participants’ Reactions • End-of-Event Evaluations • 3-month Follow-up Surveys • Level 2: Participants’ Learning • Enhanced Field-based Training • Implementation Rubric • Coaching Feedback/Support

  28. Evaluating Professional Development:Guskey’s Five Critical Levels • Level 3: Organizational Support and Change • Team-based Implementation • Implementation Rubric • Coaching Feedback and Support • Level 4: Participants’ Use of New Knowledge and Skills • Fidelity Observations Integrated with Coaching • Implementation Rubric’s advanced implementation levels • Level 5: Student Learning Outcomes • Student Data guides Coaching and Booster Training content to address gaps and capitalize on successes

  29. Evaluation Tools asImplementationDrivers

  30. ERIA on the Web:http://calstat.org/effectivereading.htmlLi Walter: li@sonic.netAlan Wood: alan.wood@calstat.org (707) 287-0054

More Related