1 / 84

Making Program Evaluation “Visible”

Making Program Evaluation “Visible”. October 1, 2014 MAS/FPS Fall Directors’ Institute Oakland Schools & Wayne RESA Collaborative. Let’s Connect. Table Introductions…30 seconds each Name District/Role What is something about program evaluation you hope to learn more about?.

Télécharger la présentation

Making Program Evaluation “Visible”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MakingProgram Evaluation “Visible” October 1, 2014 MAS/FPS Fall Directors’ Institute Oakland Schools & Wayne RESA Collaborative

  2. Let’s Connect • Table Introductions…30 seconds each • Name • District/Role • What is something about program evaluation you hope to learn more about?

  3. “POP” for Today • Purpose: Deepen understanding of the program evaluation process through the use of visual tools. • Objectives: Participants will … • Understand how to use visual tools to plan for program evaluation. • Understand how to select “what” & “when” to monitor to inform the evaluation process. • Describe how the program evaluation process is linked to continuous school improvement. • Products: • “Tri-Fold” of an instructional program • Strategic Map of DIP/SIP goal area

  4. Burning Questions • Write any burning questions NOT addressed in today’s “POP” on an index card. • Hold up when ready.

  5. Setting the Stage Purpose Matters

  6. Ultimate Evaluation Question Is your “program” meeting its intended purpose?

  7. ESEA, Sect. 1001: Purpose of Title I “The purpose of this title is to ensure that all children have a fair,equal, and significant opportunity to obtain a high-quality education and reach, at a minimum, proficiency on challenging State academic achievement standards and state academic assessments.”

  8. State and Federal Requirements Related to Program Evaluation Annual evaluation of the implementation and impact of the School Improvement Plan Modification of the plan based on evaluation results MICHIGAN • Annual evaluation of all federal programs—effectiveness & impact on student achievement, including subgroups • Modification of the plan based on evaluation results FEDERAL ISDs/RESAs are required by PA25 to provide technical assistance to schools and districts to develop annual evaluations. ESEA requires annual evaluations of programs funded by the federal programs such as Title I, Part A, C, D; Title II and Title III.

  9. Targeted Assistance Schoolwide Title I Program-Specific Requirements:Supplemental Services • Needs assessment of eligible students • Supplemental Services provided only to eligible students • Research-based strategies • Ongoing review of student progress • Provide additional support, if needed • Revise Title I TA program • Provide training for educators • Measures to ensure students’ difficulties are identified • Timely & additional assistance to students having difficulty. • Research-based strategies • Annual evaluation

  10. MDE Program Evaluation Roll Out Feb-Mar 2014 Mar-Aug 2014 Spring 2014: DIP/SIP 2014-15 June 30, 2015 Summer 2015+ Conduct Train the Trainer workshop on Program Evaluation to include representatives from each of ISD/SIFN, OFS, OEII, AdvancED, MICSI, LEAs. ISD/MDE trainers to conduct regional workshops for LEAs Include program evaluation activities in SIP/DIP to support Program Evaluation as part of the Continuous Improvement Process Implement Program Evaluation activities throughout the 2014-2015 school year Report on the evaluation of ONE program using the MDE Program Evaluation Diagnostic (submit in ASSIST). (Required for approval of2015 – 2016 Consolidated Application.) Sustain professional learning by reconvening trainers to discuss successes, challenges, and develop the required follow-up training materials and support systems

  11. Best Evidence Is Collected… when the Right questions are asked.. http://www.youtube.com/watch?v=IGQmdoK_ZfY

  12. MDE’s Questions for Evaluation 2. Knowledge and skills? 1. Readiness? 3. Opportunity? 5. Impact on students? 4. Implemented as intended?

  13. ASSIST: ProgramEvaluationDiagnostic Whatwas theREADINESSfor implementingthestrategy/ program/initiative? Didparticipantshave theKNOWLEDGEAND SKILLSto implementthe program? WasthereOPPORTUNITYforimplementation? Wasthe programIMPLEMENTEDAS INTENDED? DESCRIPTION of the Program/Strategy/Initiative IMPACT:Whatwasthe IMPACT of the STRATEGY/ PROGRAM/ INITIATIVE ON STUDENT ACHIEVEMENT? CONCLUSIONS: continue, adjust, discontinue

  14. Linking Program Evaluation to Program Planning

  15. Michigan Continuous School Improvement“Plan”: Get Ready, Implement, Monitor, and Evaluate A Strategy/Initiative/Program Gather & Study Process Data: Q1: Readiness to implement? Q2: Knowledge & Skills of Implementers? Q3: Opportunity to implement? Gather & Study Achievement & Process Data: Q4: Implemented with fidelity? Q5: Impact on Students?

  16. “Evaluation” begins with Planning Planning: What will we do to ensure….. ? “GR-IM-E” Getting Ready 1. Readiness 2. Knowledge/Skills 3. Opportunity Implementation/ Monitoring 4. Implementing With Fidelity 5. Impact on Students Evaluation Conclusions: Program Effectiveness

  17. “Body of Evidence”  Program Evaluation Monitor & Adjust DURING implementation. Draw Conclusions Data on Adult Implementation ANDImpact on Students

  18. Sources of Evidence Q4: Program Implementation (What adults are doing) • Lesson Plans • Student work, artifacts • Activity Logs • Observation Data, Checklists • Self assessments of practice • Surveys • Staffing; Job Descriptions • Agendas, Minutes • Schedules • Policies & Procedures • Others Q5: Impact on Students • Student Work • Assessment Results • Universal Screening • Benchmark Assessments • Diagnostic Assessments • Observations of Students • Surveys • Others

  19. Debrief: “Take 3” • What is making sense? • What questions might you have?

  20. Using a “Tri-Fold”to ‘See’ Program Components Guiding Question: What did we say we would do?

  21. “Tri-Fold” • Builds shared understanding of the program, strategy or initiative that will be implemented • Articulates critical program components • Purpose • Evidence of need • Expected student outcomes • Expected adult actions • Resources needed • Helps us “see” what to monitor

  22. “Tri-Fold” Case Study: Select a Program* • Select a supplemental program/service from your DIP or SIP. • It should meet as many of the following criteria as possible: • It is a instructional program for students. (Tier 2 or 3 program/service.) • It has been implemented for at least a semester…a year is preferable. • The process for selecting students to participate in the program is clear to you. • You understand the program/service and can describe it to others. *Program = program, strategy, or initiative

  23. Team Turn and Talk • Be able to answer two questions: 1) WHAT…is the name of the program or service? 2) WHY…does it exist?

  24. Activity: Create “Tri-Fold” • Use blank paper (landscape) • Draw a line near the top to create a “header” • Draw a line near the bottom to create a “footer” • See example on next slide.

  25. “Name of Program” Purpose Statement

  26. Title I Before School Math Program To improve the foundational math skills of students in grades 4-6 so they can access and achieve grade-level standards.

  27. Title I Before School Math Program To improve the foundational math skills of students in grades 4-6 so they can access and achieve grade-level standards. Fold into thirds.

  28. Key Components Exit Criteria “Program” Purpose Statement Eligibility Criteria Resources

  29. Team Turn and Talk 3) WHAT…are the Key Components or “Critical Features” of the program or service? (Refer or find in your DIP/SIP) • What “defines” this program or service? • What would observers expect to see if the program/service is implemented as intended (based on research)? • What do students “get”?

  30. Key Components Exit Criteria “Program” Purpose Statement List the key components or “critical features”. What does the program/service look like when implemented well? What do students “get”? Eligibility Criteria Resources

  31. Key Components Exit Criteria “Title I Before School Math Program”To improve the foundational math skills of students in grades 4-6 so they can access and achieve grade-level standards. Eligibility Criteria • 30 min., 2-3x week, a.m. • Breakfast snack • 10:1 stud:tchr ratio • Students grouped by need • Pre-teaching of skills to support access to grade-level curriculum • Aligned to daily classroom instruction • Use research-based strategies (i.e., Concrete-Pictorial-Abstract) • Teacher & interventionist plan weekly Resources/PD

  32. Team Turn & Talk 4) WHAT…are the eligibility and exit criteria for the program? 5) WHAT…are the resources (including PD) needed to implement the program?

  33. Key Components Exit Criteria “Program” Purpose Statement What criteria are used to identify eligible children? Consider the purpose of the program and the needs it was designed to address. What criteria are used to determine when students exit the program? Consider the eligibility criteria and how you know when a student’s needs have been met. Eligibility Criteria Include the data source as well as the “cut score” and timeframe. Resources/PD

  34. Key Components Exit Criteria “Program” Purpose Statement • MEAP: Level 3 or 4 for two consecutive years • District Benchmark Assessment: below 65% on 2 of last 3 assessments • NWEA: 0-25th percentile on most recent assessment • MEAP: Level 1 or 2 on most recent assessment • District Benchmark: at least 70% on 2 of last 3 assessments • NWEA: 26th percentile or higher on last assessment • C or better semester grade in math. Eligibility Criteria Resources/PD

  35. Key Components Exit Criteria “Program” Purpose Statement Eligibility Criteria What resources are needed? (materials, PD, food….?)

  36. Key Components Exit Criteria 1 2 “Title I Before School Math Program”Purpose: To improve the foundational math skills of students in grades 4-6 so they can access and achieve grade-level standards. Eligibility Criteria • MEAP: Level 3 or 4 for two consecutive years • District Benchmark Assessment: below 65% on 2 of last 3 assessments • NWEA: 0-25th percentile on most recent assessment • MEAP: Level 1 or 2 on most recent assessment • District Benchmark: at least 70% on 2 of last 3 assessments • NWEA: 26th percentile or higher on last assessment • C or better semester grade in math. 3 4 4 • 30 min., 2-3x week, a.m. • Breakfast snack • 10:1 stud:tchr ratio • Students grouped by need • Pre-teaching of skills to support access to grade-level curriculum • Aligned to daily classroom instruction • Use research-based strategies (i.e., Concrete-Pictorial-Abstract) • Teacher & interventionist plan weekly 5 Resources/PD: Manipulatives, Snacks, Intervention Teacher (360 hrs), PD for Int. Tchr. (registration, sub costs), others..,

  37. Debrief: “Take 3” • What is making sense? • What questions might you have?

  38. Key Components Exit Criteria “Program” 5. Impact on students? 4. Implemented as intended? Purpose Statement Eligibility Criteria Getting Ready 2. Knowledge and skills? 3. Opportunity? 1. Readiness? Resources/PD

  39. Strategic Management:Introduction Randel Jasserand, Chicago Public Schools

  40. Strategic ManagementEnsuring School & Student SuccessSelected Slides from webinarFeb. 20, 2013 Randel Josserand, Chief of Schools

  41. Typical School Improvement Student Outcome Metric Improve State Reading Assessment from 56.2% Meet/Exceed to 64.2% M/E (+8% Points) Reading Recovery Program (GR K-2) Guided Reading Program ( GR K-2) SS & Reading / Lit Circles ( GR 6-8) After School Tutoring (3-8) Balanced Literacy Program ( GR 3-5) Extended Small Group (K-2) Lunch Bunch Program (GR 3-8) Writer’s Workshop Program ( GR 6-8) Lunch Bunch Program (GR 3-8)

  42. Typical School Improvement Student Outcome Metric Improve State Reading Assessment from 56.2% Meet/Exceed to 64.2% M/E (+8% Points) New Grade Card ( GR 6-8) SSR ( GR K-2) Reading Recovery Program ( GR K-2) Guided Reading Program ( GR K-2) SS & Reading / Lit Circles ( GR 6-8) After School Tutoring (3-8) Before School Tutoring (3-8) Math Literacy Program ( GR 6-8) Computer Program ( GR K-2) After School Tutoring (3-8) Balanced Literacy Program ( GR 3-5) Extended Small Group( GR K-2) Lunch Bunch Program ( GR 3-8) Writer’s Workshop Program ( GR 6-8) Reading Workshop ( GR K-2) Silent Reading ( GR 3-5) Breakfast Club (3-8) Small Group Reading (K-2) Lunch Bunch Program ( GR 3-8)

  43. Typical School Improvement Student Outcome Metric Improve State Reading Assessment from 56.2% Meet/Exceed to 64.2% M/E (+8% Points) Running Record (K-2) Running Record (3-5) DIBELS Assessment (K-1) Interim Assessment (3-5) Interim Assessment (6-8) New Grade Card ( GR 6-8) SSR ( GR K-2) Reading Recovery Program ( GR K-2) Guided Reading Program ( GR K-2) SS & Reading / Lit Circles ( GR 6-8) After School Tutoring (3-8) Before School Tutoring (3-8) Math Literacy Program ( GR 6-8) Reading Recovery Program ( GR K-2) Guided Reading Program ( GR K-2) SS & Reading / Lit Circles ( GR 6-8) After School Tutoring (3-8) Computer Program ( GR K-2) After School Tutoring (3-8) Balanced Literacy Program ( GR 3-5) Extended Small Group( GR K-2) Lunch Bunch Program ( GR 3-8) Writer’s Workshop Program ( GR 6-8) Extended Small Group (GR K-2) Lunch Bunch Program ( GR 3-8) Balanced Literacy Program ( GR 3-5) Writer’s Workshop Program ( GR 6-8) Reading Workshop ( GR K-2) Silent Reading ( GR 3-5) Breakfast Club (3-8) Small Group Reading (K-2) Lunch Bunch Program (GR 3-8) Lunch Bunch Program ( GR 3-8)

  44. Strategically Managing School Improvement Student Outcome Metric Improve State Reading Assessment from 56.2% Meet/Exceed to 64.2% M/E (+8% Points) GR-IM-E Running Record (K-2) Running Record (3-5) DIBELS Assessment (K-1) Interim Assessment (3-5) Interim Assessment (6-8) Develop Project Plans Each New Intervention MUST Have a Fully Developed Project Plan. Reading Recovery Program ( GR K-2) Guided Reading Program ( GR K-2) SS & Reading / Lit Circles ( GR 6-8) After School Tutoring (3-8) Extended Small Group (GR K-2) Lunch Bunch Program (GR 3-8) Balanced Literacy Program ( GR 3-5) Writer’s Workshop Program ( GR 6-8)

  45. Strategically Managing School Improvement Student Outcome Metric Improve State Reading Assessment from 56.2% Meet/Exceed to 64.2% M/E (+8% Points) Running Record (K-2) Running Record (3-5) DIBELS Assessment (K-1) Interim Assessment (3-5) Interim Assessment (6-8) Identify Fidelity Metrics for Each Intervention Each New Intervention MUST Have One Or More Fidelity Metrics Reading Recovery Program ( GR K-2) Guided Reading Program ( GR K-2) SS & Reading / Lit Circles ( GR 6-8) After School Tutoring (3-8) Fidelity Metrics Fidelity Metrics Fidelity Metrics Fidelity Metrics Extended Small Group (GR K-2) Lunch Bunch Program (GR 3-8) Balanced Literacy Program ( GR 3-5) Writer’s Workshop Program ( GR 6-8) Fidelity Metrics Fidelity Metrics Fidelity Metrics Fidelity Metrics

  46. Fidelity METRICS TELL YOU… Are the RIGHT PEOPLE… Doing the RIGHT THINGS… In the RIGHT WAY… At the RIGHT TIME… …for the benefit of Students?

  47. Strategically Managing School Improvement Student Outcome Metric Improve State Reading Assessment from 56.2% Meet/Exceed to 64.2% M/E (+8% Points) Running Record (K-2) Running Record (3-5) DIBELS Assessment (K-1) Interim Assessment (6-8) Interim Assessment (3-5) Reading Recovery Program ( GR K-2) Guided Reading Program ( GR K-2) SS & Reading / Lit Circles ( GR 6-8) After School Tutoring (3-8) Fidelity Metrics Fidelity Metrics Fidelity Metrics Fidelity Metrics Balanced Literacy Program ( GR 3-5) Writer’s Workshop Program ( GR 6-8) Extended Small Group (GR K-2) Lunch Bunch Program (GR 3-8) Fidelity Metrics Fidelity Metrics Fidelity Metrics Fidelity Metrics

  48. Progress Monitoring 1 & 2 Fidelity Metric (Process Data) Outcome Metric (Impact Data)

  49. CPA Strategy Implementation Data “Level of Use”, Rating Scale 0-4 Fidelity = 3.0 Leadership and Learning Center 2010

More Related