1 / 96

Christopher R. Gareis, Ed.D . The College of William & Mary crgare@wm.edu

Principles of Program Evaluation A Workshop for the Southwest Virginia Professional Education Consortium. Christopher R. Gareis, Ed.D . The College of William & Mary crgare@wm.edu. Pre-Session Understandings. Perspective on Our Time Together Today. The Profession of Education

malini
Télécharger la présentation

Christopher R. Gareis, Ed.D . The College of William & Mary crgare@wm.edu

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Principles of Program EvaluationA Workshop for the Southwest Virginia Professional Education Consortium Christopher R. Gareis, Ed.D. The College of William & Mary crgare@wm.edu

  2. Pre-Session Understandings

  3. Perspective on Our Time Together Today • The Profession of Education The regular and expert exercise of learned judgment and skills in service of one of an individual’s life needs and one of society’s greater purposes. • Zen and the Art of Professional Development • The Principle of Collective Wisdom

  4. Let’s graph ‘em

  5. Shared Understandings • What is a “program”? A planned, multi-faceted series of activities leading to intended outcomes for individuals, groups, and/or a community. • What is “evaluation”? The use of valid and reliable information/data to make judgments.

  6. Definitions of “Program Evaluation” • Judging the worth of a program. (Scriven, 1967) • The process of systematically determining the quality of a school program and how the program can be improved. (Sanders, 2000) • The deliberate process of making judgments and decisionsabout the effectiveness, direction, and valueof educational programs in our schools. (me)

  7. DO YOU KNOW OF…? • …a program that you thought was effective but was discontinued without explanation? • …a program that you believe is likely a waste of time, money, and effort, and yet it continues?

  8. Random Acts of Improvement Our Intentions Our Efforts

  9. Aligned Acts of Improvement Our Intentions Our Efforts

  10. Perspective

  11. The Role of Evaluative Thinking Planning Redesigning Evaluative Thinking Implementing Frechtling, 2007

  12. > 20 major models of program evaluation • Seminal Model: Daniel Stufflebeam’sCIPP model (1971) Our focus: • Creating a Logic Model • Focusing the Evaluation • Identifying Performance Indicators • EVALUATIVE THINKING

  13. Our Objectives Today • Understand basic concepts and principles of program evaluation at the school level • Create a logic model that accurately depicts the essential components of and relationships among input, process, and outcome variables of a school-based educational program • Select the appropriate focus for the evaluation of a program • Identify appropriate data sources aligned to the intended educational outcomes of a selected school-level program • Appreciate the importance of engaging in intentional program evaluation as a teacher leader

  14. Talk About Your Program • What is your program? • What does your program do? • Who participates in your program? • Whom does your program serve? • How do they benefit? • How do you know, or how would you know, that you program is a success?

  15. Think Visually • Sketch a visual metaphor of your program.

  16. Every educational program: • Uses resources or INPUTS, such as… • Materials • Funds • Teachers • Students • Engages in activities or PROCESSES, such as… • Instruction • Interventions • Intends produce certain results or OUTCOMES, such as… • Learning • Achievement • Engendering of certain attitudes or values • Eligibility for a next step in life • Bringing about some change

  17. Visualizing an Educational ProgramThe Logic Model

  18. What is a logic model?A diagram with text that illustrates and describes the reasoned relationships among program elements and intended outcomes to be attained.A visual theory of change. How Why Modified from guest lecture by John A. McLaughlin (November 16, 2009) at The College of William & Mary

  19. Simple Logic Model

  20. Everyday Logic Feel better Take the antihistamine Get an antihistamine

  21. Try This • You’re hungry. • Sketch a logic model to address your need. • Oh, but one twist: You have no food in the house. Assumptions

  22. Greenlawn Middle School Common Math Planning Initiative Initial Outcomes Intermediate Outcomes Ultimate Outcomes Processes Inputs Student affect for instruction • Core Teachers • Math • Student achievement • Quarterly class grades in math • Student achievement • 6th grade Math SOL • 7th grade Math SOL • Common planning by teachers • Frequency • Enthusiasm • Resource Teachers • Special Education • Gifted Education Student engagement in instruction

  23. The Reach of Intended Outcomes

  24. Now Try This • The Great Recession has had a broad and, in many places, deep impact on Americans. • One “solution” to our economic morass has been a call from policymakers for improved financial literacy of our citizens. • K-12 schools are seen as the logical place to teach young citizens financial literacy. • Sketch a logic model that depicts this “theory of change” Causality

  25. Source: STEM Education Symposium (retrieved 9/16/12).

  26. A Simple Logic Model:9th Grade Transition Program

  27. Stephen M. Millett, Susan Tave Zelman, (2005) "Scenario analysis and a logic model of public education in Ohio", Strategy & Leadership, Vol. 33 Iss: 2, pp.33 - 40

  28. Source: SUNY NGLC Grant Proposal (retrieved 9/16/12)

  29. A Model for Improving Assessment Literacy Processes of Professional Development Impact on Student Learning Context & Inputs Outcomes for Teachers Experience & Expertise of Professional Developers Explore Alignment C=Ia=A Understand role of C=Ia=A Alignment Create assessments that yield valid & reliable data Exposure to the full curriculum (e.g., depth & skill development) • Unpack Curriculum • Content • Cognitive levels Understand relationship between assessment & evaluation • Experience & Expertise of Participants • - Pre-service teachers • - In-service teachers • Instructional leaders • Administrators Use assessment data to make decisions: - About student learning - For student learning - About assessments - About instruction Create a Table of Specifications or “blueprint” Understand & apply concepts of validity & reliability Increased student achievement Psychometric Principles -Translated into practical terms & skills for teachers Critique an assessment for validity & reliability • Use a ToS to: • Create an assessment • Critique & improve an assessment • Create a unit plan assessment • - Plan instruction • - Analyze assessment results Contribute to the design & development of common assessments • Explore uses of ToS • Create an assessment • Critique & improve an assessment • Create a unit plan assessment • - Plan instruction • - Analyze assessment results State Standardized Assessments - de facto Curriculum? Deeper, more meaningful learning State &/or School District Curriculum Provide “opportunity to learn” through aligned instruction Apply principles to the spectrum of assessment types & practices Create good assessment items, prompts, assignments, & rubrics District Goals, Initiatives, or Imperatives

  30. TRY THIS Task: Create a logic model for your program Materials: • Chart paper (in “landscape layout”) • Post-It notes • Markers • Answers to the earlier questions about your program

  31. Gallery Walk(with Docents) • Can you readthe logic model (without the docent’s assistance)? • Questions that the logic model raises for you? • Feature(s) you want to “borrow”? • Suggestions to strengthen the logic model?

  32. Why bother with logic modeling? • Seems like a lot of work. • Logic models are too complex—how could I realistically ever know all the variables at play in a complex program! • I don’t “think” this way. • Even if we created a logic model, what would we do with it?!

  33. Limits of Logic Models • Represent intentions not reality • Focus on expected outcomes, which may mean missing out on beneficial unintended outcomes • Challenging to know causal relationships • Does not address whether what we’re doing is right

  34. Benefits of Logic ModelingMcLaughlin, J. A., & Jordan, G. B. (1999). Logic models: a tool for telling your program’s performance story. Evaluation and Program Planning 22, 65-72. • Builds a shared understanding of the program and expectations for inputs, processes, and outcomes • Helpful for program design (e.g., identifying critical elements for goal attainment and plausible linkages among elements) • Points to a balanced set of key performance indicators

  35. Are you going to New York or by train?

  36. Your question IS your FOCUS • Implementation Fidelity Question: “Are we implementing the program as designed?” Focus: Inputs & Processes • Goal Attainment Question: “Are we seeing evidence that we are achieving our intended outcomes?” Focus: Initial, Intermediate, and/or Ultimate Outcomes

  37. ESL Dual-endorsement Program Recruitment of candidates Dually-endorsed teachers Approved teacher preparation programs Elem/Sec/SPED program Satisfaction with preparation Coursework Scheduling MDLL courses MDLL TESOL/ESL courses Field Experiences Advising candidates • Arranging field experiences • Orientation • Transportation • Supervision Impact on student learning ESL summer school in LEAs Locating field sites VDOE regulations for ESL prep Program approval

  38. Once you have a logic model and performance indicators, what do you do? Where would IF evaluation occur and where would GA evaluation occur? • Determine the intent (and “audience”): • Formative evaluation (e.g., implementation fidelity and improvement—an assumption of continuation) • Summative evaluation (e.g., to continue or discontinue) • Articulate a limited number of relevant evaluation questions • Identify who is needed to conduct the evaluation • Determine how the evaluation will be conducted (time, money, data collection & analysis, compilation & reporting) http://www.google.com/imgres?imgurl=http://meera.snre.umich.edu/images/copy_of_logicmodelnoindbig.png/image_preview&imgrefurl=http:// …

  39. What is the focus of the program evaluation that would answer each of the following questions? i.e., “Implementation Fidelity “or “Goal Attainment” • Do methods used by teachers at Smith High School correspond to the principles of block scheduling introduced last fall? • Did the new reading curriculum result in improved reading abilities among 2nd graders? Increased interest in reading? • How consistently are Instructional Teams developing standards-aligned units of instruction for each subject and grade level (IIA01)? • Which recruitment strategies (of the three that we used) were most effective in involving fathers in Early Head Start? • How effective was the character education program in our middle school? • Did our new guidance program help new ESL students transition socially to our school? • How many AVID teachers have we trained during the past three years? Adapted from Fitzpatrick, Sanders, & Worthen. (2004). Program evaluation: Alternative approaches and practical guidelines. (3rd ed.) Boston: Pearson (pp. 373-374).

  40. What Information Do You Use to Answer Your Questions?

  41. Examples of Assessment Sourcesin Schools • Student attendance rates • Teacher attendance • PTA membership • Achievement gap among subgroups • Inequity of class enrollment (e.g., proportionally fewer minority students in AP classes) • Special education referral rates • Behavior referrals • Reading levels • SOL scores • PALS scores • DIBELS scores • Benchmark test scores • SOL recovery data • SAT scores • AP scores • Surveys • Class grades • Grade distributions • Algebra I enrollment / completion • College acceptances • Summer school rates • Dropout rates • Retention rates • Acceleration rates • Identification for gifted services • Athletic championships • Debate team competitions • Student demographics • Staff demographics (e.g., years of experience, licensure status) • Family demographics (e.g., income, educational levels) • Financial resources (budget) • Per pupil costs

  42. What assessment sources could you use to gather relevant, accurate, dependable information/data to answer each question? • Do methods used by teachers at Smith High School correspond to the principles of block scheduling introduced last fall? • Did the new reading curriculum result in improved reading abilities among 2nd graders? Increased interest in reading? • How consistently are Instructional Teams developing standards-aligned units of instruction for each subject and grade level (IIA01)? • Which recruitment strategies (of the three that we used) were most effective in involving fathers in Early Head Start? • How effective was the character education program in our middle school? • Did our new guidance program help new ESL students transition socially to our school? • How many AVID teachers have we trained during the past three years? Adapted from Fitzpatrick, Sanders, & Worthen. (2004). Program evaluation: Alternative approaches and practical guidelines. (3rd ed.) Boston: Pearson (pp. 373-374).

  43. Avoid DRIP:Align Your Data Sources to Your Logic Model

  44. Aligning Focus and Assessment Sources:Process (P) or Outcome (O) Indicator? • Using a classroom observation protocol to determine % of student engagement • Advanced pass rate on high school end-of-course SOL test • Questionnaire of 4th grade math students to determine degree of “mathphobia” • Review of teacher-made lesson plans over the course of a 9-week period to determine % including explicit phonics instruction • Graduation rates • Number of “leveled” books available in the media center tracked over a 3-year period • Survey of students’ attitudes about participating in an after-school tutorial program (e.g., “On a scale of 1-to-5, how much did you enjoy attending the after-school program? How helpful was your after-school tutor?”) • 3rd grade SOL pass rates • % enrollment of minority students in foreign language courses in the high school • Implementation of an advisory period at the middle school level • Change from a standard 7-period bell schedule to an alternating-day block schedule • Average AP scores • Student attendance rates • Teacher attendance rates • Review of committee meeting agendas to determine % that focus on discussion of achievement data • Budget allocation per student • Grade Point Averages

  45. AVID: Advancement Via Individual Determination

  46. Retrieved 9/16/12)

  47. TRY THIS Tasks: • Imagine undertaking an Goal Attainment evaluation of your program. • Articulate at least 2 evaluation questions • Identify1-2 performance indicators (aka, data sources) that could help answer each question • If time allows, try doing the same for an Implementation Fidelity evaluation. • Materials: • Your logic model

  48. Sage Advice Remember that not everything that matters can be measured, and not everything that can be measured matters. Know your means from you ends

More Related