1 / 20

Evaluation: It’s Not Just ‘at the End’

Evaluation: It’s Not Just ‘at the End’. Terry Uyeki, MSEd Director of Evaluation & Community Services Terry.Uyeki@humboldt.edu. Program Evaluation Symposium, Sept. 10, 2010 . CCRP “Hats”. Program evaluation Meeting design & facilitation (including graphic facilitation)

aviva
Télécharger la présentation

Evaluation: It’s Not Just ‘at the End’

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation:It’s Not Just ‘at the End’ Terry Uyeki, MSEd Director of Evaluation & Community Services Terry.Uyeki@humboldt.edu Program Evaluation Symposium, Sept. 10, 2010

  2. CCRP “Hats” Program evaluation Meeting design & facilitation (including graphic facilitation) Qualitative data analysis Community based participatory research (CBPR)

  3. Collaboration between Communities & Researchers (CBPR) Evaluation Meeting Facilitation

  4. EvaluationIt’s not just for the report for the funder …and it doesn’t just happen at the end… Evaluation is the “systematic investigation of the worth or merit of an object” Why would you develop an evaluation plan when you design a program or intervention? (Who is evaluation for?)

  5. Why? • Formative/process evaluation • How can we improve as we develop? • Summative/outcome evaluation • What happened? How effective was it?

  6. Project / Program Evaluation Thinking about evaluation (benchmarks) as you develop a project Program / Project Design & Delivery Program / Project Outcomes for Participants Program / Project Dissemination Impact Evaluation (how did it affect the problem?) Process Evaluation (what did you do?) Outcome Evaluation (what did they do?)

  7. Some Models for Program Design • W.K. Kellogg Foundation Logic Model • A systematic and visual way to represent the resources and activities of your program and proposed results • Aspen Institute Theory of Change Outcome Framework • Specifying outcomes and assumptions, and backwards-mapping to connect outcomes to your activities • Intervention Mapping (Bartholomew et al., 2000) • Health Promotion program planning: Design, adoption, implementation & evaluation

  8. Kellogg Foundation Logic Model

  9. Logic Model: Family Visitation Program Increased levels of physical activity & increased fruit & veg. consumption. Positive change in mediators. Family advisors hired, trained as coaches. Portfolio of fun activities compiled. PACT program publicized. Family advisors plan, conduct visits. Annual clinics held. Group activities held. 40 families participate. Enrolled families complete 10 PACT visits. Healthy weight in children & families; reduced incidence of Type 2 Dm.

  10. Making Project Objectives SMART How can evaluation help shape project objectives? • S Specific • M Measurable • A Attainable, Actionable • R Relevant, Results-focused • T Time framed

  11. Making Project Objectives SMART • Specific – Define “healthy eating” as increased consumption of fruits & veggies • Measurable – Grams of fruits and vegetables reported consumed • Attainable– Target: 95 60% of participants will improve F/V consumption;Actionable – Parents learn new ways of preparing F/V meals • Relevant – Consumption of soda decreases. Results-focused – 75% of families complete at least 8 visits. • Time framed– Families complete 10 visits within 9 months. How can evaluation help shape project objectives? • S Specific • M Measurable • A Attainable, Actionable • R Relevant, Results-focused • T Time framed

  12. From “PACT Program participants will eat healthier” to • At least 60% of PACT Program participants will increase consumption of fruits and vegetables. • Implications for Evaluation: • Self-reported meal recall • Weight of reported fruits and vegetables consumed • Perceived mediators in preparation of meals and consumption of fruits and vegetables • Participation rates of program families • Satisfaction with PACT family visits • Competence of family advisors (adherence to visit protocol; assessment of adults relative to stage of change; coaching skills)

  13. Logic Model: Family Visitation Program Increased levels of physical activity & increased fruit & veg. consumption. Positive change in mediators. Family advisors hired, trained as coaches. Portfolio of fun activities compiled. PACT program publicized. Family advisors plan, conduct visits. Annual clinics held. Group activities held. 40 families participate. Enrolled families complete 10 PACT visits. Healthy weight in children & families; reduced incidence of Type 2 Dm.

  14. Aspen Institute Theory of Change Outcome Framework • A graphic representation of the change process, mapping the pathways of change that will be brought about by an intervention/program • From Outcomes (long-term goals and assumptions behind them) • Backwards mapping to pre-conditions required to cause the desired change. • Connected outcomes = pathways of change

  15. Project “Superwomen” as a Logic Model

  16. Project “Superwomen” as a Theory of Change Outcome Framework

  17. From “Intervention Mapping”: DefiningPerformance Objectives for Health Behavior • Negotiate the use of a condom with a partner (based on negotiation theory, Fisher & Ury, 1991) • State mutual goals such as prevention of pregnancy or AIDS • State clearly the intention of using a condom as a prerequisite for intercourse • Listen to partner’s concerns • Pose solutions to partner’s concerns that reference mutual goals & personal requirements

  18. Why Build Evaluation Plan as Part of Program/Intervention Design? • Not just for the funder… • Keeps objectives realistic • Ensures that objectives are measurable • Helps with QA/QC • Enables one to make program adjustments • What happens if you did wait until the end? • Other reasons?

  19. Contact us at California Center for Rural Policy Humboldt State University 707-826-3400 ccrp@humboldt.edu www.humboldt.edu/~ccrp/

More Related