1 / 42

Measuring Program Results for Abstinence Education Grantees

Measuring Program Results for Abstinence Education Grantees. Vijaya ChannahSorah, Ph.D. Independent Consultant, Results Management. RESULTS. What do we mean by measuring results? > Measuring Outputs and Outcomes > Program Evaluation > Logic Modeling (as a tool)

maylin
Télécharger la présentation

Measuring Program Results for Abstinence Education Grantees

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Program ResultsforAbstinence Education Grantees Vijaya ChannahSorah, Ph.D. Independent Consultant, Results Management

  2. RESULTS • What do we mean by measuring results? > Measuring Outputs and Outcomes > Program Evaluation > Logic Modeling (as a tool) • The purpose is to improve program outcomes and to achieve our vision . . . to give our youth knowledge and skills to live a rich, grounded life . . . Page 2

  3. Why are Results Important? • Enable you to know what works well (in what populations, under which conditions) • Enable you to take corrective actions • Other states/projects can implement effective program elements and test them in their own area Page 3

  4. TRAINING APPROACH Walk systematically through: • Performance Measurement (outputs, outcomes, efficiency, etc.) • Program Evaluation • Logic Modeling • Tying Everything Together Page 4

  5. PERFORMANCE MEASUREMENT • Definitions: outputs, outcomes, etc. • Setting targets • How different from research / program evaluation Page 5

  6. Performance Measurement: Definitions • Outcomes: ultimate purpose of the program • Outputs: intermediate results of activities • Activities: your processes • Inputs: resources • Efficiency: outcomes or outputs over costs (usually) Page 6

  7. Performance Measurement: Definitions / Examples Outcome example: • Decrease the rate of births to unmarried teenage girls ages 15 to 19. (2002 baseline was 35.4%) Output example: • Number of teachers trained in abstinence education (by 200X). • Activities: Developing curriculum, conducting training, etc. • Inputs: Funds, x number of abstinence experts, ideas, etc. • Efficiency: Number of teachers trained over training dollars (a PROCESS Efficiency Measure). Page 7

  8. Challenges in Performance Measurement • Defining “efficiency” for human services programs (e.g., abstinence education) • Defining outcomes (e.g., family violence prevention) • Timely, reliable data Page 8

  9. Performance Measurement / Program Evaluation: definitions Difference between performance measurement & program evaluation Performance measurement shows: • Trends over time • Comparison between actual and desired results/outcomes Program evaluation shows: • Outcomes relative to what they would have been in absence of the program • Program’s causal contribution to the observed outcome Page 9

  10. PROGRAM EVALUATION • Definition: Program evaluations are systematic, scientifically-based studies conducted to assess the impact of a program A program evaluation typically applies scientific techniques to determine how confident we can be that a particular OUTCOME was CAUSED BY the intervention(s) Evaluations examine achievement of program objectives in the context of environmental and external factors (and takes them into account) Page 10

  11. Program Evaluation Basic Approaches: • Randomized controlled trials ________________ O1 X O2 - - - - - - - - - - - - O1 O2 ________________ Random assignment to control and experimental groups Page 11

  12. Program Evaluation Basic Approaches: • Quasi experimental design ________________ O1 X O2 - - - - - - - - - - - - O1 O2 ________________ No random assignment to control and experimental groups Page 12

  13. Program Evaluation Basic Approaches: • Longitudinal quasi experimental design _________________________________ O1 X O2 X O3 X O4 _________________________________ Observations and treatments (interventions) over time Page 13

  14. Program Evaluation Basic Approaches: • Single group pre- post-design _______________ O1 X O2 ________________ No comparison/control group. Not recommended. Conclusions about causality will be uncertain. Page 14

  15. Program Evaluation Key things to look for (and discuss with evaluator): • Type and rigor of study design • Hypotheses addressed? Page 15

  16. Program Evaluation Key things to look for continued… • Timing between pre- and post-tests (if applicable) • Data collected & methods of collection • Frequency of data collection • Demographics, external factors collected? • Internal/external validity of study Page 15 - A

  17. Program Evaluation Using milestones/interim results to take corrective actions • Ensure evaluation, data gathering, and outcomes measurement allow for interim reporting • Find out about the data lag (and implications) • Determine how interim results can be interpreted (will you know what to change about the program elements/activities?) Page 16

  18. Program Evaluation Reminders: • Consider evaluating relatively small chunks of program aspects (you may not be able to conduct an evaluation of the entire project) • Build use of interim results into your workplan Page 17

  19. Program Evaluation Reminders (continued): • Tap into partnerships: are existing data available (or is information available) that might be useful? • Check in “early and often” with 3rd party evaluator ~ stay in close contact • Design the formats for evaluation reports (interim and final) at the start Page 18

  20. Program Evaluation Awareness of analysis techniques: • Multiple regression • Non-response analysis • Correlation (is NOT causality!) Page 19

  21. Program Evaluation Feedback loop • Learn from the performance and program evaluation process and results • Incorporate program evaluation into the logic model (discussed next) Page 20

  22. LOGIC MODELS A diagram, chart, or picture of all the major elements of your entire project The idea is to make it user-friendly to you and your group, and applicable to your purpose You will get basic building blocks here, and can then tailor the logic model to your needs and preferred ways of thinking Page 21

  23. Overview of Logic Models • A logic model tracks how we get from our challenges to our solutions and desired long term outcomes … CHALLENGE INPUTS ACTIVITIES OUTPUTS DEMOGRAPHICS LONG TERM OUTCOMES (GOALS) OUTCOMES EXT. FACTORS CONSTRAINTS ASSUMPTIONS

  24. Logic Model Defined What is a Logic Model? • A graphic representation of a program. • It shows what the program is designed to accomplish, including the services it delivers, expected results of those services, and the linkages between services and program goals. Page 23

  25. Logic Models …can go both ways X ---------------------------------------------Y (e.g., how can we make a raft float?) X<---------------------------------------------Y (e.g., why did the raft sink?) Page 24

  26. Logic Model Uses Use for program: • Design • Budgeting • Implementation • Evaluation • Communication • Marketing • Workplanning So logic models can be used to help plan and manage the whole program Page 25

  27. Logic Model Construction Process Brainstorming the Draft Model: • Establish the scope and context • Determine challenge(s), outcomes, inputs, activities, outputs, and measurement components Page 26

  28. Logic Model Construction Process Continued… • Create model draft • Express relationships among/between key components • Determine evaluation needs/points using dotted line arrows (solid arrows show known relationships) Page 26 - A

  29. Logic Model Construction Process Brainstorm the Model Development: • Use flipcharts and colored post-its • Create model draft step by step • No idea is “wrong” • Think creatively! Page 27

  30. Logic Model Development 1. Identify CHALLENGE/Social ill: What do we want to improve in the population? • Out of wedlock births • Relationships before marriage • Diseases (STDs) Challenges are often expressed as statements of fact, based on empirical data/statistics. There may be multiple challenges addressed by a single program. Page 28

  31. Logic Model Development 2. Identify LONG-TERM OUTCOMES: The ultimate end goals of your program, how your service population will look after your interventions have taken place • Decrease out of wedlock births • Increase proportion of abstinent youths • Decrease preventable disease (STDs) Long-term outcomes/goals (as well as all other outcomes) are usually expressed as changes: you will use words such as “improved,” “increased,” “decreased,” etc. Ultimate long-term outcomes/goals are sometimes “pie in the sky” or utopian. Page 29

  32. Logic Model Development The remainder of the logic model elaborates how we get from the CHALLENGE to the LONG TERM OUTCOMES. CHALLENGE INPUTS ACTIVITIES OUTPUTS DEMOGRAPHICS LONG TERM OUTCOMES (GOALS) OUTCOMES EXT. FACTORS CONSTRAINTS ASSUMPTIONS

  33. Logic Model Development 3. Identify Inputs/Resources (personnel, funds, laws/regulations, creative ideas, etc.) 4. Identify Activities: • Train the trainer • Delivering abstinence education in [churches, community centers, schools] Page 31

  34. Logic Model Development 5. Identify Outputs, such as number of trainers trained, number of training courses developed and administered. 6. Identify/develop Key Outcomes and Measures: • Decrease the rate of births to unmarried teenage girls ages 15-19 (35.4% in 2002) Target: 35% in 2003 • Decrease the proportion of youth ages 15-19 who have engaged in sexual intercourse (46.7% in 2003) Target: 45.5% for 2004 … and set Targets Page 32

  35. Logic Model Development 7. Determine where program evaluation/research needs to take place (and depict arrows accordingly): - - - - - - > = plausible causal relationship (or desired effects/results) ---------- = known causal relationship (based on scientific research/program evaluation) Page 33

  36. Logic Model Development Reminders: • Logic modeling is a continuous (not static) process • Incorporate activities/processes into more detailed project workplans • Do not restrict your thinking - jot down ideas for the workplan or other areas, items, etc. as you think of them while brainstorming • Use the abstinence-until-marriage (Abstinence Education) logic model template as a starting point, and to assist thinking... Page 34

  37. Logic Model Development Complementary Tools: • Workplans • Strategic plans • Flowcharts • Process diagrams • Related logic models • Performance budgets containing “global” Agency performance measure information Page 35

  38. LINKING – How does all this relate? • Performance measurement & reporting (represented in the boxes of the logic model): focus is on outcomes/ultimate results • Program evaluation (represented in the arrows of the logic model): focus is on impacts • Logic Modeling (gives a clear picture of what outcome measures needed, program evaluation needed, etc.) Page 36

  39. In Conclusion • Need to focus on producing results • We have lots of information! – how do we assemble it to help us manage? • Use logic modeling to track and improve performance (at federal, state, and grantee levels) Page 37

  40. Questions / Discussion • Questions on any aspect of measuring program results? • Is it clear how all aspects relate to one another and how logic modeling can be used as a powerful tool? Page 38

  41. Go Forth, Produce Results, and Measure !. . . for our youth Page 39

  42. Questions & Answers • You may submit questions pertaining to today’s web cast until 5:00pm EDT, Wednesday June 14, 2006, to the following address: cbaeta@youthdevelopment.org • Answers will be posted at www.pal-tech.com/web/cbaewebcast/ as soon as they are available. Page 40

More Related