1 / 30

How to Plan a Local Evaluation and Lessons Learned

How to Plan a Local Evaluation and Lessons Learned. Philip Rodgers, Ph.D. Evaluation Scientist American Foundation for Suicide Prevention 2013 Garrett Lee Smith Combined Annual Grantee Meeting June 11-13, 2013, Washington DC. Acknowledgements. U.S. Department of Health & Human Services.

doyle
Télécharger la présentation

How to Plan a Local Evaluation and Lessons Learned

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How to Plan a Local Evaluation and Lessons Learned Philip Rodgers, Ph.D. Evaluation Scientist American Foundation for Suicide Prevention 2013 Garrett Lee Smith Combined Annual Grantee Meeting June 11-13, 2013, Washington DC

  2. Acknowledgements U.S. Department of Health & Human Services Substance Abuse and Mental Health Services Agency Howard Sudak, MD AFSP Katrina Bledsoe, PhD SPRC

  3. www.sprc.org

  4. Defining Evaluation

  5. Why is evaluation important? • Required by funding agencies. • Improves performance. • Demonstrates effectiveness. • Advances knowledge. As not everything can be done, there must be a basis for deciding which things are worth doing. Enter evaluation. M. Q. Patton Source for Patton Quote: U.S. Department of Health and Human Services, P. H. S. (2001). National Strategy for Suicide Prevention: Goals and Objectives for Action. Rockville, MD: U.S. Department of Health and Human Services, Public Health Service.

  6. Participatory evaluation Formative evaluation Summative evaluation Responsive evaluation Goal-free evaluation Empowerment evaluation Advisory evaluation Accreditation evaluation Adversary evaluation Utilization evaluation Consumer evaluation Theory-driven evaluation What are types of evaluation? We will address: Process, Outcome, and Impact Evaluations

  7. What is process evaluation? • Process “evaluation assesses the extent to which a program is operating as it was intended.” Source: General Accounting Office. (2005). Performance Measurement and Evaluation: Definitions and Relationships (Vol. GAO-05-739SP). Washington DC: United States Government Accounting Office.

  8. What is outcome evaluation? • Outcome “evaluation assesses the extent to which a program achieves its outcome-oriented objectives.” Source: General Accounting Office. (2005). Performance Measurement and Evaluation: Definitions and Relationships (Vol. GAO-05-739SP). Washington DC: United States Government Accounting Office.

  9. What is impact evaluation? • “Impact evaluation…assesses the net effect of a program by comparing program outcomes with an estimate of what would have happened in the absence of the program.” Source: General Accounting Office. (2005). Performance Measurement and Evaluation: Definitions and Relationships (Vol. GAO-05-739SP). Washington DC: United States Government Accounting Office.

  10. I. Logic Models & Evaluation

  11. Logic models can drive evaluations Generic Gatekeeper Training Logic Model Process Evaluation Outcome Evaluation Impact Evaluation vs. Control Group

  12. II. How to Develop Evaluation Questions

  13. Where do evaluation questions come from? • Generally, from the goals listed in a logic model. • More specifically, defined by stakeholders through a collaborative process. • Depending upon circumstances, stakeholders can be funders, participants, trainers, evaluators, and others, or a combination of these.

  14. Divergent, convergent process… • Stakeholders meet with relevant materials (grant application, logic model, etc.). • After a review of materials, engage in divergent process—a free association of evaluation questions • After the divergent process, stakeholders collectively narrow list of questions to manageable proportions through a convergent process.

  15. III. How to Answer Evaluation Questions

  16. What is measurement? • Measurement is the means you use to collect data. • It includes how you collect data and what data you collect (and how well you collect data).

  17. How will you collect data? • Questionnaires (in-person, mail, email, phone) • Psychological Tests • Interviews • Health Records • Health Statistics • Observations • Logs

  18. Where do you find measures? • Create your own (pilot test!) • Borrow from other evaluations (with permission!) • Search the literature (see Additional Resources) • Use standardized measures (may cost) • Brown (adult) and Goldston (adolescent) reviews • Use existing data sources and records

  19. IV. How to Develop an Evaluation Plan

  20. What evaluation design do you need? • What is your purpose? • Performance assessment? • Evidence of concept? • Evidence of effectiveness?

  21. There are four basic evaluation designs Posttest Only X O Pre- and Posttest O X O Posttest Only w/control X O O Pre- and Posttest w/control O X O O O

  22. Additional data collection points can be added to the basic desgins Posttest Only X O O Pre- and Posttest O X O O Posttest Only w/control X O O O O Pre- and Posttest w/control O X O O O O O

  23. Best evidence comes when subjects are randomly assigned to groups x Exp. Group Exp. Group Pool of Subjects or Groups Con. Group Con. Group Random assignment increases the likelihood that subjects in both groups are equivalent in regards to factors that may be related to outcomes.

  24. If random assignment is not possible… • Compare groups that are similar. • Use a pretest so that group differences—to some extent—are accounted for.

  25. Evaluation Planning Forms

  26. Philip Rodgers, PhD American Foundation for Suicide Prevention prodgers@afsp.org 802-446-2446

More Related