1 / 17

Measuring and Evaluating Impact

Measuring and Evaluating Impact. Prepared for Iowa Network for Community and School Partnerships September 5, 2013. Overview. What is Evaluation Why Evaluate? Types of Evaluation Neglected Evaluation Power in Evaluation Evaluation and Research Measuring Impact Is Impact “Good Enough”?

ogden
Télécharger la présentation

Measuring and Evaluating Impact

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring and Evaluating Impact Prepared for Iowa Network for Community and School Partnerships September 5, 2013

  2. Overview • What is Evaluation • Why Evaluate? • Types of Evaluation • Neglected Evaluation • Power in Evaluation • Evaluation and Research • Measuring Impact • Is Impact “Good Enough”? • When Impact isn’t “good enough” • Evaluation in Continuous Improvement

  3. What is Evaluation? • Systematic Process • Collecting Data • Enhances Knowledge and Decision Making (Russ-Eft & Preskill)

  4. Why Evaluate? • Ensures Quality • Increases Knowledge • Prioritize Resources • Accountability • Demonstrate need and effectiveness • Marketable (Russ-Eft & Preskill)

  5. Types of Evaluation • Developmental • Formative • Summative • Monitoring and Auditing • Outcome Evaluation • Impact Evaluation • Performance Measurement (Russ-Eft & Preskill)

  6. Top 10 Reasons Evaluation is Neglected • Misunderstanding the purpose and role of evaluation • Fear • Lack of Skills (Real or Perceived) • No one has asked for it • It is not used • Time-consuming • Perceived costs vs. Perceived benefit • Leaders already know what works • Prior experience • Lack of Value (Russ-Eft & Preskill)

  7. Power in Evaluation Who has the power in a program evaluation? • The Funder • The Board of Directors • The Executive Director • The staff member conducting the evaluation • The evaluation participants

  8. Evaluation and Research • A continuum • Purpose • Evaluation: Informs the organization • Research: Generates new knowledge – Truth • Audience • Evaluation: clients, funders, internal use • Research: Other researchers • Focus • Evaluation: key evaluation questions, purpose of evaluation • Research: Literature Review, Problem Statement, Research Questions and Hypothesis, Variables to be studied

  9. Evaluation and Research • Design • Evaluation: Bound by Organization • Research: Bound by research and funding • Collecting Data • Same methods • Reliability and Validity • Evaluation: Rooted in values and politics; not concerned with generalizing findings • Research: Attempts to be value free and objective; seeks to generalize findings • Reporting results • Evaluation: Evaluative conclusions, makes recommendations, rarely published • Research: Empirical Conclusions, Suggestions for further research, published

  10. Moving along the continuum • Consider what is required of you in program evaluation • Report of Results vs. Proof • Where are you along the continuum? • Where do you want to be? • How do you get there?

  11. Measures of Impact • Measures of Change • Attitude, Knowledge and Behavior • Comparison • Pre- and Post-program • Calibration • Is the primary impact what is measured? • Is the program driving the measure, or is the measure driving the program?

  12. Is the Impact Good Enough? • Goals • Program established goals • Funder established goals • Comparison • Population level data • Are the target and general population the same? • The continuum of evaluation and research

  13. When Impact isn’t “Good Enough” • Re-calibrate Measures • What is impact? • Participant focus groups • What best measures impact? • Refine measurement tools • Staff Training • Re-Pilot surveys • Incorporate Measures of Program Quality • Look at HOW programs and services are offered • Explore field-specific measures of quality • Youth Program Quality Assessment • Quality Rating System

  14. Continuous Improvement Evaluation

  15. Results: Maximizing Evaluation • What is the purpose of the evaluation? • Funder Required? • Is there a requirement to “Act” on the results? • How are the results communicated to all stakeholders? • How are results used in planning?

  16. Contact Information Jennifer Farley Censeo Solutions jennifer@censeosolutions.com 515.371.1754 www.censeosolutions.com

  17. Works Cited Russ-Eft, D., & Preskill, H. (2009). Evaluation In Organizations: A systematic approach to enhancing learning, performance, and change.New York: Basic Books.

More Related