1 / 32

The Evaluation Plan

Understand how evaluation helps improve program effectiveness, informs decision-making, and supports accountability. Learn to focus on implementation and outcomes through logic models. Explore principles and methods for evaluating activities, outputs, and outcomes to enhance program impact and communicate results effectively.

rbouknight
Télécharger la présentation

The Evaluation Plan

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Evaluation Plan

  2. Session Purpose • To understand how evaluation can be useful • To understand how your logic model helps to focus an evaluation • To understand both implementation and outcome evaluation

  3. What is Evaluation? The systematic collection of information about a program in order to enable stakeholders to better understand the program, to improve program effectiveness, and/or to make decisions about future programming.

  4. What’s in it for you? • Understand and improve your program • Test the theory underlying your program • Tell your program’s story • Be accountable • Inform the field • Support fundraising efforts

  5. Evaluation Principles Evaluation is most effective when it: • Is connected to program planning and delivery • Involves the participation of stakeholders • Supports an organization’s capacity to learn and reflect • Respects the community served by the program • Enables the collection of the most information with the least effort

  6. Logic Model Program Goals: overall aims or intended impacts Resources The resources dedicated to or consumed by the program Activities The actions that the program takes to achieve desired outcomes Outputs The tangible, direct results of a program’s activities Outcomes The benefits to clients, communities, systems, or organizations External Factors: what else affects the program

  7. Putting Your Plans Together Logic Model Resources Activities Outputs Outcomes Evaluation Plan: Implementation Data Collection Method Effort Activities Outputs Outcomes Data Collection Method Effort Outcomes Indicators

  8. Implementation and Outcomes • Evaluating Outcomes: What changes occurred as a result of your work? • Evaluating Implementation: What did you do? How well did you do it?

  9. Evaluating Outcomes • Outcomes: the changes you expect to see as a result of your work • Indicators: the specific, measurable characteristics or changes that represent achievement of an outcome. They answer the question: How will I know it?

  10. Evaluating Outcomes:Indicators = What to Measure • Meaningful • Direct • Useful • Practical

  11. Evaluating Outcomes:Direct v. Indirect Indicators

  12. Evaluating Outcomes: Template

  13. Evaluating Outcomes: Example

  14. Evaluating Implementation • Activities and Outputs: The “what” —the work you did, and the tangible results of that work • Additional Questions: The “why”—understanding how well you did, and why

  15. Evaluating Implementation: Understanding how well you did What information will help you understand your program implementation? Think about: • Participation • Quality • Satisfaction • Context

  16. Evaluating Implementation: Template

  17. Evaluating Implementation: Example

  18. Data Collection Determine what methods will you use to collect the information you need? • Choose the method • Decide which people, or records will be the source of the information • Determine the level of effort involved in using that method with that population.

  19. Data Collection Methods • Review documents • Observe • Talk to people • Collect written information • Pictorial/multimedia

  20. Issues to Consider • Resist pressure to “prove” • Start with what you already collect • Consider the level of effort it will take to gather the data. • Prioritize. What do you need to collect now, and what can wait until later?

  21. Data Collection: Level of Effort • Instrument development • Cost/practicality of actually collecting data • Cost of analyzing and presenting data

  22. Qualitative Data • Usually in narrative form—not using numbers • Collected through focus groups, interviews, open-ended questionnaire items, but also poetry, stories, diaries, and notes from observations

  23. Quantitative Data • Pieces of information that can be expressed in numerical terms, counted, or compared on a scale • Collected in surveys, attendance logs, etc.

  24. Both Types of Data are Valuable • Qualitative information can provide depth and insight about quantitative data • Some information can only be collected and communicated qualitatively • Both methods require a systematic approach

  25. What do your data tell you? • Are there patterns that emerge? • Patterns for sub-groups of your population? • Patterns for different components of your program? • What questions do the data raise? • What is surprising? What stands out? • What are other ways the data should be analyzed? • What additional information do you need to collect?

  26. Communicating Findings • Who is the information for? • How will you tell them what you know?

  27. “Information that is not effectively shared with others will not be effectively used.” Source: Building a Successful Evaluation Center for Substance Abuse Prevention Communicating Findings

  28. Staff Board Funders Partners Other agencies Public Audience: Who needs the findings, and what do they need? Who are the audiences for your results? Which results?

  29. Written report Short summaries Film or videotape Pictures, displays PowerPoint presentations Graphs and other visuals Different ways to communicate Decide what format is appropriate for different audiences.

  30. Whatever communication strategy you choose: • Link the findings to the program’s desired outcomes • Include the “good, the bad, and the ugly” • Be sure you can support your claims • Acknowledge knowledge gaps

  31. Continuous Learning Cycle Logic Model Reflection/ Improvement Evaluation Planning Data Collection

  32. Thanks for Your Participation! Measure results. Make informed decisions. Create lasting change. Innovation Network, Inc. 1625 K Street, NW 11th Floor Washington, DC 20006 (202) 728-0727 Website: www.innonet.org Robin: Extension 104; rkane@innonet.org Veena: Extension 107; vpankaj@innonet.org

More Related