1 / 39

Setting the Stage: Workshop Framing and Crosscutting Issues

Setting the Stage: Workshop Framing and Crosscutting Issues. Simon Hearn, ODI. Evaluation Methods for Large-Scale, Complex, Multi-National Global Health Initiatives, Institute of Medicine 7-8 January, London, UK. What do we mean by complex interventions?. The nature of the intervention:

inez
Télécharger la présentation

Setting the Stage: Workshop Framing and Crosscutting Issues

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi-National Global Health Initiatives, Institute of Medicine 7-8 January, London, UK

  2. What do we mean by complex interventions? • The nature of the intervention: • Focus of objectives • Governance • Consistency of implementation • How it works: • Necessariness • Sufficiency • Change trajectory Photo: Les Chatfield -http://www.flickr.com/photos/elsie/

  3. What are the challenges of evaluating complex interventions? Describing what is being implemented Getting data about impacts Attributing impacts to a particular programme Photo: Les Chatfield -http://www.flickr.com/photos/elsie/

  4. Why a framework is needed Wikipedia: Evaluation Methods

  5. Why a framework is needed Image: Simon Kneebone. http://simonkneebone.com/

  6. The Rainbow Framework

  7. DEFINE what is to be evaluated

  8. Why do we need to start with a clear definition? Photo: Hobbies on a Budget / Flickr

  9. Develop initial description Develop program theory or logic model Identify potential unintended results

  10. Options for representing logic models

  11. FRAME what is to be evaluated

  12. Make Decision • Frame Decision • Design Evaluation • Frame Evaluation Source: Hobbies on a Budget / Flickr

  13. Identify primary intended users Decide purpose(s) Specify key evaluation questions Determine what ‘success’ looks like

  14. DESCRIBE what happened

  15. Sample Use measures, indicators or metrics Collect and/or retrieve data Manage data Combine qualitative and quantitative data Analyze data Visualize data

  16. Combine qualitative and quantitative data Enrich Examine Explain Triangulate Parallel Sequential Component Integrated

  17. UNDERSTAND CAUSES of outcomes and impacts

  18. Impacts Outcomes

  19. As a profession, we often either oversimplify causationor we overcomplicate it!

  20. “In my opinion, measuring attribution is critical, and we can't do that unless we use control groups to compare them to.” Comment in an expert discussion on The Guardian online, May 2013

  21. Check that the results support causal attribution Compare results to the counterfactual Investigate possible alternative explanations

  22. SYNTHESIZE data from one or more evaluations

  23. Was it good?Did it work? Was it effective? For whom did it work? In what ways did it work? Was it value for money? Was it cost-effective? Did it succeed in terms of the Triple Bottom Line?

  24. How do we synthesize diverse evidence about performance?

  25. Synthesize data from a single evaluation Synthesize data across evaluations Generalize findings

  26. REPORT and SUPPORT USE of findings

  27. I can honestly say that not a day goes by when we don’t use those evaluations in one way or another

  28. Identify reporting requirements Develop reporting media Ensure accessibility Develop recommendations Support use

  29. MANAGE your evaluation

  30. Understand and engage with stakeholders Establish decision making processes Decide who will conduct the evaluation Determine and secure resources Define ethical and quality evaluation standards Document management processes and agreements Develop evaluation plan or framework Review evaluation Develop evaluation capacity

  31. Making decisions Look at type of questions DESCRIBE Descriptive Questions – Was the policy implemented as planned? UNDERSTAND CAUSES Causal questions – Did the policy change contribute to improved health outcomes? SYNTHESIZE Synthesis questions – Was the policy overall a success? REPORT & SUPPORT USE Action questions – What should we do?

  32. Making decisions Compare pros and cons

  33. Making decisions Create an evaluation matrix

  34. www.betterevaluation.org

  35. Documenting Sharing activities R & D Events Descriptions WEBSITE Comments Examples Guides Tools

  36. Founding Partners Financial Supporters

  37. For more information: www.betterevaluation.org/start_here

More Related