1 / 21

VIII Konferencja Ewaluacyjna Using Mixed Methods of Evaluation for Policy Making Purposes

VIII Konferencja Ewaluacyjna Using Mixed Methods of Evaluation for Policy Making Purposes dr Philip Davies. 8 th Evaluation Conference, Evaluation in the System of Public Policies Warsaw, Poland, 12-13 November 2012.

kyrie
Télécharger la présentation

VIII Konferencja Ewaluacyjna Using Mixed Methods of Evaluation for Policy Making Purposes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. VIII Konferencja Ewaluacyjna Using Mixed Methods of Evaluation for Policy Making Purposes dr Philip Davies

  2. 8th Evaluation Conference, Evaluation in the System of Public Policies Warsaw, Poland, 12-13 November 2012 The Challenges Ahead for Impact Evaluation Studies in PolandUsing Mixed Methods of Evaluation for Policy Making Purposes Philip Davies International Initiative for Impact Evaluation [3ie]

  3. What Is To Be Evaluated? • Intervention effectiveness - what works? • Implementation effectiveness – how/why it works? • Diversity of effectiveness across different groups – what works for whom and when? • Experiential effectiveness - users’ views • Resource effectiveness - at what cost/benefit?

  4. Which translates into: • Theory Based Evaluation • How is a policy/intervention supposed to work? • Impact (or summative) evaluation Does the policy (programme, intervention) work? • Process (or formative) evaluation How, why, and under what conditions does the policy (programme, intervention) work? • Developmental evaluation • Supporting innovation and adaptation. Providing feedback, generating learning, supporting direction or affirming changes in direction in real time. Building the evidence base over time.

  5. The key to the challenges ahead is: What’s the Question?

  6. How is the policy supposed to work? Logic Model Theories of Change Evidence for Policy

  7. Evaluation: Theory of Change/Logic Model • How is a policy/programme supposed to work? • What activities, mechanisms, people have to be in place? • And in what sequence – what is the causal chain? • What resources are required – and are available? • What data are required – and are available? • Is the policy/programme feasible/achievable?

  8. Logic Model How is the policy supposed to work? What is already known about this issue? Theories of Change Research Synthesis Harness Existing Evidence Evidence for Policy

  9. Types of Research Synthesis • Statistical Meta-Analyses (6-18 months) • Narrative Systematic Reviews (6-12 Months) • Rapid Evidence Assessments (1-3 Months) • Evidence Maps and Gap Maps (1 Month) • Meta-Ethnography /Qualitative Synthesis (6-12 Months)

  10. Logic Model How is the policy supposed to work? What is already known about this issue? Theories of Change Research Synthesis Harness Existing Evidence Evidence for Policy What is the nature, size and dynamics of the problem? Descriptive and Experiential Evidence Administrative Data Surveys Census Data Qualitative Research

  11. Logic Model How is the policy supposed to work? What is already known about this issue? Theories of Change Research Synthesis Harness Existing Evidence Evidence for Policy What is the nature, size and dynamics of the problem? Descriptive and Experiential Evidence Evidence of Proven Effectiveness Administrative Data Surveys Census Data Qualitative Research What has been shown to work elsewhere? Experimental and Quasi- Experimental Evidence

  12. Experimental and Quasi-Experimental Evidence • Randomised Controlled Trials • Matched Comparison Designs - Propensity Score Matching • Difference-in-Differences • Regression Discontinuity Designs • Interrupted Time Series Designs • Instrumental Variable Analysis

  13. Logic Model How is the policy supposed to work? What is already known about the problem/policy? Theories of Change Research Synthesis Harness Existing Evidence Evidence for Policy What is the nature, size and dynamics of the problem? Descriptive Analysis and Experiential Evidence Implementation Evidence Administrative Data Surveys Census Data Qualitative Research Evidence of Proven Effectiveness Case Studies Interviews Focus Groups Ethnography Operations Research What has been shown to work elsewhere? Experimental and Quasi- Experimental Evidence How do we make the policy work?

  14. Impact Evaluations Evaluations of Outcome Attainment (Have targets been met?) Theory of Change Case Studies Interviews Focus Groups Ethnography Admin data Surveys Operations Analysis Priority Review Data Point 2 Data Point 3 Data Point 1

  15. Logic Model How is the policy supposed to work? What is already known about the problem/policy? Theories of Change Research Synthesis Harness Existing Evidence Economic and Econometric Evidence Evidence for Policy What is the nature, size and dynamics of the problem? Cost-Benefit/ Effectiveness/ Utility Analysis Descriptive Analysis and Experiential Evidence Implementation Evidence/ What are the costs and benefits of the policy ] Administrative Data Surveys Census Data Qualitative Research Evidence of Proven Effectiveness Case Studies Interviews Focus Groups Ethnography Operations Research What has been shown to work elsewhere? Experimental and Quasi- Experimental Evidence How do we make The policy work?

  16. Economic Appraisal • No policy programme or project should be adopted without first having the answer to these questions: • Are there better ways to achieve this objective • [Cost-Effectiveness Analysis] • Are there better uses for these resources? • [Cost-Benefit Analysis]

  17. How is the policy supposed to work? Logic Model What is already known about the problem/policy? Social Ethics Public Consultation Theories of Change Ethical Evidence Research Synthesis What are the ethical implications of the policy? Harness Existing Evidence Economic and Econometric Evidence Evidence for Policy What is the nature, size and dynamics of the problem? Cost-Benefit/ Effectiveness/ Utility Analysis Descriptive and Experiential Evidence What are the costs and benefits of the policy ] Administrative Data Surveys Census Data Qualitative Research Implementation Evidence/ Evidence of Proven Effectiveness Case Studies Interviews Focus Groups Ethnography Operations Research What has been shown to work elsewhere? Experimental and Quasi- Experimental Evidence How do we make The policy work?

  18. The Evaluation Process Specify Test Theorise Review Activities Mechanisms People Resources Systematic Review of All Evaluations Impact and Process Evaluations Theory of Change Refine Re-Test Accumulate Refine

  19. An Example: Conditional Cash Transfers Specify Test Theorise Review Activities Mechanisms People Resources Systematic Review of All Evaluations Impact and Process Evaluations Theory of Change • CCTS incentivise parents to send children to school • Improve enrolment and attendance • Improve learning outcomes • By enforcing the conditionality of schooling • Changing the behaviour of parents and children • Altering the relative costs and benefits of schooling versus other uses of children’s time • >100 impact evaluations • Many process evaluations • Testing CCTs in Latin America, Asia, Africa • Petrosino et al, (2012) Systematic Review of 23 studies • Quality Education For All Children • CCTs increase enrolments and attendance • No overall impact on learning outcomes.

  20. Specify Test Theorise Review Activities Mechanisms People Resources Systematic Review of All Evaluations Impact and Process Evaluations Theory of Change Refine Re-Test Accumulate Refine • Deadweight issues • Levels of payment • Payment recipient • Reward performance • Measure learning outcomes • Supply-side interventions needed • Teacher quality a mediating factor • Phased transfers? • Need to re-test refinements/mechanisms • Focus on contextual specificity • Build an evidence base over time • Establish generalisability • And context specificity • Avoid Type I/II errors

  21. Thank you Philip Davies Email: pdavies@3ieimpact.org +44 (0)207 958 8350 Visit www.3ieimpact.org

More Related