1 / 14

Day 2, Presentation in Oslo ‘What Have We learnt From the Application Phase?’ 17 February 2011

Methodological Lessons Joint Evaluation of Conflict Prevention and Peace-Building in DRC, 2002-2010. Day 2, Presentation in Oslo ‘What Have We learnt From the Application Phase?’ 17 February 2011. The General Feedback On the Draft Working Guidance.

Télécharger la présentation

Day 2, Presentation in Oslo ‘What Have We learnt From the Application Phase?’ 17 February 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Methodological LessonsJoint Evaluation of Conflict Prevention and Peace-Building in DRC, 2002-2010 Day 2, Presentation in Oslo ‘What Have We learnt From the Application Phase?’ 17 February 2011

  2. The General Feedback On the Draft Working Guidance • Feedback received from the Congo evaluation team, but also from recent work with USAID-OTI, Ausaid, and training given by Channel on evaluation of CPPB. • The Guidance is useful introduction, especially for practitioners with limited experience of evaluation. Particularly useful are the overview of key steps and the development of the evaluation questions. • Intrigued by Theories of Change and importance of conflict analysis. Find Theories of Change better for project level. • However the Guidance does not tackle the “how to” questions: conflict analysis (which, when, how), testing theories of change, baselines. Channel Research

  3. Mandate & Steering The Evaluative Framework Used for the Congo Evaluation 1. Evaluation Questions & Criteria Effects on the Conflict Outcomes 2. Sample of Projects 4. Research Hypotheses and Themes 3. Conflict Analysis

  4. Mandate & Steering The Evaluative Framework Used for the Congo Evaluation Good to have simple questions but tricky to analyse by “results”. 1. Evaluation Questions & Criteria Effects on the Conflict Outcomes 2. Sample of Projects 4. Research Hypotheses and Themes 3. Conflict Analysis

  5. Mandate & Steering The Evaluative Framework Used for the Congo Evaluation Issue N° 1: No Theories of Change Confused Strategies 1. Evaluation Questions & Criteria Effects on the Conflict Outcomes 2. Sample of Projects 4. Research Hypotheses and Themes 3. Conflict Analysis

  6. Reasons for the Weakening of Theories of Change & Strategies as Reference • 1. Seizing Opportunities • Implementation requires a response to constraints and opportunities as they arise. • These cannot be planned for in advance so plans and objectives are not a good evaluation point of reference • 2. Compartmentalisation • Too many levels and too many instruments to carry out an analysis • Severe lack of documentation, issues of disclosure, many “indirect” effects. • 3. Problems of Reconstruction • Theories of Change conceived as an elicitive tool in planning • Retrospective interpretation opens up real chances of contested findings, with no clear data sets • 4. End State? • Problems in defining terms such as peace and conflict. • Usually depends on the identification of an end-state. • This may not be explicit or shared by all stakeholders or those within a single programme Channel Research

  7. Mandate & Steering The Evaluative Framework Used for the Congo Evaluation 1. Evaluation Questions & Criteria Effects on the Conflict Outcomes 2. Sample of Projects 4. Research Hypotheses and Themes Issue N° 2: No Conflict Analyses Used, and No Conflict Sensitivity Analysis 3. Conflict Analysis

  8. Our understanding of peace depends on where we are

  9. Mandate & Steering The Evaluative Framework Used for the Congo Evaluation 1. Evaluation Questions & Criteria Issue N° 3: Finding the right balance in the sample is not easy Effects on the Conflict Outcomes 2. Sample of Projects 4. Research Hypotheses and Themes 3. Conflict Analysis

  10. The Question of Sampling An excessive number of projects. A database selected before the evaluation and a great difficulty in accessing information for these projects. Consequence: Time spent to obtain information: 80% (to avoid focus on + transparent) Sample may not represent the full range of interventions Strategic dimensions were lost. Case study approach weakened.

  11. Are We on the Right Track? Channel Research

  12. Evaluation Communities Must Become More Closely Engaged Negative / Positive Change - + ++ - - 0 Results reporting Regular reporting Public workshops Donors only Good consultation Theories of Change Many field visits Drivers based Desk Bound Simple Eval Questions A Strong Steering Committee willing to arbitrate between approaches and to track the evaluation team in country(ies) Key factor A A highly structured approach with detailed phasing and willingness to communicate Key factor B Resource Conscious Steering Committee 5 Engagement Public Uptake - - ++ 1 Internal & Static Conflict Analysis Channel Research

  13. Specific Recommendation for the Final Guidance: Recommend Care in 3 Key Stages and Outline Unique Nature of Conflict Interventions Drawing Conclusions Outcome to Factors Optimal scenario Case Studies Conflict Analysis Sequencing Channel Research

  14. Specific Recommendation for the Final Guidance: Recommend Care in 3 Key Stages and Outline Unique Nature of Conflict Interventions Drawing Conclusions Outcome to Factors Optimal scenario Case Studies Conflict Analysis Conflict analysis must precede sampling Sequencing ToR Preparation Collection Analysis Channel Research

More Related