1 / 12

Linda Morra Imas Independent Evaluation Group World Bank Group

Experiences, Trends, and Challenges in Management of Development Assistance Operators for Effective Monitoring & Evaluation of Results. Linda Morra Imas Independent Evaluation Group World Bank Group. WHY EVALUATE?. It’s a burden Ancient history department

nicosia
Télécharger la présentation

Linda Morra Imas Independent Evaluation Group World Bank Group

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Experiences, Trends, and Challenges in Management of Development Assistance Operators for Effective Monitoring & Evaluation of Results Linda Morra Imas Independent Evaluation Group World Bank Group

  2. WHY EVALUATE? • It’s a burden • Ancient history department • No one wants to read those reports • We know the problems and are already fixing them

  3. We Evaluate Because … • Officials are accountable for their use of public funds • We are learning organizations “Many development schemes and dreams have failed. This is not a reason to quit trying. It is cause to focus continually and rigorously on results and on the assessment of effectiveness.” Robert B. Zoellick, President of the World Bank Group

  4. The Power of Measuring Results • If you do not measure results, you cannot tell success from failure. • If you cannot see success, you cannot reward it. • If you cannot reward success, you are probably rewarding failure. • If you cannot see success, you cannot learn from it. • If you cannot recognize failure, you cannot correct it. From Osbourne and Gaebler (1992) Reinventing Government

  5. How do you tell the difference between success and failure? • We need a monitoring system to track our key indicators so as to know if we are getting the change we have anticipated. • We need an evaluation system to tell us: - Are we doing the right things? - Are we doing things right? - Are there better ways of doing it?

  6. Adequate resources for evaluation People with the right skills Consultation with partners, beneficiaries, other key stakeholders Understanding of the theory of change Sound evaluation design Valid methods Relevant and accurate data collection– including baseline data Communication Lessons identification Recommendations and management action tracking system For Credible Evaluation…

  7. “The golden rule of evaluation is that is it should be protected from capture by program managers and donors.” Robert Picciotto, 2008. • There are trade-offs between internal and external evaluation • Using consultants is no guarantee of independence • Independence: organizational, behavioral, absence of conflict of interest, free from external influences • Independent evaluation and self-evaluation are complementary and synergistic

  8. Trends: The New Impact Evaluation Ingredients: • A skeptical public not convinced that aid makes a difference • Media questioning ability to provide evidence of attribution, e.g. NYT “…only 2% of WB projects properly evaluated.”  Demand for rigorous evaluations = RCTs aka “the Gold Standard” and, if not possible, quasi-experimental desgins  Campbell Collaboration (2000), Global Development Center (2001), Mexico legislation (2001), Poverty Action Lab (2003)

  9. Trends: Continued • Initiatives: Development Impact Evaluation Initiative (DIME), IFC & J-PAL conduct 32 TA impact evals, Spanish WB Trust Fund €10.4 million, Africa Impact Eval Initiative, 3IE, etc. • But useful for discrete interventions rather than broad multi-components, multi-participant country programs

  10. Evolution of Development Evaluation At same time • globalization/global public goods/MDGs=global partnership • demand for country evaluations not project • from attribution to relative contribution and additionality at the country level • growth of national evaluation associations and beginning of country reviews of donor assistance

  11. Challenges for Management of Development Evaluation • Promoting a mixed methods approach to project evaluation, supporting RCTs where appropriate (e.g. EES) • Determining role in impact evaluations + skills mix • Shifting resources to increasing joint evaluations • Measuring additionality • Mapping of country efforts • Methodology for aggregation in joint country and sector or thematic evaluations

  12. “Which road shall I take?” Alice asked the Cheshire cat. “Where do you want to get to?” the cat asked helpfully. “I don’t know,” admitted Alice. “Then,” advised the cat, “any road will take you there.” Lewis Carroll. Alice in Wonderland.

More Related