1 / 25

Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation and Gender in Fina

Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation and Gender in Finance and Private Sector & Agriculture and Rural Development : What, How and for Whom. Two new DIME programs. AADAPT-Impact Evaluation in Agricultural Adaptations

gwenifer
Télécharger la présentation

Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation and Gender in Fina

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation and Gender in Finance and Private Sector & Agriculture and Rural Development : What, How and for Whom

  2. Two new DIME programs • AADAPT-Impact Evaluation in Agricultural Adaptations • DIME-FPD-Impact Evaluation in Finance and Private Sector To: • Improve the quality of ARD and FPD operations • Build capacity for evidence-based policy in relevant government agencies • Generate knowledge on ARD and FPD policies and interventions

  3. …mainstreaming gender

  4. With global outreach • AADAPT: Africa, Latin America, MENA, South Asia • DIME-FPD: Africa, Europe, Latin America, MENA, South Asia

  5. DIME new business model • Strengthen analytical content of operations • Bank’s research team work with operations throughout the project cycle • Project develops set of operational and testable options • Options are tested and selected for scale up • Improve the quality of operations from design to completion. • Use result-based operations to demonstrate to governments how evidence-based policy-making works in practice

  6. H0w does it work?

  7. The evaluative process is not one-shot • The impact evaluation product delivers advice to clients at many points in the policy decision cycle. • Training and sharing of evidence during project preparation ensures prior evidence included in design. • Availability of data improves reporting and strengthens the monitoring function. • Testing operational alternatives guides implementation. • Measures of program effectiveness validate the assumptions in the results framework

  8. Align local incentives • Importance of local knowledge and the need for tailored solutions • Through a process of capacity development and facilitated discussion, each evaluation responds to a client-defined learning agenda. • Aligns the incentives for knowledge generation to local needs by focusing on client ownership and operational relevance. • Local knowledge feeds into the improvement of local programs.

  9. Capacity is built through • Formal training, • Networking with a larger community of practitioners, and • Learning by doing through joint evaluations with the Bank. This helps policy makers understand the tools put at their disposal and take ownership.

  10. Communities of practice • Clients become members of a club of peers from multiple countries and with access to international experts. • Periodic cross-country activities provide clients and Bank operations with a forum to compare and benchmark their results and learn from the experience of others. • Here some general lessons can be drawn out of local experience.

  11. To go from local to global knowledge

  12. Teasing out global lessons • Once a significant body of work is completed, DIME works to synthesize results. • The synthesis is facilitated when the individual evaluations share a common analytical and measurement framework

  13. What is Impact Evaluation? • Impact evaluation measures the effect of an intervention on outcomes of interest relative to a counterfactual (what would have happened in the absence of) • It identifies the causal effect of an intervention on an outcome separately from the effect of other time-varying conditions

  14. To do this we need a good counterfactual? • Treated & control groups have identical observed and unobserved characteristics • The only reason for the difference in outcomes is due to the intervention • How? • Assign intervention to some and not some other eligible populations on a random basis or on the basis of clear and measurable criteria • Obtain a treatment and a control group • Measure and compare outcomes in those groups over time

  15. What, How and for Whom? • Impact evaluation measures the effect of a program—accountability Does it work? • Can guide the implementation of a program by comparing alternatives—manage for results on the basis of rigorous evidence How? • Can help understand for whom it works For whom?

  16. The decision process is complex • A few big decisions are taken during design but many more decisions are taken during roll out & implementation • Large number of operational choices

  17. How can it be made to work? • Experimentally compare operational alternatives side by side • Scale up better options and discard others • Understanding how a policy can be made to work in a specific context is the key to success

  18. Does one size fit all? • Impact evaluation measures the average effects • Effects are different for different people (young/old, rich/poor, educated/non educated, male/female) • When we incorporate these dimensions in the analysis we learn how to shape interventions for different populations

  19. Why gender? • Incorporate gender dimensions in policy interventions and learn from impact evaluation how to make gender policy work for development • Much evidence suggest gender matters for development

  20. Venues for gender to affect development • Women have different preferences and take different decisions than men at home and for policy • Position of weakness in the household may reduce household overall productivity through unequal sharing of resources • Rules, constraints and disadvantages may reduce productivity in the economy

  21. Gender factors that can be addressed through policy • Perceptions • Differential access to land, inputs, capital, output markets • Traditional rules on duties, movement, household decisions • Different formal or informal rights on property

  22. How impact evaluation can help • Hypothesize factors that may induce inefficiencies in the context of your program • Think about what policy interventions may address them • Test policy alternatives rigorously • Impact evaluation will separately isolate the effect of a particular intervention from that of other interventions of factors • There is currently little impact evaluation evidence on gender differentiated program effects • AADAPT and DIME-FPD, in collaboration with the GAP, will support governments build the evidence

  23. How to measure gender differentiated effects • Measure differential effects on men and women for the same interventions • Larger samples • Different data collection strategy • Additional indicators • For each type of intervention, measure spillover effects on the targeted individual as well as other members of the households who may be affected (wife of the head, daughters) • or, Target men and women with different interventions and measure effects on men and women

  24. Conclusions • It is not enough to know if a policy works on average • We must know how to make it work better and for whom • Impact evaluation can help on all three counts • DIME-GAP collaboration is a step in this direction and available to help you do it

More Related