1 / 38

Re-thinking the Impact of Humanitarian Aid

Re-thinking the Impact of Humanitarian Aid. 24 th Biannual Meeting Berlin, 2 nd December 2008. ALNAP 24 th Biannual Aims. To help clarify key issues around use of humanitarian impact assessments

Télécharger la présentation

Re-thinking the Impact of Humanitarian Aid

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Re-thinking the Impact of Humanitarian Aid 24th Biannual Meeting Berlin, 2nd December 2008

  2. ALNAP 24th Biannual Aims • To help clarify key issues around use of humanitarian impact assessments • To move towards a shared understanding of the limits and possibilities of humanitarian impact assessment • To use this understanding to outline a practical vision for future work in this area 24th ALNAP Biannual Meeting, December 2008

  3. Overview • Humanitarianism ‘transformed’ and the emergence of impact assessment • Challenges for assessing impact • Conclusions 24th ALNAP Biannual Meeting, December 2008

  4. Humanitarianism Transformed Three broad trends since the 1980s contribute to the current interest in and debate around humanitarian impact assessment: • Humanitarian aid expanded and politicised • Humanitarian aid institutionalised and professionalised • Changing nature of vulnerability and human suffering 24th ALNAP Biannual Meeting, December 2008

  5. Increase in initiatives, across and within agencies • Cross agency efforts include SMART; HNTS; TRIAMS; Fritz Humanitarian Impact Project; the CDA Listening Project; ALNAP HPP; SPHERE • Quality Compass; ECB ‘Good Enough’ Guide to impact measurement; DEC’s new Accountability Framework etc • WFP, ECHO and UNICEF all include impact in their evaluation guidelines; ActionAid’s ALPS; Save the Children UK’s GIM 24th ALNAP Biannual Meeting, December 2008

  6. But despite considerable progress, problems remain. The Biannual background paper has identified six key broad challenges: • Defining impact assessment • Diverse stakeholders and interests • Indicators, baselines and data • Methodologies • Collective interpretation and analysis • Capacities and incentives 24th ALNAP Biannual Meeting, December 2008

  7. 1. Defining ‘impact’ and impact assessment 24th ALNAP Biannual Meeting, December 2008

  8. Ideal picture of impact: the MDG Goal of universal primary education by 2015 24th ALNAP Biannual Meeting, December 2008

  9. Reality of impact is rather different… “…Universal primary education would be achieved at present rates of progress in 2079 in sub-Saharan Africa and in 2036 in the Middle East and North Africa…” Social Watch 2007 24th ALNAP Biannual Meeting, December 2008

  10. Real world impact is complex Output Outcome Activity Impact 24th ALNAP Biannual Meeting, December 2008

  11. and hard to discern, even a long time after the fact 24th ALNAP Biannual Meeting, December 2008

  12. Attribution or Contribution? Private Sector Other NGOs A.N. NGO Community and Family Local partners Religious organisations Developing Country Govmts Civil Society 24th ALNAP Biannual Meeting, December 2008

  13. A widely recognised definition of impact assessment “…Impact assessment is the systematic analysis of the lasting or significant changes – positive or negative, intended or not – in people’s lives brought about by a given action or series of actions….” Novib/Oxfam research project, reported in C. Roche, Impact Assessment for Development Agencies 24th ALNAP Biannual Meeting, December 2008

  14. Humanitarian action has its own challenges • Lack of clarity on IA definition and purpose • Rapidly changing humanitarian contexts • No consensus on objectives of humanitarian aid • Intended impacts of interventions often unclear or overambitious 24th ALNAP Biannual Meeting, December 2008

  15. Development Considerable Lead Time Deliberate & pro-active Will take time, be thorough, extensive with comprehensive data collection Location chosen Duration planned Beneficiary population identifiable and static IA goals may be made compatible with socio-economic ones Humanitarian Sudden onset Reactive May need to be partial in coverage Unpredictable location Uncertain duration Beneficiary population heterogeneous and dynamic Priority given to “life saving” activities sometimes difficult to reconcile with IA goals Contextual differences between normal development aid and humanitarian aid situations 24th ALNAP Biannual Meeting, December 2008

  16. 2. Diverse stakeholders, interests and objectives 24th ALNAP Biannual Meeting, December 2008

  17. 24th ALNAP Biannual Meeting, December 2008

  18. The challenge of stakeholders (illustrative) UN agencies National and local partners Donors Media Affected Population International NGOs Private Sector Red Cross / Red Crescent Military Political authorities 24th ALNAP Biannual Meeting, December 2008

  19. Different stakeholders have different perceptions of and interests in impact • Enabling different stakeholders to express divergent views of impact is crucial to successful impact assessment • IA findings more likely to be used they meet the interests of the end users 24th ALNAP Biannual Meeting, December 2008

  20. Different needs may not be reconcilable and achievable in a single impact assessment: Accountability or Learning? • “The purpose of most impact assessments is to demonstrate past impact and to improve future practice, and there may be tension between the two.” • Often, too much is expected: “If we continue to expect evaluation to cover most of the accountability needs of the sector, we will be disappointed” (Sandison, P. 2006). 24th ALNAP Biannual Meeting, December 2008

  21. 3. Indicators, baselines and data 24th ALNAP Biannual Meeting, December 2008

  22. Indicators, Baselines and Data for humanitarian IA Identifying impact indicators involves value judgements about what kinds of changes are significant for whom (Roche, C. 2000) 24th ALNAP Biannual Meeting, December 2008

  23. ‘The familiar adage [“you can lead a horse to water, but you can’t make it drink”] illuminates the challenge of committing to outcomes. The desired outcome is that the horse drinks the water. Longer-term outcomes are that the horse stays healthy and works effectively. But because program staff know they can’t make a horse drink water, they focus on the things they can control: leading the horse to water, making sure the tank is full, monitoring the quality of the water, and keeping the horse within drinking distance of the water. In short, they focus on the processes of water delivery rather than the outcome of water drunk. Patton, M. 1997:157-8 24th ALNAP Biannual Meeting, December 2008

  24. “Reports were so consistent in their criticism of agency monitoring and evaluation practices that a standard sentence could almost be inserted into all reports along the lines of: It was not possible to assess the impact of this intervention because of the lack of adequate indicators, clear objectives, baseline data and monitoring.” (ALNAP, 2003) 24th ALNAP Biannual Meeting, December 2008

  25. Issues include… • Weak or non-existent baselines • Data is often unavailable or unreliable • Data collected is mainly quantitative • Monitoring systems focus on process and outputs 24th ALNAP Biannual Meeting, December 2008

  26. 4. Methodologies 24th ALNAP Biannual Meeting, December 2008

  27. Wealth of tools, techniques and approaches are available • Documentary analysis • Interviews • Questionnaires (including recipient perceptions surveys) • Monitoring • Ex-post evaluation • Case studies • Participatory Rapid Appraisal (PRA) • Experimental / quasi-experimental 24th ALNAP Biannual Meeting, December 2008

  28. Qualitative versus quantitative • Any research design is shaped by both opportunities and constraints • Quantitative methods able to tackle ‘what’ and ‘where’ questions • Qualitative methods able to answer ‘why’ and ‘how’ questions, and are good at capturing process 24th ALNAP Biannual Meeting, December 2008

  29. 24th ALNAP Biannual Meeting, December 2008

  30. Mixed methods approaches can take into account, rather than dismiss, the complexity of assessing humanitarian impact 24th ALNAP Biannual Meeting, December 2008

  31. 5. Collective interpretation and analysis 24th ALNAP Biannual Meeting, December 2008

  32. Improved interpretation and analysis of data through engagement with affected populations and other stakeholders • Humanitarian impact should not only be about providing more and better information, but also about making sure that findings are used in ways that improve the lives of affected populations • Wider stakeholders should also be engaged in this process • “Learning partnerships” for impact assessment 24th ALNAP Biannual Meeting, December 2008

  33. To date, participation by and accountability to affected populations has not been a key feature of impact assessments Attempts to improve this include: • ECB ‘Good Enough Guide’ • The Quality Compass • Feinstein International Center Participatory Impact Assessments (PIA) 24th ALNAP Biannual Meeting, December 2008

  34. 6. Capacities and incentives 24th ALNAP Biannual Meeting, December 2008

  35. Capacities and Incentives for improved humanitarian impact assessment • Lack of individual and organisational capacity to do good impact assessments • TORs are often unclear • objectives are not defined clearly within the context of the intervention • stakeholder analysis is limited • timing relates to institutional priorities rather than humanitarian need • skills relating to impact assessment methodologies are lacking • Contributing factors: high staff turnover; lack of a learning culture; inadequate investment and resources 24th ALNAP Biannual Meeting, December 2008

  36. There is a considerable lack of incentives • Institutional incentives can override humanitarian ones; too few incentives to conduct good impact assessments; results-based approaches can create perverse incentives • A number of cultural barriers and biases that hinder good quality humanitarian impact assessment 24th ALNAP Biannual Meeting, December 2008

  37. Recap: six challenges • Defining impact assessment • Diverse stakeholders and interests • Indicators, baselines and data • Methodologies • Collective interpretation and analysis • Capacities and incentives 24th ALNAP Biannual Meeting, December 2008

  38. Conclusions “…Taken as a whole, the humanitarian system has been poor at measuring or analysing impact, and the introduction of results-based management systems in headquarters has yet to feed through into improved analysis of impact in the field… it is arguable that there has been significant under-investment in evaluation and impact analysis…” (Hoffman C.A. et al, 2004) • Our review gives little indication that there has been much movement from the position above, articulated in 2004 • Can we move forward? How? 24th ALNAP Biannual Meeting, December 2008

More Related