1 / 20

Peer reviews: who is evaluating the evaluators?

Rob D. van den Berg, GEF Evaluation Office Mini-workshop IPDET, June 14, 2013. Peer reviews: who is evaluating the evaluators?. Who is evaluating the evaluators? What should they be evaluated on? What are the criteria on which they should be judged? How should they be evaluated?

holleb
Télécharger la présentation

Peer reviews: who is evaluating the evaluators?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Rob D. van den Berg, GEF Evaluation OfficeMini-workshop IPDET, June 14, 2013 Peer reviews: who is evaluating the evaluators?

  2. Who is evaluating the evaluators? What should they be evaluated on? What are the criteria on which they should be judged? How should they be evaluated? How to do a peer review? Overview

  3. Why has this question become important? • Increasing independence of evaluators (their performance is no longer assessed by management) • Increasing professionalization (who checks whether they are actually good professionals?) • Increasing coverage of evaluations (judging every program; but who judges the evaluators?) • What should they be evaluated on? • Conduct: transparency, lack of bias, no conflict of interest, ethics • Quality of work: design, implementation, reporting • Effectiveness and usefulness of evaluations • Etc… Who is evaluating the evaluators?

  4. OECD/Development Assistance Committee (DAC) Principles for Evaluation of Development Assistance (1991) • Evaluation Cooperation Group (ECG) GoodPractice Standards (ongoing) • UNEG Norms and Standards (2005) • Various Ethical Guidelines (UNEG, AEA) • International Best Practice? • Some communities of practice may have specific norms and standards: ALNAP? • See references at the end of the presentation What are the criteria?

  5. DAC, ECG and UNEG agree on the importance of independence, credibility and usefulness of evaluations • Independence: • Best international practice: ECG paper (“Template for Assessing the Independence of the Evaluation Function”) • Independence has building blocks: • First and foremost: independent evaluations – “functional” independence • Secondly: structural independence: reporting lines, budget, conflict of interest issues, etc. • Thirdly: administrative/logistical independence • ECG evaluation offices tend to have all three; bilateral units and UN units tend to have the first but have various degrees of independence at the structural and administrative levels Common elements (1)

  6. Credibility • Agreement on the need for high quality, but: • Specific quality norms and standards are relatively new • Transparency • Accountability for what has been done • Quality assurance • Usefulness • Focus on strategic issues • Evaluation coverage • Links between central and decentralized evaluations • Knowledge products / knowledge sharing Common elements (2)

  7. Self-assessment (done by many) Independent Review (UNESCO, WHO, WB, ADB) Quality Assurance (many) Evaluation (FAO/2007) Advisory / Oversight Panels (several) Professional Peer Reviews (several) A combination of the above (many) How should they be evaluated?

  8. UNEG Peer Reviews: • UNDP (2005) • UNICEF (2006) • WFP (2007) • OIOS (2008) • GEF (2009) • UNIDO (2010) • UN-Habitat (2011) • UNEP (2011) • FAO (2012) • UNDP (2012) • ECG Peer Reviews • IFAD (2011) • Bilateral Peer Reviews • Belgium, Germany Peer reviews

  9. UNDP and UNICEF were pilot peer reviews, leading to: DAC/UNEG Framework for Peer Reviews 2006 UNEG Framework for Peer Reviews 2011 Shift in ownership; peer reviews are now perceived as a UNEG instrument UNEG has established a peer review task force and is willing to provide funds to smaller UN agencies Collaboration with DAC continues UNEG Framework for peer reviews

  10. Governing body or senior management need to ask for peer review • UNEG Task Force assembles a panel of peers in consultation with the organization to be peer reviewed • Early panels were big and aimed for representation of stakeholders and professional community • Panel writes TORs and assembles the budget • TORs follow from peer review framework but need to focus on specifics of the organization (mandate, role, modalities etc.) • Budget divides out in budget for organization and budget for peer panel • Internal budget of organization: time of staff, meeting rooms etc. • External budget: panel members and advisor/consultants that need to be funded • Funding of external budget: is shifting from bilateral donors to organization that is peer reviewed • Budgets are coming down from about $200k to less than $100k How to do a peer review (1)

  11. The panel writes a “normative framework” on which the evaluation function will be reviewed The evaluation function performs a self-assessment on the basis of the normative framework The panel studies the self-assessment and visits the organization for interviews Some early peer reviews conducted field visits which were abandoned due to costs and issues of representativeness (GEF: peer review made statements on interactions with stakeholders on the basis of two country visits, whereas the GEF is active in more than 150 countries) How to do a peer review (2)

  12. Panel prepares further work on the basis of the self-assessment, first visit and evaluation reports • Second visit: further interviews to fine-tune findings and fill gaps • Since 2011: “peer exchange” session of peers with staff of evaluation unit • Usefulness of this confirmed in FAO and UNDP peer reviews • Draft peer review report sent to organization for factual error checking and errors of analysis • Final peer review report presented to organization • Management response and follow-up action How to do a peer review (3)

  13. Functional independence is universally recognized • Many organizations feel it is sufficient to externalize evaluations: outsource them, establish independent teams • This poses problems for both the credibility and the usefulness of evaluations • Coherence between evaluations • Links between central and decentralized evaluations Findings: independence

  14. Structural Independence is problematic • UN governance forbids full structural independence (?!?) • Next best option is for CEO to delegate authority to evaluation function • UN agencies have been moving in the direction of structural independence thanks to peer reviews • Administrative/logistical independence • The details… continue to cause problems… Findings: independence (2)

  15. Level of professionalism should be further improved • Half of staff of evaluation units does not have evaluation training • Many evaluations still follow the “expert” model for evaluations • Final responsibility for the evaluations should shift to the evaluation function • Theory-based evaluations are increasing • Comparability between evaluations: problematic • Evaluations at the central level should build on findings from decentralized evaluations Findings: credibility

  16. Strategic focus of evaluations can be improved Expert evaluations always end up with recommendations to increase the priority of the area/sector evaluated Link to RBM system of the organization is often weak Stakeholder involvement has increased over time but could be further strenghtened New ways to share knowledge are not yet fully incorporated Usefulness

  17. In 2012 UNEG and DAC commissioned a “lessons learned” study on the peer reviews • Report was presented and discussed at 2013 Annual General Meeting of UNEG • Main finding: peer reviews have been successful in strengthening evaluation function • Independence further strengthened • Credibility increased • Usefulness: reviews have been agenda setting • Refocus programming to increase strategic value of evaluations • Better linkages with RBM and decentralized evaluations Lessons learned

  18. Peer reviews are voluntary: they need to become obligatory • Funding from a common pool in the UN? • Peer reviews should be part of a larger system to promote professionalism • It is one instrument amongst many (self-assessment, reviews, evaluation, quality assurance; for the future: accreditation?) • Emerging new “best practice“ on how to do peer reviews Issues for the future

  19. Smaller panels • Early panels had 6 to 8 members; now 4 is seen as the ideal size • More emphasis on the self-assessment; higher investment of evaluation office, but creates better understanding at the start • One instead of two peer review visits; extra emphasis on the “peer exchange” session • Shorter reports focusing on strategic issues Towards a new model for Peer Reviews

  20. OECD/DAC Principles for Evaluation of Development Assistance (1991) • http://www.oecd.org/development/evaluation/2755284.pdf • Evaluation Cooperation Group (ECG) Good Practice Standards (ongoing) • www.ecgnet.org – go to “key documents” • UN Evaluation Group (UNEG) – Norms and Standards (2005) • www.uneval.org go to “norms standards” • For peer reviews: go to “Papers and Publications” and “UNEG-DAC Peer Reviews” • AEA: www.eval.org • ALNAP: www.alnap.org • IFAD peer review: www.ecgnet.org go to “ECG evaluations” References

More Related