1 / 25

The Effectiveness Agenda AusAID’s emerging response

The Effectiveness Agenda AusAID’s emerging response. Policy Context. PM’s pledge to double aid by 2010 White Paper Regional context AusAID 2010 International agenda MDG 8; Paris Declaration; MfDR Social accountability agenda Policy coherence A strong emphasis on Performance:.

ranee
Télécharger la présentation

The Effectiveness Agenda AusAID’s emerging response

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Effectiveness AgendaAusAID’s emerging response

  2. Policy Context • PM’s pledge to double aid by 2010 • White Paper • Regional context • AusAID 2010 • International agenda • MDG 8; Paris Declaration; MfDR • Social accountability agenda • Policy coherence • A strong emphasis on Performance:

  3. AusAID’s response: new policies • Upgrade country strategies • Align programs with partner systems • Strengthen link between performance and allocations • Strengthen evaluation

  4. AusAID response: New Systems Performance Assessment Framework Annual Review of Development Effectiveness Annual strategy reviews at program level (APPUs) New Quality Reporting System at activity level Backwards and forward linkages and management reporting 4

  5. AusAID response: New People… Senior performance advisers from World Bank and DFID New Performance and Quality Network across country offices New Sector advisers in country offices Increased internal contestability through thematic groups …and New Systems… 5

  6. PAF Performance Assessment Framework • PAF measures the effectiveness of all Australian Aid • PAF focuses on three sets of questions: RESULTS 1. How is the country performing in terms of its development objectives? 2. Is the country strategy on track to achieve its objectives? 3. For each of these major program objectives, what have been the main achievements over the year? 4. What are the main outputs that have been delivered by the program over the year? 5. What evaluations have been conducted and what results did they report? 6. Should the objectives of the country strategy be changed? QUALITY 7. How did initiatives score against quality ratings at entry, implementation and completion? 8. List up to five main issues which prevent initiatives recording higher quality scores. WHITE PAPER IMPLEMENTATION 9. What progress has been made against the White Paper’s gender equality principle? 10. What progress has been made against the strategy’s Anti-Corruption Plan? 11. How much is spent by sector, sub-sector, and form-of-aid? 12. How many joint donor missions and analyses are undertaken? 13. What is the proportion of spending on technical assistance? 14. How does the program make use of national systems? • The questions are answered through the PAF reports • ARDE • APPU • SOTS • QRS: QAE, QAI, QAC Guidance? Refer to : Home - Office of Development Effectiveness

  7. Performance Assessment Framework PERFORMANCE ASSESSMENT FOCUS TOOLS AND PROCESSES Annual Review of Development Effectiveness Agency Annual Program Performance Update State of the Sector Reports Program Sector Country Strategy Initiative Activity Program Initiative Manager + Supervisor Quality Reporting System (QRS)

  8. ARDE The Annual Review of Development Effectiveness • Focus • Whole of the aid program • Including other GOA agencies • High level results • Thematic and Strategic • Independent review • AusAID Management responds • Reporting • to the Development Effectiveness Steering Committee (DESC) • Can focus on key issues • Could include key countries or regions, key sectors Quality Reporting System

  9. APPU Annual Program Performance Update • Production: • By each country program • Focus: • The Country Strategy • Not individual initiatives • Scope: • Summarises performance over the year • Based on the 14 performance questions • Report Format. Three main sections: • Results: What has been achieved? • Quality : good practice followed? • Commitments: Achievements against White Paper objectives? Quality Reporting System

  10. SOTS State of the Sector Report • Objective: • To examine performance against sectoral and thematic policy objectives (eg new Gender or Education policy) • Production • Thematic Groups; Sector and Cross-Cutting • Focus • Sector performance in the Aid Program • Sector performance in Country Programs • Scope • Assessment of appropriateness of policies and strategies • Identification of ways of strengthening program outcomes • Report Format. Three sections: • Assessment of progress against high level outcomes • Assessment against policy objectives identified in sector and thematic strategies • Discussion of lessons learned (positive and negative) Quality Reporting System

  11. The Quality Reporting System(QRS) MEASURING EFFECTIVENESS AT THE ACTIVITY LEVEL Quality Reporting System 11

  12. QRS - Key Concepts QRS – KEY CONCEPTS • Aid Quality Principles • Five Principles • Aid Quality Ratings • Six point Scale Quality Reporting System 12

  13. Five Aid Quality Principles All Australian aid initiatives are expected to: Achieve clearly stated objectives that contribute to higher level objectives in the program strategy Effectively measure progress towards meeting objectives Continually manage risks Appropriately address sustainability, with due account of partner government systems, stakeholder ownership and phase out Be based on sound technical analysis and continuous learning Five Aid Quality Principles Quality Reporting System 13

  14. QRS - Ratings QRS Ratings The performance of each initiatives is rated on a six point scale: • At Entry • During Implementation • At Completion The QRS provides the data for AusAID to answer two key performance questions: • How many initiatives are Satisfactory at Entry, Implementation and at Completion? • What action is needed to lift performance to the next rating? Quality Reporting System

  15. Aid Quality Ratings Rating Scale 1-6 Clear distinction between Satisfactory and Not Satisfactory • Very high quality; needs ongoing management and monitoring only • Good quality; needs minor work to improve in some areas • 4. Adequate quality; needs some work to improve Satisfactory(Above the Line) “The Line” • Less than adequate; needs to improve in core areas • Poor quality; needs major work to improve • 1. Very poor quality; needs major overhaul Not Satisfactory(Below the Line) PAF - Quality Reporting System 15

  16. Quality Reporting System Applies to all initiatives >$3m or of strategic importance Consists of reports on: Quality at Entry (QAE) Quality at Implementation (QAI) Quality at Completion (QAC) Time Line for QRS Reporting: QAE QAI QAI QAI QAC QAE YR1 YR2 YR3 YR4+ Design/ Appraisal Design/Appraisal for next activity Implementation Completion Quality Reporting System 16

  17. Quality at Entry (QAE) The QAE provides a Performance Assessment before the Initiative commences: A reference point during implementation The QAE Report: Mandatory for all initiatives >$3m or of strategic importance Integral to Design and Appraisal The last step before approval for implementation Should focus the Peer Review on key issues Tool for risk management, monitoring and work planning across the Program QAE report should be considered during design and appraisal, and finalised during the Peer Review 17 Quality Reporting System Refer to Instruction: How do I Conduct a Peer Review and Complete a Quality at Entry Rating?:Shared Sites/DPAG

  18. Quality at Implementation (QAI) Mandatory quality reporting: Reviewed annually Part of the APPU process Updates prepared At key times/events as part of program management February/March each year as part of the APPU process Annual Plans, TAGs, Mid Term Reviews, etc. If status significantly changes, improves or gets much worse Prepared by program teams But contested in AusAID Needs to be planned for In Post work programs In portfolio management plans based on assessed risk Reporting to Management on: Ratings of Performance Risk Quality Reporting System 18 • Home - Operations Policy and Management Unit

  19. Quality at Implementation Report Initiative Name: ___________________ Start date: ____________________ End Date ___________ Total amount (FMA 9) _____________ Amount spent to date _______________ 19 Home - Operations Policy and Management Unit

  20. QAI Data – What happens to it? • Compiled into QAI Summaries: • Ongoing reference for activity and initiative managers • Country Program Summaries • Used in development of the APPU • Sector Summaries • For development of the State of the Sector Reports • Other Summaries • White Paper themes – Gender, Anti-corruption, Partnerships • Multilaterals • Reports containing Summaries of QAI data are located at: • From September 2007, QAI reports will be located on Aidworks • Home - Operations Policy and Management Unit Quality Reporting System

  21. Year One – Preliminary results • Strategy level (APPUs) • Annual Program Performance Updates for key country programs • Big leap forward • But very resource intensive • High level review of APPUs with external participation • Australian government central agencies • NGO representatives • Senior reps from donor partners • Formal PAF review to assess outcomes Quality Reporting System

  22. Year One – Preliminary results • Activity level (QRS) • Big lift in discussion of quality • Good coverage across program types • Country and sector quality ‘maps’ will be valuable management tool • Lessons: tougher objectives; better M&E; sustainability; relevance are key issues • Quality at Entry needs more attention • Better evidence and more independent review to strengthen confidence in QAI assessments • Share assessments with partners • Formal PAF review to assess outcomes Quality Reporting System

  23. Implications for partners • Strategy level: • Engagement with strategy process and consultations • Link activities to higher-level strategy outcomes • Activity level • Stronger results focus • Tighter objectives & key indicators • Contributing to ongoing assessments • Linking reporting to templates Quality Reporting System

  24. Conclusion • New drive on effectiveness agenda • Want our aid to be all that it can be • Good initial results • New systems getting traction • Creating space for discussion of effectiveness • Identifying weaknesses (and strengths) Quality Reporting System

  25. Conclusion • More to do: • Strengthen coverage and rigour • More involvement from partners • More independent review • Stronger internal and external contestability • From systems to culture • Development not linear • Lifting our gaze from design and contract implementation to development impact • Ongoing senior level engagement by all partners • Clear shared vision of objectives: what success (and failure) would look like & how we track progress. • Questions/comments? …Thank you Quality Reporting System

More Related