1 / 27

Duane Muller, USAID November 7, 2008

USAID’s Experience and Lessons Learned in Approaches used in Monitoring and Evaluating Capacity Building Activities. Duane Muller, USAID November 7, 2008 UNFCCC Meeting on Experiences with Performance Indicators for Monitoring and Evaluation of Capacity Building Rio De Janeiro, Brazil.

fawzi
Télécharger la présentation

Duane Muller, USAID November 7, 2008

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. USAID’s Experience and Lessons Learned in Approaches used in Monitoring and Evaluating Capacity Building Activities Duane Muller, USAID November 7, 2008 UNFCCC Meeting on Experiences with Performance Indicators for Monitoring and Evaluation of Capacity Building Rio De Janeiro, Brazil

  2. USG COMMITMENT TO CAPACITY BUILDING • Integral to development programs • Country driven approach • Useful lessons at the project level

  3. Managing for ResultsUSAID’s Experiences

  4. PERFORMANCE MANAGEMENT

  5. PERFORMANCE MANAGEMENT The systematic process of: • Monitoring the results of activities • Collecting and analyzing performance information • Evaluating program performance • Using performance information • Communicating results

  6. STEPS IN DEVELOPING A PERFORMANCE MANAGEMENT PLAN (PMP)

  7. MANAGING FOR RESULTS • Performance Indicators • Scale or dimension • Standard Indicators • Combination of output and outcome indicators • Measure direct, intended results • Custom Indicators • Meaningful outcome measures

  8. Direct measures Objective Plausible attribution Practical Disaggregated Quantitative CHARACTERISTICS OF GOOD INDICATORS

  9. Monitoring & Evaluation Different but complementary roles at USAID

  10. MONITORING Clarify program objectives Link project activities to their resources/objectives Translate into measurable indicators/set targets Collect data on indicators Report on progress EVALUATION Analyzes why and how intended results were/were not achieved Assesses contributions of activities to results Examines results not easily measured Explores unintended results Provides lessons learned/recommendations MONITORING AND EVALUATION

  11. EXPERIENCES WITH MONITORING USAID’s lessons learned

  12. 8 step process to collect monitoring data • Indicators/Definitions • Data source • Method: data collection • Frequency: data collection 5) Responsibilities: acquiring data 6) Data analysis plans 7) Plans for evaluations 8) Plans for reporting/using performance information

  13. EXPERIENCES WITH EVALUATION USAID’s Lessons Learned

  14. EVALUATION= POWERFUL LEARNING TOOL • Identifies lessons learned • Improves quality of capacity building efforts • Critical to understanding performance • Retrospective

  15. Analyzes why and how intended results were/were not achieved Assesses contributions of activities to results Examines results not easily measured Explores unintended results Provides lessons learned/recommendations ANALYTICAL SIDE OF PROJECT MANAGEMENT

  16. TYPES OF EVALUATION USED BY USAID

  17. TRADITIONAL EVALUATION • Donor focused and ownership of evaluation • Stakeholders often don’t participate • Focus is on accountability • Predetermined design • Formal evaluation methods • Independent/third party evaluators

  18. PARTICIPATORY EVALUATION • Participant focus and ownership • Broad range of stakeholders participate • Design is flexible • Focus on learning

  19. ASSESSMENTS • Quick and flexible • Trends and dynamics • Broader than evaluations

  20. METHODOLOGIES FOR EVALUATIONS • Scope of Work (SOW) • Interviews • Documentation Reviews • Field Visits • Key informant interviews • Focus group interviews • Community group interviews • Direct observation • Mini surveys • Case studies • Village imaging

  21. SUCCESSFUL EVALUATIONS= LESSONS LEARNED • Making the decision to evaluate • Ensuring Scope of Work is well thought-out • Finding the appropriate team • Ensuring the results are used

  22. PROGRAM ASSESSMENT RATING TOOL (PART)

  23. PART: Goals, Procedures and Results • Reviewing performance of US government programs • Program purpose and design • Strategic planning • Program management • Results • Standard questionnaire called PART • Results in an assessment and plan for improvement

  24. PART RATINGS Performing • Effective • Moderately effective • Adequate Not Performing • Ineffective • results not demonstrated

  25. Conclusions • Lessons learned/best practices for M&E • Project Level experiences • Cost effective • Timely • Ensure data is used • National experience= PART • Country driven approach to capacity building • Paris Declaration on AID Effectiveness

  26. ADDITIONAL RESOURCES Development Experience Clearinghouse • http://dec.usaid.gov Performance Management • A Guide to Developing and Implementing Performance Management Plans http://www.usaid.gov/policy/ads/200/200sbn.doc Evaluation Documents • Preparing an Evaluation Scope of Work http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby207.pdf • Conducting a Participatory Evaluation http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnabs539.pdf • Constructing an Evaluation Report http://pdf.usaid.gov/pdf_docs/PNADI500.pdf • Conducting Key Informant Interviews http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnabs541.pdf PART - http://www.whitehouse.gov/omb/part/ - http://www.whitehouse.gov/omb/expectmore/

  27. For further information: Duane Muller USAID EGAT/ESP/GCC Tel 1-202-712-5304 Fax 1-202-216-3174 Email: dmuller@usaid.gov Website: www.usaid.gov Keyword: climate change

More Related