1 / 18

PA546 23 February 2009

PA546 23 February 2009. Evaluation: A View from the Field Monitoring & Evaluation There relations Designing a monitoring system . Critical Terms. Process Evaluation Monitoring Process Monitoring Outcome Monitoring M & E Monitoring (where a program is) & Evaluation (why it is there)

patch
Télécharger la présentation

PA546 23 February 2009

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PA54623 February 2009 Evaluation: A View from the Field Monitoring & Evaluation There relations Designing a monitoring system

  2. Critical Terms • Process Evaluation • Monitoring • Process Monitoring • Outcome Monitoring • M & E • Monitoring (where a program is) & Evaluation (why it is there) • Implementation focused vs Results focused

  3. Questions • Why should program evaluations know about M & E? • What should they know? • Come up with a plan if you were building a M & E system for a food bank • Serving two masters – potential conflict between providing feedback for program improvement and deciding program worth

  4. Who wants to know about M&E • Internal (?) evaluators • Who work with program managers and • Help design MIS • External evaluators • Who are responding to RFPs to design an M&E or • Who may recommend creating an MIS

  5. Rossi’s perspective on M & E • Explains why program achieve their outcomes • Needed for external accountability • What was done • To whom • With what result

  6. Kusek & Rist Perspective • Implementation focused M & E: answers questions of compliance • Results focused help answer • What are the goals of the organization • Are they being achieved • How can the achievement be demonstrated

  7. Kusek & Rists Steps 1& 2- Highlights • Readiness for M & E • Who wants M&E system built? Why? • What is the capacity to measure, maintain & monitor? • Agreeing on outcomes • Who should be involved? • Make statement positive; 1 outcome/statement • Is there a plan for achieving the outcome?

  8. Kusek & Rist: Step 4 – Selecting Indicators • Basic questions to be answered • How will we know we are successful • How will if we are moving in the right direction • Remember outcome indicators are not the same as outcome • Should measure all levels, e.g., inputs, activities and so on • Getting quality indicators: CREAM

  9. Kusek & Rist: Setting Baselines • Take > 3 measures to establish baseline & trends • The basic questions • What are the data sources & collection methods? • Who will collect the data? How often? • What is the cost and difficulty to collect the data? • Who will analyze the data? Report the data? • Who will use the data? • Pages 83-86 discuss data sources & collection methods

  10. Kusek & Rist: Results Targets • The number, timing and location of what is to be realized • What should be considered • Previous performance • Changes in capacity, e.g., funding & other resources • Limited time frame (the further out the more likely external events will have an impact)

  11. Kusek & Rist: Monitoring Performance • Needs of a M & E system • Ownership – to assure sound data are generated, shared, and reported. • Management & Maintenance • Credibility • The data triangle: reliability, validity & timeliness

  12. Kusek & Rist: Evaluation (at last) • Relationship to monitoring • Monitoring can raise evaluation questions (or the reverse) • Uses monitoring data, but different questions • Used by managers to • Decide on resource allocation or on competing alternatives • Rethink a problem; identify emerging problems • Answers management questions • Are the right things being done • Are they being done right • Are there better ways to reach our goals

  13. USAID Perspective • Results framework process meant to • Have customer focus • Manage for results • Involve teamwork • Empowerment & accountability

  14. Assumed value of Results Framework • Customer focus achieved b/c strategic objective benefits the customer • Managing for results b/c staff constantly focused on results than on activities • Requires effort by stakeholders to identify results & steps needed to achieve them • Empowers team to achieve results & holds them accountable for what is achieved

  15. A set of results: NGO • Strategic Objective: NGOs use systematic approach to service delivery • Intermediate result: Improve mgt & adm. Capabilities • 4 lower level (prior to intermediate) results – intended to achieve intermediate results

  16. Getting the data • 6 data sources • Creation of 3 databases to monitor activities • Consultant database • NGO data base • MIS – management, finances, sales • Plus each project keeps its own data

  17. Self Assessment • Positives • Keeps staff from getting side tracked • Less stress on stakeholders about evaluation outcomes (than wt a traditional evaluation) • Less time consuming over time • Creates a longitudinal design • Negatives • May miss important factors • May not meet program monitoring needs • Evaluation process may dominate project • Staff involvement may distort data • Loss of evaluator objectivity

  18. Next Class: March 9 • Randomized field studies • Read: Rossi & al., chapter 8; Morris 5 • React to Morris 5 – scenarios 1 & 2 including be able to • Describe the design • Assess the methodology • Identify & assess the problems in implementation

More Related