180 likes | 385 Vues
PA546 23 February 2009. Evaluation: A View from the Field Monitoring & Evaluation There relations Designing a monitoring system . Critical Terms. Process Evaluation Monitoring Process Monitoring Outcome Monitoring M & E Monitoring (where a program is) & Evaluation (why it is there)
E N D
PA54623 February 2009 Evaluation: A View from the Field Monitoring & Evaluation There relations Designing a monitoring system
Critical Terms • Process Evaluation • Monitoring • Process Monitoring • Outcome Monitoring • M & E • Monitoring (where a program is) & Evaluation (why it is there) • Implementation focused vs Results focused
Questions • Why should program evaluations know about M & E? • What should they know? • Come up with a plan if you were building a M & E system for a food bank • Serving two masters – potential conflict between providing feedback for program improvement and deciding program worth
Who wants to know about M&E • Internal (?) evaluators • Who work with program managers and • Help design MIS • External evaluators • Who are responding to RFPs to design an M&E or • Who may recommend creating an MIS
Rossi’s perspective on M & E • Explains why program achieve their outcomes • Needed for external accountability • What was done • To whom • With what result
Kusek & Rist Perspective • Implementation focused M & E: answers questions of compliance • Results focused help answer • What are the goals of the organization • Are they being achieved • How can the achievement be demonstrated
Kusek & Rists Steps 1& 2- Highlights • Readiness for M & E • Who wants M&E system built? Why? • What is the capacity to measure, maintain & monitor? • Agreeing on outcomes • Who should be involved? • Make statement positive; 1 outcome/statement • Is there a plan for achieving the outcome?
Kusek & Rist: Step 4 – Selecting Indicators • Basic questions to be answered • How will we know we are successful • How will if we are moving in the right direction • Remember outcome indicators are not the same as outcome • Should measure all levels, e.g., inputs, activities and so on • Getting quality indicators: CREAM
Kusek & Rist: Setting Baselines • Take > 3 measures to establish baseline & trends • The basic questions • What are the data sources & collection methods? • Who will collect the data? How often? • What is the cost and difficulty to collect the data? • Who will analyze the data? Report the data? • Who will use the data? • Pages 83-86 discuss data sources & collection methods
Kusek & Rist: Results Targets • The number, timing and location of what is to be realized • What should be considered • Previous performance • Changes in capacity, e.g., funding & other resources • Limited time frame (the further out the more likely external events will have an impact)
Kusek & Rist: Monitoring Performance • Needs of a M & E system • Ownership – to assure sound data are generated, shared, and reported. • Management & Maintenance • Credibility • The data triangle: reliability, validity & timeliness
Kusek & Rist: Evaluation (at last) • Relationship to monitoring • Monitoring can raise evaluation questions (or the reverse) • Uses monitoring data, but different questions • Used by managers to • Decide on resource allocation or on competing alternatives • Rethink a problem; identify emerging problems • Answers management questions • Are the right things being done • Are they being done right • Are there better ways to reach our goals
USAID Perspective • Results framework process meant to • Have customer focus • Manage for results • Involve teamwork • Empowerment & accountability
Assumed value of Results Framework • Customer focus achieved b/c strategic objective benefits the customer • Managing for results b/c staff constantly focused on results than on activities • Requires effort by stakeholders to identify results & steps needed to achieve them • Empowers team to achieve results & holds them accountable for what is achieved
A set of results: NGO • Strategic Objective: NGOs use systematic approach to service delivery • Intermediate result: Improve mgt & adm. Capabilities • 4 lower level (prior to intermediate) results – intended to achieve intermediate results
Getting the data • 6 data sources • Creation of 3 databases to monitor activities • Consultant database • NGO data base • MIS – management, finances, sales • Plus each project keeps its own data
Self Assessment • Positives • Keeps staff from getting side tracked • Less stress on stakeholders about evaluation outcomes (than wt a traditional evaluation) • Less time consuming over time • Creates a longitudinal design • Negatives • May miss important factors • May not meet program monitoring needs • Evaluation process may dominate project • Staff involvement may distort data • Loss of evaluator objectivity
Next Class: March 9 • Randomized field studies • Read: Rossi & al., chapter 8; Morris 5 • React to Morris 5 – scenarios 1 & 2 including be able to • Describe the design • Assess the methodology • Identify & assess the problems in implementation