1 / 106

Evaluation Planning and Management John Wooten ~ Lynn Keeys

WELCOME. Evaluation Planning and Management John Wooten ~ Lynn Keeys. June 2012. Session 4: Evaluation Planning – Use Best Methods. Evaluation Road Map : Stage 1. Objectives: Overview highlights of performance management – conceptual and analytical frameworks

kumiko
Télécharger la présentation

Evaluation Planning and Management John Wooten ~ Lynn Keeys

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WELCOME Evaluation Planning and Management John Wooten ~ Lynn Keeys June 2012

  2. Session 4:Evaluation Planning –Use Best Methods Evaluation Road Map: Stage 1 Objectives: • Overview highlights of performance management – conceptual and analytical frameworks • Set baselines and targets • Determine evaluation design • Explore data collection/analysis options

  3. USAID PolicyRequires/Recommends: • Use BEST methods~ • Clear, Sound “Theory of Change” • (Development Hypothesis, Results/Logical Frameworks) • Adequate Performance Indicators, Baseline Data and Targets • Few Key Evaluation Questions • Appropriate Evaluation (Research) Design • Relevant/Affordable Data Collection Methods • Appropriate Data Analysis Methods

  4. 1. Clear, Sound “Theory of Change”…Development Hypothesis …Cornerstone of Performance Management Planning, organizing, assessing and reporting empirical evidence adequate to test a development hypothesis

  5. Clear, Sound “Theory of Change”…Development Hypothesis …a review A narrative description of the theory of change, logic and causal linkages between a hierarchy of results Based on sound development theory, knowledge, experience, and plausible linkages; not statistically accurate relationships Considers relevant risks and key assumptions May reflect other relevant development initiatives (e.g., other development actors) Examined and evaluated to assess, learn, adapt

  6. Development Hypothesis Where: “A” is within our “manageable interests”, and B, X, Y and Z are not If “A”….then “B”, assuming/presuming X, Y and Z

  7. Development Hypothesis Example Ifyou attend, proactively engage in and read the distributed materials for this course, thenyou will develop a good understanding of USAID’s EPM system, and be better prepared to perform your related roles and responsibilities, assuming we encounter no major, unforeseen circumstances

  8. Development Hypothesis As a Question: Would “B” likely result….if “A” occurred, assuming/presuming X, Y and Z ? As a Counterfactual: B would likely not have resulted without A occurring, given X, Y and Z …where counterfactualis the likely set of conditions that would have existed without the project intervention, i.e., what would have likely happened in the absence of the project intervention

  9. Good development hypotheses? If Government is not accountable and responsive, it will be more difficult to achieve stability and prosperity. ~~~~~ By proactively working with the Government and other donors and USG partners to mitigate conflict and increase security, the country will avoid that pitfall and increase the prosperity of the target region. A major development hypothesis underlying the SpO is that peace and security are necessary pre-conditions for sustainable development.

  10. Objective Hierarchy of Results Specific Objective • Highest Level Result (Impact) • Intermediate Level Result (Outcome) • Project Level Results (Outputs) • Activity Level Results (Inputs) Specific Objective A c t i v i t i e s Assuming….

  11. Hierarchy of Results Assumptions derived from… Risk analysis In response to risks, project designers either… • Modify the project design to account for risks, or • Postulate plausible assumptions regarding these risks Our job during implementation…. • Carefully, closely monitor known risks and assumptions • Help identify new, emerging risks and assumptions

  12. Hierarchy of Results Result statement… A significant, intended, and measurable change in the condition of a customer, or a change in the host country, institutions, or other entities that will affect the customer directly or indirectly. Derived from: • Rigorous investigation and analysisof the development condition to be improved • Applying multi-disciplinary expertise • Relying upon well-informed participatory insights from target beneficiaries/stakeholders

  13. What makes a good results statement? Use past tense… Quality Standards… • Measurable and “objectively verifiable” * • Meaningful and realistic • Focused on USAID’s strategic commitments • Customer or stakeholder focused and driven • Can be materially affected by the Operating Unit and its partners • Statement of results – not an activity or process • Uni-dimensional – not a combination of results; not stacked *

  14. Good results statements? % satisfied with social and medical services Standardized approach to prevention of HIV transmission related to intravenous drug users institutionalized and consolidated Capacity of the national TB system to manage TB and multi-drug resistant tuberculosis strengthened, and trans-border transmission reduced Systems of support for vulnerable children and families strengthened through the development of enhanced tools and the consolidation of best practices Delivery guidelines and tools for abandonment prevention and family-based care services compiled, reviewed and field-tested Package adopted at regional level and federal level and awareness raised on related child welfare issues

  15. Conveyingthe Development Hypothesis • The development hypothesis is rigorously analyzed and eventually conveyedin two strategic planning frameworks which serve as the basis for both our project design and preliminary evaluation design: • The Results Framework • The Logical Framework …a review

  16. Results Framework A planning, communications, and strategic management tool that conveys the development hypothesis, illustrating the cause-and­effect linkagesbetween a hierarchy of results to be achieved with the assistance provided. Includes the Intermediate Results (IRs) and sub-Intermediate Results (Sub-IRs) necessary to achieve the highest level result (whether funded by USAID or its partners) Includes critical assumptions that must hold for the development hypothesis to lead to the relevant outcome. Typically, it is laid out graphically and is narrated in a strategy or project design document

  17. Highest Level Result Targeted (Problem to be IMPACT-ED: Results Framework Assumptions?Risks? IF . . . . . . . THEN How? Why? Prerequisite Result (OUTCOME) Prerequisite Result (OUTCOME) Implementing Mechanism Implementing Mechanism Implementing Mechanism Implementing Mechanism Implementing Mechanism PROJECT OUTPUTS What Else? What Else? Timeframe? Achieve goal?

  18. What makes a good results framework? Quality standards… …and 2 questions per standard • Vertical “causal logic” among the results (cause and effect, bottom-up and top-down) • Horizontal logic -- consideration of the most significant “necessary and sufficient” conditions • Adequate consideration of external risks and assumptionsre. factors outside our “manageable interests” or control (Analyze per result, per set and per group of results, and ask risk assessment questions(ADS 220) • Modify the project design to account for risks? • Postulate plausible assumptions regarding risks?

  19. Results Framework Analysis… “Results” in: • Graphical basis for narrating project design • Identification of the most relevant contextual baseline data required • Specification of performance indicators at each level of the hierarchy • Decisions on the collection of all performance indicator baselines • Setting of performance indicator targets and interim targets …inputs for Performance Management Plan (PMP)

  20. Logical Framework Log Frame… • A rigorous methodology used for project design that focuses on the causal linkages between project inputs, outputs, purpose, and desired outcome (or goal) • Compliments the results framework • Where the results framework examines a project design in a hierarchal manner, the logical framework examines the design from a matrix perspective

  21. LogFrame Template

  22. DO (Problem to be Solved by Broader Set of Contributors) Cross-walk betweenProgram RF and LogFrame Mission RF Prerequisite Result (USAID’s Highest Contribution) Goal Prerequisite Result (USAID’s Highest Contribution) Project LogFrame Purpose EOPS Outputs IR Indicators Sub-IR (Project purpose) Sub-IR (Project purpose) Inputs Sub-IR Indicators Sub-IR Indicators

  23. Cross-walk betweenProject RF and LogFrame

  24. What makes a good LogFrame? Sample:http://www.fao.org/WAIRdocs/x5405e/x5405e0p.htm Quality standards… • The integrity and internalconsistency of the information included in the rows and columns • Do row level entries accurately relate to their row? (no spill-over) • Is the column cause-effect logic respected? (bottom-up, top-down) • Are all entries sufficiently clear that an objective person (e.g., future evaluation team members) would understand the skeletal project design?

  25. Results Framework and Logical Framework Limitations? When completed together… Serve as a clear guide to: • Narrate and specify the project design • Guide project procurement documents/actions • Guide selection of key evaluation questions * • Facilitate quick grasp of the crux of the design by: • New project managers Implementing partners • Stakeholders Evaluation teams * • Program auditors

  26. Results Framework and Logical Framework Implications for evaluation? What happens if / when… • DO team forgets its development hypothesis? • Has divergent perspectives? • Experiences significant staff turn-over? • Pursues impact evaluation questions not clearly or directly related to its development hypothesis? • Does not convey its development hypothesis throughout all planning/implementation docs? • IPs do not know, do not use or develop a different development hypothesis?

  27. 2. Adequate Performance Indicators, Baselines and Targets… …a review Performance Indicator: A particular characteristic or dimension used to measure, at any RF level, intended changes in defined results statement, any level Used to observe progress and measure actualvs. plannedresults Help answer howor if a project is progressing towards its objective(s),not whyprogress is or is not being made (Whynot?)

  28. Performance Indicator Types • Proxy • Rating Scale • Contextual • Standard • Custom • Quantitative • Qualitative • Milestone • Index

  29. Good Performance Indicator? Increase in the percentage of children ages 12 to 24 months in the target area who are fully vaccinated

  30. What is this?

  31. Performance Indicators with… • No baseline/target  Not dressed’ • Baseline only  ‘Half dressed’ • Target only  ‘Half dressed’ But with… • Baselines/targets/interim targets…  ‘Fully dressed,ready for work’!

  32. Performance Indicators… …when fully dressed, carry the directional and magnitudinal signal of targets Neutral IndicatorFully Dressed Indicator Incidence of crop infestation Cumulative 25% decrease by yr. 3 of 5 Number of micro-loans 100 loans of $X to 10 new groups

  33. To formulate indicators for a result, ask: Performance Indicators… Examples from your sectors? • How do we know it is a problem? Data • i.e., what data “evidences” the problem? (Where to find?) • What phenomenon are we observing? Indicator • i.e., what will we observe to ascertain whether the problem is mitigated or solved? 33 33

  34. Performance Indicator Quality Standards Uni-dimensional? • Objective • Practical • Useful for Management • Direct • Attributable to USAID efforts • Timely • Adequate

  35. Beyond Performance Indicator Data • Ponder: What happens when performance management • is limited to tracking primarilyperformanceindicatordata? Find time to Think Reflectively~ • Technical or management? (Technology adoption rates, technology variability and adaptability, emerging best practices) • Financial? (Per unit - per component costs, spread-effect, cost-sharing, financial sustainability factors and progress, etc.) • Contextual? (Beyond traditional context indicators) • Others?

  36. 2. Adequate Performance Indicators, Baselines and Targets… • Baseline: Value of a performance indicator before the implementation of USAID-supported projects or activities that contribute to the achievement of the relevant result. When/how to determine baselines? • Target: Specific, planned level of result to be achieved within an explicit timeframe. How to set targets/interim targets?

  37. Imagine...Life without baselines… Race with no start line~ Baseball diamond with no home plate~ Marriage with no ceremony~ YOU with no birthday~ Journey with no starting point~ Test with no minimum score~ Year with no New Year’s Day~ Country with no Independence Day~ Retirement with no retirement date~ Vacation with no start date~* Day without a sunrise~ Night without a moonrise~ No No No No Start Zone!

  38. Baselines, Ending the Mystery… Baselines are either: ! • Set to zero • Already known or determined • Preliminary during project design, previous project, sector assessment or pre-feasibility analysis •  Collected pre-implementation • Verify preliminary or establish new data •  Collected “on the fly” as implementation unfolds • On rolling basis, e.g., for phased projects •  Reconstructed ex post facto • During project evaluations • Forgotten or ignoredaltogether

  39. Baselines, Where do they come from? • Sector assessments, strategy development exercises • Past projects (follow-on) • Other donor projects • Stakeholder/beneficiary perspectives/experiences • Sector extension agencies and agents • Special baseline studies Early implementation output Ex post facto evaluation teams’ guesstimates Host government reports and plans Comparable regional data Expert panel opinions

  40. Special Baseline Studies • Not be overly complex or • Have a justifiable budget • Be replicable by local partners and use local expertise * • Not duplicate existing data; check established protocols • Focus on the information required for future assessment • Identify key risks and monitoring requirements * • Supply useful data for management and local partners • Address different gender perspectives • Be participative and locally-owned * • Provide adequate basis for judging development results Examine a set of characteristics of a target population at a particular time or under a particular set of conditions to establish a reference point("baseline") from which future change can be measured

  41. Special Baseline Studies • Detailed statement of work and budget • Careful selection of (local) analysts • Selection/training of enumerators, data entry persons, field supervisors • Thorough review of available/collection of new data • Quality control of field work • Conduct of institutional assessment(s) • Use of data to construct socio-economic-org. profiles • Fully documenting methodological approaches (Why x 2?) • Highlighting known data limitations • Drafting baseline report with recommendations for repeat study Require as much pre-planning and careful implementation as an evaluation:

  42. Targets and Interim Targets • Specific, planned level of result to be achieved within an explicit timeframe • Express the desired direction and magnitude of change in the targeted development condition How do you express your targets? (7 ways)

  43. Targets and Interim Targets Quality standards…

  44. Targets and Interim Targets Sounds familiar? Sources… • Insight from sector assessments/strategy dev. • Lessons from past projects (follow-on) • Lessons from other donor experience • Stakeholder/beneficiary past experiences • Sector extension agencies/agents • Established during early implementation • Government plans • Regional data • Expert opinions

  45. Setting Targets Know well… No short-cuts… …without a price~

  46. Targets and Interim Targets Tips… • Participative approaches increase corporate realism; minimize risks of harmful divisiveness • Estimating “without project” trend lines based on historical data could provide useful insight into the likely “bump up” the project may trigger • Careful performance monitoring suggests indicative “output production” rate for early target adjustments • Insight about extension/uptake factors suggests indicative “output-to-outcomes rate” per environment

  47. Targets and Interim Targets Tips… • IP proposal are instruments of cost-based competition; risky as basis for target-setting (~“bait and switch”) • Straight-line may be shortest distance between two points; straight-lining interim targets is unrealistic in development; so consider the “S-curve” approach • Setting of performance indicators, baselines and targets is subject to audit ! Audit-proof all key decisions and actions by judicious documentingandfiling

  48. More Key Evaluation Terms and Definitions • Noise • Reliability • Credibility  • Trustworthiness • Bias • Validity • Internal validity • Generalizability (external validity)

  49. 3. Few Key Evaluation Questions General factors to consider… • Development hypothesis, the principal/over-arching evaluation question reformulated as a hypothesis • Reformulate the development hypothesis as the principal/over-arching evaluation question • Formulate key questions based on evaluation triggers and “S-curve” timing • Identify appropriate sub-questions • Consider/use all three categories of evaluation questions: • Descriptive questions • Normative questions • Cause-and-effect questions

  50. Few Key Evaluation Questions • Descriptive questions: Seek to determine a “what is” snapshot Ex: What are current levels of crop productivity and production? Which farmers in target areas report highest levels of productivity and production increases? • Normative questions: Compare “what is” with “what should be” Ex: How do reported crop productivity and production levels compare with targeted/expected levels? (Inputs-to-outputs-to-outcomes focus) • Cause-and-effect questions: Determine what differences a project intervention makes -- attribution Ex: Did introduced farming and harvesting techniques significantly increase crop productivity and production levels? (Outcomes-to-impact focus)

More Related