1 / 18

Evaluating the Options

Evaluating the Options. Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies. What should we do?:. What was the impact?:. What are we doing? How. ·. ·. Policy Analysis. Outcome evaluation. does it work?:. ·. Benefit.

weldon
Télécharger la présentation

Evaluating the Options

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies

  2. What should we do?: What was the impact?: What are we doing? How · · Policy Analysis Outcomeevaluation does it work?: · Benefit - cost analysis · Performance · Process Evaluation · Cost - Effectiveness Measurement · Needs assessment Activities, Policies, and Programs Outcomes Outcomes Time After During Before

  3. WSIPP: “Return on Investment: Evidence-Based Options to Improve Statewide Outcomes” • What information do we get from WSIPP study? • How did they create it? • What principles can we take away for our predictions?

  4. Benefits: To whom? For what period?

  5. Costs: What’s included? How can they be positive?

  6. Summary stats: What are they? How are they different?

  7. Net present value (WSIPP 2011, appxII p.6)) Q is how much of outcome you get with program P the value of the outcome C is cost of program Dis is discount rate

  8. Benefit-Cost Ratio (WSIPP 2011, appendix II p.6) Internal Rate of Return discount rate at which NPV is zero

  9. WSIPP study elements: • Meta-analysis: averages results across multiple studies to get program impacts Q (appendix I) • Estimates monetary private gains to participants, public value of avoiding outcomes like abuse and crime, and private value of not being victim, P • Puts these together to get impact on outcomes over life with discount • Benefit Cost analysis adds up benefits and costs

  10. From WSIPP case: • What outcomes will you account for? • Impacts on who? • (state or local budget, participants, by-standers) • What time frame will you use? • (discounting and NPV) • How will you weight multiple sources of evidence given its qualities? • How will you add it all up? • Sensitivity analysis of results? • What criteria “count” other than monetized?

  11. General starting points for predictions: • Need detailed descriptions of the alternatives (but not TOO detailed) • Focus on key impacts and most important costs • Common metrics (dollars, DALYS, etc.) are useful if they capture key outcomes • May need to adjust estimates for your scale or context • Get the best possible evidence—won’t get perfect information • Need to understand strengths and weaknesses of your evidence and communicate it

  12. Where to get program evaluation evidence on costs and impacts (from Hatry): • Previous experience with similar changes • Pilot study in your organization • Information from other organizations that implemented similar policies (program evaluations) • Academic or think tank studies (academic journal search and web search) • Modeled or “engineered” estimates • Theories and logical inference about causal connections (tragically leads to “high,” “medium,” or “low”!) WEAKEST!

  13. Does the evidence from elsewhere apply to your organization (external validity)? • Is the policy or political context different in important ways? • Are the economic conditions different? • Is the target of the policy (e.g., client population or location) different in critical ways? • Would the policy or program be implemented in the same way? To the same scale? You must assess the severity of the differences and predict their impacts on your outcomes.

  14. Sources of uncertainty in estimates: • Validity of comparison and study methodology (see WSIPP report) • Statistical uncertainty (randomness) • Uncertainty in how would be implemented in new context • Possible changes in other policies or conditions (e.g., economic or social)

  15. What do you do with uncertainty? • Give explicit range estimates for costs or impacts • Perform sensitivity analysis and discuss effects on trade-offs (e.g., Monte Carlo) • Use worst case/best case scenarios • Give best guess estimates with caveats • Build resilience into your policy options But…. • Clients like certainty • There is limited time/space to explain details • Need to make decisions in face of uncertainty

  16. How do you add up? • Most likely you don’t!--Any adding up scheme inherently weights the criteria • Can use cost/benefit analysis (monetize) • Use go/no go (minimum threshold) for each criteria and pick policy that meets all, then maximize one • Don’t have to recommend one policy but MUST point out KEY trade-offs across policy options

  17. Your mandate: • Find at least one quantitative outcome criterionthat you can find evidence to estimate. • You must provide cost estimates of your options • Use at least one academic or think tank study as evidence for at least one outcome (and preferably more) • I challenge you to find the most informative quantitative and qualitative evidence from the broadest sources.

More Related