1 / 10

Session Chair: Bruce McWilliams First Presenter: Mary Beth Zimmerman

Evaluating government R&D programs with quantitative measures What works best AEA 2006 Annual Conference November 2, 2006. Session Chair: Bruce McWilliams First Presenter: Mary Beth Zimmerman Second Presenter: Brian Zuckerman Third Presenter: Bruce McWilliams Discussant: George Teather.

marlee
Télécharger la présentation

Session Chair: Bruce McWilliams First Presenter: Mary Beth Zimmerman

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating government R&Dprograms with quantitative measures What works bestAEA 2006 Annual ConferenceNovember 2, 2006 Session Chair: Bruce McWilliams First Presenter: Mary Beth Zimmerman Second Presenter: Brian Zuckerman Third Presenter: Bruce McWilliams Discussant: George Teather Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov

  2. Why Quantitative Measures? • Holy Grail of Evaluation Metrics • When metrics have Cardinal differences, • “Apples to Apples” comparisons can be made • Statistics can be generated • Correlation or causal chains can be supported • with statistical evidence Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov

  3. Government R&DUnique Evaluation Context • Government R&D Characteristics • Federal Management Initiatives • Institutional Characteristics & Incentives Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov

  4. Characteristics of Government R&D • Outcomes typically “public goods” • Disciplinary or fundamental knowledge • Non-rivalrous in use and/or non-excludable •  Insufficient appropriability & market failure • Serendipitous discovery can occur • Significant uncertainties in the flows of expected costs & benefits in a R&D pipeline • Wide range of motives and incentives • Quest for Understanding v. Quest for Usefulness Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov

  5. Recent Federal Management Initiatives & Legislation • GPRA (1993): Government Performance & Results Act • Must submit strategic plans to OMB and • Congress every three years • Must submit an annual performance plan to • Congress along with budget request • Performance (& Accountability) Plan must be • based on performance elements established in • plan from previous year Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov

  6. Recent Federal Management Initiatives & Legislation President’s Management Agenda (2001) – Budget Performance Integration (BPI) – OMB’s Performance Assessment Rating Tool (PART) Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov

  7. Institutional Characteristics • Very Different! • Scale & Budget (Capabilities different) • Intramural v. Extramural (Monitoring different) • Focus: Exogenously v. Endogenously Defined • (Planning horizons different,Measures different) • Other differences Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov

  8. Institutional Incentives • Competition over budgets • Potentially overlapping objectives • Differing incentives to: • - drop underperforming research activities • - transition technologies out of public sector • Others Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov

  9. Implications So what impact do these characteristics & incentives have on: • Achievable Top-Level Aggregate Comparability? • Managerial Decision-rules? • Performance Improvements? • Budget Performance Integration? Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov

  10. Questions What is the best way to quantitatively evaluate R&D programs? • Should anything be done differently to evaluate: - basic knowledge generation; - the development of applied knowledge; or, - the implementation of research knowledge? • Does the customer of R&D outcomes make a difference? • Does an organization’s characteristics (size, structure,..) impact a quantitative evaluation? • Is there or should there be a core evaluation policy that spans all the federal agencies? Cooperative State Research, Education, and Extension Servicehttp://www.csrees.usda.gov

More Related