1 / 111

Performance Measures, Outcomes, and Impacts

Performance Measures, Outcomes, and Impacts. Sharon Stout Office of Planning and Accountability April 24, 2007. Objectives. To discuss CSREES use of performance measures in the recent past and in the present;

baby
Télécharger la présentation

Performance Measures, Outcomes, and Impacts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance Measures, Outcomes, and Impacts Sharon Stout Office of Planning and Accountability April 24, 2007

  2. Objectives • To discuss CSREES use of performance measures in the recent past and in the present; • To review what CSREES OPA hopes to find in future performance measures, outputs, outcomes, impacts; and • To encourage ongoing dialogue on performance measurement, measures and indicators within the CSREES Land-Grant University System.

  3. Fundamental Principles • Axiom 1: What is measured can be improved. • Axiom 2: The above statement is true – only if we are measuring the right things.

  4. Critical Elements for Success • Select the right things to measure – both controllable and important to achieving success; • Measure those things in the right ways – with useable indicators and targets; • Employ measures in a framework to improve performance; and • Create an organizational culture valuing and encouraging use of measures.

  5. Outline • CSREES Past Use of Performance Data • Using a Performance Measures Framework, including Logic Models, to Improve • CSREES Examples • Developing and Using Performance Indicators and Targets • Using and Valuing Measures

  6. CSREES Past Use of Performance Data

  7. CSREES Program Leadership and Showcasing Results CSREES and the Land-grant System are • Addressing the national problems identified in the Strategic Plan; • Choosing the right actions to respond; • Achieving results; and • Communicating with stakeholders, including funders, the public, and the Land-grant System.

  8. CSREES Accountability • Defined as “the state of being accountable; liability to be called on to render an account; the obligation to bear the consequences for failure to perform as expected.” • CSREES is accountable to the American people, the Administration, and Congress for the use of public funds. Webster’s Dictionary, 1913

  9. Outputs, Outcomes and Impacts State Plans of Work and Annual Reports were used to provide examples for: • Budget justification and narrative; • PART self-assessments; • USDA Reports of Accomplishments; • Communication with Partners, other agencies, and the public.

  10. Success Extracting Results? • CSREES has not been as successful as we would like to be in presenting a systematic report of results from State POW annual reports. • External panels brought in to review CSREES accomplishments for the Portfolio Review Expert Process noted the lack of an adequate data system – and results.

  11. Representative Comments by Experts in Reports on PREP… • “The panel was disappointed with the evidence provided…” • “The portfolio failed to present a complete picture of all the inputs, outputs and outcomes.” • “There is a strong need to improve accountability showing measurable impacts, not just in CSREES, but throughout the system and down to individual investigators.”

  12. Improved POW System is Raising Panelists’ Expectations • “… Plan of Work and One Solution are expected to improve the documentation of significant findings.” • “In 5 years, [the panel] expects to see the consistent information across knowledge areas necessary to evaluate the portfolio properly.”

  13. Expectations of Panelists for CSREES OPA and NPLs • CSREES OPA will work …”to improve data collection, performance measurement and reporting.” • “CSREES and NPLs should have better communication with state partners in order to get significant evidence of outputs and impacts.”

  14. What is Happening With CSREES POW and 1Solution? • Exciting developments – as reported in other presentations here by Bart Hewitt – and CSREES ISTM. • “Management dashboards” will enable CSREES deputies, NPLs, and P&A to examine trends – and drill down to specific projects. • States can access information from other states.

  15. Present Status, Future Hopes • Exciting developments refers to changes in our information systems • CSREES OPA, ISTM, and Budget Office work together to make better use of our information • How can we jointly improve the information in our systems?

  16. Reducing Burden, Increasing Use • CSREES is no longer collecting ‘‘impacts” through a separate system. • CSREES is relying on Plans of Work and annual reports for both “stories and statistics” for planning, reporting, and budget justification.

  17. Using Performance Measurement Frameworks to Improve

  18. Performance Measurement Definition and Use • Ongoing, regular collection of information for monitoring how a policy, program or initiative is doing at any point in time. • Designed and used to report on the level of attainment of planned results and on performance trends over time.

  19. Performance Measurement Framework • Begins with design of a policy, program or initiative and evolves over time. • Always is engaged as part of the ongoing management of a policy, program or initiative. • Continues from the initial choices of performance measures and indicators, through performance monitoring to formative and summative evaluation.

  20. Performance Measurement Framework Cycle 2 3 Establish Data collection Procedures Specify Measures 4 1 Develop Information System Program Profile Logic model FEEDBACK Performance Measurement Understanding Measure & Report Performance 0 Review And Modify 6 Formative Evaluation Summative Evaluation 8 7 5

  21. Criteria for Assessing PM Frameworks • Useful for results management and accountability • Shared ownership • Transparent • Decision- and action-oriented • Credible and realistic • Flexible

  22. CSREES Use of PM Framework • Planning tools (‘strategic thinking’) -- to focus attention on desired outcomes • Model to identify and track measures of interest -- inputs, outputs, and outcomes. • Evaluability assessment – does this program on its face make sense? (Identify questions to be addressed in an implementation plan)

  23. PM Framework in CSREES Context … Performance measurement framework is designed with organization • Mission; • Vision; • Strategic Goals; • Objectives; and • Organizational and functional linkages.

  24. CSREES Mission and Vision… Mission • To advance knowledge for agriculture, the environment, human health and well-being, and communities Vision • To improve the lives of people worldwide through an agricultural knowledge system sustained by the innovation of scientists and educators.

  25. Types of Strategies Organizational Stimuli Intended Strategy Deliberate Strategy Unrealized Strategy Realized Strategy Emergent Strategy Source: Mintzberg, H. Ahlstrand, B. Lempel, J. (1998)

  26. CSREES Strategic Goals • Enhance international competitiveness of American agriculture; • Enhance competitiveness and sustainability of Rural and Farm economies; • Support increased economic opportunity and improved quality of life in rural America; • Enhance protection and safety of the Nation’s agriculture and food supply; • Improve the Nation’s nutrition and health; and • Protect and enhance the Nation’s Natural Resource Base and Environment.

  27. ASSUMPTIONS CSREES Logic Model as “Roadmap” Situation Inputs Activities Outputs Outcomes Describe the problem, challenge, and opportunities $$$ Other Partners NIH, ARS What we do in Research Extension Education Describe what comes out from funded activities Knowledge Actions Conditions Occurs when there is a change in knowledge or the participants actually learn Occurs when there is a change in behavior or the participants act on what they have learned Occurs when a societal condition is improved due to a participant action taken in the previous column EXTERNAL FACTORS

  28. Water Quality Program: Logic Model Inputs Outputs Activities Participants P R I O R IT I E S Outcomes- Impact Knowledge Actions Conditions S I T U A T I O N Staff Money Materials Partners Increased knowledge of link between cattle diet and water quality Monitor phosphorus levels in feed, manure, and soil Educational workshops Reductions in phosphorus use Farmers at risk of overfeeding phosphorus Set up record keeping systems to track phosphorus Increased understanding of recommended phosphorus levels Feed cost savings Make appropriate adjustments to cattle feed On-farm visits Increased knowledge of tracking phosphorus levels Improved water quality

  29. Water Quality Program: Performance Measurement, Evaluation Questions Inputs Outputs Activities Participants Outcomes Knowledge Actions Conditions

  30. Water Quality Program: Indicators Indicators: How will you know it? Inputs Outputs Activities Participants Outcomes Knowledge Actions Conditions

  31. Uses of the Logic Model • Clarifies the linkages between activities, outputs and expected outcomes of the policy, program or initiative • Communicates externally about the rationale, activities and expected results of the policy, program or initiative;

  32. Uses of the Logic Model • Tests whether the policy, program or initiative "makes sense" from a logical perspective; and • Provides the fundamental framework on which the performance measurement and evaluation strategies are based (i.e., determining what would constitute success).

  33. Performance Measures vs. Performance Indicators • Performance measures are conceptual, and need to be operationalized • Performance indicators are operationalized, to specify actual data (qualitative or quantitative) to be collected and used

  34. What is a Performance Indicator? An indicator is a quantitative or qualitative amount used to determine the extent to which progress is made toward an outcome. • Quantitative Example: percent of firms exporting 10 percent of their produce • Qualitative Example: distance between mother and child pairs in video of play activity

  35. Outputs Products, services and events that are: • Intended to lead to outcomes; and • Linked to problems or issues to be addressed – through the logical or causal chain of events depicted in the model.

  36. Performance MeasuresExamples -- Outputs • Number of training sessions held • Number of participants trained • Number of instructional hours • Number of publications • Number of patents

  37. Outcomes Planned results or changes for individuals, groups, communities, organizations or systems, including: • Changes in knowledge; • Changes in behavior; and • Changes in conditions (impacts) -- resulting in solution of the original problem or issue.

  38. Examples of Outcomes • Knowledge: Change in level of knowledge regarding plant production • Behaviors: Change in farming practices • Conditions: Increased food security

  39. CSREES Examples Performance Measures and Indicators

  40. Examples … • Soybean Rust • CEAP • Cryptosporidium • Nutrition and Obesity • Family Financial Security

  41. What are We Seeking? Quantitative and qualitative evidence that • Researchers are identifying possible solutions to national problems; • Educators and extension faculty and staff are helping the public learn; and • The public is applying new knowledge and addressing national problems, thus • Producing changes in conditions.

  42. Soybean Rust • Devastating disease worldwide – some regions lost 60-80% of their soybean crop • First found in November, 2004 by an Extension Specialist at Louisiana State University trained by a CSREES program, SPDN • Rapid response by USDA agencies and Land Grant partners • CSREES and its National Plant Diagnostic Network (NPDN), provided disease recognition and pathogen diagnostic tools. • Helped save $11 to $299 million in 2005 …

  43. Penn State Soybean Rust Model

  44. SBR Observation – 12-14-2005 Soybean Rust in 2005

  45. Zooming to a state brings up county level resolution and specific guidelines for that state.

  46. Producer Access to Information

  47. Soy Bean Rust Outputs • Research papers modeling spread of soy bean rust; • Training for extension agents in identifying and treating soy bean rust; • Development of Internet Pest Information Platform.

  48. Soy Bean Rust Output Indicators • # of research papers modeling spread of soy bean rust; • # of training hours for extension agents • # of participants trained • Completed development of Pest Information Platform

  49. Soy Bean Rust Outcomes • Knowledge – producers aware of spread of soy bean rust and access Pest Information Platform for advice on treatment; • Behavior – producers reduce fungicide use (cease treating in advance of spread); • Conditions – crops appropriately treated, cost savings, less fungicide in ground water.

  50. Indicators Outcomes: • Knowledge – number of producer consultations with Extension agents; • Behavior – amounts and locations of applied fungicide; claims filed for crop damage • Conditions – TBD – ERS model of cost savings

More Related