1 / 56

Results-Based Accountability

Results-Based Accountability. Discovery Communities TA Institute April 1, 2008. Results Accountability Decision-making and Strategic Planning. Fiscal Policy Studies Institute Santa Fe, New Mexico. WEBSITES www.resultsaccountability.com www.raguide.org www.charteroakgroup.com.

brasen
Télécharger la présentation

Results-Based Accountability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Results-Based Accountability Discovery Communities TA Institute April 1, 2008

  2. Results Accountability Decision-makingand Strategic Planning Fiscal Policy Studies Institute Santa Fe, New Mexico • WEBSITES • www.resultsaccountability.com • www.raguide.org • www.charteroakgroup.com BOOK ORDERSwww.trafford.com www.amazon.com

  3. Why Are We Here? 3

  4. SIMPLE COMMON SENSE PLAIN LANGUAGE MINIMUM PAPER USEFUL

  5. Concepts to Take Away Today • Population accountability v. performance accountability • Ends v. means • How to choose indicators and measures • The importance of a data development agenda • How to move from talk to action

  6. Population Accountabilityabout the well-being ofWHOLE POPULATIONS For Communities – Cities – Counties – States - Nations Performance Accountabilityabout the well-being ofCLIENT POPULATIONS For Programs – Agencies – and Service Systems Results Accountabilityis made up of two parts:

  7. Results and Performance Accountability COMMON LANGUAGE COMMON SENSE COMMON GROUND 7

  8. THE LANGUAGE TRAPToo many terms. Too few definitions. Too little discipline Benchmark Outcome Result Modifiers Measurable Core Urgent Qualitative Priority Programmatic Targeted Performance Incremental Strategic Systemic Indicator Goal Measure Objective Target Lewis Carroll Center for Language Disorders

  9. The Humpty Dumpty Approach to Language • “When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean—neither more nor less.” “The question is,” said Alice, “whether you can make words mean so many different things.” “The question is,” said Humpty Dumpty, “which is to be master—that’s all.” • LEWIS CARROLL, Through the Looking-Glass, chapter 6, p. 205 (1934). First published in 1872.

  10. Population Performance DEFINITIONS RESULT A condition of well-being for children, adults, families or communities. Children born healthy, Children succeeding in school, Safe communities, Clean Environment, Prosperous Economy INDICATOR A measure which helps quantify the achievement of a result. Rate of low-birthweight babies, Rate of high school graduation, crime rate, air quality index, unemployment rate PERFORMANCE MEASURE A measure of how well a program, agency or service system is working. Three types: 1. How much did we do? 2. How well did we do it? 3. Is anyone better off? = Customer Outcomes

  11. Population Performance Customer outcome = Ends Service delivery = Means From Ends to Means From Talk to Action RESULT ENDS INDICATOR PERFORMANCEMEASURE MEANS

  12. Connecticut Glossary of RBA Terms • The Appropriations Committee standardized the terms we use in Connecticut. You have a copy of the glossary. • We could have chosen other terms consistent with Friedman’s RBA approach so this list is somewhat arbitrary. • Its virtue is that everyone in Connecticut doing this work – both executive branch and legislative branch – is using the same vocabulary and meaning the same thing by it. At least we should be able to understand each other.

  13. Some Connecticut Early Childhood Indicators • % Infants born at low birth weight • % Births to mothers without a high school degree • % of kindergartners with all or most pre-literacy and personal skills • % 4th grade reading scores at mastery or above

  14. IS IT A RESULT, INDICATOR OR PERFORMANCE MEASURE? • 1. Safe Community • 2. Crime Rate • 3. Average Police Dept response time • 4. A community without graffiti • 5. % of surveyed buildings without graffiti • 6. People have living wage jobs and income • 7. % of people with living wage jobs and income • 8. % of participants in job training who get living wage jobs

  15. IS IT A RESULT, INDICATOR OR PERFORMANCE MEASURE? • 9. % HS graduates enrolling in college • 10. Traffic-related death rate • 11. Clean environment • 12. Air pollutants in parts per million • 13. % participating cities fixing treatment plants • 14. All children eat healthy • 15. % students eligible for free lunch participating in free lunch

  16. POPULATIONACCOUNTABILITY • For Whole Populationsin a Geographic Area

  17. MarylandResults for Child Well-Being ● Babies born healthy ● Healthy children ● Children enter school ready to learn ● Children successful in school ● Children safe in their families and communities ● Stable and economically independent families ●Communities that support family life 17

  18. Connecticut Results Statements A clean and healthy Long Island Sound All children healthy and ready to learn by age 5 All children ready by five and fine by nine 18

  19. ? Fixed Leaking Roof(Results thinking in everyday life) Experience: Not OK Inches of Water Measure: Turning the Curve Story behind the baseline (causes): Partners: What Works: Action Plan:

  20. Seven Population Accountability Questions What are the quality of life conditions we want for the children, adults and families who live in our community? What would these conditions look like if we could see them? How can we measure these conditions? How are we doing on the most important of these measures? Who are the partners that have a role to play in doing better? What works to do better, including no-cost and low-cost ideas? What do we propose to do? 20

  21. “We haven’tgot the money, so we’ve gotto think.” Lord Rutherford1871 - 1937

  22. Connecticut Population Template 22

  23. Criteria forChoosing Indicators as Primary vs. Secondary Measures Communication Power Does the indicator communicate to a broad range of audiences? Proxy Power Does the indicator say something of central importance about the result? Does the indicator bring along the data HERD? Data Power Quality data available on a timely basis.

  24. Choosing IndicatorsWorksheet Safe Community Outcome or Result_______________________ ProxyPower DataPower CommunicationPower Candidate Indicators Measure 1 Measure 2 Measure 3 Measure 4 Measure 5 Measure 6 Measure 7 Measure 8 H M L H M L H M L H H H H L H DataDevelopmentAgenda

  25. Three Part Indicator List for each Result Part 1: Primary Indicators ● 2 or 3 or 4 “Headline” Indicators● What this result “means” to the community● Meets the Public Square Test Part 2: Secondary Indicators ● Everything else that’s any good (Nothing is wasted.)● Used later in the story behind the baseline Part 3: Data Development Agenda ● New data● Data in need of repair (quality,timeliness etc.)

  26. H M L The Matter of Baselines OK? Point to Point Turning the Curve Forecast History Baselines have two parts: history and forecast

  27. Performance Accountability • For Programs, Agencies and Service Systems

  28. Population Accountabilityabout the well-being ofWHOLE POPULATIONS For Communities – Cities – Counties – States - Nations Performance Accountabilityabout the well-being ofCLIENT POPULATIONS For Programs – Agencies – and Service Systems Results Accountabilityis made up of two parts:

  29. “All Performance Measures that have ever existed for any program in the history of the universe involve answering two sets of interlocking questions.”

  30. Program Performance Measures Quantity Quality HowWell did we do it? ( % ) HowMuch did we do? ( # )

  31. Program Performance Measures Effort How hard did we try? Effect Is anyone better off?

  32. Effort HowWell HowMuch Effect Program Performance Measures

  33. Program Performance Measures Quality Quantity How welldid we deliver it? How much service did we deliver? Effect Effort Output Input How much change / effect did we produce? What quality of change / effect did we produce?

  34. Program Performance Measures Quality Quantity How welldid we do it? How much did we do? Effect Effort Is anyonebetter off? # %

  35. Education Quality Quantity How well did we do it? How much did we do? Student-teacherratio Number ofstudents Effect Effort Is anyone better off? Percent ofhigh schoolgraduates Number ofhigh schoolgraduates

  36. Percent of 9th graders whograduate on timeand enter college oremployment after graduation Number of 9th graders whograduate on timeand enter college oremployment after graduation Education Quality Quantity How well did we do it? How much did we do? Student-teacherratio Number ofstudents Effect Effort Is anyone better off?

  37. Health Practice Quality Quantity How well did we do it? How much did we do? Percent ofpatients treatedin less than1 hour Number ofpatientstreated Effect Effort Is anyone better off? Incidence ofpreventabledisease(in the practice) Rate ofpreventabledisease(in the practice)

  38. General Motors Quality Quantity How well did we do it? How much did we do? Employees pervehicleproduced # of production hrs # tons of steel # of cars produced Effect Effort Is anyone better off? # of cars sold $ amount of Profit $ car value after 2 years % market share Profit per share % car value after 2 years Source: USA Today 9/28/98

  39. What Kind of PERFORMANCE MEASURE? ● # of people served ● % participants who got jobs ● staff turnover rate ● # participants who got jobs ● % of children reading at grade level ● cost per unit of service ● # applications processed ● % patients who fully recover Upper Left Lower Right Upper Right Lower Left Lower Right Upper Right Upper Left Lower Right 39

  40. RBA Categories Account for All Performance Measures(in the history of the universe) TQM Efficiency, Admin overhead, Unit costStaffing ratios, Staff turnoverStaff morale, Access, Waiting time, Waiting lists, Worker safety Efficiency Cost Quantity Quality Process Input Effort Customer Satisfaction(quality service delivery& customer benefit) Product Output Impact Effect Cost / Benefit ratioReturn on investment Client results or client outcomes Benefit value EffectivenessValue-addedProductivity Effectiveness

  41. The Matter of Control Quality Quantity How much did we do? How well did we do it? MostControl Effect Effort Is anyone better off? LeastControl PARTNERSHIPS

  42. The Matter of Use • Fundamental Purpose is to Improve Performance as a contribution to improving results 2. Avoid the Performance MeasurementEquals Punishment Trap ● Acknowledge the experience as real. ● Work to create a healthy organizational environment ● Start small. ● Build bottom-up and top-down simultaneously.

  43. Program Performance Measures Quality Quantity How welldid we do it? How much did we do? Effect Effort Is anyonebetter off? # %

  44. Connecticut Program Template 45

  45. How Population &Performance AccountabilityFIT TOGETHER

  46. THE LINKAGE Between POPULATION and PERFORMANCE POPULATION ACCOUNTABILITY Healthy BirthsRate of low birth-weight babiesStable FamiliesRate of child abuse and neglectChildren Succeeding in SchoolPercent graduating from high school on time POPULATIONRESULTS Contributionrelationship PERFORMANCE ACCOUNTABILITY Alignmentof measures Child Welfare Program % withMultiplePlacements # Foster ChildrenServed Appropriateresponsibility # RepeatAbuse/Neglect % RepeatAbuse/Neglect CUSTOMEROutcomes

  47. THE LINKAGE Between POPULATION and PERFORMANCE POPULATION ACCOUNTABILITY Healthy BirthsRate of low birth-weight babiesChildren Ready for SchoolPercent fully ready per K-entry assessmentSelf-sufficient FamiliesPercent of parents earning a living wage PERFORMANCE ACCOUNTABILITY POPULATIONRESULTS Contributionrelationship Alignmentof measures Job Training Program Unit costper persontrained # personsreceivingtraining Appropriateresponsibility % who getliving wage jobs # who getliving wage jobs CUSTOMEROutcomes

  48. What is Happening in the Legislature? Institutionalizing RBA in the legislature: The Appropriations Committee has created a new RBA sub-committee Early Childhood Cabinet must develop an accountability plan and make other recommendations for changes necessary to ensure coordination, service integration, and accountability The 2007 budget requires new and expanded programs to report to the General Assembly and OPM using an approved RBA framework, an effort lead by the Office of Fiscal Analysis (OFA) and OPM The co-chairs and rankings members of the other sub-committees of Appropriations are identifying particular public policy issues (result statements) for RBA development All this month, the early childhood agencies that were part of the RBA Phase II initiative last session reported back to the legislature on the progress they have made 50

More Related