270 likes | 678 Vues
Performance Counts! County of Los Angeles Performance Measurement Framework. July 23, 2003 Christina Altmayer, President Altmayer Consulting, Inc. Performance Measurement. What is Performance Measurement? The regular systematic measurement of work performed by an organization
E N D
Performance Counts!County of Los Angeles Performance Measurement Framework July 23, 2003 Christina Altmayer, President Altmayer Consulting, Inc.
Performance Measurement • What is Performance Measurement? • The regular systematic measurement of work performed by an organization • What is Performance Counts!? • The County of Los Angeles’ common framework for collecting and reporting performance information
Why Performance Measurement? If you don’t now where you’re going any road will take you there. If you can’t see success, you can’t reward it. If you can’t see success, you can’t learn from it. Show me the money! If you don’t measure results you can’t tell success from failure. What gets measured, gets done.
Making Valuable Investments in Performance Measurements How do we develop a meaningful process for evaluating program results, recognizing the fiscal and program realities of the County of Los Angeles?
Performance Counts! • Performance Counts! represents the County of Los Angeles’ framework to report: • What resulted from the County’s efforts (program results) • How well the service was provided (operational measures) • Benefits and improvements achieved for Los Angeles County (outcomes).
Why Performance Counts!? • Increase accountability to the public for the County’s accomplishments in a common, unified, accessible format • Enable strategic business decisions, planning and investments • Reinforce department planning and performance measurement efforts by promoting a culture which focuses on results
Challenge: Breaking the Terminology Trap Depends. is that a result, indicator or performance measure? • Need to overcome terminology and language as a barrier to Countywide progress • 4 Departments indicate use of Balanced Scorecard • 6 Departments use RBDM • 9 Departments report use of other methodologies How do you improve customer satisfaction?
Building a System to Plan and Measure Results.. Define Our Mission Enrich Lives through Effective and Caring Service Set Plans to Achieve County Strategic Plan Report Results Performance Counts! …. For the County, Departments, and Managers
What We Have Done So Far Define Who We Are Set Plans to Achieve Measure Results County Strategic Plan: Four Program Goals and Four Operational Goals; Annual resource planning through budget Outcomes associated with goals measured through indicators County Mission Performance Counts!: Program results measured through program indicators and operational measures Department Strategic Plans: Identify goals and strategies to achieve their mission Depts Mission Develop MAPP goals to support achievement of County outcomes and program results and operational measures Evaluate achievement of MAPP goals Management
Performance Counts! Framework County of Los Angeles Mission Statement Enrich Lives through Effective and Caring Service • What did we achieve? • Countywide Outcomes • Measured by Indicators of achievement of County Strategic Plan Goals • Program Results • Measured by Indicators of achievement of department programs that support department and County Strategic Plan Goals Population/ Community of Los Angeles County • How was the service provided? • Operational Measures • Reflect the quantity, cost, efficiency and/or quality of services provided • Reflect specific priorities of an organization or other ways to determine programs are meeting specific operational targets County of Los Angeles as an Organization
Performance Counts! A Brief History • November 2002 • Guiding Coalition and department heads endorse County framework concept in Strategic Plan Update and approve pilot effort (Goal 3, Strategy 1) • December 2002 • Updated Strategic Plan approved by Board of Supervisors • Pilot Departments Selected and Attend Orientation Session (HR, ISD, PW, CSS) • January – April 2003 • Pilot Departments work to test and validate framework • May 2003 • Guiding Coalition endorses framework and roll-out plan • June 2003 • Executive Strategic Planning Conference to address Performance Counts! roll-out and implementation
Pilot Objectives • Develop/test/validate performance measurement framework within which different departmental methodologies may co-exist • Ensure consistency with the County Strategic Plan and related measurement efforts (e.g., the County Progress Report, the Proposed Budget, the Children and Families’ Budget, and Management Appraisal and Performance Plan) • Identify and address implementation challenges and assess impact of proposed changes • Develop recommendations for Countywide deployment and implementation
Pilot Lessons Learned • General acceptance and support for development of common methodology and terminology, but a common methodology will require some degree of change among all departments and agencies for common County reporting • Broad range of experience and implementation in performance measurement within departments • Provide flexibility within a common framework that supports and reinforces internal management initiatives while supporting common County reporting. • Integrate with budget cycle and determine how performance results will drive priorities set in their strategic plan and resource allocation decisions (long-term issue).
What We Know From the Pilot Efforts • Performance measurement is a skill that will get easier over time as we become more comfortable with the language and terms • Performance information must be meaningful to the managers who will use the information or nothing will improve • Performance Counts! is not a comprehensive approach for performance measurement and management, but the experience may prompt managers to seek one
Performance Counts! Terminology • Program Result – A statement of the intended result from the services or interventions provided. The program result defines the change that occurs in the clients served from the intervention provided. • Program Indicator – Measures, for which data is available, that quantify the achievement of the program results. The indicator is tells us how many or what portion of the clients served underwent the change from the intervention. • Operational Measure – Measures of how well a program, agency or service is working in terms of: • Input/Output • Efficiency • Quality (including Customer Satisfaction)
Performance Counts! Answers Two Fundamental Questions • Are we doing the right stuff? • Program Results and Indicators • Is the intervention appropriate for the solution we want? • Example: Immunizations Prior to School Entry – Does it reduce incidence of communicable disease? • Are we doing it in the right way? • Operational Measures • What was the cost to provide that service? Did we find the right provider? • Example: Cost per Immunization; Number of immunizations administered at County clinics
Results and Indicators vs. Operational Measures • By evaluating program indicators, we will know if we need to modify the intervention or make a policy change to achieve the result we want • Program Indicators are EXTERNAL to the organization • % of Clients Employed Six Months after Receiving Services • Operational measures will help us understand where to focus improvement efforts • Operational Measures are INTERNAL to the organization • Cost per client served • Both are important, but for different reasons
Guidance and Criteria Performance Information must be clear, concise, understandable, and logical • Program Result Statements • Identify the specific client served • Stipulate the change to take place • Program Indicators • Intuitive based on Program Result • Rely on available and reliable data sources • Operational Measures • Measure an essential aspect of performance • Compare to a point of reference
Measuring the Right Stuff • Four important factors to consider in selecting data • Reliability– Are the suggested data sources reliable for current and future reporting? • Credibility – Is the data source credible and recognized as basis for program evaluation and reporting? • Controllability – Will this data help me as a manager to improve the performance of my program and services? • Comparability – What will this data be compared to as a means for measuring progress?
Community and Senior Services • Program Name: Adult Protective Services • Program Result: • Elder (aged 65+) and dependent adults (aged 18-64) with physical and/or mental limitations restricting the ability to carry on normal activities have reduced risk for abuse, neglect (including self-neglect), and exploitation • Program Indicator: • Percentage of elder and dependent adults whose risk was reduced based on CSS independent ranking. • Operational Measure: • Number of clients served
Public Works • Program Name: Unincorporated County Roads • Program Result: • Roadways and appurtenances within unincorporated County area are safe, smooth, well maintained, and can be traveled in a reasonable time • Program Indicators: • Accidents per million vehicle miles traveled • Pavement Ride Quality - % of pavement area with smooth ride • Operational Measures • Maintenance cost per lane mile
Internal Services Department • Program Name: Acquisition Services • Program Result: • County departments are procured goods and contracted services in a cost-efficient and timely manner and in accordance with the County Charter. • Program Indicators: • Average rating on Annual Customer Survey for timeliness of Acquisition Services delivery • Operational Measure: • Percent of on-time service delivery • Percent of benchmarked expenditures within 10% of market
Human Resources • Program Name: Los Angeles County Training Academy • Program Result: • Employees are prepared to meet current and future operational needs of the County • Program Indicators: • Percent of program participants who report using knowledge and skills acquired in the program on their jobs • Percent of academy programs for which statistically significant knowledge gain is demonstrated. • Operational Measures • Number of employees trained • Number of training hours delivered • Number of training programs conducted
2004/05 Budget Expectations • Departments will begin efforts to cross-walk current performance information or develop new performance information consistent with Performance Counts! in Summer 2003 • Departments will complete initial efforts prior to 2004/05 budget process • Departments will have updated performance information consistent with Performance Counts! framework for 2004/05 Proposed Budget
Roll-Out Strategy • Follow success and lessons learned from Pilot effort • Limit “classroom” training to orientation session • Hands-on training through group learning while doing • Bring uncommon functions together on common purpose • Organize County departments in six cross-functional groups • Meet together to critique and evaluate proposed measures for programs • Participate in collaborative discussion sessions followed-up with individual department coaching sessions by a trained consultant • Support each department group with the following: • Consultant coach • Pilot department mentor(s) • CAO Budget Analyst • Internal department resources.