1 / 22

Managing for Results in America’s Great City Schools A Report of the Performance Measurement & Benchmarking Project

Managing for Results in America’s Great City Schools A Report of the Performance Measurement & Benchmarking Project. Council of the Great City Schools Chief Operating Officers Conference April 2007. Project Purpose.

bud
Télécharger la présentation

Managing for Results in America’s Great City Schools A Report of the Performance Measurement & Benchmarking Project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Managing for Results in America’s Great City SchoolsA Report of the Performance Measurement & Benchmarking Project Council of the Great City Schools Chief Operating Officers Conference April 2007

  2. Project Purpose • Establish a common set of key performance measures for Operations, Finance, Human Resources, and Information Technology functions, • Benchmark district performance in these operational areas among CGCS member districts across the United States, and • Inventory effective management practices used by top performing districts, so other member districts can utilize them where applicable and viable to do so.

  3. Benchmarking Performance • The goal of benchmarking: • Examine how districts compare to their peers, • Learn what effective practices are being used by top performing districts, and • Apply methods to improve performance by studying their circumstances and approaches. • The four major steps in the benchmarking process include: • Select and compile data to compare districts • Identify factors in which performance leads and lags • Study the gaps between districts • Provide information for districts to develop recommendations and propose actions to close the gap.

  4. Value for Operational Leadership • National perspective on how urban schools are managing their resources within administrative functions. • Information about management practices, disciplines, culture, accountability and other factors that make them best practice districts. • Executives can quickly spot gaps in cost, quality and timeliness compared to their peers. • Knowledge development for general managers across a broad array of operational responsibilities.

  5. Value for Instructional Leadership • Ensure effective service to provide the highest quality for students and staff. • Maximize efficient operational costs so financial resources are prioritized to instruction. • Reduce the operational burden on school sites.

  6. Value for Boards & Superintendents • Accountability • Moving to outcome based discussions and away from process based discussions • Clarity • Prioritizing efforts that drive the core mission based on data driven decisions. • Progress • Assessing where we are relative to district-wide goals and objectives • Context • Benchmarking with other districts to better understand performance and practices leading to improvements • Transparency • Reporting outcomes that enhance public trust

  7. Project Outcomes • The benchmark data will provide districts with a focused view on where they perform well, and where they may have an opportunity for improvement. • The “gap analysis” will provide districts with an inventory of effective practices they may be able to adopt to enhance their performance.

  8. Overview and Status Report

  9. Project Stages • Stage I: Operations • Launched April 2006 • Stage II: Finance • Launched November 2006 • Stage III: Human Resources • Launched March 2007 • Stage IV: Information Technology • Launches June 2007

  10. Project Phases • Identify functional areas for study • Establish technical teams with expertise in each function • Assess components of functional area to determine what needs to be commonly measured • Establish a clearinghouse of performance measures • Select key performance measures for study • Establish standardized definitions and methodology • Survey CGCS member districts • Collect, analyze, and report data • Conduct gap analysis between districts • Inventory effective management practices that create the most effective performance

  11. Status Report: Operations

  12. Operations: Pilot • Sample study of 20 school districts • Areas: Food Services & Transportation • Lessons Learned from sample benchmarking • Some are not measuring critical areas of performance • Some realized they were not as good as they thought • Some realized they are better than they thought • Measures are not standardized in our industry • Data can be unreliable

  13. Operations: Pilot April Chief Business & Operating Officers Conference Sample KPI: Yellow Bus On-Time Performance

  14. Operations: Functional Areas • Five operational areas were selected by COOs from around the country for inclusion: • Transportation • Food Services • Maintenance & Operations (M&O) • Procurement/Supply Chain • Safety & Security • Technical Team Leads from Buffalo, Indianapolis, Jackson, Miami, St. Paul with 34 other technical team members from 22 districts

  15. Operations: Technical Team Leads • Transportation • John Fahey • Buffalo Public Schools • Food Services • Jean Ronnei • St. Paul Public Schools • Maintenance & Operations • Steve Young • Indianapolis Public Schools • Procurement/Supply Chain • Joseph Gomez • Miami-Dade County Public Schools • Safety & Security • Michael Thomas • Jackson Public Schools • Six Sigma Methodologies • Deb Ware • Fort Worth Independent School District • Project Data Analysis • Katherine Blasik • Broward County Public Schools

  16. Operations: Performance Measures • A clearinghouse of 208 potential measures was established • A total of 135 were selected for inclusion • 50 were identified for initial reporting • Utilized Six Sigma methods (Dallas ISD) to ensure valid and reliable data • Definition of the measure • Calculation of the measure • Source of the data

  17. Operations: Survey Development • CGCS partnered with K12 Insight to establish an online survey tool for data collection • Scope • Survey included 65 CGCS member districts • A total of 311 questions asked • A total of 624 data points collected • Collected data only to ensure integrity of calculations • Districts did not perform their own calculations

  18. Status Report: Finance • Project Management: • Fred Schmitt, Chief Financial Officer, Norfolk, Va. • The Finance stage was launched at the Chief Financial Officers Conference November 2006 • Functional areas for review will include Financial Management and General Accounting • Technical team were identified led by Chief Financial Officers and technical teams of directors from selected functional areas. • Fred Schmitt will provide a detailed update Thursday

  19. Status Report: Human Resources • Project Management: • Dan Cochran, Chief HR Officer, Ret., Broward County, Fl. • The HR stage was launched at the Human Resource & Development Officers conference March 1-3, 2007. • Functional areas for review will include Employee Relations, Human Resources Operations, and Recruitment & Staffing. • Technical Team leads were identified, and HR officers with expertise in each area above have volunteered for the technical teams. • HR officers have mapped initial concept areas for measurement, established measures for many, and have begun to define measurement methodologies.

  20. Status Report: Information Technology • Project Management: • Mike Casey, Chief Technology Officer, San Diego, Ca. • The IT Stage will be launched at the Management Information Symposium in Albuquerque June 2007. • Technical support is being formulated at this time, and technical teams and functional review areas will be identified in June.

  21. Next Steps • Refine program approach • Stabilize current measures • Differentiate Key Performance Indicators “KPIs” and subordinate measures • Examine other functional areas • Review critical dates for gathering current data • Establish a senior advisory team and new members of the technical teams. Ownership by member districts and supported by the Council of Great City Schools

  22. Contact Information Robert Carlson Director of Management Services Council of Great City Schools rcarlson@cgcs.org Michael Eugene Business Manager Los Angeles Unified School District Michael.eugene@lausd.net Heidi Hrowal Principal Administrative Analyst Los Angeles Unified School District Heidi.Hrowal@lausd.net

More Related