200 likes | 312 Vues
This text explores the significance of rankings and benchmarks for universities seeking to enhance their quality management processes. It delves into the comparison between measurable rankings and actionable benchmarks, providing insights on how institutions can leverage this information to improve their overall performance. The discussion emphasizes the importance of using multidimensional metrics and linking cause-and-effect relationships to drive institutional excellence. It also highlights the role of benchmarking in determining key inputs and outputs to optimize efficiency and effectiveness. Furthermore, it examines the European University Association's Institutional Evaluation Program and the UNICA Observatory on Administrative Excellence as platforms for benchmarking and promoting best practices in higher education management.
E N D
Which way to quality?Rankings vs Benchmarks Stavros A. Zenios Professor of Management Science Rector, University of Cyprus UNICA President UNICA Rectors’ Seminar Dubrovnik, July 2009
OUTLINE • About quality • Rankings vs Benchmarks • UNICA Administrative Excellence Observatory
On Rankings • They tell you where you are • They don’t tell you how you got there • They don’t tell you how to move up • They describe a complex situation in one number • Why use them? • Governments like them • Students like them • Funding agents like them • Halo effect • Rectors should use them as “the way to quality” map
On Quality • Multidimensional • Link cause and effect • Observe “non-actionable” variables • Rankings • Number of publications, citations, faculty with Nobel • Measure “actionable” variables • Salaries, autonomy, hiring policies, facilities • Benchmark with similar institutions • Identify best practices • Innovation makes perfect
Linking Shanghai rankings with actionable items • Budget per student +0.61 • University governance • Public status -0.35 • Budget autonomy +0.16 • Building autonomy -0.01 • Hiring autonomy +0.20 • Wage setting autonomy +0.27 • Hiring own PhDs -0.08
Benchmarking Determine key inputs - actionable Decide on key outputs – non-actionable Use Linear Programming to fit a multidimensional envelope
Benchmarking: efficiency or effectiveness • Efficiency: doing things right • Number of students per faculty • Publications per faculty • Average years to degree completion • Effectiveness: doing the right things • Quality of education, ability to learn • Citations per faculty • Contribution to knowledge • Multi-dimensional
Holistic • EUA Institutional Evaluation Program • Bird’s eye view • Leadership • Policy and strategy • Stakeholders’ view • Capacity for change • Departmental Peer review • Customers’ view • Internal customers’ view • Stakeholders’ view • Critical indicators • Procedures? Personnel Management? Resources?
The UNICA Observatory on Administrative Excelence • Procedures • Personnel management • Facilities management • Decentralization • Resources
Performance benchmarking UCY - AUEB • Determine key performance indicators • Collect data from European Universities • Benchmark UCY viz-a-viz other European Universities
Performance benchmarking UCY - AUEB • Survey from 506 Universities • Humane 168 • AUEB 103 • EUA (rest) 134 • US (facilities planning) 17 • US (top Universities) 47 • UNICA 37 • 52 respondents
Online questionnaire • Standardized • Colour coded • Definitions, precision • Technical specification • Control questions
The UNICA Observatory on Administrative Excellence • On line system for benchmarking • Calibrate implicit relations • Address the PROCESSES and RESOURCES aspects of quality management • Supplement EUA Institutional Evaluation Programs • UNICA Policy: Learn from each other