benchmarking for improvement n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Benchmarking for improvement PowerPoint Presentation
Download Presentation
Benchmarking for improvement

play fullscreen
1 / 30

Benchmarking for improvement

161 Views Download Presentation
Download Presentation

Benchmarking for improvement

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Benchmarking for improvement Liz Hart Director of Information Services Staffordshire University

  2. Benchmarking for improvement • Brief overview of purpose and current situation in the UK • Benchmarking consortia • Examples of benchmarking in practice

  3. Benchmarking in libraries • Growth of importance of quality management • Accountability • Three “E”’’s: • Effectiveness • Efficiency • Economy

  4. Purpose? “Measurement and benchmarking are not separate sciences or unique themes of quality management, but rather strategic approaches to getting the best out of people, processes, plant and programmes” John Oakland (1993)

  5. UK experience • 40% of HE sector involved in some way • Variable involvement: • Structured to “unstructured” • SCONUL pilots project • Manual of methods

  6. Issues from UK experience • Time and effort balanced by willingness to engage • Ethics, confidentiality and honesty • Staff / user involvement essential • Process mapping / Activity Based Costing • Comparative measures difficult! • Good project management essential

  7. Issues from UK experience • Embedding benchmarking in institutional or service quality framework • Methodologies from the literature do work in practice

  8. Methodology • Oakland’s 15 stages: Plan and Action • Plan: 7 stages • Select processes • Identify appropriate benchmarks • Collect environmental data • Identify team • Decide on data collection methodology

  9. Methodology • Project planning and timescales • Useful example: Advice desk project • Critical success factor - satisfied customers • “delightful” process • Help; situation; environment • Balanced against “right” information • 7th Stage in Plan: implement data collection

  10. Methodology • Oakland’s second group, Action: • Compare data • Organise and catalogue for retrieval • Understand enabling processes • Set new objectives and standards • Develop action plans for new standards • Implement and embed changes

  11. Methodology • Monitor and evaluate outcomes and improvements • Review your measures to ensure usability • Sounds obvious - but it is a discipline

  12. Benchmarking consortia and other models • Motivation? • Individual • Step change in organisation • Conceptual review • Developmental and experimental • Commonwealth University Management Benchmarking Club • Consortia

  13. Benchmarking agreements • Framework for operation • Define clarity of purpose: • To produce beneficial cross University analysis of process, statistical information and service outcomes • Produce comparative data • Voluntary grouping • Equal partnership

  14. Benchmarking agreements • Executive Group • Operations Group • To ensure consistency, comparative outcomes and methodological enhancements • Sub groups led by one partner institution • Clear financial and resource based • Ensures work is evenly allocated and shared

  15. Benchmarking agreements • “Get out” clauses • Non-participation due to internal developments • 3 months notice • Confidentiality • Critical and key to success • Openly share finance, staffing and process information and data

  16. Benchmarking agreements • Why is this so important? • Open environment • Facilitates working relationships • Mutual support in process of assessment • Shared objectives • And finally, politically advantageous

  17. Learning from experience • Good project management • External marketing • Users and non-users • Marketing largely passive to date • Changing in 2002/2003 cycle

  18. Tools and techniques • Mystery shopper • Used extensively in commercial sector • Dependent on robust and open relationship between partner institutions • Essential to agree and clarify criteria for measurement • Sensitivity regarding outcomes

  19. Tools and techniques - mystery shopper • Mystery shopper for website access evaluation • Set questions assessing 3 variables: • Ease of access • Success in finding information • Time taken • 9 out of 10 questions the same • Shoppers were from other institutions

  20. Tools and techniques - mystery shopper • Shoppers accessed sites in predetermined order • Outcomes? • Time - much longer than anticipated • 1 site much more successful than other 3 • Basic navigation OK • Access via username and password reduced success by minimum of 25%

  21. Tools and Techniques - Exit interviews • Undertaken outside library environment • “Neutral” territory • Example from Advice Desk project • Difference between questionnaire approach and facilitated approach • Facilitators gained more comprehensive and open responses • Costly but more realistic views obtained

  22. Tools and techniques - behavioural study • Unobtrusive observation used in Advice Desk project • Rejected as a method following trial • Sensitivity and influence on staff behaviour • Method valuable for “people flows” • Ability to predict demand and physical use of space/s

  23. Tools and techniques - Measuring process times • Shelving project • Length of time to re-shelve • Tidiness and accuracy of items on shelves • Environmental and costing data • Changed timescales between 2000 and 2001 based on experience gained • Tracking slips • Improved shelving times by up to 50%

  24. Tools and techniques - Measuring process times • Improvement via Practical Action: • Targetting new staff appointments • Moving stock • Creating core team of shelving staff • More effective use of return shelves • Better trained / longer serving staff undertaking initial sorting for floors

  25. Tools and techniques - Measuring process times • Improvement via Staff motivation • Greater motivation / self starting • Team planning • Willingness to reassign duties - flexibility • Local analysis to pinpoint problems • Shelving is (for now) the base of the pillar

  26. Tools and techniques - Measuring process times • Shelf tidiness • Counting samples across classmark ranges • Carried out over one block week (2000) then 1 day per week over 5 weeks (2001) • Outcomes? • 2 sites with fast shelving had untidy shelves • 1 site with quickest shelving had tidy shelves!

  27. Tools and techniques - Measuring process times • Outcomes? • Will repeat as not entirely clear • Most significant factor seems to be priority given to shelving compared with other duties • Staff response emphasised the importance of speed and accuracy particularly in relation to other processes (reservations for example)

  28. Outcomes and benefits? • Provision of shared management information • Establishing best practice • Identifying and implementing positive change • Evaluating opinion, views and needs of customers • Beginning to identify trends

  29. Outcomes and benefits? • Networking - “invaluable” • Exchange of ideas and views • Staff development • Staff ownership and flexibility • Perspectives on roles: • Challenging the established or expected outcome or view

  30. Outcomes and benefits? • Institutional recognition • 4 Universities Benchmarking Consortia has proved benchmarking is: • Achievable • Repeatable • Valuable • In future it will be part of our routine...