Contents: • Brief definition of benchmarking • How benchmarking is applied in the commercial world • How benchmarking can be applied to the public sector • The main stages in a benchmarking project • Designing a simple bench-marking exercise for your own library.
What is “benchmarking”? Some basic definitions: Benchmark: a reference standard against which other things can be compared. Benchmarking:the continuous process of identifying, understanding and adapting practices from other organisations to improve the performance of your own organisation. Benchmarking involves continuously assessing the quality of goods or services against standards of best-practice. It is often included as part of a commitment to total quality management (TQM). Like many other quality tools, benchmarking was devised by Japanese industrialists in the 1950’s to identify and copy “best practice” in Western manufacturing.
Commercial uses of bench-marking Commercial organisations use bench-marking to compare standards of performance at various levels: • Internal: comparing how different departments within the same organisation (e.g. finance and marketing) perform. • External: comparing performance with other organisations, either those in the same market, or those fulfilling similar functions in a different market. It is notoriously difficult to obtain performance figures for your competitors. • Global: may involve comparison with organisations with completely different functions, but again the object remains to identify and assess best practice. Such activities are essentially about gaining a competitive edge, in terms of cost-cutting, value for money, supply chains and building up market share.
Examples of benchmarking in the public sector: • Since the 1980’s, benchmarking has become more common in the public sector, with the rise of: • Competitive tendering • Emphasis on measuring performance • Pressure on funding • Management models moving closer to the commercial sector • Focus on customer satisfaction (I.e. outputs, rather than processes). • Public services (e.g. libraries and leisure centres), can use bench-marking to explore the quality of services they offer. Performance indicators must reflect the viewpoint of their customers. • Some see benchmarking as an aid to controlling cost while preserving the quality of service provision. This is achieved by establishing closer links between resources and outputs.
Quality in libraries • Many information professionals view techniques that emphasise profit and performance as “alien” and against the traditions of public service. • Quality improvement programmes were often seen as paper exercises, which librarians had to complete to show willingness to change, although in practice they often made little difference. • In an era of citizens’ charters and charter marks for public libraries and with academic libraries obliged to justify their funding, both have had little option but to engage more fully in quality programmes.
Level of bench-marks Benchmarks can be set to differing levels, such as: • Level 1: provide a “library.” This could be anything which reaches the dictionary definition (maybe just a collection of books with some public access) • Level 2: provide an appropriate library: this suggests at least some thought about the function of the library e.g. a healthcare library should include key journals and access to the Medline health database. • Level 3: provide a quality library to meet the users’ needs: this may mean tailoring provision to preferences like opening hours, training in using databases, stocking the journals that most users wish to consult.
Four key stages in bench-marking • Planning for bench-marking. • Set targets for the quality of various services • Measuring: audit or bench-mark the quality of that service using a check-list (this may include an cost-benefit analysis) • Action plan: compare the bench-marks with the targets and produce an action plan to bridge deficits.
We shall illustrate these four steps with reference to the Australian National Resource Sharing Working Group Interlibrary Loan and Document Delivery (ILL/DD)Benchmarking Study
Step 1: Planning for Benchmarking You may choose to use a panel rather than an individual, to introduce an element of objectivity. The panel should have sufficient gravity to command respect, but not so many members that it becomes unwieldy. You should consult with all the main parties involved, including those who will undergo benchmarking and the customers of the service. You should define the topics for bench-marking and processes to be used. This may involve clarifying who your customers are and which outputs you wish to benchmark.
Step 1 - Planning for Benchmarking • The key aims of the ILL/DD study were to: • Identify characteristics of high performing ILL/DD operations, • Raise awareness and help to change ILL/DD practices, and • Assist individual libraries to benchmark their operations against a standard set of data.
Step 2: Set targets for quality of specific services You can set targets to assess quality across many areas: • Design and quality of facilities • Staffing (levels, expertise) • Health and safety (probably more organisation-wide rather than departmental, but may include issues like cleanliness) • Accessibility (opening hours, disabled access) • Use of services • Marketing Begin with standards in key processes the organisation performs, and which are critical success factors in running it effectively. You identify appropriate targets from comparisons with similar organisations or from accepted standards for “best practice” from the wider literature.
Step 2 - Set targets • Turnaround time, fill rate and unit cost were the main performance measures for Interlibrary Loan requests, and fill rate and unit cost for Document Supply. • Data for turnaround time for supplying an item was also collected. Each measure was clearly defined.
Sample definition for Turnaround time • Requesting: Number of calendar days (not working days) between patron’s initiation of ILL/DD request and library’s notification to patron of final outcome of request (receipt of item or negative response). • Includes weekends. All completed requests (i.e. filled and unfilled requests but not those ‘in process’) used to calculate average turnaround time for survey period.
Step 3: Measuring • This involves data collection and analysis, culminating in comparison with external standards or “best-practice.” • You will already have some data from statistics already gathered as part of routine activity (e.g. book issue and return). Some data may need specific collection from this point in time (e.g. survey of customer satisfaction with specific services). • Sometimes measurement may involve site visit or interviews with stakeholders.
Step 3 - Measuring • Data from ninety libraries from all states and territories and sectors included in main analysis. • Data analysed by Australian Bureau of Statistics. • Findings corroborated by detailed analysis of operations identified as high performing for ILL request and DD supply.
Performance averages broken down by Sector. Include times, costs and Percentage of standard met
Stakeholder Involvement • Patrons commented that service they received was ‘valuable’, ‘useful’, ‘efficient’ and ‘excellent’. Staff often described as ‘helpful’, ‘efficient’, ‘professional’ and ‘obliging’. • Patrons who had not paid a fee asked to estimate amount they would have been willing to pay for the service. While service was valued highly, most common response was $0. Average amount was $4.09.
Step 4: Action plan • Action Plan addresses gaps between “best practice” (what organisation should be doing) and current practice (what it is currently doing). • An early (and crucial step) may be to share conclusions with all who will be involved in the action plan. • Action Plan needs to be reviewed at regular intervals to see if is working at closing each gap.
Step 4 – Action Plan • Examine workflows to ensure there are as few steps as possible • Implement an automation package • Ensure ILL/DD staff are well-trained in resources and systems • Add and maintain holdings on union catalogues • Investigate cooperative agreements with key libraries
If you were an ILL/DD benchmarked library you might • Compare your performance figures against averages for your Sector • Investigate reasons for below average performance • Have a “team think in” on the ILL/DD process • Investigate potential for automation/further automation • Your thoughts?..........................
Problems along the way… Unfortunately, benchmarking exercises do not always run smoothly. Problems for which you might want to plan ahead include: • Lack of support at senior management level • Uncertainty about which processes to prioritise for benchmarking. • Justifying cost (in terms of money and staff time) that benchmarking exercise will consume.
….and some other issues • Benchmarking often involves teamwork and developing a consensus about what the organisation can and should do • As the process may involve disclosure of sensitive details of performance, you may need to establish ground rules about confidentiality. • Bench-marking will have costs in terms of staff and time, but well planned benchmarking exercises yield a good return on investment in terms of improved efficiency and economy. • You need to avoid a “blame culture” i.e. your organisation’s response to mistakes is: What lessons can we learn from next time? NOT: Whose fault was it?
What we have covered • Simple definitions of bench-marking • Some applications to both commercial and public sectors • Main stages in benchmarking exercise with example that operates on national and local levels.
Further reading: • Bendell, T. (1997) Benchmarking for competitive advantage. 2nd rev ed. London: Pitman. • Balm, G. (1992) Bench-marking: a practitioner’s guide for becoming and staying the best of the best. Schaumburg (Illinois): QPMA Press. • Brockman, J. (Ed) (1997) Quality Management and Bench-marking in the Information Sector. London: Bowker Saur. • Brophy, P. (2006) Measuring library performance: principles and techniques. London: Facet Publishing. • National Resource Sharing Working Group (2001).Interlibrary Loan and Document Delivery Benchmarking Study. National Library of Australia.
That’s all folks! That’s all folks!