1 / 47

Value, Impact and Performance: the SCONUL VAMP Programme

Value, Impact and Performance: the SCONUL VAMP Programme. Stephen Town Cranfield University Open University Wednesday 27 th June, 2007. Summary. An introduction to the issues The Value & Impact Measurement Program (“VAMP”) The VAMP Deliverables

denna
Télécharger la présentation

Value, Impact and Performance: the SCONUL VAMP Programme

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Value, Impact and Performance:the SCONUL VAMP Programme Stephen Town Cranfield University Open University Wednesday 27th June, 2007

  2. Summary • An introduction to the issues • The Value & Impact Measurement Program (“VAMP”) • The VAMP Deliverables • A Community of Practice in Library Measurement • The “Performance Portal”

  3. The University Context (from the Library Assessment Conference, Charlottesville, Va, September 2006) Universities have two “bottom lines” • Financial (as in business) • Academic, largely through reputation in • Research (the priority in “leading” Universities) • Teaching (& maybe Learning)

  4. Library Pressures for Accountability The need is therefore to demonstrate the Library contribution in these two dimensions: • Financial, through “value for money” or related measures • Impact on research, teaching and learning This also implies that “competitive” data will be highly valued

  5. Cautions for measurement The Aim & Role of Universities & their Libraries • Research • ‘Mode 1’ Research & impact ‘transcendental’ • ‘Mode 2’ Research & impact ‘instrumental’ • Reductionism • Value, Price & ‘Mandarinisation’ of research and its support • Libraries as symbols and ‘transcendent’ services

  6. The SCONUL Experience The SCONUL Working Group on Performance Improvement • Ten years of “toolkit” development to assist in performance measurement and improvement • SCONUL ‘Top concern survey’ 2005, suggested inability to prove value, impact or worth, leading to VAMP

  7. Integration Efficiency & Comparability Quality assurance Guidelines SCONUL Statistics & interactive service HELMS national performance indicators E-measures project Benchmarking Manual Examples of tools developed 1

  8. Satisfaction Impact SCONUL Satisfaction Survey SCONUL LibQUAL+ Consortium LIRG/SCONUL Impact Initiative Information Literacy Success Factors Examples of tools developed 2

  9. VAMP Objectives • New missing measurement instruments & frameworks • A full coherent framework for performance, improvement and innovation • Persuasive data for University Senior Managers, to prove value, impact, comparability, and worth

  10. Missing methods? • An impact tool or tools, for both teaching & learning and research (from the LIRG/SCONUL initiative?) • A robust Value for Money/Economic Impact tool • Staff measures • Process & operational costing tools

  11. Benefits? • Attainment & retention of Library institutional income • Proof of value and impact on education and research • Evidence of comparability with peer institutions • Justification of a continuing role for libraries and their staff • Meeting national costing requirements for separating spend on teaching and research

  12. VAMP Project Structure • Phase 1 (March-June 2006) • Critical review • SCONUL Members Survey • Gap analysis & synthesis • SCONUL Conference Workshops • Phases 2 & 3 (July 2006 - June 2007) • Development of new measures & techniques • Review and re-branding of existing tools • Web site development • Dissemination & maintenance strategy

  13. Critical Review Method Review of: • SCONUL initiated or promoted services • Other UK & European initiatives • Initiatives from other UK library sectors • International initiatives starting from the perspective of ‘The Effective Academic Library’, 1995

  14. Review Findings • The Impact Initiative work will be key, but needs to be solidified and embedded • Eight JISC/EU projects in the last eight years relevant to the assessment of value and impact, many relating to e-resources • Significant work in the USA, Australia and South Africa

  15. Review Conclusions • Vast amount of relevant work, but without wide take up • More critical analysis required of most products and tools • Further development and simplification required to create credible and applicable instruments for SCONUL members

  16. Member Survey Findings • 38 respondents; 27% of population • 70% undertaken value or impact measurement • Main rationales are advocacy, service improvement, comparison • Half used in-house methodologies; half used standard techniques • Main barrier is lack of tools, making time an issue • Buy-in of stakeholders is an issue

  17. Member Survey Conclusions • There is a need to demonstrate value and that libraries make a difference • Measurement needs to show ‘real’ value • Need to link to University mission • Libraries are, and intend to be, ahead of the game • Impact may be difficult or impossible to measure • All respondents welcomed the programme, and the prospect of an available toolkit

  18. Synthesis • Terminological confusion? • Is ‘impact’ = ‘effect’ or to ‘outcome’ • ‘Higher order effects’ and level • Individual, course, institutional, vocational, societal, national, international • ‘Value and impact’ are not an item? • ‘Value’, ‘adding value’, ‘value for money’, ‘cost-effectiveness’

  19. Expert Comment • Return to why? • Advocacy or management? • Critical gap in measurement • Effect of the Library on educational attainment • Effect of the Library on research attainment • Robust and simple tools • Only a few existing tools effective, so simplify and focus the range of offerings

  20. SCONUL Conference Workshops • Accountability to a variety of structures and individuals … therefore a range of approaches required • SCONUL Statistics heavily used • Directors want help with all VAMP lines • Pedagogic “Big project” needed? • Re-engineer processes rather than measure!

  21. Overall conclusions • Wider sectoral involvement? • Health Evidence-based methods • National MLA Measures • British Library Contingent Valuation • Not only new measures … but also supporting & directing processes

  22. Deliverable Plan 1 “Content” Products 2.1 Value & Impact Guidelines 2.1.1 Institutional Value (eg VFM & Economic Impact) 2.1.2 Impact on Teaching & Learning 2.1.3 Impact on Research

  23. Deliverable Plan 2 “Content” Products 2.2 Staffing & Operational Measures Guidelines 2.2.1 Staff Costing 2.2.2 Staff Added Value measures 2.2.3 Other operational costing methods 2.3 Re-branding & packaging of existing tools

  24. Deliverable Plan 3 “Process” Products 3.1 Web Site 3.2 Community of practice establishment 3.3 Maintenance & sustainability strategy

  25. Progress on Content 1 2.1.1 Institutional Value (eg VFM & Economic Impact) VFM tool in negotiation now with 2.2 Contingent Valuation methods available 2.1.2 Impact on Teaching & Learning 2.1.3 Impact on Research Tool for both areas delivered (Information Management Associates)

  26. Progress on Content 2 2.2.1 Staff Costing 2.2.2 Staff Added Value measures 2.2.3 Other operational costing methods Costing & value method in negotiation ‘Transparency’ Meeting May 2007 2.3 Re-branding & packaging of existing tools Mainly included within the Web Site development

  27. Progress on Process 3.1 Web Site Portal to be launched in June 2007 3.2 Community of practice establishment Invitations for beta testing of site May 2007

  28. Communities of Practice “groups of people who share a passion for something that they know how to do,and who interact regularly to learn how to do it better” “coherence through mutual engagement” Etienne Wenger, 1998 & 2002

  29. Community of Practice Techniques Member’s Forum (Blog?Chat?) VAMP Home Page Simple Introductions Detailed Techniques Techniques in Use (Wiki?)

  30. The ‘Performance Portal’ • A Wiki of library performance measurement containing a number of ‘approaches’, each (hopefully) with: • A definition • A method or methods • Some experience of their use in libraries (or links to this) • The opportunity to discuss use

  31. The Ontology of Performance • ‘Frameworks’ • ‘Impact’ • ‘Quality’ • ‘Statistics’ • ‘Value’

  32. Mounted European Framework for Quality Management (EFQM) Desired Key Performance Indicators The Balanced Scorecard Critical Success Factors The Effective Academic Library Frameworks

  33. Mounted Impact tools Desired Detailed UK experience from LIRG/SCONUL Initiatives Outcome based evaluation Information Literacy measurement More on research impact Impact

  34. Mounted Charter Mark Customer Surveys LibQUAL+ SCONUl Survey Priority Research Investors in People Desired Benchmarking Quality Assurance ISO 9000s ‘Investors in People’ experience Opinion meters Quality Maturity Model Quality

  35. Mounted SCONUL Statistics & interactive service HELMS statistics Desired Institutional experience of using SCONUL statistics for local advocacy COUNTER E-resource tools Statistics

  36. Mounted Desired Contingent valuation ‘Transparency’ costing Staff & process costing, value & contribution E-resource value tools Value

  37. Discussion Tools • An experiment in social networking & Web 2.0 technologies

  38. Acknowledgments • Angela Conyers, Evidence Base, UCE • Claire Creaser & Suzanne Lockyer, LISU, Loughborough University • Professor Peter Brophy, Manchester Metropolitan University • The VAMP Subgroup of SCONUL WGPI Maxine Melling, Philip Payne, Rupert Wood • The Cranfield VAMP Team, Darien Rossiter, Michael Davis, Selena Lock, Heather Regan

More Related