1 / 17

Supported self-evaluation in assessing the impact of HE Libraries

Supported self-evaluation in assessing the impact of HE Libraries. Sharon Markless, King’s College London and David Streatfield, Information Management Associates. Why supported self-evaluation?. No established tradition in HE libraries of evaluating impact

cloris
Télécharger la présentation

Supported self-evaluation in assessing the impact of HE Libraries

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Supported self-evaluation in assessing the impact of HE Libraries Sharon Markless, King’s College London and David Streatfield, Information Management Associates

  2. Why supported self-evaluation? • No established tradition in HE libraries of evaluating impact • Problems of engaging busy people with a difficult process- and for the long term How to effectively overcome both these challenges?

  3. Developing the approach: Stage 1 The Effective College Library Project: case studies in 6 colleges to develop and evaluate specific aspects of practice. Contribution to our approach: • production of a prototype model of the process of impact evaluation (key steps); • importance of understanding aims of the library service; • value of researcher/librarian partnership.

  4. Developing the approach: Stage 2 School self-evaluation materials:Generic materials based on research and development. Sets of performance and impact indicators plus data collection guidance and tools Contribution to our approach: • workshops vital to support use of materials and get people started; • use of research to guide generation of PIs; • need to provide tools for data collection

  5. Developing the approach: Stage 3 Health and public library research and development initiatives: cycles of workshops to introduce the model, supplemented by on-line support and a growing range of materials Contribution to our approach: • refining the model to work in, and be relevant to, different contexts; • visible power of the supported action research to motivate and enable change.

  6. The Impact [Implementation] InitiativeLIRG/SCONUL • 22 university teams – 2 annual cycles • focus on information literacy, supporting research, providing electronic services • 18 finished the cycle • 3 workshops per year + distance support • Visits offered • Structured reports from each site

  7. The Supported Self-evaluation approach • Use of impact model: coherent and systematic approach • Workshops • Materials, especially examples and data collection • E-support between workshops • Teams within each participating library • Self-evaluation: libraries’ own objectives, impact indicators and data gathering • Range and changes in facilitator roles

  8. Underpinning principles • capacity for enhancing work/the service • owned/adapted by practitioners (empowerment) • practitioner-formulated approaches within a coherent framework • tapping research cross different disciplines to help get at impact • work within a supportive team • a real initiative with no extra time or money provided; have to fit it into already busy lives to be sustained

  9. An approach at three levels • Action research undertaken by each team within each participating HE library • Sharing/reviewing impact indicators, data gathering tools and problems across participating libraries • Evaluating the impact model together with the approach as an experimental programme of change

  10. End eval. Start eval. Intro. event Progress check Review

  11. Review of the approach/lessons learned 1 Power of supported self-evaluation: • Re-focussed practitioners away from process to impact • Effected real development/change • Enabled practitioners to demonstrate impact

  12. Review of the approach/lessons learned 2 Participants recognised: • Collaboration/networking is critical • Need to focus on one aspect of provision in depth • Importance of a framework and structure • Value of examples, especially research tools • Problems of academic cooperation, particularly in data collection • Challenging and stressful nature of engaging with impact

  13. Review of the approach/lessons learned 3 Facilitators learned: • Critical role of the workshops in the process • Need a range of facilitator skills and roles (research; facilitation; change management) and ability to shift between them • Hard to negotiate effective levels and types of support (coercion v empowerment!) • Need to offset low uptake of offered support

  14. Organisational and Structural factors • When to evaluate impact? Problems of the planning cycle; impact may take time! • Sustaining the work; what might be needed for institutionalisation? “Influencing academics and getting change at Academic Boards was harder to do than the evaluation.”

  15. General issues to consider if adopting this approach • Importance of framework and structure • Cross-site collaboration: timing; type and focus • Reporting the process and the outcomes (deadlines, ownership) • may increase uncertainty/cognitive dissonance for participants as deep challenge

  16. Issues to consider if adopting this approach 2 What do we sacrifice by enabling teams to ‘do their own thing’ albeit within a framework? • Consistency, validity + rigour versus real development + empowerment • Benchmarking/ comparability of outputs versus local context • Facilitating versus enforcing

  17. Project process and materials VAMP Website

More Related