1 / 100

Intro - Jan AskAway – Brandon askON – Jan Conclusion – Jan Q&A – you!

Intro - Jan AskAway – Brandon askON – Jan Conclusion – Jan Q&A – you!. [http://www.slideshare.net/webbmedia/key-performance-indicator-for-libraries-presentation]. Performance Metrics and Value Indicators

kennethe
Télécharger la présentation

Intro - Jan AskAway – Brandon askON – Jan Conclusion – Jan Q&A – you!

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intro - Jan • AskAway – Brandon • askON – Jan • Conclusion – Jan • Q&A – you!

  2. [http://www.slideshare.net/webbmedia/key-performance-indicator-for-libraries-presentation][http://www.slideshare.net/webbmedia/key-performance-indicator-for-libraries-presentation]

  3. Performance Metrics and Value Indicators • Everything is based on user needs. Start with metrics that demonstrate satisfaction of user needs. Photo By Amanda Etches

  4. Pearl-grow your metrics and keep developing them. • Link them to tell the bigger story and the value • More than proving value: • Optimize workflow • Engages VR community: partners, staff, &visitors

  5. Brandon Weigel, AskAway Coordinator brandonw@eln.bc.ca

  6. AskAway Background • Launched in 2006 as public/post-secondary collaborative service • Became post-secondary only in 2010 • 30 post-secondary libraries across BC • Collaboratively staffed by all partners • 7 days a week • 67 service hours per week • 39 weeks per year • 235 staff hours per week • 3-5 staff per shift

  7. Funding – Today • Formerly funded by BCcampus – until it dried up • Now transitioning to being fully participant-funded • Result: ever-greater need to prove value!

  8. Why do we measure? To prove our value… but what constitutes “value”?

  9. Value – Libraries’ perspective: Getting a return on investment: • Students are using the service • Service provided is high quality • Professional development for staff • Worth the time and resources put into it

  10. Value – Consortial perspective • Provide consistent, quality service • Limit wait times and costs with appropriate staffing levels • Improvements to cost, staff resources, and overall service by working collaboratively

  11. Value – Patrons’ perspective • Contributes to learning and information literacy • Connects researchers with information they need • Leaves patrons satisfied and willing to come back

  12. Tools for measuring • Usage statistics (traffic) • Question form fields • Transcripts • Patron exit surveys • Library commitments and cost data • Institution data

  13. Usage Stats/Traffic • Counts of all chat sessions: • Initiated by patrons, per patron library • Total handled by each library • Number of own patrons handled, per library • Number of other libraries’ patrons handled, per library • The most commonly-used measure… And the most problematic

  14. Usage Stats/Traffic: Uses What it’s good for: • Indicates patron awareness of the service • Measuring effect of promotional activities or website changes • Red flag for technical problems What it’s not good for: Measuring value!

  15. Usage Stats/Traffic: Mis-uses • Assessment on service value • “Return on investment” • “If they’re not using it, they don’t need it” • Comparisons with other institutions • “We’re using it less, so we should pay less” • Says nothing about quality of interaction, type of question, student outcomes

  16. Usage Stats/Traffic: Value Why is usage a poor indicator of value? • No correlation between usage and patron value indicators • Usage primarily correlates with: • Visibility on library website • Promotional activities

  17. Usage Stats: Visibility Examples Qwidget added to EBSCO search results

  18. Usage Stats: Visibility Examples Website redesign cuts Qwidgets and links

  19. Usage Stats: Visibility Examples Qwidget added to Discovery Layer search results Discovery Layer launched

  20. Usage Stats/Traffic Takeaway: Usage is under your control! (So it can’t tell you much about value.)

  21. Question Form Fields • Detailed data about individual questions • Key fields: • Timestamp • Patron’s institution • Referring URL • Session length • Wait time • Resolution code • Descriptive codes

  22. Form Fields: Uses What it’s good for: • Identifying and tracking problems • Estimate the depth of the questions • Find referral hotspots • Top referrers: database search results, research guides, catalogue, database lists, discovery layer results… • Essentially, pages where people are actively researching • Determine staffing needs for each shift

  23. Form Fields: Uses

  24. Form Fields: Uses Descriptive Codes: added by librarian post-chat • Used to identify the type(s) of question • Core codes: research, ready reference, directional, citation, technical, e-resource access, prank • Plus 16 additional codes Useful for: Debunking myths! (And demonstrating value)

  25. Form Fields: Cautions • Good data, but incomplete: • Descriptive codes not always applied by busy librarians • Currently, 70% rate of applying codes (not bad) • Not much user-submitted information captured • Difficult to read – which discourages use

  26. Form Fields: What I wish we could do • Compare data with other reference venues • Collect more data! • Get more people using it

  27. Transcripts • Useful for: • Examining problematic interactions • Examples for training • Potentially excellent for measuring service quality (But we don’t do that)

  28. Transcripts: Cautions • Strong potential, but untapped • 26/27 polled libraries do not use transcripts for staff evaluation • General sense: “It’s creepy” • Sense of being watched could hurt staff support • Comparisons harm collaboration • Result: Opinions on quality are based on feelings, not on data

  29. Transcripts: In an ideal world… The impossible dream: • Consistent reference standards, with library buy-in • Non-invasive transcript analysis (impossible?) • Work with coordinators to apply those standards More realistic: • Annual anonymized random sample analysis

  30. Exit Survey • Began in 2008 to measure patron satisfaction, and to collect qualitative data and useful quotes • Rewritten in 2013 to measure learning outcomes

  31. Exit Survey: What it tells us • How they discovered AskAway • Why they’re using AskAway • Satisfaction level • Likelihood of returning • What could be improved • Learning outcomes • Demographic info • General comments

  32. Exit Survey: Why it’s useful • Satisfaction level: rough measure of service quality • 2014: 90% high satisfaction; 93% likely to return • Demographics: Tells us who our patrons really are, and how different groups use AskAway

  33. “Super fantastic help. Friendly, positive and informative. Exactly what I needed and didn't take much longer than a phone call. Thank you!” - TRU grad student “I was astounded at the ease with which the librarian found a specific paper for me, based on an incomplete name, an erroneous date, and context. This is a great service!” - UBC Faculty

  34. "Thank you so much for your help once again. I really appreciate how you 'teach' us students strategies to use rather than just giving us the answer. Proves to be really effective in my learning!” - UFV Student

  35. Exit Survey: What I wish we could do • Link survey responses to transcripts • Improve response rates: • Send dialog box to user when closing browser window • Other ways?? • Post monthly survey analyses, not just tables

  36. Inputs and Outputs: Return on Investment • NOT ABOUT HOW MANY CHATS YOU GET • Inputs vs. Outputs • Staffing: • Commitment hours per week: 3 to 34 (based on size) • Majority contribute 3 or 5 hours weekly • Return on investment: 235 staff hours per week • 510% to 6830% return on staff time • PLUS extended reference hours

  37. Inputs and Outputs: Return on Investment – Financial

  38. What I wish we could measure • Larger student impact • Assignment scores • Long-term learning • Reasons people don’t use AskAway • Overall awareness of the service • Quality of chats • Librarian performance

  39. Takeaways • The easiest tool to use is rarely the best for the job • Choose metrics that tell you want you want to know • Modify your tools to fit the problem • Usage is under your control • Think broadly about the investment

More Related