1 / 59

LibQUAL + Canada Consortium Survey Results Webinar Oct . 5, 2010

LibQUAL + Canada Consortium Survey Results Webinar Oct . 5, 2010. Organized by Sam Kalb, LibQUAL + Canada Coordinator Sponsored by the Canadian Association of Research Libraries Moderated by Katherine McColgan , CARL. Program.

helene
Télécharger la présentation

LibQUAL + Canada Consortium Survey Results Webinar Oct . 5, 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LibQUAL+ Canada Consortium Survey Results WebinarOct. 5, 2010 Organized by Sam Kalb, LibQUAL+ Canada Coordinator Sponsored by the Canadian Association of Research Libraries Moderated by Katherine McColgan, CARL

  2. Program • Sam Kalb (Queen’s University). Using the LibQUAL+® notebooks and other LibQUAL+ services to analyze and present your results (50 minutes) • Eun-ha Hong (Wilfrid Laurier University). Using statistical tools to further analyze your LibQUAL+® data (15 minutes) • Questions/discussion (25 minutes)

  3. Understanding & Using Your LibQUAL+® Results Sam Kalb Assessment & Scholarly Communication Services Coordinator Queen’s University Library

  4. Privacy & LibQUAL +® Results • Any LibQUAL participant can access the results of all other libraries and consortia who participated in the same year • “In an example of collaboration, LibQUAL+® participants are sharing their results within the LibQUAL+® community with an openness that nevertheless respects the confidentiality of each institution and its users.” Martha Kyrillidou, 2010 LibQUAL +® survey, introduction.

  5. Desired Outcomes • Use the LibQUAL +® Notebooks & analytic utilities to perform some simple analyses of your LibQUAL+® survey results • Present the results to your stakeholders • Use the data to target areas for improvement From: presentation by R. Bowlby and M. Kyrillidou,, LibQUAL+® Canada Workshop, Ottawa, Ontario, Canada, October 24-25, 2007

  6. LibQUAL+ Canada 2010 Survey Format Distribution ¾ of members chose the Lite format

  7. Completion Rates • 2010 LibQUAL+® Lite: 61.7% • 2010 LibQUAL+® full: 54.3% • 2007 LibQUAL+® full: 48.8% Completed Survey. The user has supplied a rating for all items on the survey.

  8. Relative Indicators • There are no absolute high or low scores. • Scores are relative indicators • Scores are only meaningful in comparison with other scores in the same survey, your survey from another year, other individual libraries and the consortial totals.

  9. Three interpretation frameworks • Zone of tolerance • Perceptions vs. expectations • meeting users minimum expectations (Adequacy Gap) • Approaching users’ desired expectations (Superiority Gap) • My scores over time (longitudinal) • Am I doing better or worse compared to last time I measured my performance • Peer comparisons From presentation by M. Kyrillidou, ALA, June 2007

  10. What Do the Ratings Signify? User Assigned: • Desired. How highly do I value it? • Minimum. What is my least expectation? • Perceived. My actual rating Calculated Scores: • Adequacy Gap. Perceived – Minimum • Superiority Gap. Desired – Perceived

  11. Which Numbers? • Desired. How important is a service relative to other services; relatively to same services among other libraries • Adequacy Gap. How far above or below Minimumexpectation is the service delivered.

  12. User Groups & Disciplines

  13. User Groups Focus mainly on your results by User Group The major academic user groups (faculty, grads and undergrads) have quite different perspectives and expectations of library services. Overall totals from different libraries or different years could be skewed by variations in user groups ratios. • Look at the 5 most and the 5 least desired questions by each user group • Look at the services with 5 highest and the 5 lowest gap scores(Bear in mind that Superiority Gap scores are usually negatives. • Look for any correlation between the high/low Desired and the high/low Gap scores • Look at the average (mean) scores for each service dimension by user group

  14. 5 highest Desired Services

  15. Academic Discipline Groups • In addition to analyzing your results by User Group, you can analyze your results by Discipline using ARL’s Analytics utility (to be covered later in the presentation). • Focus on LibQUAL +®’s Standard Disciplines • Facilitates comparison with other libraries • Most useful for Information Control dimension

  16. Where to Get Your Results • Report - results notebook (PDF) • Comments - respondents’ free-text comments (CSV or TXT) • Consortial Notebooks – Consortium, CARL, CREPUQ, OCUL (PDF) • Raw Data – individual records of every respondent’s survey in CSV format (spreadsheet) • Data Keys & SPSS Syntax – description of the raw data elements or fields (MS Word) http://www.libqual.org/SurveyInstruments/LibQual/DataRepository.aspx

  17. Consortial Results Notebooks • http://library.queensu.ca/webir/canlibqual/results-e.htm

  18. The Results Notebooks

  19. Results Notebooks - Content • Sections for Overall, Undergraduates, Graduates, Faculty, Staff, Library Staff include: • Demographic Summary • Core Questions Summary • Dimensions Summary • Local Questions • General Satisfaction Questions • Information Literacy Outcomes Questions • Library Use Summary • Appendix describing changes in the dimensions and the questions included in each dimension.

  20. Representativenessby User Group & Discipline

  21. Representativeness - Standard Disciplines: An Academic Library* * Queen’s University 2010 LibQUAL +® Survey

  22. LibQUAL+ Canada 2010University Response Rates

  23. Core Survey Questions • Tables & Charts • Individual questions – average scores & standard deviations • Dimensions summary

  24. Standard Deviations[This slide was omitted from Oct. 5/2010 presentation] How closely does a mean score in a notebook represent all the individual respondents scores for the particular item. If all respondents rated AS-1 Desired as 7.64, the SD would be 0

  25. Understanding Your Thermometer Charts Desired Perceived Target Minimum Zone of tolerance

  26. Charting the Survey Dimensions • Affect of Service – Customer/client service • Information Control – Collections & access to collections • Library as Place – Physical facilities Range of Mean scores is relatively narrow (6.27 – 8.03)

  27. LibQUAL Canada 2010Faculty Mean Scores

  28. Negative Adequacy Gap Scores • Potential areas for improvement or further investigation

  29. When lower may not be a bad sign? Always check the Minimum mean score when evaluating the rise or fall in your Adequacy Gap score. A lower Gap score just may result from rising expectations

  30. Understanding Your Individual Results: Radar Charts From presentation by M. Kyrillidou, ALA, June 2007

  31. Key to Radar Charts From presentation by M. Kyrillidou, ALA, June 2007

  32. Other Notebook Elements • Local Questions • General Satisfaction Questions • Information Literacy Outcomes Questions • Library Use Summary

  33. Local Questions • You can compare your individual local question results with: • Your past surveys • Consortial or peer library results (assuming sufficient number of respondents, e.g. 100+) LibQUAL Canada 2010. Item 3.4. Location Questions Summary

  34. General Satisfaction &Information LiteracyOutcomesQuestions • These are simple indicators without the context of minimum and desired ratings

  35. Library Use Summary From presentation by M. Kyrillidou, ALA, June 2007

  36. Qualitative Analysis: User Comments • Almost ½ of respondents fill in the Comments box • Provide valuable insights and suggestions for improvements • Essential component in understanding the reasons behind the survey scores. • Available on LibQUAL+® Web site Data Repository www.libqual.org

  37. Comments Records • Download file in csv(Excel) or txt (text) • Record Content: • ID: • UserGroup: • Discipline: • Branch: • Age: • Sex: • Comment:

  38. Using Respondents’ Comments • Skim the comments • Load csv file into text analysis program, e.g. ATLAS.ti, InMagic, etc. • http://db.library.queensu.ca/libqual/index2010.htm • Conduct analysis • Identify major themes, e.g. study space, library catalogue, noise, etc. • http://library.queensu.ca/webir/libqual-2007/issues&actions.html

  39. Other LibQUAL+®Analytical Tools • Analytics • Norms Tables www.libqual.org

  40. LibQUAL+® AnalyticsUser group and Discipline analysis • Institutional Explorer (peer comparison) • Representativeness graphs • Radar chart • Library Use chart • Thermometer chart • Cumulative percentile distribution • Longitudinal Analysis (by survey year) • Cumulative percentile distribution • http://www.libqual.org/SurveyInstruments/LibQual/Analytics.aspx

  41. LibQUAL +® AnalyticsInstitutional Explorer 2010 Queen’s LibQUAL +® results Undergraduate Business students

  42. Longitudinal AnalysisQueen’s University, 2007 & 2010 Heath Sciences Graduate Students

  43. LibQUAL +® Chart Templates • ARL provides templates to assist you in preparing customized radar and thermometer charts to create custom analyses http://www.libqual.org/SurveyInstruments/LibQual/Resources.aspx www.libqual.org

  44. LibQUAL+® Norms Tables • Identify your score • Compare it to a relevant norms table • Year • Subgroup • Dimension • Norms are stable and are not calculated on an annual basis anymore (last calculated 2005) • Norm Conversion Tables facilitate the interpretation of observed scores using norms created for a large and representative sample. http://people.cehd.tamu.edu/~bthompson/libq2005.htm From: presentation by R. Bowlby and M. Kyrillidou,, LibQUAL+® Canada Workshop, Ottawa, Ontario, Canada, October 24-25, 2007

  45. Additional LibQUAL +® Analysis Services • Customized Discipline Analysis • Library Branch Analysis • Other Customized Analyses (upon request) • Print Copies http://www.libqual.org/about/about_lq/fee_schedule www.libqual.org

  46. Presenting the Results to Your Stakeholders

  47. Stakeholders • Identify all of the stakeholders or constituents who want and need to know about the survey results • Consider the “stake” of each of the above; what specific aspect of LibQUAL+® will be of most interest / concern • Determine how to communicate with each identified stakeholder From: presentation by R. Bowlby and M. Kyrillidou,, LibQUAL+® Canada Workshop, Ottawa, Ontario, Canada, October 24-25, 2007

  48. Communicate with your Customers(faculty, students, other) 1stPriority. Particularly those whom you asked to participate in the survey. As soon as you can: • Announce incentive award winners • Inform users of highlights of survey results • Present weak areas as challenges and opportunities not as negatives • Most importantly, what the library intends to do. Describe action items begun and planned Example: http://library.queensu.ca/libqual%202010 From: presentation by R. Bowlby and M. Kyrillidou,, LibQUAL+® Canada Workshop, Ottawa, Ontario, Canada, October 24-25, 2007

  49. Comparing Your Library’s Results • Compare your results with the corresponding consortial results and those of peer Canadian libraries • Compare your results over multiple LibQUAL +® surveys (longitudinal analysis) • Look at results to determine if users are not aware of what the library already does • Explore one question by discipline and user group • Probe the questions that had meaningful gaps between perceived results and minimum expectations (Adequacy Gap) From presentation by M. Kyrillidou, ALA, June 2007

  50. Targeting Incremental Improvements • From all of the data, determine what can and should be addressed • Prioritize some action items • Align with mission, vision and goals of parent organization • Address users’ top priorities, by user group • Improve areas of strong user dissatisfaction • Build on strengths, if they are truly user needs and priorities • Identify work that can be de-emphasized and resources that can be reallocated From: presentation by R. Bowlby and M. Kyrillidou,, LibQUAL+® Canada Workshop, Ottawa, Ontario, Canada, October 24-25, 2007

More Related