1 / 29

Technical Implementation of the MINES Survey Methodology

MINES for Libraries. Technical Implementation of the MINES Survey Methodology. Terry Plum Assistant Dean, Simmons GSLIS Association of Research Libraries MINES – www.minesforlibraries.org Terry Plum ( terry.plum@simmons.edu ) Brinley Franklin ( brinley.franklin@uconn.edu ). ACRL 2005

ewiley
Télécharger la présentation

Technical Implementation of the MINES Survey Methodology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MINES for Libraries Technical Implementation of the MINES Survey Methodology Terry Plum Assistant Dean, Simmons GSLIS Association of Research Libraries MINES – www.minesforlibraries.org Terry Plum (terry.plum@simmons.edu) Brinley Franklin (brinley.franklin@uconn.edu) ACRL 2005 Minneapolis, MN April 7, 2005 www.arl.org/stats/

  2. Issues with web surveys • Non-probability • Entertainment surveys • Self selected surveys • Volunteer panels • Probability • Intercept (every nth) • Surveys that obtain respondents from an e-mail request. • Mixed-mode surveys where one of the options is a Web survey. • Pre-recruited panels of a particular population as a probability sample www.minesforlibraries.org

  3. Issues with web surveys • Research design • Coverage error • Unequal access to the Internet • Internet users are different than non-users • Response rate • Response representativeness • Random sampling and inference • Non-respondents • Data security www.minesforlibraries.org

  4. Networked electronic resources and services - assessment environment - • Networked electronic resources are accessible from many different web pages and web servers • Patrons bookmark networked electronic resources locally on their own workstations. • Academic departments, librarian liaisons, anyone with a web page copies and pastes library links into their own web sites • The survey data must be collected and commensurable for all networked electronic resources, including e-journals, e-books, online databases or traditional library request services offered in the online environment, such as Interlibrary Loan. • The results of the survey have to be uninfluenced by caching issues, both local, web browser caching and proxy server or Internet Service Provider caching. • The survey has to be meaningful for networked electronic resources, no matter how they were implemented. • Different authentication methods have to be accommodated, whether the institution used IP, password, referring URL, or an authentication and access gateway. • Remote usage has to be measured, regardless of the channel of communication, whether locally implemented proxy server, modem pool, or other institutional service. www.minesforlibraries.org

  5. MINES strategy • A representative sampling plan, including sample size, is determined at the outset. Typically, there are 48 hours of surveying over 12 months at a medical library and 24 hours a year at a main library. • Random moment/web-based surveys are employed at each site. • Participation is usually mandatory, negating non-respondent bias, and is based on actual use in real-time. • Libraries with database-to-web gateways or proxy re-writers offer the most comprehensive networking solution for surveying all networked services users during survey periods. www.minesforlibraries.org

  6. MINES strategy • Placement • Point of use • Not remembered, predicted or critical incident • Usage rather than user • What about multiple usages • Time out ? • Cookie or other mechanism with auto-population • Distinguish patron association with libraries. • For example, medical library v. main library. • But what if the resources are purchased across campus for all. Then how to get patron affiliation? www.minesforlibraries.org

  7. Web Survey Design Guidelines • Web survey design guidelines that MINES followed: • Presentation • Simple text for different browsers – no graphics • Different browsers render web pages differently • Few questions per screen or simply few questions • Easy to navigate • Short and plain • No scrolling • Clear and encouraging error or warning messages • Every question answered in a similar way - consistent • Radio buttons, drop downs • ADA compliant • Introduction page or paragraph • Easy to read • Must see definitions of sponsored research. • Can present questions in response to answers – for example if sponsored research was chosen, could present another survey Dillman, D.A. 2000 (December). Mail and Internet Surveys, The Tailored Design Method. 2nd Ed. New York: John Wiley & Sons. www.minesforlibraries.org

  8. Example of presentationsfirst fork www.minesforlibraries.org

  9. Example of presentationssurvey www.minesforlibraries.org

  10. Example of presentationssponsored research followup www.minesforlibraries.org

  11. Example of presentations www.minesforlibraries.org

  12. Example of presentations www.minesforlibraries.org

  13. Quality checks • Target population is the population frame – surveyed the patrons who were supposed to be surveyed - except in libraries with outstanding open digital collections. • Check usage against IP. In this case, big numbers may not be good. May be seeing the survey too often. • Alter order of questions and answers, particularly sponsored and instruction. • Spot check IP against self-identified location • Spot check undergraduates choosing sponsored research – measurement error • Check self-identified grant information against actual grants • Content validity – discussed with librarians and pre-tested. • Turn-aways – number who elected not to fill out the survey • Library information architecture -- Gateway v. HTML pages – there is a substantial difference in results. www.minesforlibraries.org

  14. Issues with web surveys:brief bibliography • Cook, Colleen; Heath, Fred; and Russell L. Thompson. 2000 (December). “A Meta-Analysis of Response Rates in Web- or Internet-Based Surveys.” Educational and Psychological Measurement 60(6): 821-836. • Couper, Mick P.; Traugott, Michael W.; and Lamias, Mark J. 2001. "Web Survey Design and Administration," Public Opinion Quarterly, 65 (2): 230-253. • Covey, Denise Troll. . 2002. Usage and Usability Assessment: Library Practices and Concerns. CLIR Publication 105. Washington DC: Council on Library and Information Resources. • http://www.clir.org/pubs/reports/pub105/contents.html • Dillman, D.A. 2000 (December). Mail and Internet Surveys, The Tailored Design Method. 2nd Ed. New York: John Wiley & Sons. • Gunn, Holly. 2002. “Web-based Surveys: Changing the Survey Process.” FirstMonday 7(12). • http://firstmonday.org/issues/issue7_12/gunn/index.html • LIBQUAL+ ™ Spring 2004 Survey. 2004. Cook, Colleen, and others. • http://www.libqual.org/documents/admin/ARL_Notebook2004.pdf • Schonlau, Matthias; Fricker Jr., Ronald D.; and Elliott, Marc N. 2002. Conducting Research Surveys via E-Mail and the Web. Santa Monica, CA: RAND. • Tenopir, Carol, with the assistance of Brenda Hitchcock and Ashley Pillow. 2003 (August). Use and Users of Electronic Library Resources: An Overview and Analysis of Recent Research Studies. Washington DC: Council on Library and Information Resources. • http://www.clir.org/pubs/reports/pub120/contents.htmls • Thomas, Susan J. 2004. Using Web and Paper Questionnaires for Data-Based Decision Making: From Design to Interpretation of the Results. Thousand Oaks, Corwin Press. • Thompson, Bruce.; Cook, Colleen.; Thompson, Russell L. 2002. Reliability and Structure of LibQUAL+™ scores: Measuring Perceived Library Service Quality. portal: Libraries and the Academy.3-12. www.minesforlibraries.org

  15. How to implement web surveys on library web sites • Because the point of use requirement, libraries that had a virtual gateway in library web architecture succeeded the best. • Rewriting proxy server • Database-to-web solutions • Serials Solutions • Interestingly openURL solutions are a gateway. www.minesforlibraries.org

  16. Library web architecture www.minesforlibraries.org

  17. Library web architecture www.minesforlibraries.org

  18. Variations on the web architecture theme • Online catalog • 856 field • Serials solutions • List of ejournals • Referrer server • Create a passthrough gateway • Mirrored web server • Drop in mirrored HTML page with survey links at survey period • Mirrored HTML pages enabled by scripts www.minesforlibraries.org

  19. Variations on the web architecture theme • DD/ILL • ILLiad – enable at the ILLiad logon screen • Ask Reference • Enable at the Ask Reference page or icon • Digital libraries • Represent an enormous investment • Primary clientele is outside the library. • Introduces non-authenticated group www.minesforlibraries.org

  20. Scenarios – University 1 • Situation • Three libraries; main and two branches • One virtual library • University authentication is NetID, but the library uses IP. • Off-site access is provided through a rewriting proxy server • List of databases generated with php and MySQL • List of ejournal generated with Serials Solutions • Library catalog – electronic journals are cataloged. Links point to proxy server, not to Serials Solutions. Uses a metasearch ILS feature. • Interlibrary loan, ILLiad logon • Digital libraries available through CONTENTdm www.minesforlibraries.org

  21. Scenarios – University 1 • Survey Solution • Elected not to use the proxy server • Ran all requests except for ejournals through a gateway • http://redir.library.xxx.edu/gateway.php?http://0-search.epnet.com.yyy.xxx.edu/login.asp?profile=agr • Serials Solutions • http://redir.library.xxx.edu/gateway.php?http://secret.search.serialssolutions.com/log?L=MW8XT6BJ7R&D=RMI&&J=DAEDCA&U=http://0-mitpress.mit.edu.yyy.sss.edu/catalog/item/default.asp?sid=22065875-6E38-4AAA-BB11-E00879BDE665&ttype=4&tid=61 • Good coverage except for ejournals through catalog www.minesforlibraries.org

  22. Scenarios – University 2 • Situation • Two libraries; health sciences and everything else (main library plus numerous branches) • One virtual library with many of the sources linked by both libraries, but still wanted to distinguish health sciences from main. • Off-site authentication uses proxy server (mechanical, not rewriter, VPN (Cisco), and Health Sciences VPN. • Services have been focused on the online catalog environment (Sirsi) • Public access to online catalog has a dummy logon. • Lists of databases and ejournals require authentication using Library ID. • One third of ejournals are cataloged • Implementing OpenURL server, which can link to journal articles. • Health Science uses ColdFusion to generate A-Z database list • There are large digital library collections. www.minesforlibraries.org

  23. Scenarios – University 2 • Survey solution • One survey for both health and main, with a forking re-write. • Main • Globally changed 856 links • OpenURL server • Sirsi environment • Subject guides – top level • DD/ILL • Ask a librarian • All digital collections – • Health Sciences • DD/ILL • Databases A-Z • Databases by topic – top level www.minesforlibraries.org

  24. Exploratory analysis • Mandatory v. optional • Usage v. user • Non-respondents • By resources – researchers and cost • Information seeking • Number of uses (surveys) by user (IP), after eliminating public and shared (lab) workstations • Pencil and web survey • In library v. out of the library – run concurrently • Location – different populations, different purposes for use. www.minesforlibraries.org

  25. Mandatory v. optional • Study of mandatory survey v. optional survey • Same survey • Randomly chosen 2 hour time periods each month. • Only three months into the study (6 hours of surveying) www.minesforlibraries.org

  26. Mandatory – University X (3 months) www.minesforlibraries.org

  27. Optional – University X (3 months) www.minesforlibraries.org

  28. What is MINES? • Action research • Set of recommendations for research design • Set of recommendations for web survey presentation • Set of recommendations for information architecture in libraries • Plan for continual assessment of networked electronic resources www.minesforlibraries.org

  29. What is the future of assessment of networked electronic services • This seems like a lot of work. Can’t we just use LibQUAL™? • MINES – assessment of networked electronic resources at point of use. • There are other assessments (vendor data, transaction logs, etc.) • E-metric Instruction System (EMIS) • http://www.ii.fsu.edu/EMIS/ • Subscription services • Can relate cost back to usage through subscription cost • Access services • DD/ILL • Online catalog • Ask Reference • Importance of gateways in library web architecture • But what about open access? • Next challenge of network electronic services assessment. www.minesforlibraries.org

More Related