1 / 40

LIB115 - WEEK 13 – PPT

LIB115 - WEEK 13 – PPT. ASSESSMENT (EVANS CHAPTER 16). Guide to User-centered Outcomes Durrance and Fisher, 2005. Librarians have failed to explain to those outside the field what contributions they and their institutions actually make to society at large .

Télécharger la présentation

LIB115 - WEEK 13 – PPT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LIB115 - WEEK 13 – PPT ASSESSMENT (EVANS CHAPTER 16)

  2. Guide to User-centered OutcomesDurrance and Fisher, 2005 Librarians have failed to explain to those outside the field what contributions they and their institutions actually make to society at large. Libraries need to publicize their worth to the greater public in supporting learning and developing an informed citizenry. SOURCE: http://www.flickr.com/ Uploaded on 2/23/2010 by Jess Gambacurta

  3. Performance Measurement Brophy, 2006 Central to library management since without a firm grasp on what is actually being achieved it is impossible to move forward to improved service - or even to maintain the status quo.

  4. Statistics collected on inputs and outputs relating to services delivered: • Number of reference questions answered • Number of items checked out • Number of information literacy lectures delivered • Number of ILLs obtained for library patrons and loaned to other libraries

  5. Libraries are accountable that spending for educational activities is well used: • School and college administrators • Federal, state, local government authorities • Parents • General Public SOURCE: http://www.flickr.com/ reserves held near customer service pointby annethelibrarian 10/23/08

  6. Accountabiltiy and Assessment • Libraries are attempting to address accountability with assessments that go beyond quantifying inputs and outputs to demonstrate library’s support ofparent institution’s goals and objectives. • Assessment is defined as documenting “observed, reported, or otherwise quantified changes in attitudes and skills of students (or patrons) on an individual basis because of contact with library services, programs, or instructions.” (Dugan and Hernon, 2002)

  7. Internal Focus to improve library efficiency and effectiveness: • Describe current performance and allow staff to see where improvements to services are needed. • Allocate resources and plan operations and services. • Assess success of new programs and services SOURCE: http://www.flickr.com/ Omaha Library Reference Deskby herzogbr 10/17/2007

  8. Weiss (1982) described internal decisions informed by evaluation data: • To continue a program. Should purchasing in a middle school library curriculum subject area continue if the material is rarely used? • To institute similar programs elsewhere. Should a public library filter all public terminals at all branch libraries or just to those with no separate children’s room? • To improve practices and procedures. Can corporate or special library staff improve recall rates for online searching without overloading researchers with irrelevant citations?

  9. Weiss (1982) described internal decisions informed by evaluation data: • To add, drop, or change specific program strategies and techniques. Should academic library expand evening hours when requested to do so by student government? • To allocate resources among competing programs. Should the budget allocation for books be reduced in order to increase the amount spent on online databases? • To accept or reject a program approach or theory. Would students be more likely to read and view reserve materials available in a digital reserve system than they are through traditional service?

  10. There was no analysis of statistical information in terms of what they meant for quality of library programs. SOURCEhttp://www.flickr.com/ Reference Deskby annethelibrarian 4/7/2008

  11. User Focus involves users in assessment process Ranganathan (1931) Books are for use Every reader his book Every book its reader Save the time of the reader A library is a growing organism • Surveys distributed to students, faculty, and other library users, are used to find out about programs and services. • Benchmarking data is a means to judge quality based on comparison of output data with that of carefully selected institutions. SOURCE http://www.flickr.com/Terry, the Creative Forceby Lester Public Library 6/9/2008

  12. Inputs Raw materials of a library’s services: • budget • amount of available space • size of collections amounts • kinds of equipment • number of staff

  13. Outputs Quantifiable products of inputs: • number of books circulated • reference questions answered • number of children attending story hours • ILLs borrowed for library patrons Outputs indicate library and resource usage to decide budget, staffing levels, hours of service

  14. Benchmarking • Involves a library assessing its operations by detailed comparisons with the same operations in other libraries or against professional standards. • “aspirant” libraries perceived as “best practices” are selected for comparison to evaluate given library operations. • “Professional standards” developed by the library profession suggest comparison points for inputs and outputs.

  15. ACRL academic library professional standards using inputs for benchmark comparisons: • Ratio of volumes to combined total student/ faculty FTE • Ratio of volumes added per year to combined total student / faculty FTE • Ratio of material/information resource expenditures to combined total student and faculty FTE • Percentage of total library budget expended in categories: • Materials/information resources, • subdivided by print, microform, and electronic • Staff resources, subdivided by: librarians, full/part-time staff, student assistants, Federal contributions, outsourcing costs

  16. ACRL academic library professional standards using inputs for benchmark comparisons: • Operating expenses, network infrastructure, equipment • Ratio of FTE library staff to combined student/ faculty FTE • Ratio of usable library space (square feet) to combined student / faculty FTE • Ratio of number of students attending library instructional sessions to total number of students in specified target groups • Ratio of library seating to combined student / faculty FTE • Ratio of computer workstations to combined student / faculty FTE

  17. ACRL academic library professional standards using outputs for benchmark comparisons: • Ratio of circulation (excluding reserve) to combined student / faculty FTE. • Ratio of interlibrary loan requests (photocopy and book) to combined student / faculty FTE • Ratio of interlibrary loan lending to borrowing.

  18. ACRL academic library professional standards using outputs for benchmark comparisons: • Interlibrary loan/document delivery borrowing turnaround time, fill rate, and unit cost. • Interlibrary loan/document delivery lending turnaround time, fill rate, unit cost • Ratio of reference questions (sample week) to combined student / faculty FTE.

  19. Web-based comparison tools Public Libraries Survey (IMLS, 2005) http://nces.ed.gov/surveys/libraries/ • Database allows library staff to choose comparison libraries based on city, state, collection size, income per capita • Compare to operating revenue, FTE staff, collection size, reference transactions per capita, children’s program attendance per 1,000 population Academic Library Survey: National Center for Education Statistics http://nces.ed.gov/surveys/libraries/Academic.asp • Benchmark comparison of peer institutions: staffing, collection size, and budget per FTE student used for regional accrediting agency institutional review.

  20. Patron-centered Outcomes Assessment Measures ways in which library users are changed as a result of their contact with library’s resources and programs Evidence of contribution to teaching and learning mission of the parent institution: • Satisfaction/dissatisfaction of users with library programs and services • Academic performance of students improved through contact with library • Student improved chances of successful careers • Student improved success in graduate school • Bibliographic instruction results in high level of “information literacy” among students • Collaboration with library’s staff increases faculty members likely to view library as integral part of courses • Students using library more likely to lead fuller and more satisfying lives.

  21. Outcomes assessment • Good outcomes measures are based on specific library goals and objectives • Assessment plan identifies what changes in patron skills, values, perceptions are outcomes to be achieved • Modifications based on assessments show progress toward goals (Hernon and Dugan, 2002)

  22. Academic Libraries • Regional accrediting agencies standards seek evidence to assess effectiveness of student learning • Funding authorities seek evidence that higher education is preparing students for careers and to participate in society • IMLS defines outcomes as benefits or changes for individuals or populations during or after participating in program activities: new knowledge, increased skills, changed attitudes or values, modified behavior, improved condition.

  23. Academic Libraries Libraries continue to measure inputs/outputs: • to assess programs and services • to benchmark outputs against peer institutions as a measure of quality • to relate these measures to changes in users

  24. Public Libraries Define “service responses” as what a library does for, or offers to, the public in an effort to address a well-defined set of community needs • Basic literacy: need to read and to perform essential daily tasks • Business and Career Information: need for information related to business, careers, work entrepreneurship, personal finances and employment opportunities • Commons: need of people to meet and interact with others in their community and participate in public discourse about community issues • Community Referral: need for information related to services provided by community agencies and organizations • Consumer Information: need for information to make informed consumer decisions and to help residents become more self-sufficient • Cultural awareness: need for community residents to gain an understanding of their own cultural heritage and the cultural heritage of others

  25. Public Libraries • Current topics and titles helps fulfill community residents’ interest in information about popular culture and social trends, and recreational experiences • Formal Learning Support helps students enrolled in schooling programs attain educational goals • General Information helps need for information and answers to questions on a broad array of topics related to work, school, and personal life • Government Information helps need for information about elected officials / government agencies that enable people to participate in democratic process • Information literacy helps need for skills related to finding, evaluating, and using information effectively • Lifelong learning helps desire for self-directed personal growth and development opportunities • Local History and Geneology addresses desire of community to know and better understand personal or community heritage.

  26. Browse Counting on Results PDF Outcomes-Based Evaluation of PL (Lance, 2001) http://www.lrs.org/documents/cor/CoR_FullFinalReport.pdf

  27. School Libraries and media centers current national guidelines: • Information Power: Building Partnerships for Learning (1998 AASL American Association of School Librarians and AECT Association for Educational Communications and Technology) • Focus on providing resources transition to creating a community of lifelong learners • Curriculum extends beyond traditional location and retrieval skills to skills in evaluating, synthesizing, interpreting information

  28. School Libraries and media centers current national guidelines: • Requires library media specialists expand their data collection beyond quantitative statistics on collection size to measures of actual student learning. • Positive correlations are found between student learning measures and stronger library programs particularly the effectiveness of school’s information literacy instruction. • See Library Research Impact Studies website: http://www.lrs.org/impact.php

  29. http://www.islma.org/resources.htm • American Association of School Librarians (AASL) Standards for the 21st Century Learner • ACRL Institute for Information Literacy • Big6 Skills to Information Problem Solving • INFOLIT: The Building Blocks of Research: An Overview of Design, Process and Outcomes • 21st Century Information Fluency Project • S.O.S. for Information Literacy • I-SAIL Wiki

  30. Special libraries • Outcomes assessment demonstrates a library’s value to parent organization • Increasing demand for specialized services • ROI creates cost-benefit analysis: benefits identified by clients, costs for providing library services known from library’s budget.

  31. Assessment Evidence needs: • To be multi-faceted, • To cover range of indicators, • To utilize multiple methods • To assess effectiveness of library programs

  32. Assessment Evidence needs: Quantitative methods designed to find out extent of particular indicators and involve collecting numerical data: • number of hits on library web site, • number of patrons entering library in time period, • number of classes using media center or library

  33. Assessment Evidence needs Qualitative methods designed to find out why something happens, ie reasons patrons come to library or what students use media center or library for. Involve surveying or interviewing patrons about satisfaction with service of library or media center to determine how well information needs are met.

  34. FYI Here is the assessment laundry list….(no need to memorize) • Citation analysis of student papers to assess use of higher level references as outcome of information literacy presentations to show more scholarly resources, less reliance on the web, fewer incomplete citations, higher grades from instructors. • Transaction log analysis OPAC records show patron mistakes, used to improve help prompts and suggests frequently searched authors, titles, subjects for purchase. • Logs, Diaries, Journals • Patrons are sometimes asked to keep a record of information-searching practices to review search process and improve patron’s searching experience. Patrons reflect on search experience and comment about coping with process. Also referred to as personal search process. • Think-Aloud Protocol

  35. FYI Here is the assessment laundry list….(Once again, no need to memorize) • Patron’s articulate experiences into a recorder while accomplishing a particular task, later transcribed for analysis. • Standardized Tests • National Survey of Student Engagement NSSE or College Senior Survey CSS: results used to see if student satisfaction improves over time or compares with peer institutions • LIBQUAL+ ARL measures user satisfaction in libraries, gathered via Web-based survey: “gap analysis” identifies shortfalls exist between level of services received and level expected • Counting data collection: number of reference questions, level of complexity, date and time of day, whether asked in person, phone, email, text or IM. Indicate library use patterns.

  36. FYI Here is the assessment laundry list….(Once again, no need to memorize) • Surveys or questionnaires: instrument design, sample selection, statistical analysis; web-based, mailed, emailed, or distributed in person to library patrons. Administered pretest/posttest to measure outcomes, ie web-based tutorials to measure of information literacy instruction • Interviews administrated to participants individually. Questions predetermined to limit data to evaluative areas. • Focus groups interview with open-ended, in-depth discussions with small groups to obtain data about a single topic or limited range of topics. Used to determine patrons’ perceptions of library programs and reasons for satisfaction • Observation of patron behavior / staff behavior involves “secret shopper” posing as patron, asking typical questions in reference or other service setting, then judging quality of staff members’ response; judged against predetermined answers.

  37. Assessments based on goals and objectives of individual libraries: Example 1: Reference Services Measures Frequency of reference use, estimated by counting questions, keeping records of reference questions; Number of reference questions asked per person in reference community or proportion of reference transactions successfully completed.

  38. Assessments based on goals and objectives of individual libraries: Example 1: Reference Services Measures Quality standards in assessing digital reference services (Lankes 2003) • Courtesy of library staff • Accuracy of answers • User satisfaction • Number of repeat users • Awareness among users that service exists • Cost per digital reference transaction • Information literacy instruction

  39. Example 2: Information Literacy Instruction Measure effect of instruction on quality of student work and core competencies: • Citations analysis • Pre/post testing surveys • Portfolios collecting examples of students work and self-reflections show student’s progress/ achievements in select areas: analog, digital, both • Online tutorials reach distance students and allow staff to accommodate more students with pre/post tests

  40. Example 2: Information Literacy Instruction Measure effect of instruction on quality of student work and core competencies with Standardized tests: • iSkills scenario-based tasks to measure ability to navigate, critically evaluate, and make sense of digital information; http://www.iskills.com/intro.html • iCritical Thinking (new – replaces iSkills) http://www.ets.org/portal/site/ets/menuitem.1488512ecfd5b8849a77b13bc3921509/?vgnextoid=159f0e3c27a85110VgnVCM10000022f95190RCRD&vgnextchannel=e5b2a79898a85110VgnVCM10000022f95190RCRD • SAILS pre-test with specific instruction based on weaknesses; Post-test measures progress and effectiveness of instruction; I-SAIL - Illinois Standards Aligned Instruction for Libraries • TRAILS Information Power tests information literacy skills and concepts. TRAILS, Tools for Real-time Assessment of Information Literacy Skills

More Related