1 / 29

Demonstrating the Value of Academic Libraries: Challenges and Opportunities

Demonstrating the Value of Academic Libraries: Challenges and Opportunities . Lessons Learned from the Assessment Practices of a Sample of Libraries Presented by: Laura Gil-Trejo, MPH & MA Social Science R esearch C enter Presented to:

yon
Télécharger la présentation

Demonstrating the Value of Academic Libraries: Challenges and Opportunities

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Demonstrating the Value of Academic Libraries: Challenges and Opportunities Lessons Learned from the Assessment Practices of a Sample of Libraries Presented by: Laura Gil-Trejo, MPH & MA Social Science Research Center Presented to: The Statewide California Electronic Library Consortium

  2. Culture of Assessment Growing recognition of its importance in higher education What is the value of education? • What is the library’s place in contributing to that value? WASC Accreditation Creation of external push for assessment observed across all campuses Variation exists in the extent to which libraries are being pushed to the same extent that academic departments are

  3. Culture of Assessment Within Libraries There are diverse attitudes toward assessment within and across campuses Among those surveyed, there is a belief that the culture of assessment has become more positive as administrators have become convinced of its importance.

  4. Culture of Assessment Some believe it is important because: • They have to do it: it has become critical for survival or someone is asking for the information directly • They genuinely see assessment as part of ensuring student success • “…we’re very student focused. We want to ensure that [they] are getting the educational resources they need, and that’s the only way you can do that is through some kind of assessment.”

  5. Culture of Assessment Those who are less enthusiastic or hesitant because they: • fear change • fear what the evaluation will tell them • “…but also the people who you say the word and immediately you can see the strain on their face and the stress because they think it’s going to be some sort of negative reaction to them personally. They don’t want to be evaluated personally.” • think it is unnecessary • don’t have the resource, skill, and/or time to do it

  6. Culture of Assessment Factors that shape attitude toward assessment • Perceived meaningfulness of assessment activities • Consequences for not doing assessment activities • Not knowing what is meant by assessment exactly • “I don’t think anyone in the library questions the importance of [assessment]. Resistance and challenge comes from ignorance, and we don’t have a magic bullet to close the loop. It is never an easy solution—it is always complex and iterative.”

  7. Approaches to Assessment There is also diversity as to how assessment is approached. Those who: • Get it done • Get it done, and continue look for better ways to get it done • Want to get it done, but don’t know how • Are not sure it can be done rigorously so avoid it all together.

  8. Definitions of Assessment Main Themes • Lack of a formal definition shared by libraries • Use outside entity standards to inform definitions • Difference between what should be and what is • “…we want to know that the services we offer in the form of collections and our reference services and our interlibrary loan…are helping students to learn and ultimately to graduate. But that’s always been really difficult to make those connections. We end up sticking with satisfaction…we really don’t have a good assessment program.” • Discrepancy/misunderstanding between and within libraries as to what is considered assessment • “Within the library there is not agreement as to what is assessment, with one librarian saying all data collection efforts are assessment and another indicating that assessment is more confined to ‘outcomes.’”

  9. Definition of Assessment • A tool to fit various needs • Encompassing a variety of activities • “I wouldn’t tie to any one form of assessment. Our definition is broad on purpose because we do all kinds of assessment in the building, so it’s any feedback we can get for a service we’ve provided…to both meet the students’ needs and do all the stuff…as well as assessing our services. But we see assessment fulfilling both of those needs.”

  10. Approaches to Assessment Assessment is happening at multiple layers • University • Library • Some as part of university-efforts • Some coordinated library-wide • Some individual • The majority of libraries do not have repositories or keep track of their assessment activities on one hand, while others are publishing the results of their efforts on the other

  11. Potential List of Services Being Assessed The following services were identified:

  12. Assessment Activities

  13. Assessment as Output Most common Usage statistics (quantitative) • Question: To what extent are our services being used? • Circulation, E-Resources/digital resources, ILL, Reference, Laptop loan program, Study room reservation, Information literacy instruction • Mandated for several service points • Critical to assessment • If done over time using the same definition and data collection systems, it can be used to establish change over time or measure the impact of an outreach program* • Introduction to rapid ILL • Change in the way data is collected/use defined

  14. Assessment as Process Very common • Question: Are our services being delivered/received as intended? • Services: IL instruction, study room, circulation, ILL, etc. • Question: Is the library as a whole or its various components giving the student what he/she achieve SLO. • Most common is the satisfaction survey (LibQual, NSSE, and other independent paper/pencil and web surveys) • Comment boxes where students can provide feedback • Focused group interviews • Other qualitative methods • Samples of varying degree of representativeness from convenience to attempts at randomly sampling • Both faculty and students included in samples • Very often non-users are not used as sample

  15. Assessment as Outcome/Impact Least common , but viewed as most critical • Some have made steps in that direction with varying degrees of rigor • Question: Does Student X demonstrate some skill at the end of an IL session/workshop/course? • Posttest only design (without comparison group) • Rubrics applied to student samples (without control group). • Typically students sampled from X number of sessions out of Y total number of sessions • Or, pre- an and posttest after online IL modules • Looking at percent correct on some measure • Some efforts to incorporate comparison groups have been made

  16. Assessment as Outcome/Impact • Question: What impact has this service had on students? • How did it impact you? • If you didn’t have the library, how would your academic experience have been altered? For the better? For the worse? • Survey • Anecdotally • Focused group interviews (student and faculty samples) • One on one interviews

  17. Assessment as Outcome/Impact Challenges experienced: • Difficult in obtaining control groups • Control and condition groups are typically not randomly assigned • Some tutorials require students to pass before producing a certificate of completion • Rubrics are difficult to develop and time consuming to apply to student products • Challenges finding instructors to participate • For surveys, relying of self reports • Does not incorporate all library services

  18. Assessment as Impact/Output: A Simplified Model of Student Learning Where does the library fit in?

  19. Where Does the Library Fit In? Library Services and Resources Student Success Mechanisms

  20. Assessment as Outcome/Impact Other potential methodology: Merging student outcome data with use data • Can factor out confounding factors • Can examine moderating factors • Becomes stronger when you start with a cohort and add a longitudinal component to it (tracking students over time) • Allows you to estimate the value of the library on student outcomes. • Permits for large sample sizes • Avoid costs of doing surveys • The down side? • Start-up is potentially TIME-CONSUMING and EXPENSIVE • Relies on statistical expertise that many librarians do not possess • Safeguards must be taken to protect student confidentiality

  21. Merging Student Outcome and Usage Statistics Other downsides: • Data collected may not gets at actually usage • Sometimes usage gets missed • Group study rooms • Accessing databases inside the library • Checking out a book • Doesn’t tell us how use is linked to student success • Even if a correlation is present, it doesn’t tell us anythning about the mechanisms.

  22. Merging Student Outcome and Usage Statistics Being piloted at two universities within the CSU. Already done at University of Minnesota Twin Towers • Potential problem lies with the accuracy of the data and assumptions that are made • Should validate student-use data as collected through library systems with self-reported data to make sure that it approximates reality; otherwise these data are useless • Relies on knowing which student demographic data to include • Relies on being able to bring non-users into the data file • Should conduct follow up qualitative studies to focus on mechanisms • Also relies heavily on statistical expertise

  23. Good Assessment Plans: Leveraging what standards we do possess as a starting point (WASC, ACRL) Leverage existing campuses resources • Almost all campus respondents indicating having an office of institutional research • IT departments are always helpful • Each other • Publications • Best Practices/Good models (e.g., assessment team) Must be meaningful Must not add to existing workloads Methodological rigor must match assessment question • Do not let a method be your master

  24. A Tool Box Approach? Starts with assessment question driving the effort: • Who is using our services? • Who is not using our services and why? • How are our services being used? • Are users satisfied with the services they receive? • How do we best deliver services to users? Does this differ by user? • Do students have basic competency in some skill that is required of them at the end of IL Instruction? • Does exposure to our services/resources have a measurable impact on users?

  25. A Tool Box Approach • Selected: Does exposure to our services/resources have a measurable impact on students? • Quantitative data • Qualitative data • Both

  26. A Tool Box Approach • Selected: Does exposure to our services/resources have a measurable impact on students? • Selected: Both quantitative and qualitative • What impact/outcome would you like to document? • GPA • Persistence • Time to graduation • Faculty scholarship • Instruction of faculty • Self efficacy • Academic achievement (other) • Etc….

  27. A Tool Box Approach • Selected: Does exposure to our services/resources have a measurable impact on students? • Type of Data Selected: Both quantitative and qualitative • Impact/outcome selected: One Year Retention • Potential methods • Quantitative: • Survey : Two time points vs. one time point • Correlating usage statistics with student outcome statistics • Samples: First time freshmen • Qualitative • Focused group interviews • Key informant/face-to-face interviews • Samples: First time freshmen users and non users • Other samples: Student advisors or other educators who work directly with students.

  28. A Tool Box Approach • For Every Method… • Limitations of method • How to set up your assessment • Sample survey/interview guides • Different sampling approaches • Examples of what data files should like • Sample data analysis • How to report your data • Highlighting new methods of assessing similar questions on library campuses both nationally and locally • Publications using similar methods • Places to submit newly discovered methods that address similar questions. • Potentially: Online workshops or training for particular topics • Potentially: a moderated community chat room • Potentially: access to technical assistance

  29. Reactions, Questions, or Ideas? • Contact Information Laura Gil-Trejo Director, Social Science Research Center California State University, Fullerton lgil-trejo@exchange.fullerton.edu

More Related