260 likes | 403 Vues
This presentation explores the Cabrillo College Library's journey in embracing assessment of services, highlighting the emotional stages of denial, anger, bargaining, depression, and acceptance. It illustrates key decision points regarding student learning and effective assessment approaches, such as narrative descriptions, statistical measures, student self-assessments, focus groups, and surveys. The outcomes showcase improved student self-assessments of competencies, successful accreditation, and enhanced teamwork among staff. This pragmatic approach to assessment emphasizes practical benefits over complex studies.
E N D
Cabrillo’s SLO Journey Georg Romero Library Director Cabrillo College LibraryDecember 3, 2010 (libwww.cabrillo.edu/staff/slo/carldig2010.ppt)
A trip through Cabrillo’s SLO journey Or, How we eagerly embraced service assessments from the start…
Denial • They’re not serious about assessing services! • This will only apply to the classroom, right? • Another fad – it’ll pass…
Anger • How can they possibly expect us to assess our services? • It’s impossible – we can’t do it! • Those meddling dunderheads at WASC…
Bargaining • What if we just say we’re going to do it? • Couldn’t we just describe how much we benefit students? • Could I at least use all these wonderful library statistics, somehow?
Depression • We’re never going to figure this out. • We’re going to lose our accreditation, and I’m going to lose my job…
Acceptance • OK. Fine. • So, how could we assess our services? This is when it got interesting…
Two key early decision points • What kinds of student learning could we definitely claim a causative role in? • How much time do we want to spend on this?
What kinds of student learning could we definitely claim? • How to attribute specific learning outcomes to transaction services? • Can we separate what we teach outside the classroom from what classroom faculty teach? • Do library service users succeed because of the library, or do successful students simply know the benefits of the library?
How much time do we want to spend on this? • Is it worth the time and effort to produce potentially very tenuous findings? • Should we focus on simple, practical approaches, but risk not meeting the requirements?
What we decided • Leave the detailed studies for another day • Streamline, and focus on the practical and immediately relevant • We will make this useful for us Community college librarians are a pragmatic bunch!
Assessment options considered • Narrative descriptions • Statistical measures • Student self-assessment • Focus groups • Post-transaction sampling surveys/interviews • Surveys
Narrative descriptions • Easy to write – we know this stuff • Widely used among academic institutions (example) • Descriptive, not usually very measurable • Tend to be global, and not as relevant to individual transactions • Useful for internal communication and mindset
Statistical measures • We have lots of these… • Reflect quantity and usage, not quality or effectiveness • Most likely useful statistics would need to be created and cross-correlated: • Track reference service users, compare GPA or semester success to non-users • Compare users vs non-users on a required bibliography for a specific class research project
Student self-assessment • Easy to fold into a survey, interview, or focus group • But – do students really know what they know? • Perceived value is informative, especially in an information void
Focus groups • Potentially rich source of detailed information • Examples: Austin Pea S.U., Univ. of Pittsburgh • Small sample size • Most often used for specific goals: assess effectiveness of a catalog redesign, etc. • Heavily dependent upon personalities, both interviewers and interviewees • Possible focus group: How does the library assist your learning processes?
Post-transaction sampling surveys/interviews • Very “fresh” assessment • Somewhat intrusive • Heavily dependent upon student perceptions • Potentially small sample size • Home-grown, e.g. Cuyamaca, Linscheid • Or, professionally available, e.g. WOREP • Influenced by student’s mood and the “feel-good” aspects
Surveys • Familiar • Many models out there, can fold almost anything into a survey • Home-grown, e.g. Cabrillo, Southern Illinois (survey of IM service) • Professionally available, e.g. LibQUAL • Multi-purpose
Surveys (cont’d) • Paper or online, each with merits & drawbacks • Dependent upon student self-assessment • Typically very actionable results • Can have multiple surveys for different population groups
What we decided • Multiple approaches: • Some narrative descriptions, used in our accreditation self-study and program plan • Annual survey, incorporating student self-assessment on campus “core competencies” • No specific assessment for any specific service • Leave the door open for different future approaches
Our outcomes • Students self-assessed positively on all four campus core competencies • Established a process of collecting survey data and discussing it annually, then acting upon any key findings • Passed accreditation in 2007, with a commendation for the library
Some unanticipated benefits • Focused Circulation staff more on teaching and learning, less on punishing • Increased team mindset across the board • Increased attention to action and experimentation, not just measurement
Looking back • Don’t be afraid to try – if it doesn’t work out, try something else • Most important: do somethingwith your findings • Use the new requirements to help meet old goals: • Service improvements • Staff training and evaluations • Awareness building across all campus groups • Mentoring for a ubiquitous service-mindset
Links • “Assessment of student learning from Reference Service,” G. Gremmels & K. Lehmann, Wartburg College(crl.acrl.org/content/68/6/488.full.pdf) • CSU Northridge, Oviatt Library, Objectives for Library Services (library.csun.edu/kdabbour/assessment.html#services) • Community college survey on library SLOs, J. Turner, Palo Verde College (pages.paloverde.edu/staff/library/slosurvey.doc) • Conducting Focus Groups in Libraries, Sara Aerni, Special Projects Librarian, Univ. of Pittsburgh, 8 April 2005 (www.lib.whu.edu.cn/dzpx/files/5Focus_Groups.ppt) • Cuyamaca College Library Questionnaire (www.cuyamaca.edu/slo/PDF/Ref%20Card/RefDeskCard_Fall2010.pdf) • “Instruction via Instant Messaging Reference: What’s Happening?” C. Desai & S. Graves, Southern Illinois University(opensiuc.lib.siu.edu/cgi/viewcontent.cgi?article=1025&context=morris_articles)
Links (cont’d) • Linscheid Library, East Central University; Reference Assessment Plan (www.ecok.edu/siteContent/1/documents/library/reference/reference_assessment_plan.pdf) • “Use of focus groups in a library’s strategic planning process,” M. L. Higa-Moore et al, J Med Libr Assoc 90(1) 2002 (www.ncbi.nlm.nih.gov/pmc/ articles/PMC64762/pdf/i0025-7338-090-01-0086.pdf) • “What do students want? A focus group study of students at a mid-sized public university,” M. A. Weber, R. K. Flatley, Library Philosophy & Practice, 2008 (www.webpages.uidaho.edu/~mbolin/weber-flatley2.pdf) • “What do they know? Assessing the Library’s contribution to student learning,” B. Fister, Library Issues 19.1 (Sept. 1998)(homepages.gac.edu/~fister/LIassessment.html) • “What WOREP results say about reference service, patron success and satisfaction,” J. A. Gedeon et al, RUSA New Reference Research Forum, ALA Annual Conference, 2009 (worep.library.kent.edu/Summary_of_the_Study.pdf)
Thank You! (libwww.cabrillo.edu/staff/slo/carldig2010.ppt)