1 / 26

Cabrillo’s SLO Journey

Cabrillo’s SLO Journey. Georg Romero Library Director Cabrillo College Library December 3, 2010 (libwww.cabrillo.edu/staff/slo/carldig2010.ppt). A trip through Cabrillo’s SLO journey. Or, How we eagerly embraced service assessments from the start…. Denial.

semah
Télécharger la présentation

Cabrillo’s SLO Journey

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cabrillo’s SLO Journey Georg Romero Library Director Cabrillo College LibraryDecember 3, 2010 (libwww.cabrillo.edu/staff/slo/carldig2010.ppt)

  2. A trip through Cabrillo’s SLO journey Or, How we eagerly embraced service assessments from the start…

  3. Denial • They’re not serious about assessing services! • This will only apply to the classroom, right? • Another fad – it’ll pass…

  4. Anger • How can they possibly expect us to assess our services? • It’s impossible – we can’t do it! • Those meddling dunderheads at WASC…

  5. Bargaining • What if we just say we’re going to do it? • Couldn’t we just describe how much we benefit students? • Could I at least use all these wonderful library statistics, somehow?

  6. Depression • We’re never going to figure this out. • We’re going to lose our accreditation, and I’m going to lose my job…

  7. Acceptance • OK. Fine. • So, how could we assess our services? This is when it got interesting…

  8. Two key early decision points • What kinds of student learning could we definitely claim a causative role in? • How much time do we want to spend on this?

  9. What kinds of student learning could we definitely claim? • How to attribute specific learning outcomes to transaction services? • Can we separate what we teach outside the classroom from what classroom faculty teach? • Do library service users succeed because of the library, or do successful students simply know the benefits of the library?

  10. How much time do we want to spend on this? • Is it worth the time and effort to produce potentially very tenuous findings? • Should we focus on simple, practical approaches, but risk not meeting the requirements?

  11. What we decided • Leave the detailed studies for another day • Streamline, and focus on the practical and immediately relevant • We will make this useful for us Community college librarians are a pragmatic bunch!

  12. Assessment options considered • Narrative descriptions • Statistical measures • Student self-assessment • Focus groups • Post-transaction sampling surveys/interviews • Surveys

  13. Narrative descriptions • Easy to write – we know this stuff • Widely used among academic institutions (example) • Descriptive, not usually very measurable • Tend to be global, and not as relevant to individual transactions • Useful for internal communication and mindset

  14. Statistical measures • We have lots of these… • Reflect quantity and usage, not quality or effectiveness • Most likely useful statistics would need to be created and cross-correlated: • Track reference service users, compare GPA or semester success to non-users • Compare users vs non-users on a required bibliography for a specific class research project

  15. Student self-assessment • Easy to fold into a survey, interview, or focus group • But – do students really know what they know? • Perceived value is informative, especially in an information void

  16. Focus groups • Potentially rich source of detailed information • Examples: Austin Pea S.U., Univ. of Pittsburgh • Small sample size • Most often used for specific goals: assess effectiveness of a catalog redesign, etc. • Heavily dependent upon personalities, both interviewers and interviewees • Possible focus group: How does the library assist your learning processes?

  17. Post-transaction sampling surveys/interviews • Very “fresh” assessment • Somewhat intrusive • Heavily dependent upon student perceptions • Potentially small sample size • Home-grown, e.g. Cuyamaca, Linscheid • Or, professionally available, e.g. WOREP • Influenced by student’s mood and the “feel-good” aspects

  18. Surveys • Familiar • Many models out there, can fold almost anything into a survey • Home-grown, e.g. Cabrillo, Southern Illinois (survey of IM service) • Professionally available, e.g. LibQUAL • Multi-purpose

  19. Surveys (cont’d) • Paper or online, each with merits & drawbacks • Dependent upon student self-assessment • Typically very actionable results • Can have multiple surveys for different population groups

  20. What we decided • Multiple approaches: • Some narrative descriptions, used in our accreditation self-study and program plan • Annual survey, incorporating student self-assessment on campus “core competencies” • No specific assessment for any specific service • Leave the door open for different future approaches

  21. Our outcomes • Students self-assessed positively on all four campus core competencies • Established a process of collecting survey data and discussing it annually, then acting upon any key findings • Passed accreditation in 2007, with a commendation for the library

  22. Some unanticipated benefits • Focused Circulation staff more on teaching and learning, less on punishing • Increased team mindset across the board • Increased attention to action and experimentation, not just measurement

  23. Looking back • Don’t be afraid to try – if it doesn’t work out, try something else • Most important: do somethingwith your findings • Use the new requirements to help meet old goals: • Service improvements • Staff training and evaluations • Awareness building across all campus groups • Mentoring for a ubiquitous service-mindset

  24. Links • “Assessment of student learning from Reference Service,” G. Gremmels & K. Lehmann, Wartburg College(crl.acrl.org/content/68/6/488.full.pdf) • CSU Northridge, Oviatt Library, Objectives for Library Services (library.csun.edu/kdabbour/assessment.html#services) • Community college survey on library SLOs, J. Turner, Palo Verde College (pages.paloverde.edu/staff/library/slosurvey.doc) • Conducting Focus Groups in Libraries, Sara Aerni, Special Projects Librarian, Univ. of Pittsburgh, 8 April 2005 (www.lib.whu.edu.cn/dzpx/files/5Focus_Groups.ppt) • Cuyamaca College Library Questionnaire (www.cuyamaca.edu/slo/PDF/Ref%20Card/RefDeskCard_Fall2010.pdf) • “Instruction via Instant Messaging Reference: What’s Happening?” C. Desai & S. Graves, Southern Illinois University(opensiuc.lib.siu.edu/cgi/viewcontent.cgi?article=1025&context=morris_articles)

  25. Links (cont’d) • Linscheid Library, East Central University; Reference Assessment Plan (www.ecok.edu/siteContent/1/documents/library/reference/reference_assessment_plan.pdf) • “Use of focus groups in a library’s strategic planning process,” M. L. Higa-Moore et al, J Med Libr Assoc 90(1) 2002 (www.ncbi.nlm.nih.gov/pmc/ articles/PMC64762/pdf/i0025-7338-090-01-0086.pdf) • “What do students want? A focus group study of students at a mid-sized public university,” M. A. Weber, R. K. Flatley, Library Philosophy & Practice, 2008 (www.webpages.uidaho.edu/~mbolin/weber-flatley2.pdf) • “What do they know? Assessing the Library’s contribution to student learning,” B. Fister, Library Issues 19.1 (Sept. 1998)(homepages.gac.edu/~fister/LIassessment.html) • “What WOREP results say about reference service, patron success and satisfaction,” J. A. Gedeon et al, RUSA New Reference Research Forum, ALA Annual Conference, 2009 (worep.library.kent.edu/Summary_of_the_Study.pdf)

  26. Thank You! (libwww.cabrillo.edu/staff/slo/carldig2010.ppt)

More Related