1 / 29

Candice Benjes-Small Eric Ackermann Radford University

Creating An Architecture of Assessment: using benchmarks to measure library instruction progress and success. Candice Benjes-Small Eric Ackermann Radford University. “So, Candice, how many library sessions have we taught this year?”. Look at all these instruction librarians!. But….

jesus
Télécharger la présentation

Candice Benjes-Small Eric Ackermann Radford University

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Creating An Architecture of Assessment: using benchmarks to measure library instruction progress and success Candice Benjes-Small Eric Ackermann Radford University

  2. “So, Candice, how many library sessions have we taught this year?”

  3. Look at all these instruction librarians!

  4. But… • Curricular changes • Librarian burnout • Students reported BI overload

  5. On the other hand • University administration wants to see progress

  6. Looking for alternatives • Number of sessions plateau • Scoured literature • Attended conferences • Networked with colleagues

  7. Our environment • Public university • 9000+ students • Courses not sequenced • Instruction built on one-shots

  8. Macro look at program • Focus on us, not students • Search for improvements over time • Student evaluations as basis

  9. A little bit about our evaluation form

  10. Goals • Provide data to satisfy three constituents • Instruction librarians: immediate feedback • Instruction team leader: annual evaluations • Library Admin: justify instruction program

  11. Background Began in 2005 Iterative process

  12. Development 4-point Likert scale Originally had a comment box at end Major concern: linking comments to scale responses

  13. Solution: Linked score and comment responses Q1. I learned something useful from this workshop. Q2. I think this librarian was a good teacher.

  14. Inspiration for benchmarks • University of Virginia library system use of metrics to determine success • Targets outlined • We would do one department rather than entire library To learn more about UVA’s efforts, visit http://www.lib.virginia.edu/bsc/

  15. Benchmark baby steps • Look at just one small part of instruction program • Begin with a single benchmark • Identify one area to assess • Decided to do one particular class

  16. Introduction to Psychology • Teach fall and spring, beginning 2006 • 14 sections of 60+ students • Shared script and PPT • Everyone teaches over 2 days To see our shared PPT, visit http://lib.radford.edu/instruction/intropsych.ppt

  17. Developing benchmarks Selected a comment based metric for Instruction Team Chose class of comments: “What did you dislike about the teaching?” (Question #2)

  18. Current benchmarks Partial success: 5 < 10% total comments for Question 2 are negative Total success: < 5% total comments for Question 2 are negative

  19. How did we do?

  20. Results

  21. Success? • Reached our desired benchmark for partial success- never quite went below 5% • Tweaking the script again • Continuous improvement

  22. Scaling for your program Adjust the benchmark levels Only look at score responses (quantitative) instead of comments (qualitative) Adjust the number of benchmarks used

  23. Sharing with administrators • Team annual reports • Stress evidence-based nature • Use percentages, not a 4-point scale

  24. Disadvantages Time intensive Follow through required Evaluation forms not easy to change

  25. More disadvantages Labor intensive to analyze comments Results may reveal your failures

  26. Advantages Flexiblity to measure what you want to know Provides structured goal Evidence-based results more convincing

  27. More advantages Continuous evaluation results over time Data-driven decisions about instruction program Do-able

  28. Contact Candice Benjes-Small cbsmall@radford.edu Eric Ackermann egackerma@radford.edu

More Related