1 / 42

Applying Student Learning Data to Enhance Instruction An Anteater Initiative

This presentation explores the application of student learning data to enhance instructional methods. It covers program overview, incentives/challenges, methodology, results, lessons learned, and future plans.

exavier
Télécharger la présentation

Applying Student Learning Data to Enhance Instruction An Anteater Initiative

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applying Student Learning Data to Enhance InstructionAn Anteater Initiative LOEX 2007 Cathy Palmer Head, Education and Outreach UC Irvine Libraries LOEX 2007

  2. Outline • Background • Program Overview • Incentives/Challenges • Methodology • Results • Lessons Learned • What next? LOEX 2007

  3. Writing Program/Library Collaboration • Lower-division writing requirement • Fulfilled in part by successful completion of Writing 39C: Argument and Research • Course integrated library research assignment since 1986 • Library research assignment evolved as course curriculum evolved

  4. Library Incentives for Evaluation • Major investment of library resources • Personnel • Space • Availability of numeric data • Number of students reached • Number of sessions taught • Number of librarian hours spent • Were our resources being put to the best use?

  5. 1996/1997 1727 Students 76 Sessions 82 Hours 2005/2006 2504 Students 126 Sessions 150 Hours Yearly Statistics

  6. Session Outline • Libraries Homepage (Lecture/Demo) • Ask a Librarian • Connect from off-campus • ANTPAC catalog (Lecture/Demo) • Expanded Academic (Lecture/Demo) • Hands-on exploration of assigned databases • Sociological Abstracts • L/N Congressional Universe • United States Newspapers • Review of each database (Discussion) • Evaluations

  7. Evaluation 1.0 Overall Value of Session to Student

  8. Student rating of Librarian Presentation

  9. Choose Databases—student self rating

  10. Select Search Terms-student self rating

  11. Evaluate Results of Search-self rating

  12. Interpret Citations-Find Material-self-rating

  13. But... What were the students REALLY learning in the session?

  14. The "Eureka!" moment Esther Grassian workshop Ask the students questions about the content of the session as part of the evaluation.

  15. The BIG IDEA Explicit Objective: Students will understand that they have a variety of information resources available to them and that each resource has strengths and weaknesses. They will be challenged to answer the question "Why use this resource rather than that resource?"

  16. The Invisible BIG IDEAs Implicit Objectives: Students will leave knowing that the librarians are there to help them. Involve students in their own learning by making the session as hands-on as possible.

  17. Learning Outcomes At the end of the 50 minute session, students will: • Recognize the characteristics of generalized and specialized information resources and be able to differentiate between them. • Learn and apply basic search strategies that will help them locate information • Select, explore and use 3-5 library databases to find information on their topic.

  18. Session Outline • Libraries Homepage (Lecture/Demo) • Ask a Librarian • Connect from off-campus • ANTPAC catalog (Lecture/Demo) • Expanded Academic (Lecture/Demo) • Hands-on exploration of assigned databases • Sociological Abstracts • L/N Congressional Universe • United States Newspapers • Review of each database (Discussion) • Evaluations

  19. Early Results

  20. Read each statement and indicate the best resource to use by placing a check in the appropriate column.(Fall, 2003—383 responses) 62% 38% 53% 77% 62%

  21. Another Eureka Moment Let's use Sociological Abstracts as the demonstration database instead of Expanded Academic ASAP.

  22. Read each statement and indicate the best resource to use by placing a check in the appropriate column.(Spring 2004, 809 evaluations) 73.9% 48.3% 52.7% 55.8% 74.7% 64.4%

  23. Student Performance Comparison

  24. Bonuses Ability to look for themes in student comments What parts of the presentation helped you the most? • Following along step by step (71) • Visual aid/demonstration (73) • Group work (83)

  25. Student comments cont. What helped you the least? • Group work (59) • Not enough time (38) • Nothing/everything helped (202) What could improve the session? • Longer session (184) • Slower and more detail about resources (58) • Speak louder/microphone

  26. Still more student comments I still need to learn more about: • What each database is used for and when to use it for certain searches (78) • The other databases we didn't get to explore (59) • How to find the actual copies of books and articles in the library (34)

  27. Problems with Paper

  28. Problems with Paper

  29. Problems with Paper

  30. Benefits of Automating Evaluation Form • Could only select one response • More comments/longer comments • Results available immediately • Results could be sorted in a variety of ways

  31. Numeric Results • Individual Librarian/Individual Session • Individual Librarian/All Sessions • Overall evaluations by quarter • Overall evaluation by academic year

  32. Individual Performance Improvements Assessment/Feedback methodology • Use of Observation Rubric • Presentation style • Use of projected image • Can be heard • Session Content • Outlines session • Pacing • Classroom management • Interaction with students • Execution of learning techniques • Comparison between Observation and Evaluations

  33. Results Clear, concise presentation + Hands-on + Review of experience Best performance by students on selection of best resource

  34. Example

  35. Another Example

  36. Librarians: All Academic year: 2006 - 2007Total evaluations: 1733

  37. Librarians: All Academic year: 2006 - 2007Total evaluations: 1733

  38. Future Wishlist • Pre and post testing • Ability to share results of evaluation with students • More nuanced testing of ability to apply concepts learned • Consistent evaluation of learning outcomes from other library research skills sessions

  39. Questions?

  40. Contact Me Cathy Palmer cpalmer@uci.edu 949-824-4972

More Related