170 likes | 266 Vues
Learn about the journey of developing and implementing the LORLS system for academic reading lists at Loughborough University, key features, ongoing promotion strategies, and measures of success.
E N D
Online Reading Lists at Loughborough University Gary Brewerton, Library Systems Manager
The System • Background • Basic requirements • Development of LORLS • Current redevelopment • Implementation and promotion • Implementation • Issues raised • Ongoing promotion • Measure of success Contents
Project initiated by University’s Learning and Teaching committee in February 1999 • The basic aim of the project was to provide an online version of the academic’s own reading list for each module • Library initially approached to provide a “quality” (?) role in the process but a year later took over development of the system Background
Allow academics to create and maintain their own reading lists via the web • Including annotating the list with comments and indicating suggested holdings to the Library • Displaying the reading list in an order determined by the academic • Linking to the Library catalogue • Alert Library staff to new or changed items • Supply data to the campus bookshop Basic requirements
Web-based system developed in 2000 ready for start of new academic year • University of Nottingham expressed interest in using the system in 2002 and so it was made available under an open source license • Finally gave the system a name – LORLS (Loughborough Online Reading List System) • Since then there have been 3 minor releases of the software Development of LORLS
After nine years, decision was made to redevelop LORLS based on past experience • Key features: • Separation of user interface and backend functionality (easier to embed in other systems) • Ability to mimic other institutional structures and not just Loughborough University • Customised metadata for differing material such as articles, books, journals, websites, etc. • Intend to pilot at Loughborough in 2010 Current redevelopment
Advertised the new system including demonstrating it to interested parties • Blamed University for everything bad! • Before the launch of the system we asked all academics for a copy of their reading lists • Hired temporary staff to assist in inputting • Once completed responsibility for updating was passed (back) to the academics Implementation (1/2)
Lots of training sessions provided • Some in the Library, some in departments • Mostly group sessions but some one-on-one training • Workshops organized for less enthusiastic and issues raised used to inform further development of the system • We still provide training for new staff Implementation (2/2)
Differing types of reading lists • Engineering lists typically very short (3-4 items) • Science lists usually an A4 page (up to 25 items) • Humanities lists can be huge (up to 1,396 items) • Who owns a reading list? • Is a reading list a publication in its own right? • Is it owned by the academic or the institution? • Who should be able to access it? Issues raised
Use every opportunity to promote the system (e.g. any form of assessment) • If appropriate play departments/faculties off against one another • Periodically “nag” (via email) academics to maintain their reading lists • Continue to provide training (maybe online) • Advocacy remains an ongoing process! Ongoing promotion
Prior to having an online system the Library received around 400 reading lists a year • This represented 22% of the available modules running in each year (although not all modules have reading lists) • Some lists were photocopied from students when they came to the enquiry desk • In the first year of having an online system we got 857 lists and currently were have 2,416 list (some potential or dummy lists) Measure of success