1 / 22

Evaluation of Digital library

Evaluation of Digital library. Anna Maria Tammaro University of Parma. Outline. What is the digital library? Why evaluate? Evaluation cycle What? How? Good practices. What is a digital library?. What is encompassed? Visions of library What elements to take? What is critical?.

Télécharger la présentation

Evaluation of Digital library

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of Digital library Anna Maria Tammaro University of Parma University of Tbilisi, 5-15 July 2010

  2. Outline • What is the digital library? • Why evaluate? • Evaluation cycle • What? • How? • Good practices University of Tbilisi, 5-15 July 2010

  3. What is a digital library? • What is encompassed? • Visions of library • What elements to take? • What is critical? University of Tbilisi, 5-15 July 2010

  4. Evaluation is a fact finding, Evidence based value measuring, Integrated in the management process of digital libraries Accountability: evidence of resources spent Effectiveness: understanding basic phenomena (as information seeking) Impact: as increased learning, research, dissemination Why evaluate? University of Tbilisi, 5-15 July 2010

  5. 4 major questions for evaluation • What actually occurred? • How can it be improved? • Did it accomplish the objectives? • What impact did it have? University of Tbilisi, 5-15 July 2010

  6. Documentation Evaluation, MIS Description • What actually occurred? University of Tbilisi, 5-15 July 2010

  7. How can it be improved? Formative evaluation Improvement University of Tbilisi, 5-15 July 2010

  8. Did it accomplish its objectives? Effectiveness evaluation Fit for purposes? University of Tbilisi, 5-15 July 2010

  9. Impact of digital library • What impact did it have? • The ultimate question for evaluation is: “How are digital libraries transforming research,education, learning and living?” (Saracevic 2002, p. 368) University of Tbilisi, 5-15 July 2010

  10. What evaluate? • Content • Services/system • Users and uses University of Tbilisi, 5-15 July 2010

  11. Content evaluation • Content quality (subject coverage, relevance) • Content scope (what is included? Online journals, ebook) • Content organisation (metadata, bibliographic organisation,indexing) • Effectiveness (management, user support) • Efficiency (cost) University of Tbilisi, 5-15 July 2010

  12. System interface • Interface (usability, design, accessibility) • System performance (interactivity,algorithms for searching, processing time) • System configuration (networks, security,authentication) University of Tbilisi, 5-15 July 2010

  13. Outcomes • The ways in which library users are changed as a result of their contact with the library resources and programs (ARL 1998) University of Tbilisi, 5-15 July 2010

  14. Outcomes based evaluation • Have audiences been sufficiently identified? • Are outcomes clearly written? • Are outcomes sufficient to describe what you hope will happen? • Are data collection methods cost efficient? Add: Do they provide the data you want and need? University of Tbilisi, 5-15 July 2010

  15. Users • Who are they? (researchers, students, remote, etc.? What is their context?) • How do they access the digital library? (infomation seeking behviour, usability) • Why do they need the digital library? (activities, expectations) • What type of resources do they need? (subject, etc.) • What is the value of digital library? (impact, outcomes, potential for community building) University of Tbilisi, 5-15 July 2010

  16. European Minerva Project • Minerva • Handbook on cultural web user interaction • http://www.minervaeurope.org/publications/handbookwebusers.htm University of Tbilisi, 5-15 July 2010

  17. Survey Focus group Interviews Transaction logs Observation Ethnographic evaluation Usability Combined methods Longitudinal studies Crosscultural assessment Benchmarking How to evaluate? University of Tbilisi, 5-15 July 2010

  18. Standard – COUNTER, SUSHI (NISO standard usage statistics harvesting initiative) • No benchmarking or longitudinal studies (for the rate of change) University of Tbilisi, 5-15 July 2010

  19. Good practice • DigiQual – http:// www. digiqual. org/ • PEAK – http:// www. dlib. org/ dlib/ june99/ 06bonn. html • E- valued – http:// www. evalued. uce. ac. uk University of Tbilisi, 5-15 July 2010

  20. Bad news • There is no single, easy to administer, inexpensive, reliable, and valid approach to evaluating interactive learning from DLs. University of Tbilisi, 5-15 July 2010

  21. Good news There are practical strategies for documenting the development and use of interactive learning, improving it, and building a case for its effectiveness and impact. University of Tbilisi, 5-15 July 2010

  22. Questions? • Thanks of attention! Annamaria.tammaro@unipr.it University of Tbilisi, 5-15 July 2010

More Related