1 / 31

Kristiina Hormia-Poutanen Head of National Electronic Library Services (FinELib)

Kristiina Hormia-Poutanen Head of National Electronic Library Services (FinELib). Selection and evaluation – the Finnish model http://www.lib.helsinki.fi/finelib/. Contents. Selection of resources The role of evaluation in the development of FinELib services

carnig
Télécharger la présentation

Kristiina Hormia-Poutanen Head of National Electronic Library Services (FinELib)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Kristiina Hormia-PoutanenHead of National Electronic Library Services (FinELib) Selection and evaluation – the Finnish model http://www.lib.helsinki.fi/finelib/

  2. Contents • Selection of resources • The role of evaluation in the development of FinELib services • The user and usage of digital resources • Conclusions

  3. Selection of resources • subject specific groups –priority listing • by subject • by sector (universities, polytechnics, research institutes, public libraries)FinELib - hankintaehdotukset v. 2003 • FinELib- summary,analysis and proposal • consortium group - decision

  4. Why evaluate digital library services? • to meet stakeholders’ needs also in the future – insurance for the future • to raise status of the programme • to show impact of services • to help with fundraising • to support selection of resources / re-allocation of funding • to developthe services based on user needs

  5. How FinELib evaluates? • evaluation of the national programme in 2002 • http://www.kka.fi/pdf/julkaisut/KKA_403.pdf • usage statistics collected since 1998 • user surveys run since 1998 • consortium surveys run since 2001 • user/customer feedback invarious formats constantly

  6. Evaluation of theprogramme- why? • to safeguard the future • to raise status • to develop the information content and access to information on the net • to prioritise activities • to develop services based on user needs • to develop tools and to develop and evaluate working methods

  7. The evaluation process • run by the Finnish Higher Education Evaluation Council (FINHEEC) in 2002 • funded by the Ministry of Education and National library • self evaluation • FinELib • 16 consortium members • international evaluation

  8. Results • FinELib has been very successful in its activities • good timing of the evaluation • high strategic level • recommendations to • the Ministry of Education • FinELib • consortium member organisations and their libraries • recommendations • related to research and library policy • practical

  9. Recommendations to FinELib • a medium term strategy needed • connection to national knowledge society strategies • e-learning/the role of the citizen • number and expertise of staff needed • roles / work division within the country • quality assurance mechanisms at FinELib • flexibility and packages • cooperation • surveys, statistics, feedback mechanisms • know-how help • improvement of communication process • increase transparency of budgeting

  10. What has happened due to the evaluation? • the preparation of FinELib strategy has started • the plan to serve ”small disciplines” has been drafted • cooperation has been intensified • brain storming seminars (communication, statistics 2003) • active provision of training (e-books, portal 2003) • communication plan and its practical implementation • consortium survey 2001, 2004 • user surveys also for research institutes

  11. Usage data • collected centrally since 1998 • published on FinELib web site • data collected: • sessions, searches, downloaded articles by member organisation and by consortium • growth in use • problems: • comparability, format, not always available, delivery • goal for COUNTER compliant usage data

  12. Usage statistics- why? • indicator of the importance • shows the trend of usage • parameter in pricing models/ cost division models • indicator of price level • price/article

  13. Downloaded articles, examples

  14. General principles Practical guidelines Fairness Cost division has to be as fair as possible for all consortium members. The costs are defined by criteria, which have been clearly determined, are based on facts and are transparent. Based on facts The model must be based on facts, which are easy to verify. FinELib collects and maintains the facts on its web pages. Simplicity and transparency The model has to be simple and easy to understand and explain. New cost division is applied when renewing contracts, 2- 4 cases annually The new cost division is applied when renewing contracts, not during a license term. The consortium agrees on the resources, where the new cost division is applied. New cost division is applied in new contracts With new agreements the model can be applied from the first beginning. Minimum and maximum prices A minimum price is defined for each resource. The parameters of the model are used when calculating the minimum price. The price cannot be lower than the price given by the publisher. A maximum price may not exceed the price the organisation would pay alone, outside the consortium. Cost division principles

  15. Cost division model within consortium • includes universities, polytechics and research institutes • parameters: • usage (0.6) • FTE researcher (0.3) • FTE students (0.1) • minimum price & maximum price • price increase limiters (+90% /-25%)

  16. Case Emerald

  17. Resource price per downloaded article

  18. User surveys • run since 1998 • web based surveys linked to SAS • 3 master thesis in preparation

  19. Who is the user?

  20. Working environment?

  21. Print or electronic versions? Willingness to cancel print versions if the corresponding e-version is available

  22. Print or electronic versions?Sample – university professionals

  23. Preferences between print and electronic?Sample – university professionals

  24. How often e-resources are used?

  25. SATISFACTION? 2000 % 2001 % 2002 % Very well 10 11 12 Well 47 48 68 Some 35 34 17 Poorly 8 7 3 None 0 0 0 How does the content meet user’s needs?

  26. How does the content meet user’s needs?Sample – university professionals

  27. Factors affecting the use of e-resources • gender • men use more than women, are more willing to cancel print versions • age • people under 35 years of ageuse more than older onesand are more willing to cancel print versions

  28. Factors affecting the use of e-resources • position/status • post-graduate students and researchers use more than lecturers and professors • discipline

  29. Which disciplines use most e-resources? • use is high: • life sciences, economics • use varies: • technology, health sciences ?? • use is low: • humanities and social sciences

  30. Conclusions • different evaluation methods needed • to show impact • to collect feedback • to develop services • to improve status • helps to justify the fundraising

More Related