1 / 84

Andrew Booth Reader in Evidence Based Information Practice

Evidence Based Library and Information Practice: The Impossible Will Take a Little While ( *Lo imposible se llevará un poco de tiempo* ). Andrew Booth Reader in Evidence Based Information Practice. Puerto Rico & United Kingdom. Outline.

amara
Télécharger la présentation

Andrew Booth Reader in Evidence Based Information Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evidence Based Library and Information Practice: The Impossible Will Take a Little While (*Lo imposible se llevará un poco de tiempo*) Andrew Booth Reader in Evidence Based Information Practice

  2. Puerto Rico & United Kingdom

  3. Outline • What is Evidence Based Library and Information Practice (EBLIP)? • Why is EBLIP so Important? • How is EBLIP carried out? • What are the Barriers? • How might these Barriers be Overcome? • EBLIP - a practical and feasible tool to improve all areas of library and information practice.

  4. What is Evidence Based Library and Information Practice (EBLIP)?

  5. Evidence based library and information practice is … • “Evidence Based Library and Information Practice (EBLIP) seeks to improve library and information services and practice by bringing together the best available evidence and insights derived from working experience, moderated by user needs and preferences”. • “EBLIP involves asking answerable questions, finding, critically appraising and then utilising research evidence from relevant disciplines in daily practice. It thus attempts to integrate user-reported, practitioner-observed and research-derived evidence as an explicit basis for decision-making”. (Booth, 2006)

  6. Evidence based library and information practice is … • “Evidence Based Library and Information Practice (EBLIP) seeks to improve library and information services and practice by bringing together the best available evidence and insights derived from working experience, moderated by user needs and preferences”. • “EBLIP involves asking answerable questions, finding, critically appraising and then utilising research evidence from relevant disciplines in daily practice. It thus attempts to integrate user-reported, practitioner-observed and research-derived evidence as an explicit basis for decision-making”. (Booth, 2006)

  7. Three Cogs? User-Reported Practitioner-Observed Research-Derived

  8. Why librarians? • “As a profession which has the ability to manage the literature of research, librarianship is uniquely placed to model the principles of evidence-based practice, not only as they apply to other disciplines which we serve, but also as they apply to our own professional practice” (Ritchie, 1999)

  9. Aren’t we doing it already? • What are we doing well? • What might we do better?

  10. 15 £1 Million 14 £500,000 13 £250,000 12 £125,000 11 £64,000 10 £32,000 9 £16,000 How do librarians typically make their day to day decisions about their services? 8 £8,000 7 £4,000 6 £2,000 5 £1,000 4 £500 3 £300 2 £200 50:50 1 £100 A: Go 50:50? B: Phone a Friend? C: Ask the Audience? D: Conduce Evidence Based Search?

  11. Why is EBLIP so Important?

  12. Is it worth it? • “I blogged about the Norwegian tutorial Search and Write…At the Creating Knowledge conference, Therese Skagen (University of Bergen Library, Norway) talked about this project in the context of, and use of projects to develop, EBP. She identified some challenges to EBP including time allocation, dissemination (within and outside the library), competences (in research and planning), and resources. • Management needs to be supportive, and you need to believe that research can be of value to the organisation. She suggested trying out EBP in a small area of the library service to begin with. Therese saw gains from EBP in terms of, for example, your own learning, strategic understanding or (organisationally) increased quality of service and better morale”. Information Literacy Weblog

  13. How is EBLIP carried out?

  14. The 5As • Ask a focused question • Acquire the evidence • Appraise the studies • Apply the findings • Assess the impact… ...and your own development

  15. Ask a focused question

  16. SPICE

  17. SPICE: Tell Me What You Want, What you Really Really Want!

  18. Six domains of EBL (Crumley & Koufogiannakis) • Reference/Enquiries –providing service and access to information that meets the needs of library users. • Education –finding teaching methods and strategies to educate users about library resources and how to improve their research skills. • Collections –building a high-quality collection of print and electronic materials that is useful, cost-effective and meets the users needs. • Management –managing people and resources within an organization. • Information Access & Retrieval–creating better systems and methods for information retrieval and access. • Marketing/Promotion –promoting the profession, the library and its services to both users and non-users.

  19. Acquire the evidence

  20. Searching the literature • LIS databases • Library and Information Science Abstracts (LISA) • Library, Information Science and Technology Abstracts (LISTA) www.libraryresearch.com (EBSCO Publishing) FREE Cover librarianship, classification, cataloguing, bibliometrics, information retrieval, information management

  21. Optimal Combinations of Library Databases Sampson et al (2008)

  22. Appraise the studies

  23. Appraise Evidence • Checklists for appropriate studies • Information needs analyses (CRiSTAL) • User studies (CRiSTAL) • Interventions Addressing the Need for education and Training (RELIANT) • EBL Critical Appraisal Tool (Glynn)

  24. Evidence Based Library and Information Practice ejournals.library.ualberta.ca/index.php/EBLip • 6-10 Evidence Summaries per issue: "provide a critical appraisal synthesis for a specific research article, so that practitioners may more readily determine if the evidence in that research study is valid and reliable, and whether they can apply it to their own practice."  Koufogiannakis, 2006

  25. Apply the findings

  26. How do we APPLY the evidence • Ideally we want Evidence that is Directly Applicable. • More commonly we encounter Evidence that needs to be Locally Validated (e.g. through a survey or audit of local services). • In our general reading we encounter Evidence that Improves Understanding.(Koufogiannakis and Crumley, 2004) • Final category is Evidence that may inform our Choice of Methodologies, Tools or Instruments (Booth, 2004)

  27. Four point plan • Demand evidence – best way to get organization to become evidence-based is for leaders to ask for evidence supporting decisions and recommendations. • Examine logic and critically evaluate any evidence presented. • Treat organization as unfinished prototype - Try something new in a limited way, gather evidence and then adapt, revise and retry as needed. • Cultivate attitude of wisdom throughout the organization - act on best available evidence at the time, keep questioning what we know, see if new evidence comes to light and be open to any new evidence. Fisher & Richardson EBLIP4 (2007) (after Pfeffer and Sutton)

  28. As Koufogiannakis and Crumley state: • “"When using research to help with a question, look for high quality studies, but do not be too quick to dismiss everything as irrelevant. Try to take what does apply from the research and use it to resolve the problem at hand" (Koufogiannakis and Crumley, 2004)

  29. When considering Applicability think SCOPE • Severity – How urgent/important is the problem? • Clients – Does the planned intervention fit with the values, needs and preferences of my users? • Opportunity – Is now the time to apply this? Has the situation changed since the evidence was produced? • Politics – Is there local support for this intervention? • Economics – Can we afford this intervention? Will this be at the expense of something else?

  30. Assess the impact… ...and your own development

  31. Evaluate impact of intervention • Identify what you need to measure, what data you need to measure it, appropriate methods for measuring • Do changes support organisational goals and objectives? • Gather data you need, not data that is easiest to gather • Compare changes to predicted deliverables/outcomes in original project plan • Quantify extent of changes • Have changes resulted in customer service improvement? • Have any further questions arisen?

  32. Evaluate your EBLIP performance • Have I followed the stages of the EBLIP process? • Have I improved my professional knowledge and skills? • Has the process revealed any personal or professional strengths or weaknesses?

  33. What are the Barriers?

  34. Possible barriers • time constraints • lack of knowledge about sources of research evidence • limited access to the literature • lack of training in critical appraisal skills • emphasis on practical rather than intellectual knowledge • work environment (structural barriers)

  35. Challenges to searching the LIS literature • problematic indexing • getting access can be difficult • not comprehensive • full text not always easy to obtain • multiple study designs • unhelpful abstracts • limited coverage of publication types

  36. How might these Barriers be Overcome?

  37. Ten Steps for Practical EBLIP 1. Integrate EBLIP into recruitment and development 2. Practice Evidence Based Project Management 3. Incorporate Evidence Review into Existing Meetings 4. Utilise Evidence Based Standards and Guidelines 5. Implement Evidence Based Webpages HILJ, Mar 2009

  38. Ten Steps for Practical EBLIP 6. Develop Evidence Based Questionnaires 7. Practise Evidence Based Collection Management 8. Evaluate Information Literacy Instruction 9. Manage Change using Evidence Based Methods 10. Evaluate Evidence Based Strategies HILJ, Mar 2009

  39. 1 - Integrate EBLIP into recruitment and development • Integrate into job descriptions • Integrate into Staff appraisal and performance review • Highlight in Promotions and Revalidation

  40. 2. Practice Evidence Based Project Management Brooks S et al (2007). What, So What, Now what. In Connor E (2007). Evidence-based librarianship: case studies and active learning exercises. Oxford: Chandos.

  41. 3. Incorporate Evidence Review into Existing Meetings • Don’t initiate EXTRA evidence based meetings • Introduce an evidence based component into staff meetings, project meetings, team meetings etcetera

  42. 4. Utilise Evidence Based Standards and Guidelines

  43. 5. Implement Evidence Based Webpages

  44. Booth, A. (2006) Australian supermodel? – A practical example of evidence-based library and information practice(EBLIP). Health Information and Libraries Journal; 23 (1): 69-72. Cotter, L., Harije, L., Lewis, S. & Tonnison, I. (2005) Adding SPICE to our library intranet site: a recipe to enhance usability [online]. Available from: http://conferences.alia.org.au/ebl2005/Cotter.pdf [Accessed February 2007]

  45. 6. Develop Evidence Based Questionnaires • “How often, though, do those researchers use the same questionnaire or at least the same or similar (enough) questions (after getting proper permissions and giving proper attributions, of course)? How often, when selecting survey participants, do they try to control for the same factors as the studies they are using as examples? How often, in other words, do they approach their project from the standpoint of gathering results that will be directly comparable to the work they are using as models? Not having studied this systematically myself, I cannot say for sure, but my impression is that the answer would have to be: not very often”. (Plutchak, 2005)

More Related