1 / 19

Rapid Searches for Rapid Reviews

Rapid Searches for Rapid Reviews. Liz Dennett, Dagmara Chojecki June 24, 2012. Outline. What are rapid reviews and why do them? What does the lit say? Methodological issues Rapid review survey of IRG members IHE & rapid reviews. Rapid Reviews 101. Increasing demand for rapid access

Télécharger la présentation

Rapid Searches for Rapid Reviews

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.


Presentation Transcript

  1. Rapid Searches for Rapid Reviews Liz Dennett, Dagmara Chojecki June 24, 2012

  2. Outline • What are rapid reviews and why do them? • What does the lit say? • Methodological issues • Rapid review survey of IRG members • IHE & rapid reviews

  3. Rapid Reviews 101 • Increasing demand for rapid access to current research • Clinical urgency/ intense demand for public uptake of a technology • Evidence informed decision making/policy • RRs accelerate/streamline traditional SR practices • Methodological concessions that can introduce bias • No standard methodologies

  4. What Does the Lit Say? • Little empirical evidence comparing RRs vs. SRs • Lack of transparency in RRs (Gannan et al. 2010) • Variety of types of RRs and methods (Cameron 2007 ASERNIPS report) • Leave out economic factors, social issues, clinical outcomes • Fewer details than SRs • Only published lit (no grey lit) • No quality assessment (Moher 1998) • Date, language, study type limits on search

  5. What Does the Lit Say? • RRs not appropriate for all technology assessments (Cameron 2007, Watt et al. 2008) • Standardization of methods may not even be appropriate (Gannan 2010, Cameron 2007) • RR conclusions may be harder to generalize and less certain (Watt et al. 2008)

  6. Possible Biases Introduced • Selection bias (Butler et al. 2005, Egger 1998) • Publication bias • Language of publication bias • Database bias (Egger 1998, Sampson 2003) • Medline or Embase or both??

  7. Searching for Rapid Reviews at IHE Checklist of databases and grey literature sources that we consider searching Differs for each rapid review product at IHE (based on project time more than anything else) Employ other strategies to reduce results and time: use of existing strategy, title searching, restricted date, pub type, geography, language

  8. How are searches altered for rapid reviews? Very little information available on how searches are being altered for rapid reviews Wanted more details – (e.g. which strategies most common? What limits were used? Any strategies that we were not using that we could?) Decided to survey other HTA organizations in order to identify trends

  9. Survey on adaptation of search strategies for Rapid HTAs Methods: • Created list of strategies for reducing time it takes to do search and/or to reduce the number of results the search returns. • Created a survey instrument (fluidsurveys) and pretested with former HTA searchers • Received ethics approval • Sent email out on IRG listserv

  10. Definitions for Survey Full HTA: An HTA report that starts with a comprehensive search, involving numerous databases and grey literature sources, in order to find all relevant research. The included primary studies are critically appraised and synthesized and meta-analysis is performed if possible. (Because of the methodological rigour involved, these reviews often take longer than six months to produce, but may take less depending on available resources or nature of question). Rapid HTA: an HTA report where methodological compromises are made in order to meet shorter timelines. (These reports would generally take at least 1 week but no more than 6 months to produce.)

  11. Results Received 17 completed responses from 16 unique organizations Responses from Canada, England, Scotland, Germany, Sweden, Norway, Netherlands, Malaysia, Spain, Argentina, Poland 11 produce both full HTA and rapid HTA, 2 only rapid HTA, 3 only full HTA 13 rapid HTA producers combine to produce 32 different rapid HTA products Timelines range from 1 day to 9 months

  12. Number of Databases 10 out of 12 (83%) provided a lower minimum or maximum number for at least one of their rapid HTA reports as compared to full HTA report (e.g 5-7 for rapid HTA; 8-11 for full HTA) However 10 out of 12 had at least one rapid HTA product whose range for the number of databases searched overlapped with the range for full HTAs While many organizations may search fewer databases for rapid HTAs, they do not always do so.

  13. Grey Literature 9 of 12 (75%) of IS reduce the number of grey literature sources they search However 32 out of 32 (100%) of the rapid HTA products included some grey literature 31 out of 32 (97%) include a search for publications from other HTA agencies

  14. Other strategies 7 of 12 (58%) of IS try to make their strategy more precise 5 or 12 (42%) use a more precise version of methodological filter 9 of 14 (64%) use an unmodified (or only slightly modified) preexisting search strategy if it is available and appropriate.

  15. Main points All respondents use some strategies to speed up or reduce results for a rapid HTA search Great variability between organisations (and possibly different IS from same organisation) Appears clear that other factors (such as the nature of the question) affect the choice of time saving strategies

  16. Unanswered questions Should best practices for searching for rapid reviews be established? What impact does each of the different strategies have on the comprehensiveness of the search? Evidence for some, but not for others What are the most effective strategies (i.e. save the most time but lose the fewest relevant articles)?

  17. Next steps for IHE Review rapid review protocols for all types of rapid HTAs in light of survey results and evidence

  18. References Ganann R, Ciliska D, Thomas H. Expediting systematic reviews: methods and implications of rapid reviews. Implement Sci. 2010;5:56. doi: 10.1186/1748-5908-5-56. Watt A, Cameron A, Sturm L, Lathlean T, Babidge W, Blamey S, Facey K, Hailey D, Norderhaug I, Maddern G: Rapid reviews versus full systematic reviews: an inventory of current methods and practice in health technology assessment. Int J Technol Assess Health Care 2008, 24:133-139. Butler G, Hodgkinson J, Holmes E, Marshall S: Evidence based approaches to reducing gang violence. West Midlands, UK: Government Social Research Unit; 2004. Cameron A: Rapid versus full systematic reviews: an inventory of current methods and practice in Health Technology Assessment. Australia: ASERNIPS; 2007:1-119. Moher D, Pham B, Jones A, Cook DJ, Jadad AR, Moher M, Tugwell P, Klassen TP: Does quality of reports of randomised trials affect estimates of intervention efficacy reported in meta-analyses? Watt A, Cameron A, Sturm L, Lathlean T, Babidge W, Blamey S, Facey K, Hailey D, Norderhaug I, Maddern G: Rapid versus full systematic reviews: validity in clinical practice? ANZ J Surg 2008, 78:1037-1040 Egger M, Smith GD: Bias in location and selection of studies. BMJ 1998, 316:61-66 Sampson M, Barrowman NJ, Moher D, Klassen TP, Pham B, Platt R, St John PD, Viola R, Raina P: Should meta-analysts search Embase in addition to Medline? Clin Epidemiol 2003, 56:943-955

  19. Thank you very muchfor your attention! www.ihe.cadchojecki@ihe.caldennett@ihe.ca

More Related