1 / 27

Reporting search methodology in HTA

Reporting search methodology in HTA. IRG Workshop Montreal July 6 2008. Why report search methodology. Users of any HTA need confidence in: Transparency: exactly what was done/not done – informs judgment about accuracy, reliability, generalisability of findings

adortch
Télécharger la présentation

Reporting search methodology in HTA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Reporting search methodology in HTA IRG Workshop Montreal July 6 2008

  2. Why report search methodology Users of any HTA need confidence in: • Transparency: exactly what was done/not done – informs judgment about accuracy, reliability, generalisability of findings • Replicability: for updating, how, where and when the search was done is important • Adequate weight given to reporting the search: demonstrates recognition that a comprehensive and systematic search is fundamental to identifying the available evidence

  3. 2005 research on reporting search methodology • 42 HTAs in English from 23 research groups and published between 2002-5 were examined • All but 2 reported something about search process • Number of sources of information ranged from 2->50 • 34 (81%) reported at least one full strategy to show how terms were combined • Dates search covered and date search ended reported by 35 (83%) • 39 (92%) showed enough detail to demonstrate an adequate or better search process but • Only 33(78%) reported enough detail to enable replication of search by experienced searcher

  4. Update 2008 Aims: • Examine current practice and trends • Compare with previous study • Include “rapid” reviews/assessments (excluded from previous study) to compare search methods and reporting

  5. Inclusion criteria • Systematic reviews • Full text in English (or full English translation of report in other language) • Described as systematic review or health technology assessment • Published 2006 or later • Rapid/short reviews • Described as “rapid”, “brief” or any word indicating speed/short time frame OR • <30 pages • Horizon scans excluded • Published 2004 or later (difficult to locate as few are clearly identified)

  6. Systematic reviews (SRs) • 25 systematic reviews examined • 22 different HTA groups or collaborations between groups • 10 countries (Aus, Bel, Can, Fin, Neth, Nor, NZ, Swe, UK, USA) • 20(80%) were done by INAHTA agencies • 2005 (1); 2006(2); 2007(12); 2008(7)

  7. “Rapid” reports (RRs) • 13 reports from 9 agencies in 8 countries (Aus, Bel, Can, Ger, NZ, Spa, UK, USA) • 12 done by INAHTA agencies • 2004(2); 2006(6); 2007(4); 2008(1) • Page count range: 9-210 pages • Variably described as: • rapid report, rapid assessment, rapid review, rapid response • brief overview, technical overview, technical brief • abridged report, accelerated review, evidence request • rapid and systematic review

  8. Data collected • Sources of information reported • Were search terms described? • Were subject headings used? • Were full strategies shown? • Were any date and/or language restrictions detailed? • Was the end date of the search shown? • Placement of information within the report (text/appendix) • Number of references located • Was there enough information to replicate the search? • RRs only: page count and document type

  9. Reporting of information sources- SRs • All but 1 reported source information • The only one not reporting had clearly had used Medline but did not state it • Major sources ranged from 2-15 databases (includes Cochrane Library) • 19(76%) reported “other” sources – range 1->25 • Difficult/impossible to count “other sources” as some were generic categories eg “references of retrieved papers” or “HTA agency websites” • “Other” sources tended to be topic specific • Several very complex multi-question reviews tended to search fewer resources

  10. Reporting of search terms and strategies - SRs • All SRs reported something about search terms • 6 (24%) gave “search terms” - not presented as subject headings • 19(76%) gave subject headings or headings plus terms • 19 gave at least one full strategy • 13(52%) gave additional full strategies (range 1-16 full strategies)

  11. Other details-SRs Language and date information • All reported language or date limits or enough detail was provided in a complete strategy to identify any limits • 23(92%) reported the end date for the search Number of results from search • 18(72%) reported the total number of results from the search

  12. Placement of information reporting the search-SRs • 1 report showed search information in an appendix only – nothing in the text • 10 (40%) reports showed details only in main text • 14(56%) had information in both the main text and an appendix • Appendix was most often used for showing complete strategies or long lists of websites searched

  13. Sources of information - RRs • All reported sources of information • Range of sources 3->20 sources • Trend to reporting major sources and listing others generically • Three reports had completed a fully exhaustive and systematic search • MEDLINE and Cochrane Library used by all

  14. Search terms - RRs • 2 gave no information about search terms at all • 2 stated that the search strategy was available in an appendix but it was not included in the online version • 2 gave search terms only • 7(53%) reported a full strategy • 6(46%) reported more than one complete strategy (range 2-12)

  15. Other details - RRs Of 13 reports • 10 reported language and date restriction details • 8 reported search completion date • 9 reported the number of references located in the search • 5 reported search details in the main part of the report only • 8 had brief details in text and included an appendix • Missing appendix in 2 reports may have contained these details but could not be located

  16. General comments on reporting of sources • 37 of 38 documents reported some source information • The only one not reporting source information had identifiable MEDLINE strategies - possible oversight? • Cochrane Library resources are now standard rather than “other” • Trend to reporting generic sources and trend away from long lists of itemised HTA websites that become outdated quickly • Some HTA websites listed were out of date – had they really been searched or was the list included as a standard part of every report? • The HTA database is still being referred to incorrectly as the INAHTA database by some reports • Variable reporting of some sources affected the count eg “Cochrane Library” vsitemising the different sections of the Cochrane Library

  17. Rapid vs systematic: sources • 3 so-called “rapid reviews” were also fully systematic with >20 sources reported and complex strategies • Several SRs had >25 sources • Generally RRs searched and itemised fewer sources but least number was 3 sources in both SRs and RRs (except for the SR not reporting any source information) • Several very complex SRs reported fewer sources than most RRs

  18. Rapid vs systematic: search terms and strategies • All SRs (100%) gave some details of search terms used. • 9 (69%) of RRs gave some details of search terms • 19 SRs (76%) gave at least one full strategy and 13 (52%) gave additional strategies • 7 RRs (54%) gave one full strategy; 6 (46%) gave additional strategies

  19. Other details • 100% of SRs included date range & language restrictions vs 10 (77%) of RRs • 23 SRs (92%) gave end date for search vs 8 (62%) of RRs • 10 (40%) of SRs reported details in main text only vs 5 (38%) of RRs • Appendix plus text used in 14 (56%) SRs vs 8 (61%) RRs • Similar rate of reporting number of references- 30% of SRs vs 28% of RRs

  20. Replicability • 20 (80%) SRs were judged be broadly replicable based on reporting of at least one full strategy and adequate source information vs 7 (54%) of RRs • 2 more RRs may have been replicable if the missing appendices had been available

  21. Differences from 2005 • Most HTAs have adequate or better reporting • 100% of SRs gave details about search terms and limits placed – up from 95% and 83% respectively in 2005 • Improvement in reporting date search ended – 92% vs 83% in 2005 • Number clearly indicating use of subject headings up slightly from 73% in 2005 to 76% • Range (least to most) of sources searched has not changed but the average number per review is higher (NB: difficulty of quantifying generic sources reported) • Other reporting appears to be more detailed though the percentages are very similar to 2005

  22. Differences from 2005 • Trend to separating source and strategy information between main text and appendix respectively has become apparent • Only one SR banished all details to an appendix • General impression given that the search process is being taken more seriously and is now a standard part of methodology section • Very complex multi-question SRs are being commissioned and carried out. This can necessitate limiting the search to major sources • More non-English speaking countries are publishing full text in English - 11 of total 38 documents examined (29%) compared to 4 of 42 (9.5%) in 2005

  23. SRs vs RRs • Reporting in most but not all RRs is less detailed and less replicable • RRs are highly variable in length and methodology (anything from full SRs to overviews of a few pages) • Topics tend to be somewhat different for RRs: • more tightly defined (one intervention vs several; or vs review plus economic evaluation), • more “medical” (drugs, procedures, devices) compared to SRs now branching into social topics (spousal violence, alcohol offenders) • Newer (robotic surgery, inhaled insulin vs screening topics, asthma drugs, chronic pain)

  24. SRs vs SRs • ASERNIP-S report*on rapid vs systematic reviews did not recommend trying to standardise “rapid HTA” document type but instead increase the full reporting of methods used and clarify the scope and purpose of the document • Reporting of approximate number of person hours spent would also assist under-resourced groups experiencing pressure for timeliness and accuracy from their funding body • *Rapid versus full systematic reviews: an inventory of current methods and practice in health technology assessment (July 2007)

  25. Limitations of this update • Quick overview – small convenience sample • Not possible to read all documents thoroughly • Difficult to locate enough RRs to compare with SRs – may not be listed on HTA database • Major language groups left out especially Spanish HTAs • Comparisons with 2005 may indicate a trend but are not statistically sound because of variability in selecting the included reports and the difference in number of documents included in 2005 vs 2008 overview

  26. Final conclusions • Reporting search methodology has improved and keeps improving in SRs • Information specialists need to keep vigilant with SRs • ensure reporting gives sufficient detailforreplicability • ensure balance between text and appendix • ensure reporting is in proportion to the size of the full document • Number of sources reported should not be a competition – some topics need wider information gathering than others • Reporting in RRs needs attention – as the format is so variable reporting of the search methodology is even more important • Search methods could be more fully provided in a supplementary online file if reporting in full would be too unbalanced for very short RRs

  27. Susan Bidwell susan.bidwell@otago.ac.nz susanbidwell@xtra.co.nz

More Related