1 / 21

ATRAS

ATRAS. Work Package 2 Published Evidence. AT Definition. An AT is a mechanical or electrical device used in a functional task orientated training process which will have a systemic or rehabilitative effect on the person. WP Progress. Searched databases Initial Screening Audited

dragon
Télécharger la présentation

ATRAS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ATRAS Work Package 2 Published Evidence

  2. AT Definition An AT is a mechanical or electrical device used in a functional task orientated training process which will have a systemic or rehabilitative effect on the person.

  3. WP Progress • Searched databases • Initial Screening Audited • Developed Website Forms • Allocated Papers • Inter-Library Loans • Share point import data Access Excel • Spreadsheets for ‘Agreeing Score’

  4. Databases searched PeDRo COMPENDEX INSPEC National Research Register (clinical trials), Reports from professional bodies (RCP Guidelines, CSP), RECAL Legacy CIRRIE, REHABDATA AMED, CINAHL, Cochrane Library including DARE CSA Illumina, EMBASE, MEDLINE PsycInfo, Web of Science,

  5. Papers for scoring 1361 Failed selection criteria 2424 Titles found 763 Screening checked by 2nd reviewer 299 Rejected 95 Review papers 464 Accepted 369 Research papers

  6. Initial Screening

  7. Study Designs

  8. RCTs & Quasi-Controlled studies

  9. ATs, Joints & Design

  10. Review Paper Assessment

  11. Review Paper Assessment CRD Guidance PeDRo DARE PRISMA

  12. Review Paper Assessment 1. Is the purpose of the review clearly defined? 2. Type of review paper 3. Peer reviewed 4. Were search terms reported? 5. Were search terms comprehensive? 6. Was the search strategy reported? 7. Was the search strategy satisfactory? 8 . Were relevant databases searched? 9. No papers excluded on basis of language? 10. Unlimited search timeframe? 11. How many reviewers independently reviewed each paper? 12. How was the validity of studies assessed? 13. Name lowest quality of studies retained for data extraction? 14. Was raw data extracted from papers and/or the research team? 15. Was a meta-analysis carried out? 16. Were study details synthesised and summarised?

  13. RPA Results 1 Duplicate 2 Non-English 95 Review papers 77 Review papers Scored 15Experimemtal not Review papers 24 Metanalysis Scored with Van tulder

  14. Meta-analysis 77 Review papers Scored

  15. van Tulder Scoring

  16. van Tulder Scoring • Reviewer Name * • Reviewer Order * • Author Name * • Title of article * • Year * • Ref Works ID * 1. Were the eligibility criteria specified? 2. Was a method of randomization performed? 3. Was treatment allocation concealed? 4. Were prognostic indicators similar for groups at baseline? 5.Were the index & control interventions explicitly described? 6. Was the care provider blinded to the intervention? 7.Were co-interventions avoided or comparable? 8. Was the compliance reported in all groups? 9. Was the patient blinded to the intervention? 10.Was the outcome assessor blinded to the intervention? 11.Were the outcome measures relevant? 12.Were adverse effects described? 13.Was the withdrawal/drop-out rate described? 14.Was a short-term follow-up measurement performed? 15.Was a long-term follow-up measurement performed? 16.Was timing comparable for outcome assessment in both groups? 17.Was the sample size for each group described? 18.Did the analysis include an intention-to-treat analysis? 19.Was the variability given for primary outcome measures? Reviewer's Remarks *

  17. Van Tulder Update 369 +15 papers require 384 agreed scores 542 completed forms 45 same score with out discussion 41 + 33 = 74 Agreed Scores 154 single reviews

  18. Tasks Complete van Tulder Screening Screen and review updates Prepare data extraction form WP2 –Data extraction familiarization meeting (wb May 17th at Keele)

  19. Data Extraction Treatment related Stroke related Time Severity L:R • Impairment or Activity • Odds Ratio (OR) • Effect Size • Adverse events (OR) • Complexity • Duration (set up / treatment) • Frequency • Expertise

  20. Tasks 2 Audit van Tulder Process Check Cochrane database and clinical trials register Complete data extraction –July/August Prepare Report / Matrix for Sept meeting

  21. Join us in May Thanks For Reviewing

More Related