1 / 55

WHAT IS SYSTEMATIC REVIEWING ESRC Methods Festival, Oxford 2.00-2.45 pm 19 th July 2006

WHAT IS SYSTEMATIC REVIEWING ESRC Methods Festival, Oxford 2.00-2.45 pm 19 th July 2006. David Gough EPPI-Centre, Social Science Research Unit, Institute of Education, University of London. Today. SSRU and its EPPI-Centre Building on what we know Research and decision making

rianne
Télécharger la présentation

WHAT IS SYSTEMATIC REVIEWING ESRC Methods Festival, Oxford 2.00-2.45 pm 19 th July 2006

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WHAT IS SYSTEMATIC REVIEWINGESRC Methods Festival, Oxford2.00-2.45 pm 19th July 2006 David Gough EPPI-Centre, Social Science Research Unit, Institute of Education, University of London

  2. Today • SSRU and its EPPI-Centre • Building on what we know • Research and decision making • Systematic reviews • Making a difference • Issues for researchers

  3. Social Science Research Unit 1. Five streams of work: • Childhood Studies • Evaluation of Social Interventions • Sexual Health, Reproduction and Social Exclusion • Evidence for Policy and Practice Information and Co-ordinating Centre • Perspectives, Participation and Research

  4. Evidence for Policy and Practice Information and Co-ordinating (EPPI) Centre Conducting reviews since 1993 In health promotion, education, transport Support and tools for review groups: Education (25 groups, 70+ reviews), criminology, employment, speech and language, social care Formal links with Cochrane and Campbell Collaborations Methodological work, e.g. Methods for Research Synthesis Project ESRC National Centre for Research Methods Short courses and Masters course in evidence for public policy and practice On-line libraries of research evidence

  5. Why be a researcher? 2. • A job (‘I do it for the pay’) • To make a difference: • To develop how we understand and conceptualize the World • To aid decision making

  6. Improve our knowledge and understanding • New bit of research: how do we know we made a difference? • What did we know before and what do we know now = step change • But is this what we do? MRC requirements • So, how do we know what we know (both before and after completing research)?

  7. How do we do this? Methods of study • Primary study: explicit rigorous methods within accepted rules of evidence for that discipline • Secondary study: exactly the same!

  8. Decision making: Research evidence for policy and practice 3. “…policy makers and practitioners who intervene in the lives of other people not infrequently do more harm than good” — Chalmers I (2003) Trying to do more good than harm in policy and practice: the role of rigorous, transparent, up to date, replicable evaluation. Paper commissioned for the Annals of the American Academy of Political and Social Science.

  9. ‘Reduce the Risk’ Campaign in the early 1990s in the UK “The risk of cot death is reduced if babies are NOT put on the tummy to sleep. Place your baby on the back to sleep. …Healthy babies placed on their backs are not more likely to choke.”

  10. Research for practice “Teaching is not at present a research-based profession.” — Hargreaves D (1996) Teaching as a research-based profession: possibilities and prospects. Teacher Training Agency (TTA) Annual Lecture. London: TTA. Procedural (craft/apprentice) vs. declarative (for e.g. research based) knowledge

  11. Public debate “We are, through the media, as ordinary citizens, confronted daily with controversy and debate across a whole spectrum of public policy issues. But typically, we have no access to any form of a systematic ‘evidence base’ – and therefore no means of participating in the debate in a mature and informed manner”. — AFM Smith (1996) Mad cows and ecstasy: chance and choice in an evidence-based society. Journal of the Royal Statistical Society 159: 367–383.

  12. SYSTEMATIC REVIEWS 4. • Bring together and ‘pools’ the findings of primary research (i.e. clarify what we know) • Are pieces of research - following principled methods and a research question, with a protocol ‘up-front’, reflecting on its own strengths and limitations • Take steps to reduce hidden ‘bias’ and ‘error’ • Are accountable, ‘replicable’ and updateable • Are driven by review users (including researchers) to answer relevant questions in relevant ways

  13. Question led reviews, so synthesis is to answer the review question So synthesis starts at the beginning of a review: Decisions made about • Question that is being answered • Conceptual Framework to structure the synthesis • Inclusion criteria determining what studies and findings will be included • The approach to reviewing used

  14. A review’s scope matters 6 reviews of older people and accident prevention Total studies included 137 Common to at least two reviews 33 Common to all six reviews 2 Treated consistently in all reviews 1 — Oliver S (1999) Users of health services: following their agenda. In: Hood S, Mayall B, Oliver S (eds) Critical Issues in Social Research: Power and Prejudice. Buckingham: Open University Press.

  15. Key decision-making stages in SRS Form review team Formulate review question and develop protocol Define studies to be considered (inclusion criteria ) Search for and screen studies (search strategy) Describe studies (systematic map of research) Assess study quality (and relevance) Synthesise findings (answering review question) Communicate and engage

  16. Building on what we know What do we want to know? What do we know already? How do we know it? What more do we want to know? How can we know it? Not just a linear process: many overlapping research questions and questions, concepts and methods will change

  17. What did we know before and what do we know now = step change • Review what we know • Do a further piece of primary research • New study understood as just one tentative piece of knowledge/understanding (may be misleading) to build on what we know • Rarely decisive single studies (e.g. all swans are white) • Knowledge and understanding (including reviews) is not static but develops and changes

  18. Synthesis • ‘The process or result of building up separate elements, especially ideas, into a connected whole, especially a theory or system’ (OED) • Not just a report of the findings of the individual studies in a review • Involves a transformation of the data from primary studies in some way

  19. Question led reviews, so synthesis is to answer the review question So synthesis starts at the beginning of a review: • Decisions made about • Question that is being answered • Conceptual Framework to structure the synthesis • Inclusion criteria determining what studies and findings will be included • The approach to reviewing used • All shape possibilities for data transformation

  20. Approaches to reviewing and types of data and synthesis Nature of data and analysis in the review predominantly: • A priori specified method of review or iterative approach (for e.g review question, bases for assessment of quality of studies, and/or framework for synthesis emerging during the review) • Numerical or narrative data • ‘Empirical’ or ‘conceptual’ data • Relatively homogeneous or heterogeneous data • Numerical or narrative analysis of data in synthesis • Integrative (meta empirical) or interpretative (meta conceptual) synthesis of data

  21. Approaches to reviewing :examples Statistical meta analysis1 • A priori or iterative procedures of review • Numerical or narrative data • ‘Empirical’ or ‘conceptual’ data • Relatively homogeneous or heterogeneous data • Numerical or narrative analysis of data in synthesis • Integrative or interpretative synthesis of data 1For e.g. Many statistical meta analysis reviews addressing ‘what works’ questions

  22. Quantitative analysis for synthesis Eg. Statistical meta-analysis of effect sizes from experimental studies of the effect of interventions; • Product = graphical display of: • results for individual studies on a common scale • variation between studies • pooled estimate from all studies and confidence around that estimate • Methods: • Effect sizes weighted because smaller studies more subject to chance; • Assumptions made about source of variability; • Sensitivity analysis to assess robustness to assumptions and inclusion criteria.

  23. Meta analysis example: Does sex education improve the use of contraception amongst young people? From DiCenso et al. (2002) Interventions to reduce unintended pregnancies amongst adolescents: a systematic review of randomised controlled trials. British Medical Journal 231: 1426-1434

  24. Approaches to reviewing:examples Narrative reviews of quantitative studies1 • A priori or iterative procedures of review • Numerical or narrative data • Empirical’ or ‘conceptual’ data • Relatively homogeneousorheterogeneous data • Numerical or narrative analysis of data in synthesis • Integrative or interpretative synthesis of data 1For e.g. Many narrative reviews addressing ‘what works’ questions

  25. Quantitative narrative example: Secondary school size review Garrett Z, Newman M, Elbourne D, Bradley S, Noden P, Taylor J, West A, (2004) Secondary school size: A systematic review. In: Research Evidence in Education Library. London: EPPI-Centre, Social Science Research Unit, Institute of Education

  26. Approaches to reviewing:examples Thematic reviews of views studies1 and some meta ethnography2 • A priori or iterative procedures of review • Numerical or narrative data • ‘Empirical’ or ‘conceptual’ data • Relatively homogeneousorheterogeneous data • Numerical or narrative analysis of data in synthesis • Integrative or interpretative synthesis of data 1For e.g. Thomas et al (2004); 2 For e.g. Campbell et. al. (2003)

  27. Qualitative analysis for synthesis E.g. Meta-ethnographic analysis • Products: new interpretative constructions • Methods: • Vary amongst the small number of studies reported so far; • Key concepts used as data; • Concepts translated within and across studies; • Looking for reciprocal and refutational studies and lines of argument; • Role of quality and sampling varies. e.g. Britten N, Campbell R, Pope C, Donovan J, Morgan M, Pill R (2002) Using meta ethnography to synthesise qualitative research: a worked example. J Health Serv Res Policy 7(4):209-215

  28. Meta-ethnographic synthesisAdapted from Britten et al., 2003

  29. Approaches to reviewing:examples Some meta ethnography1, critical interpretative synthesis2, realist synthesis3 • A priori or iterative procedures of review • Numerical or narrative data • ‘Empirical’ or‘conceptual’ data • Relatively homogeneous or heterogeneous data • Numerical or narrative analysis of data in synthesis • Integrative or interpretative synthesis of data 1Noblitt & Hare (1988); 2Dixon Woods et al (2005); 3Pawson (2005)

  30. Iteration:a challenge for replicability • E.g. realist synthesis • developing and examining evidence for mid-range theories from multiple contexts • articulate theory and check empirical evidence – use of ‘nuggets’ • Emphasis on relevance Pawson R, Greenhalgh T, Harvey G, Walshe K (2005) Realist review – anew method of systematic review designed for complex policyinterventions. Journal of Health Services Research & Policy10: S1:21.

  31. Narrative iterative example: Realist Synthesis • Realist reviews may have multiple purposes and be directed at all or one of following • What is it about this kind of intervention that works for whom in what circumstances , in what respects and why • Developing and examining evidence for mid-range theories from multiple contexts • e.g. programme theory for a policy of public disclosure of performance data • Articulate what the theory underpinning intervention is and check empirical evidence • Process is iterative, purposive selection of evidence, appears more reliant on individual reviewers knowledge & expertise Pawson R, Greenhalgh T, Harvey G, Walshe K (2005) Realist review – anew method of systematic review designed for complex policyinterventions. Journal of Health Services Research & Policy10: S1:21.

  32. Approaches to reviewing:examples Some Bayesian meta analysis1 • A priori or iterative procedures of review • Numerical or narrative data • ‘Empirical’ or ‘conceptual’ data • Relatively homogeneousorheterogeneous data • Numerical or narrative analysis of data in synthesis • Integrative or interpretative synthesis of data 1For e.g. Roberts et. al. (2002)

  33. Background to ‘mixed methods’ approach • Policy and practice concerns often precede, or go beyond, questions of effectiveness. • Different types of questions require different combinations of study types to be included. • Different combinations of study types demand different methods of synthesis. • However, key principles of systematic reviews are not compromised.

  34. Case example of a ‘mixed methods’ synthesis What is known about the barriers to, and facilitators of, healthy eating amongst children?* *The full report of this review is available at the EPPI-Centre website: Thomas J, Sutcliffe K, Harden A, Oakley A, Oliver S, Rees R, Brunton G, Kavanagh J (2003a) Children and Healthy Eating: A systematic review of barriers and facilitators. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London.

  35. Review process SCOPING AND MAPPING (of 272 reports of 193 studies) Review question e.g. What is known about the barriers to, and facilitators of, fruit and veg intake amongst children aged 4 to 10 years? ‘Views’ studies (N=8) 1. Application of inclusion criteria 2. Quality assessment 3. Data extraction 4. Thematic synthesis Trials (N=33) 1. Application of inclusion criteria 2. Quality assessment 3. Data extraction 4. Synthesis using statistical meta-analysis Trials and ‘views’ Mixed methods synthesis

  36. Other aspects of extensiveness of systematic reviews Each stage of a review can vary in its relative breadth, depth, exhaustiveness, and analysis: • Question: breadth broad or narrow • Searching: full exhaustive or purposive or scoping • Map: analytic or descriptive • Synthesis breadth: broad or narrow • Extent of evidence: adequate unless specified minimal • Synthesis type: synthetic unless specified descriptive

  37. Some reasons why research does not make a difference 5. • Research only one of many factors • Instrumental use / avoidance of research • Questions about the quality of research or the perspective being ‘sold’ • Access to research • Interpretation and use and relation to other knowledge and contexts • Relevance of the research

  38. Professional Experience & Expertise Pragmatics & Contingencies Political Judgement Research Evidence Lobbyists & Pressure Groups Resources Habits & Tradition Values Research often just one small factor: adapted fromDavies 2004

  39. ii. Instrumental use of research • Other (non research) factors are important (e.g. school no parking safety notices) • Naïve rational model(Weiss 1979) These are not reasons for not advocating rational approaches to research and its use • Changing the culture to make rational use of research the expectation

  40. iii. Questions about quality or the perspective being ‘sold’ • Overclaiming about individual studies not contextualised within other research • Different studies with different conclusions • Different experts with different views • One particular perspective / product being ‘sold’

  41. iv. Access to all the research “…[which is] presented in a form or medium which is largely inaccessible to a non-academic audience; and lack(s) interpretation for a policy-making or practitioner audience”. — Hillage J, Pearson R, Anderson A, Tamkin P (1998) Excellence in Research in Schools. London: Department for Education and Employment/Institute of Employment Studies.

  42. Systematic reviews and the communication of their findings • Reviews provide access but need to be systematic and to critically reflect on review strengths, limitations and relationship to other research and theory • One size fits all? Rapid Evidence Assessments and EPPI-Centre 6 basic types of short or long map and synthesis reviews • Maximising accessibility: graded entry review reports and access to review data with varying degrees of user implications / conclusions

  43. v. Interpretation & use and relation to other knowledge & contexts • Interpretation in light of other knowledge • Application in practice into different contexts

  44. Interpretation of findings • Specification of the basis of the generic findings of reviews • Context specific interpretation • Application of practice knowledge and other forms of knowledge • Formality of processes to identify and use such ‘other’ knowledge. For example: primary or secondary research on contexts and/or on practice knowledge

  45. Application of interpreted findings Interpreted findings are effected by other factors and maybe also procedures in order to be applied to decision making • Contexts, demand, perspectives and other pragmatic implementation issues • Formality of procedures to identify and make use of such implementation issues. For example: primary or secondary research on implementation contexts and issues • Intermediary processes, procedures, organisations and products

  46. Feed into decision making Feed into decision making Communication of review findings Communication of review findings

  47. Funders What do we know Other review-users (and how do we know it) Researchers what do we want to know? vi. Relevance: demand rather than product led approaches Systematic Review Review question Apply systematic review methods Is there more that we want to know? and how could we know it? Communication, Interpretation, Application Review findings

More Related