280 likes | 637 Vues
What is meta-analysis?. ESRC Research Methods Festival Oxford 8 th July, 2010 Professor Steven Higgins Durham University s.e.higgins@durham.ac.uk. Key points. Understanding ‘effect-size’ Why do we need it? What are its limitations? What is its potential?. What is an “effect size”?.
E N D
What is meta-analysis? ESRC Research Methods Festival Oxford 8th July, 2010 Professor Steven Higgins Durham University s.e.higgins@durham.ac.uk
Key points • Understanding ‘effect-size’ • Why do we need it? • What are its limitations? • What is its potential?
What is an “effect size”? • Standardised way of looking at difference • Different methods for calculation • Odds Ratio • Correlational (Pearson’s r) • Standardised mean difference • Difference between control and intervention group as proportion of the dispersion of scores
Examples of Effect Sizes: “Equivalent to the difference in heights between 15 and 16 year old girls” ES = 0.2 58% ofcontrol group below mean of experimental group Probability you could guess which group a person was in = 0.54 Change in the proportion above a given threshold: from 50% to 58% or from 75% to 81%
ES = 0.5 “Equivalent to the difference in heights between 14 and 18 year old girls” 69% ofcontrol group below mean of experimental group Probability you could guess which group a person was in = 0.60 Change in the proportion above a given threshold: from 50% to 69% or from 75% to 88%
ES = 0.8 “Equivalent to the difference in heights between 13 and 18 year old girls” 79% ofcontrol group below mean of experimental group Probability you could guess which group a person was in = 0.66 Change in the proportion above a given threshold: from 50% to 79% or from 75% to 93%
Significance versus effect size • Traditional test is of statistical ‘significance’ • The difference is unlikely to have occurred by chance • However it may not be: • Large • Important, or even • Educationally ‘significant’
The rationale for using effect sizes • Traditional reviews focus on statistical significance testing • Highly dependent on sample size • Null finding does not carry the same “weight” as a significant finding • Meta-analysis focuses on the direction and magnitude of the effects across studies • From “Is there a difference?” to “How big is the difference?” and “How consistent is the difference?” • Direction and magnitude represented by “effect size”
Meta-analysis • Synthesis of quantitative data • Cumulative • Comparative • Correlational • “Surveys” educational research (Lipsey and Wilson, 2001)
Forest plots • Effective way of presenting results • Studies, effect sizes, confidence intervals • Provides an overview of consistency of effects • Summarises an overall effect (with confidence interval) • Useful visual model of a meta-analysis
Anatomy of a forest plot… Line of no effect N of study Study effect size (with C.I.) C.I Studies Study effect size Weighting of study in meta-analysis Pooled effect size Pooled effect size
Issues and challenges in meta-analysis • Conceptual • Reductionist - the answer is 42 • Comparability - apples and oranges • Atheoretical - ‘flat-earth’ • Technical • Heterogeneity • Publication bias • Methodological quality
Schulze, R. (2007) The state and the art of meta-analysis Zeitschrift fur Psychologie/ Journal of Psychology, 215 pp 87 - 89.
RDI in Quantitative Synthesis • Collaboration between the Universities of Durham, Birmingham and the Institute of Education, University of London • Rob Coe, Mark Newman, James Thomas and Carole Torgerson • Levels 1 and 2: Intro and practical training • Durham, Edinburgh London, Belfast, York, Cardiff • Level 3: expert workshops • Prof Mark Lipsey, Prof Larry Hedges, • Doctoral support through BERA
References, further readings and information Books and articles Borenstein, M., Hedges, L.V., Higgins, J.P.T. & Rothstein, H.R. (2009) Introduction to Meta Analysis (Statistics in Practice) Oxford: Wiley Blackwell. Chambers, E.A. (2004). An introduction to meta-analysis with articles from the Journal of Educational Research (1992-2002). Journal of Educational Research, 98, pp 35-44. Cooper, H.M. (1982) Scientific Guidelines for Conducting Integrative Research Reviews Review Of Educational Research 52; 291. *Cooper, H.M. (2009) Research Synthesis and meta-analysis: a step-by-step approach London: SAGE Publications (4th Edition). Cronbach, L. J., Ambron, S. R., Dornbusch, S. M., Hess, R.O., Hornik, R. C., Phillips, D. C., Walker, D. F., & Weiner, S. S. (1980). Toward reform of program evaluation: Aims, methods, and institutional arrangements. San Francisco, Ca.: Jossey-Bass. Glass, G.V. (2000). Meta-analysis at 25. Available at: http://glass.ed.asu.edu/gene/papers/meta25.html (accessed 9/9/08) Lipsey, Mark W., and Wilson, David B. (2001). Practical Meta-Analysis. Applied Social Research Methods Series (Vol. 49). Thousand Oaks, CA: SAGE Publications. Torgerson, C. (2003) Systematic Reviews and Meta-Analysis (Continuum Research Methods) London: Continuum Press. Websites What is an effect size?, by Rob Coe: http://www.cemcentre.org/evidence-based-education/effect-size-resources The meta-analysis of research studies: http://echo.edres.org:8080/meta/ The Meta-Analysis Unit, University of Murcia: http://www.um.es/metaanalysis/ The PsychWiki: Meta-analysis: http://www.psychwiki.com/wiki/Meta-analysis Meta-Analysis in Educational Research: http://www.dur.ac.uk/education/meta-ed/