Using meta-analyses in your literature review BERA Doctoral Workshop 3rd September 2008 Professor Steven Higgins Durham University

Slides:



Advertisements
Similar presentations
The Robert Gordon University School of Engineering Dr. Mohamed Amish
Advertisements

Reading and interpreting quantitative intervention research syntheses: an introduction Steve Higgins, Durham University Robert Coe, Durham University Mark.
What is meta-analysis? ESRC Research Methods Festival Oxford 8 th July, 2010 Professor Steven Higgins Durham University
Student Learning Development, TCD1 Systematic Approaches to Literature Reviewing Dr Tamara O’Connor Student Learning Development Trinity College Dublin.
Practical Meta-Analysis
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Statistical Issues in Research Planning and Evaluation
Effect Size Overheads1 The Effect Size The effect size (ES) makes meta-analysis possible. The ES encodes the selected research findings on a numeric scale.
Effect Size and Meta-Analysis
COURSE: JUST 3900 INTRODUCTORY STATISTICS FOR CRIMINAL JUSTICE Instructor: Dr. John J. Kerbs, Associate Professor Joint Ph.D. in Social Work and Sociology.
Introduction to Research Methodology
MSS 905 Methods of Missiological Research
Introduction to Meta-Analysis Joseph Stevens, Ph.D., University of Oregon (541) , © Stevens 2006.
15 de Abril de A Meta-Analysis is a review in which bias has been reduced by the systematic identification, appraisal, synthesis and statistical.
Specifying a Purpose, Research Questions or Hypothesis
1 Meta-analysis issues Carolyn Mair and Martin Shepperd Brunel University, UK.
Meta Analysis & Programmatic Research Re-introduction to Programmatic research Importance of literature reviews Meta Analysis – the quantitative contribution.
Meta-analysis & psychotherapy outcome research
Personality, 9e Jerry M. Burger
Practical Meta-Analysis -- D. B. Wilson 1 Practical Meta-Analysis David B. Wilson.
Today Concepts underlying inferential statistics
Chapter 7 Correlational Research Gay, Mills, and Airasian
Campbell Collaboration Colloquium 2012 Copenhagen, Denmark The effectiveness of volunteer tutoring programmes Dr Sarah Miller Centre.
Reading and interpreting quantitative intervention research syntheses: an introduction Steve Higgins, Durham University Robert Coe, Durham University Mark.
September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW.
Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.
The Data Analysis Plan. The Overall Data Analysis Plan Purpose: To tell a story. To construct a coherent narrative that explains findings, argues against.
Embedding information literacy into the curriculum - How do we know how well we are doing? Katharine Reedy (Open University) Cathie Jackson (Cardiff University)
9.0 A taste of the Importance of Effect Size The Basics of Effect Size Extraction and Statistical Applications for Meta- Analysis Robert M. Bernard Philip.
© 2008 McGraw-Hill Higher Education The Statistical Imagination Chapter 9. Hypothesis Testing I: The Six Steps of Statistical Inference.
Systematic Reviews: The Potential of Meta-analysis ESRC Research Methods Festival Oxford 5 th July, 2012 Professor Steven Higgins Durham University
Systematic Reviews Professor Kate O’Donnell. Reviews Reviews (or overviews) are a drawing together of material to make a case. These may, or may not,
Understanding Statistics
Evidence Based Medicine
The Effect of Computers on Student Writing: A Meta-Analysis of Studies from 1992 to 2002 Amie Goldberg, Michael Russell, & Abigail Cook Technology and.
Chapter 8 Introduction to Hypothesis Testing
Evaluating a Research Report
Good Assessment by Design International GCSE and GCE Comparative Analyses Dr. Rose Clesham.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Statistical Applications for Meta-Analysis Robert M. Bernard Centre for the Study of Learning and Performance and CanKnow Concordia University December.
Essential Statistics Chapter 131 Introduction to Inference.
Introduction to Research
FOR 500 PRINCIPLES OF RESEARCH: PROPOSAL WRITING PROCESS
Meta-analysis and “statistical aggregation” Dave Thompson Dept. of Biostatistics and Epidemiology College of Public Health, OUHSC Learning to Practice.
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
(1) Systematic reviews that configure and aggregate data to answer all research questions David Gough Systematic Reviews for Complicated and Complex Questions,
Measures of central tendency are statistics that express the most typical or average scores in a distribution These measures are: The Mode The Median.
Intro to Critiquing Research Your tutorial task is for you to critique several articles so that you develop skills for your Assignment.
Academic Research Academic Research Dr Kishor Bhanushali M
Developing a Review Protocol. 1. Title Registration 2. Protocol 3. Complete Review Components of the C2 Review Process.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
From description to analysis
Question paper 1997.
Research Methods Ass. Professor, Community Medicine, Community Medicine Dept, College of Medicine.
Retain H o Refute hypothesis and model MODELS Explanations or Theories OBSERVATIONS Pattern in Space or Time HYPOTHESIS Predictions based on model NULL.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Validity and utility of theoretical tools - does the systematic review process from clinical medicine have a use in conservation? Ioan Fazey & David Lindenmayer.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 27 Systematic Reviews of Research Evidence: Meta-Analysis, Metasynthesis,
Data Analysis. Qualitative vs. Quantitative Data collection methods can be roughly divided into two groups. It is essential to understand the difference.
Week Seven.  The systematic and rigorous integration and synthesis of evidence is a cornerstone of EBP  Impossible to develop “best practice” guidelines,
Chapter 9 Introduction to the t Statistic
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
Rehabilitation Research July 27, 2005
Gerald Dyer, Jr., MPH October 20, 2016
Essential Statistics Introduction to Inference
Meta-analysis April 11, 2006.
Meta-analysis, systematic reviews and research syntheses
META-ANALYSIS PROCEDURES
Presentation transcript:

Using meta-analyses in your literature review BERA Doctoral Workshop 3rd September 2008 Professor Steven Higgins Durham University

Acknowledgements This presentation is an outcome of the work of the ESRC-funded Researcher Development Initiative: “Training in the Quantitative synthesis of Intervention Research Findings in Education and Social Sciences” which ran from The training was designed by Steve Higgins and Rob Coe (Durham University), Carole Torgerson (Birmingham University) and Mark Newman and James Thomas, Institute of Education, London University. The team acknowledges the support of Mark Lipsey, David Wilson and Herb Marsh in preparation of some of the materials, particularly Lipsey and Wilson’s (2001) “Practical Meta-analysis” and David Wilson’s slides at: (accessed 9/3/11). The materials are offered to the wider academic and educational community community under a Creative Commons licence: Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported LicenseCreative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License You should only use the materials for educational, not-for-profit use and you should acknowledge the source in any use.

Aims To support understanding of meta-analysis of intervention research findings in education; To extend understanding of reviewing quantitative research literature; To describe the techniques and principles of meta-analysis involved to support understanding of its benefits and limitations; To provide references and examples to support further work.

ESRC Researcher Development Initiative Quantitative synthesis of intervention research findings in education –Collaboration between Durham University York University Institute of Education, London

Why review? Ask the person next to you what the purpose of the literature review is in their thesis See how many different purposes you can think of Join another pair and identify which are the 3 you think are the most important

Why review? Summarise existing knowledge What we know, and how we know it For what purpose? –Expectation –Scenery –State of the art (summary) –Positioning (conceptual) –Progressing knowledge (logic)

The PhD literature review Narrative summary of the area Grand tour of the concepts and terminology Synthesis of empirical findings Background to the study

A systematic review is usually more comprehensive; is normally less biased, being the work of more than one reviewer; is transparent and replicable (Andrews, 2005)

Examples of systematic reviews EPPI Centre –UK based - wide range of educational topics The Campbell Collaboration –5 education reviews Best Evidence Encyclopedia –John’s Hopkins’ - aimed at practice

Systematic reviewing Key question Search protocol Inclusion/exclusion criteria Coding and Mapping In-depth review (sub-question) Techniques for systematic synthesis

Systematic reviews Research and policy Specific reviews to answer particular questions –What works? - impact and effectiveness research with a tendency to focus on quantitative and experimental designs

Literature reviewing - conceptual relations Systematic review Meta-analysis Narrative review

Meta-analysis Synthesis of quantitative data –Cumulative –Comparative –Correlational “Surveys” educational research (Lipsey and Wilson, 2001)

Origins 1952: Hans J. Eysenck concluded that there were no favorable effects of psychotherapy, starting a raging debate which 25 years of evaluation research and hundreds of studies failed to resolve 1978: To proved Eysenck wrong, Gene V. Glass statistically aggregated the findings of 375 psychotherapy outcome studies Glass (and colleague Smith) concluded that psychotherapy did indeed work - “the typical therapy trial raised the treatment group to a level about two-thirds of a standard deviation on average above untreated controls; the average person received therapy finished the experiment in a position that exceeded the 75th percentile in the control group on whatever outcome measure happened to be taken” (Glass, 2000). Glass called the method “meta-analysis” ( adapted from Lipsey & Wilson, 2001)

Historical background Underpinning ideas can be identified earlier: –K. Pearson (1904) Averaged correlations for typhoid mortality after inoculation across 5 samples –R. A. Fisher (1944) “When a number of quite independent tests of significance have been made … although few or none can be claimed individually as significant, yet the aggregate gives an impression that the probabilities are on the whole lower than would often have been obtained by chance” (p. 99). Source of the idea of cumulating probability values –W. G. Cochran (1953) Discusses a method of averaging means across independent studies Set out much of the statistical foundation for meta-analysis (e.g., Inverse variance weighting and homogeneity testing) ( adapted from Lipsey & Wilson, 2001)

Significance versus effect size Traditional test is of statistical ‘significance’ The difference is unlikely to have occurred by chance –However it may not be: Large Important, or even Educationally ‘significant’

The rationale for using effect sizes Traditional reviews focus on statistical significance testing –Highly dependent on sample size –Null finding does not carry the same “weight” as a significant finding Meta-analysis focuses on the direction and magnitude of the effects across studies –From “Is there a difference?” to “How big is the difference?” –Direction and magnitude represented by “effect size”

Comparison of impact Same AND different measures Significance vs effect size –Does it work? vs How well does it work? Effect size

Standardised way of looking at gain scores Different methods for calculation Experimental group mean - Control mean/ Standard deviation ‘Effect size’

What is “effect size”? Standardised way of looking at difference –Different methods for calculation Odds Ratio Correlational (Pearson’s r) Standardised mean difference –Difference between control and intervention group as proportion of the dispersion of scores

Calculating effect size Control group gain minus experimental group gain divided by the standard deviation of the groups

Effect size and impact From: Marzano, R. J. (1998) A Theory-Based Meta-Analysis of Research on Instruction. Aurora, Colorado, Mid-continent Regional Educational Laboratory. Available at: (accessed 2/9/08).

Relative effects - average is about (Sipe and Curlette, 1997; Hattie, Biggs and Purdie, 1996) Doing something different makes a difference Visualising the difference Interpreting effect sizes

0.1 = percentile gain of 6 points ie a class ranked 50th in a league table of 100 schools would move from 50th to about 44th place 0.5 = percentile gain of 20 points ie move from 50th to 30th place 1.0 = percentile gain of 34 points ie move from 50th to 16th place How much is the impact?

0.2 “small” = difference in height between year olds 0.5 “medium” = difference in height between 14 and 18 year olds 0.8 “large” = difference in height between 13 and 18 year olds Other interpretations Cohen 1969

Meta-analysis Key question Search protocol Inclusion/exclusion criteria Coding Statistical exploration of findings –Mean –Distribution –Sources of variance

Some findings from meta- analysis Pearson et al research articles, 89 effects ‘related to digital tools and learning environments to enhance literacy acquisition’. Weighted effect size of indicating technology can have a positive impact on reading comprehension Bernard et al Distance education and classroom instruction studies, 688 effects - wide range of effects (‘heterogeneity’); asynchronous DE more effective than synchronous

More findings Hattie and Timperley, 2007 ‘The Power of Feedback’, synthesis of other meta-analyses on feedback to provide a conceptual review 196 studies, 6972 effects - average effect of feedback on learning 0.79

Formative assessment CASE (Cognitive Acceleration Through Science Education) Individualised instruction ICT Homework Direct instruction Rank (or guess) some effect sizes…

1. 04 CASE (Cognitive Acceleration Through Science Education) (Boys science GCSE - Adey & Shayer, 1991) 0.6 Direct instruction (Sipe & Curlette, 1997) 0.43 Homework (Hattie, 1999) 0.32 Formative assessment (KMOFAP) 0.31 ICT (Hattie, 1999) 0.1 Individualised instruction (Hattie, 1999) Rank order of effect sizes

‘Super-syntheses’ Syntheses of meta-analyses Relative effects of different interventions Assumes variation evens out across studies with a large enough dataset (Marzano/Hattie) or attempts to control for the variation statistically (Sipe & Curlette)

Synthesis of study skills interventions Meta-analysis of 51 studies of study skills interventions. Categorised the inverventions using the SOLO model (Biggs & Collis, 1982), classified studies into four hierarchical levels of structural complexity and as either ‘near’ or ‘far’ transfer. The results support situated cognition, and that training for other than simple mnemonic tasks should be in context, use tasks within the same domain as the target content, and promote a high degree of learner activity and metacognitive awareness. (average effect 0.4) Hattie Biggs and Purdie, 1996

Sipe and Curlette, 1997 “A metasynthesis of factors relating to educational achievement” - testing Walberg’s ‘educational productivity’ model - synthesis of 103 meta- analyses

‘Theory driven’ Self system - metacognition - cognition/ knowledge Self Metacogntive 0.72 Cognitive 0.55 Marzano, 1998

Discussion Work with a colleague to put the statements in order of how comparable you think the research findings are Join another pair (or pairs) and decide how comfortable would you be with comparing the findings

Issues and challenges in meta-analysis Conceptual –Reductionist - the answer is 42 –Comparability - apples and oranges –Atheoretical - ‘flat-earth’ Technical –Heterogeneity –Publication bias –Methodological quality

Reductionist or ‘flat earth’ critique The “flat earth” criticism is based on Lee Cronbach’s assertion that a meta-analysis looks at the “big picture” and provides only a crude average. According to Cronbach, “… some of our colleagues are beginning to sound like a Flat Earth Society. They tell us that the world is essentially simple: most social phenomena are adequately described by linear relations; one- parameter scaling can discover coherent variables independent of culture and population; and inconsistencies among studies of the same kind will vanish if we but amalgamate a sufficient number of studies…The Flat Earth folk seek to bury any complex hypothesis with an empirical bulldozer…” (Cronbach, 1982, in Glass, 2000).

Comparability Apples and oranges –Same test –Different measures of the same construct –Different measures of different constructs –What question are you trying to answer? –How strong is the evidence for this? “Of course it mixes apples and oranges; in the study of fruit, nothing else is sensible; comparing apples and oranges is the only endeavor worthy of true scientists; comparing apples to apples is trivial” (Glass, 2000).

Empirical not theoretical? What is your starting point? Conceptual/ theoretical critique –Marzano –Hattie –Sipe and Curlette

Technical issues Interventions Publication bias Methodological quality Sample size Homogeneity/ heterogeneity

Interventions “Super-realisation bias” (Cronbach & al. 1980) –Small-scale interventions tend to get larger effects –Enthusiasm, attention to detail, quality of personal relationships

Publication bias Statistically significant (positive) findings Smaller studies need larger effect size to reach significance Larger effects –‘Funnel plot’ sometimes used to explore this Scatterplot of the effects from individual studies (horizontal axis) against a study size (vertical axis)

Methodological quality Traditional reviews privilege methodological rigour –Low quality studies higher effect sizes (Hattie Biggs & Purdie, 1996) –No difference (Marzano, 1998) –High quality studies, higher effect sizes (Lipsey & Wilson, 1993) Depends on your definition of quality

Sample size “Median effect sizes for studies with sample sizes less than 250 were two to three times as large as those of larger studies.” (Slavin & Smith, 2008)

Heterogeneity Variation in effect sizes Investigate to find clusters (moderator variables) Assumption that the effect will be consistent

Questions and reactions With a colleague see if you can identify a question arising from the presentation so far What is your reaction to the technique How useful is it –Generally –To your own work?

Strengths of Meta-Analysis Uses explicit rules to synthesise research findings Can find relationships across studies which may not emerge in qualitative reviews Does not (usually) exclude studies for methodological quality to the same degree as traditional methods Statistical data used to determine whether relationships between constructs need clarifying Can cope with large numbers of studies which would overwhelm traditional methods of review

Summary “Replicable and defensible” method for synthesizing findings across studies (Lipsey & Wilson, 2001) Identifies gaps in the literature, providing a sound basis for further research Indicates the need for replication in education Facilitates identification of patterns in the accumulating results of individual evaluations Provides a frame for theoretical critique

Other approaches to synthesis Narrative Quantitative (meta-analysis) Best-evidence synthesis (Slavin) Realist (Pawson) Meta-ethnography (Noblitt & Hare) Thematic synthesis (Thomas & Harden) Grounded theory

Suggestions Be explicit about your rationale Be systematic (or at least methodical) Be transparent Describe Analyse (content and methodology) Synthesise

A (narrative) metaphor… Literature review as rhetoric An act of persuasion Introduce your study…

Some useful websites EPPI, Institute of Education, London The Campbell Collaboration Best Evidence Encyclopedia, Johns Hopkins Best Evidence Synthesis (BES), NZ Institute for Effective Education (York)

Further training ESRC RDI in quantitative synthesis –One day training sessions Introduction (for interpretation) Methods Training (for application) Issues Seminars (methodological issues) –Durham, London, Edinburgh, Bristol, Belfast, York

References Bernard, R.M., Abrami, P.C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., Wallet, P.A., Fiset, M.,& Huang, B. (2004) How Does Distance Education Compare with Classroom Instruction? A Meta-Analysis of the Empirical Literature Review of Educational Research, 74. 3, (Autumn, 2004), pp Chambers, E.A. (2004). An introduction to meta-analysis with articles from the Journal of Educational Research ( ). Journal of Educational Research, 98, pp Cronbach, L. J., Ambron, S. R., Dornbusch, S. M., Hess, R.O., Hornik, R. C., Phillips, D. C., Walker, D. F., & Weiner, S. S. (1980). Toward reform of program evaluation: Aims, methods, and institutional arrangements. San Francisco, Ca.: Jossey-Bass. Glass, G.V. (2000). Meta-analysis at 25. Available at: (accessed 9/9/08) Hattie, J. A. (1992). Measuring the effects of schooling. Journal of Education, 36, pp 5-13 Hattie, J., Biggs, J. and Purdie, N. (1996) Effects of Learning Skills Interventions on Student Learning: A Meta-analysis Review of Educational Research 66.2 pp Hattie, J.A. (1987) Identifying the salient facets of a model of student learning: a synthesis of meta-analyses International Journal of Educational Research, 11 pp Hattie, J. & Timperley, H. (2007) The Power of Feedback Review of Educational Research 77. 1, pp. 81–112. Lipsey, Mark W., and Wilson, David B. (2001). Practical Meta-Analysis. Applied Social Research Methods Series (Vol. 49). Thousand Oaks, CA: SAGE Publications. Marzano, R. J. (1998) A Theory-Based Meta-Analysis of Research on Instruction. Aurora, Colorado, Mid-continent Regional Educational Laboratory. Available at: (accessed 2/9/08). Pearson, D.P., Ferdig, R.E., Blomeyer, R.L. & Moran, J. (2005) The Effects of Technology on Reading Performance in the Middle-School Grades: A Meta-Analysis With Recommendations for Policy Naperville, Il: University of Illinois/North Central Regional Educational Laboratory. Sipe, T. & Curlette, W.L. (1997) A Meta-Synthesis Of Factors Related To Educational Achievement: A Methodological Approach To Summarizing And Synthesizing Meta-Analyses International Journal of Educational Research pp Slavin, R.E. and Smith, D. (2008) Effects of Sample Size on Effect Size in Systematic Reviews in Education Paper presented at the annual meetings of the Society for Research on Effective Education, Crystal City, Virginia, March 3-4, 2008.

Acknowledgements This presentation is an outcome of the work of the ESRC-funded Researcher Development Initiative: “Training in the Quantitative synthesis of Intervention Research Findings in Education and Social Sciences” which ran from The training was designed by Steve Higgins and Rob Coe (Durham University), Carole Torgerson (Birmingham University) and Mark Newman and James Thomas, Institute of Education, London University. The team acknowledges the support of Mark Lipsey, David Wilson and Herb Marsh in preparation of some of the materials, particularly Lipsey and Wilson’s (2001) “Practical Meta-analysis” and David Wilson’s slides at: (accessed 9/3/11). The materials are offered to the wider academic and educational community community under a Creative Commons licence: Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported LicenseCreative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License You should only use the materials for educational, not-for-profit use and you should acknowledge the source in any use.