Presentation on theme: "Writing Reviews Worth Reading"— Presentation transcript:
1Writing Reviews Worth Reading Tina JonesGavin LeslieAndrea Marshall
2The Traditional Literature review versus a Systematic Review Dr Tina JonesManager, Australian Centre for Evidence Based Clinical Practice, FMCSenior Lecturer, Flinders University
3An Evidence Based Approach to Clinical Practice The Evidence Based Practice (EBP) framework emerged in the early 1970s as a means of improving clinical practiceShifted decision-making from a culture of delivering care based on tradition, intuition and authorityTo a framework where clinical decisions are based on best available evidence
4Where do nurses find the evidence? 38.7% nursing journals (Estabrooks 1998)> 400 nursing journals listedAccess barriersLack of time and easy accessLack of evidence related to nursingLack confidence in ability to locate, understand and properly evaluate researchLack of organisational infrastructure support (Thompson et al 2001, Nagy et al 2001)
5A Traditional Literature Review Narrative in styleWritten by clinical experts or individuals who have read widely on a particular topicProvide a qualitative summary of the literatureUse an informal and subjective approach to identifying and appraising the literature
6Davies H. & Leslie G Maintaining the CRRT circuit: non-anticoagulant alternatives, ACC; 19(4):
7Limitations of the Traditional Review Traditional reviews have been criticised as haphazard and biased, subject to the idiosyncratic impressions of the individual reviewer (Mulrow 1987)Potential sources of biasLack of explicit inclusion or exclusion criteria for the studies citedReviewer bias towards particular views/practicesNo mention of methods used to assess studies, or their variable qualityLack rigor in both synthesis of findings and data extraction
8Limitations of the Traditional Review Mulrow (1987) from June identified 50 reviews in 4 major medical journals(Ann of Int Med, Arch of Int Med, JAMA, NEJM)Criteria:Purpose of review, search methods and sources, reasons for inclusion & exclusions, methodological assessment, explication of data inconsistencies, integration of findings, summary of pertinent findings, directives for new research‘Current medical reviews do not routinely use (such) scientific methods to identify, assess, and synthesise information’ (Mulrow 1987)The more authoritative the expert writing the review, the lower the quality of the review (Oxman & Guyatt 1995)
9Systematic ReviewsA review of the relevant literature on a focused clinical question that employs explicit methods to minimise bias in:the conduct of the literature searchthe appraisal of the individual studiesthe methods for pooling and summarising data
10Gardner A et al 2005 Best practice in stabilisation of oral endotracheal tubes: a systematic review ACC; 18(4):
11Purpose of Systematic Reviews “The purpose of a systematic literature review is to evaluate and interpret all available research evidence relevant to a particular question”(NHMRC, 2000)
12Why do we need Systematic Reviews? Invaluable to health care providers, researchers and policy makersInundated with unmanageable amounts of informationPowerful means of integrating the best research evidenceRational basis for health care decisionsPotential to improve healthcare outcomes
13Why do we need Systematic Reviews? To avoid the personal bias of an author‘How to live longer and feel better’ – Linus Pauling (1986)large doses of vitamin C prevent coldsSystematic review showed Vitamin C in doses as high as 1 g/day for several winter months, had no consistent beneficial effect on incidence of the common cold (Douglas et al 2003)
16Why Systematic Reviews Matter… Allow data to be ‘pooled’ and summarised, increasing statistical ‘power’A meta-analysis is a statistical technique for combining the findings of two or more studies in order to provide a single quantitative estimate of the overall treatment effect
17The Power of Systematic Reviews 1972 the first RCT of prenatal corticosteroids to women at risk of preterm birthWithin 10 years, 7 more trials had been completed, all showing benefitBecause no-one had ‘collated’ these findings until 1989, there was limited use of prenatal corticosteroids until early 1990s
20NHMRC Levels of Evidence Level I Systematic Review of Level II studiesLevel II Randomised Controlled TrialLevel III-1 Pseudo Randomised Controlled TrialLevel III-2 Comparative Study with concurrent controlsLevel III-3 Comparative Study without concurrent controlsLevel IV Case Series(NHMRC, 1999)
21Assoc Prof Gavin D Leslie Editor - Australian Critical Care Who conducts systematic reviews & where to find them? Basic steps in systematic review process – question identification, literature review, and compilation of data.Assoc Prof Gavin D LeslieEditor - Australian Critical Care
22Who conducts systematic reviews & where to find them? Anyone can conduct a reviewNumerous EBP “centres” specialising in reviews - e.g.Cochrane Collaboration (medical)Joanna Briggs institute (nursing, midwifery)Pedro (physiotherapy)Remember - not all reviews are the same!
24Cochrane Collaboration http://www.cochrane.org.au/ Best known (est 1996) international collaborationOver 2000 reviews availablePredominantly based on RCTMedically orientatedStrict criteria and process
26Joanna Briggs Institute http://www.joannabriggs.edu.au/about/home.php Australian based international groupNursing & midwifery focusseddifferent criteria and approach to Cochrane, although principles are the sameWell known for “Practice Summaries”
33Basic steps in systematic review process – literature review Search StrategyUse a thorough search strategyDescribe your search strategy, list key wordsList databases accessed, reference lists, other lit sources, languageEliminate inclusion biasDevise a selection process prospectively with inclusion and exclusion criteriaStudy designs, levels of evidence (RCTs etc)
35Basic steps in systematic review process –literature review Process of Critical AppraisalData Extraction & Synthesis. Errors minimised (use checklists), tools identified (objective criteria or measures), use 2 independent reviewers (and a third arbiter)Provide details of all studies reviewed, both included and not includedImportant in really deciding the integrity of the review.Provided in full review - not usually in “article” size version.
36Basic steps in systematic review process – compilation of data Evaluation of studies cumulatively using statistical assessment.Forrest plots, risk ratios, CIs
37New developments in systematic reviews Andrea MarshallSesqui Senior Lecturer Critical CareThe University of SydneyAssociate Editor, Australian Critical Care
38Advantages of systematic reviews Limit biasConclusions more reliable and accurateAssimilate large amounts of informationFormally compared studiesgeneralisability of findings and consistency of resultsIdentify inconsistency in results and generate new hypothesesIncrease the precision of the overall result
39Limitations of Systematic Reviews Meta-analyses are often used to recover something from poorly designed studies, studies of insignificant statistical power, studies that give erratic results and those resulting in apparent contradictions.If a medical treatment has an effect so recondite and obscure as to require meta-analysis to establish it, I would not be happy to have it used on me. It would seem better to improve treatment, and the theory underlying the treatment".HJ Eysenck 1995Systematic ReviewsChalmers I, Altman DGBMJ Publishing Group, London. p73
40Limitations of systematic reviews Combines studies of varying quality, from the excellent to the appallingMay be inconsistent with high quality RCTsImportant distinctions between primary studies may be lost (inclusion/exclusion criteria or intervention)Dissimilar conclusions depending the "review question"Over-generalisation may make it difficult for practitioners to apply the results
41Bias in systematic reviews Publication biasNegative studies tend not to get publishedData from one study can be available in multiple formatsSelection biasinclusion criteriaLanguage biaspredominantly English language studies included
42When systematic reviews disagree with each other AimMethodsJudgement of quality of studiesSumming up evidenceSystematic reviews include an element of judgement, whatever method is used
43Quality Of Reporting Of Meta-analyses (QOROM) Encourages authors to provide readers with information regardingsearchesselectionvalidity assessmentdata abstractionstudy characteristicsquantitative data synthesistrial flow
45Qualitative research & SRs Cochrane Qualitative Research Methods GroupAims to provide guidance on methodological standardsAims to publish a protocol for conducting SR of qualitative evidenceThere is a place for qualitative research in reviews of evidence, but how can it be incorporated?
46Qualitative Systematic Reviews Searching the qualitative literatureAppraising the qualitative literatureSynthesizing the qualitative evidenceLinking with existing quantitative evidence
47Appraising qualitative research evidence Should we?How is it done?Criteria – or is this imposing a positivist framework?Ultimately a means of determining the quality of qualitative research is needed
48Quality criteria Many criteria but little agreement Some criteria enforce values of the positivist paradigm (Kappa statistics in coding) eg. Engel & Kuzel, 1992; Strauss & Corbin, 1998)Is this meaningful? (Morse, 1994; Yardley, 2000)What grounds are the criteria based?Quality criteria100s of quality criteria exist but there is huge variation within them (Dixon-Woods et al, 2004; NHS CRD, 2001)Seale & Silverman (1997) emphasise detailed transcription, support of data with counts of events (quasi-statistics) & computer softwareBut Popay et al (1998) prioritise subjectivity, flexibility & adequate description arguing that computer software & quasi-statistics are neither necessary not sufficient for rigorous qualitative researchSome criteria argue qualitative research should aim to be reproducible & multiple coding is a good way of assessing its quality (Engel & Kuzel, 1992; Strauss & Corbin, 1998) While others deem such criteria meaningless (Morse, 1994; Yardley, 2000) On what grounds are these criteria based?
49QualityPhilosophical underpinnings impact on criteria (some are incompatible)Procedural aspects should be considered
50Good quality Clear reporting of methods Transparency Sampling Data collectionAnalysisTransparencyAdequate presentation of the dataIndicators of good quality based on our findingsClear reporting of methods - sampling, data collection & analysis - transparency Adequate presentation of data – grounded in the data In-depth analysis of data - interpretativeClarity throughout: integration of research questions, methods, data, results & conclusions drawn – trustworthiness
51Good quality Analysis Trustworthiness Grounded in the data Integration of research question, methods, data, results, and conclusions
52Further work to develop SRs of Qualitative studies Need to consider diverse study designsTheoretical perspectives are hugely influentialIdentify fatal flawsFurther development needed:to devise criteria that recognise diversity of study designs & theoretical perspective in qualitative researchto distinguish between fatal flaws & minor errorsto enable the application of high quality qualitative research in teaching & practice
54Cochrane or cock-eyed? how should we conduct systematic reviews of qualitative research? Andrew Booth Senior Lecturer in Evidence Based Healthcare Information, School of Health and Paper presented at the Qualitative Evidence-based Practice Conference, Taking a Critical Stance. Coventry University, MayAbstractThe quantitative versus qualitative debate has taken significant steps towards reconciliation within the wider field of evidence based practice. Nevertheless, far more insidious discrimination remains. Systematic review methodology exhibits all the characteristics of "institutionalised quantitativism" in that criteria for a "good" review are almost entirely determined by the quantitative methods promoted and perpetuated by the Cochrane Collaboration. Nobody who understands qualitative research would insist that its primary studies demonstrate alien concepts such as "sample size" or "statistical power". Yet comparably fundamental absurdities persist with regard to qualitative syntheses. Why should systematic reviewers of qualitative research pursue a "gold standard" comprehensive literature search when concepts such as "data saturation" have an established pedigree? Why should they apologise for an absence of meta-analysis when little-known techniques such as meta-ethnography could be included in a reviewer's toolbox? Why shouldn't they apply systematic, explicit and reproducible principles of thematic or concept analysis to create syntheses that advance our understanding of qualitative issues and highlight research gaps? The author draws on experience of a dozen systematic reviews, a third qualitative, to suggest how systematic reviews of qualitative research might acquire a methodology that is more sympathetic to the paradigm within which they are conducted.