Transparency and accuracy in reporting health research Doug Altman The EQUATOR Network Centre for Statistics in Medicine, Oxford, UK.

Slides:



Advertisements
Similar presentations
Appraisal of an RCT using a critical appraisal checklist
Advertisements

Evidence into Practice: how to read a paper Rob Sneyd (with help from...Andrew F. Smith, Lancaster, UK)
Perspectives Authors and editors perspective Is there much difference between perspectives of different stakeholders? –authors, readers, editors, clinicians,
Yiu-fai Cheung, MD Department of Paediatrics and Adolescent Medicine LKS Faculty of Medicine The University of Hong Kong Hong Kong, China Sharing in GRF.
Research article structure: Where can reporting guidelines help? Iveta Simera The EQUATOR Network workshop.
Dr. S. Manikandan Assistant Professor of Pharmacology JIPMER, Pondicherry. Standard Reporting guidelines: CONSORT and Others.
The CONSORT * Statement for Reporting Randomized Trials Al M. Best, PhD Department of Biostatistics *Consolidated Standards of Reporting Trials.
Doug Altman Centre for Statistics in Medicine, Oxford, UK
Reporting systematic reviews and meta-analyses: PRISMA
8. Evidence-based management Step 3: Critical appraisal of studies
Reading the Dental Literature
Elements of a clinical trial research protocol
THE NEWCASTLE CRITICAL APPRAISAL WORKSHEET
Critical Appraisal Library and Information Service Southmead Ext 5333 Frenchay Ext 6570.
Journal Club Alcohol and Health: Current Evidence January-February 2006.
CRITICAL READING OF THE LITERATURE RELEVANT POINTS: - End points (including the one used for sample size) - Surrogate end points - Quality of the performed.
By Dr. Ahmed Mostafa Assist. Prof. of anesthesia & I.C.U. Evidence-based medicine.
Introduction to evidence based medicine
Critical Appraisal of an Article by Dr. I. Selvaraj B. SC. ,M. B. B. S
Reading Science Critically Debi A. LaPlante, PhD Associate Director, Division on Addictions.
Accredited Member of the Association of Clinical Research Professionals, USA Tips on clinical trials Maha Al-Farhan B.Sc, M.Phil., M.B.A., D.I.C.
Developing Research Proposal Systematic Review Mohammed TA, Omar Ph.D. PT Rehabilitation Health Science.
Making all research results publically available: the cry of systematic reviewers.
STrengthening the Reporting of OBservational Studies in Epidemiology
Their contribution to knowledge Morag Heirs. Research Fellow Centre for Reviews and Dissemination University of York PhD student (NIHR funded) Health.
EBD for Dental Staff Seminar 2: Core Critical Appraisal Dominic Hurst evidenced.qm.
Epidemiology The Basics Only… Adapted with permission from a class presentation developed by Dr. Charles Lynch – University of Iowa, Iowa City.
SYSTEMATIC REVIEWS AND META-ANALYSIS. Objectives Define systematic review and meta- analysis Know how to access appraise interpret the results of a systematic.
Systematic Reviews.
Study design P.Olliaro Nov04. Study designs: observational vs. experimental studies What happened?  Case-control study What’s happening?  Cross-sectional.
Evidence Based Medicine Meta-analysis and systematic reviews Ross Lawrenson.
Introduction to Systematic Reviews Afshin Ostovar Bushehr University of Medical Sciences Bushehr, /9/20151.
Repair of obstetric anal sphincter tears Journal Club 18 th February 2011 By Dr. Ian Haines GP-ST1 & Nevine te West.
Transparent and accurate reporting increases reliability, utility and impact of your research: Reporting guidelines and the EQUATOR Network Iveta Simera.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
EBC course 10 April 2003 Critical Appraisal of the Clinical Literature: The Big Picture Cynthia R. Long, PhD Associate Professor Palmer Center for Chiropractic.
Plymouth Health Community NICE Guidance Implementation Group Workshop Two: Debriding agents and specialist wound care clinics. Pressure ulcer risk assessment.
Literature searching & critical appraisal Chihaya Koriyama August 15, 2011 (Lecture 2)
Landmark Trials: Recommendations for Interpretation and Presentation Julianna Burzynski, PharmD, BCOP, BCPS Heme/Onc Clinical Pharmacy Specialist 11/29/07.
Clinical Writing for Interventional Cardiologists.
Critical Reading. Critical Appraisal Definition: assessment of methodological quality If you are deciding whether a paper is worth reading – do so on.
Objectives  Identify the key elements of a good randomised controlled study  To clarify the process of meta analysis and developing a systematic review.
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
EBM --- Journal Reading Presenter :呂宥達 Date : 2005/10/27.
Research article structure: Where can reporting guidelines help? Iveta Simera The EQUATOR Network workshop 10 October 2012, Freiburg, Germany.
How Empty Are Empty Reviews? The first report on the Empty Reviews Project sponsored by the Cochrane Opportunities Fund and an invitation to participate.
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
Unit 11: Evaluating Epidemiologic Literature. Unit 11 Learning Objectives: 1. Recognize uniform guidelines used in preparing manuscripts for publication.
Protocol Launch Meeting and Research Skills Course September 16 th 2015, RCS England Searching the Literature.
How to Read a Journal Article. Basics Always question: – Does this apply to my clinical practice? – Will this change how I treat patients? – How could.
Systematic reviews and meta-analyses: when and how to do them Andrew Smith Royal Lancaster Infirmary 18 May 2015.
CONSORT 2010 Balakrishnan S, Pondicherry Institute of Medical Sciences.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Is a meta-analysis right for me? Jaime Peters June 2014.
Retrospective Chart Reviews: How to Review a Review Adam J. Singer, MD Professor and Vice Chairman for Research Department of Emergency Medicine Stony.
Corso di clinical writing. What to expect today? Core modules IntroductionIntroduction General principlesGeneral principles Specific techniquesSpecific.
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
Reporting in health research Why it matters How to improve Presentation for the Center for Open Science July 10, 2015 April Clyburne-Sherin.
Ghada Aboheimed, Msc. Review the principles of an evidence based approach to clinical practice. Appreciate the value of EBM Describe the 5 steps of evidence.
Evidence-based Medicine
First glance Is this manuscript of interest to readers of the journal?
Clinical Study Results Publication
STROBE Statement revision
Critical Reading of Clinical Study Results
Reading Research Papers-A Basic Guide to Critical Analysis
Critical Appraisal Dr Samantha Rutherford
Module 4 Finding the Evidence: Individual Trials
What is a review? An article which looks at a question or subject and seeks to summarise and bring together evidence on a health topic. Ask What is a review?
STROBE Statement revision
Presentation transcript:

Transparency and accuracy in reporting health research Doug Altman The EQUATOR Network Centre for Statistics in Medicine, Oxford, UK

Transparency and value  Research only has value if –Study methods have validity –Research findings are published in a usable form 2

3 Research article  A published research article is often the only permanent record of a research study –Some readers might be satisfied with scanning an article, or a brief summary –Others will study it in detail for possible inclusion in a systematic review or to influence a clinical practice guideline  Only an adequately reported research study can be fully appraised and used appropriately –to assess reliability and relevance

4 Research article  Readers need a clear understanding of exactly what was done and what was found –Clinicians, Researchers, Systematic reviewers, Policy makers, …  The goals should be transparency and accuracy –Should allow replication (in principle) –Can be included in systematic review and meta-analysis –Should not mislead

5 What do we mean by poor reporting?  Key information is missing, incomplete or ambiguous –methods and findings  Misrepresentation of the study  Misleading interpretation Of particular concern  Non-publication of whole studies  Selective reporting of methods or findings

6 Taxonomy of poor reporting  Non-reporting Failure to publish a report of a completed study (even if was presented at a conference)  Selective reporting Biased reporting of data within a published report  Incomplete reporting Key information is missing  Misleading presentation e.g. claiming study is an RCT when it isn’t; post hoc change of focus (spin)  Inconsistencies between sources e.g. publication conflicts with protocol All are very common

7 Incomplete reporting of research is very common  Hundreds of published reviews show that key elements of trial methods and findings are commonly missing from journal reports  We often cannot tell exactly how the research was done  These problems are generic –not specific to randomised trials –not specific to studies of medicines –not specific to commercially sponsored research

8 Incomplete reporting of research is very common “In 37% of papers patient numbers were inadequately reported; 20% of papers introduced new statistical methods in the ‘results’ section not previously reported in the ‘methods’ section, and 23% of papers reported no measurement of error with the main outcome measure.” [Parsons et al, J Bone Joint Surg Br 2011]

9 5/228 trials (2%) met all 7 CONSORT criteria reviewed. 52% specified a primary outcome 43% reported attrition (loss to follow up) 36% reported information about blinding 28% described randomization scheme 27% described allocation concealment 22% described an adequate power calculation J Am Coll Surg 2013

10 Ecological studies The quality of modern cross-sectional ecologic studies: a bibliometric review [Dufault & Klar, Am J Epidemiol 2011] N=125  “Most investigators who adjusted their outcomes for age or sex did so improperly (64%)  Statistical validity was a potential issue for 20% of regression models  Many authors omitted important information when discussing the ecologic nature of their study (31%), the choice of study design (58%), and the susceptibility of their research to the ecological fallacy (49%).”

11 “Spin”  Review of breast cancer trials “… spin was used frequently to influence, positively, the interpretation of negative trials, by emphasizing the apparent benefit of a secondary end point. We found bias in reporting efficacy and toxicity in 32.9% and 67.1% of trials, respectively, with spin and bias used to suggest efficacy in 59% of the trials that had no significant difference in their primary endpoint.” [Vera-Badillo et al, Ann Oncol 2013]

12 Inconsistency between sources Comparison of content of RCT reports in surgical journals and trial registry entries (n=51) [Rosenthal & Dwan, Ann Surg 2013] PrimarySecondary No discrepancy55%33% Complete omission8%31% New introduction8%39% Change in definition10%6% Downgrading from primary to secondary 22% Upgrading from secondary to primary 14%

13 Consequences of inadequate reporting  Assessing the reliability of published articles is seriously impeded by inadequate reporting –Clinicians cannot judge whether to use a treatment –Data cannot be included in a systematic review  Serious consequences for clinical practice, research, policy making, and ultimately for patients

14 Poor reporting is a serious problem for systematic reviews and clinical guidelines “Risk of bias assessment was hampered by poor reporting of trial methods.” [Meuffels et al. Computer assisted surgery for knee ligament reconstruction, CDSR 2011] “Poor reporting of interventions impeded replication” [Gordon and Findlay. Educational interventions to improve handover in health care: a systematic review. Med Educ 2011] “15 trials met the inclusion criteria for this review but only 4 could be included as data were impossible to use in the other 11.” [Nolte et al. Amphetamines for schizophrenia. CDSR 2004] “Poor reporting of data meant that individual effect size could not be calculated for any of these studies.” Bleakley et al. Some conservative strategies are effective when added to controlled mobilisation with external support after acute ankle sprain: a systematic review. Aust J Physiother 2008.

15 We need research we can rely on “Assessment of reliability of published articles is a necessary condition for the scientific process” [Ziman. Reliable Knowledge, 1978] “… clinical research involving human participants can only be justified ethically when such experiments are done to produce generalizable knowledge.” [Korn & Ehringhaus. PLoS Clin Trials 2006]  Authors (and journals) have an obligation to ensure that research is reported adequately

16 Reporting research is not new concern, but it is a relatively neglected one “… incompleteness of evidence is not merely a failure to satisfy a few highly critical readers. It not infrequently makes the data that are presented of little or no value.” [Mainland. The treatment of clinical and laboratory data, 1938]

17 [Altman and Moher, BMJ 2013]

18