Common Statistical Issues Andy Vail, MDSG meeting, 8 th July 2013.

Slides:



Advertisements
Similar presentations
Appraisal of an RCT using a critical appraisal checklist
Advertisements

Protocol Development.
Meta-analysis: summarising data for two arm trials and other simple outcome studies Steff Lewis statistician.
The Bahrain Branch of the UK Cochrane Centre In Collaboration with Reyada Training & Management Consultancy, Dubai-UAE Cochrane Collaboration and Systematic.
8. Evidence-based management Step 3: Critical appraisal of studies
天 津 医 科 大 学天 津 医 科 大 学 Clinical trail. 天 津 医 科 大 学天 津 医 科 大 学 1.Historical Background 1537: Treatment of battle wounds: 1741: Treatment of Scurvy 1948:
Reading the Dental Literature
Conducting systematic reviews for development of clinical guidelines 8 August 2013 Professor Mike Clarke
Estimation and Reporting of Heterogeneity of Treatment Effects in Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare.
Systematic Reviews: Theory and Practice
Introduction to Critical Appraisal : Quantitative Research
SYSTEMATIC REVIEWS AND META-ANALYSIS
15 de Abril de A Meta-Analysis is a review in which bias has been reduced by the systematic identification, appraisal, synthesis and statistical.
Common Problems in Writing Statistical Plan of Clinical Trial Protocol Liying XU CCTER CUHK.
By Dr. Ahmed Mostafa Assist. Prof. of anesthesia & I.C.U. Evidence-based medicine.
Critical appraisal Systematic Review กิตติพันธุ์ ฤกษ์เกษม ภาควิชาศัลยศาสตร์ มหาวิทยาลัยเชียงใหม่
Making all research results publically available: the cry of systematic reviewers.
Are the results valid? Was the validity of the included studies appraised?
STrengthening the Reporting of OBservational Studies in Epidemiology
Reading Scientific Papers Shimae Soheilipour
EBD for Dental Staff Seminar 2: Core Critical Appraisal Dominic Hurst evidenced.qm.
SYSTEMATIC REVIEWS AND META-ANALYSIS. Objectives Define systematic review and meta- analysis Know how to access appraise interpret the results of a systematic.
Systematic Reviews.
Study design P.Olliaro Nov04. Study designs: observational vs. experimental studies What happened?  Case-control study What’s happening?  Cross-sectional.
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
EBCP. Random vs Systemic error Random error: errors in measurement that lead to measured values being inconsistent when repeated measures are taken. Ie:
EBC course 10 April 2003 Critical Appraisal of the Clinical Literature: The Big Picture Cynthia R. Long, PhD Associate Professor Palmer Center for Chiropractic.
Appraising Randomized Clinical Trials and Systematic Reviews October 12, 2012 Mary H. Palmer, PhD, RN, C, FAAN, AGSF University of North Carolina at Chapel.
Meta-analysis 統合分析 蔡崇弘. EBM ( evidence based medicine) Ask Acquire Appraising Apply Audit.
Successful Concepts Study Rationale Literature Review Study Design Rationale for Intervention Eligibility Criteria Endpoint Measurement Tools.
How to Read Scientific Journal Articles
Wipanee Phupakdi, MD September 15, Overview  Define EBM  Learn steps in EBM process  Identify parts of a well-built clinical question  Discuss.
RANDOMIZED TRIALS Nigel Paneth. TYPES OF EXPERIMENTAL STUDIES 1. TRUE EXPERIMENTS -RANDOMIZED TRIALS 2. QUASI-EXPERIMENTS.
Medical Statistics as a science
EBM Conference (Day 2). Funding Bias “He who pays, Calls the Tune” Some Facts (& Myths) Is industry research more likely to be published No Is industry.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Research Design ED 592A Fall Research Concepts 1. Quantitative vs. Qualitative & Mixed Methods 2. Sampling 3. Instrumentation 4. Validity and Reliability.
Study designs. Kate O’Donnell General Practice & Primary Care.
Objectives  Identify the key elements of a good randomised controlled study  To clarify the process of meta analysis and developing a systematic review.
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
EBM --- Journal Reading Presenter :呂宥達 Date : 2005/10/27.
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Systematic Synthesis of the Literature: Introduction to Meta-analysis Linda N. Meurer, MD, MPH Department of Family and Community Medicine.
Finding, Evaluating, and Presenting Evidence Sharon E. Lock, PhD, ARNP NUR 603 Spring, 2001.
Bangor Transfer Abroad Programme Marketing Research SAMPLING (Zikmund, Chapter 12)
EVALUATING u After retrieving the literature, you have to evaluate or critically appraise the evidence for its validity and applicability to your patient.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Systematic Review and Meta-Analysis.
Design of Clinical Research Studies ASAP Session by: Robert McCarter, ScD Dir. Biostatistics and Informatics, CNMC
C1, L1, S1 Chapter 1 What is Statistics ?. C1, L1, S2 Chapter 1 - What is Statistics? A couple of definitions: Statistics is the science of data. Statistics.
Course: Research in Biomedicine and Health III Seminar 5: Critical assessment of evidence.
Systematic reviews and meta-analyses: when and how to do them Andrew Smith Royal Lancaster Infirmary 18 May 2015.
Evidence Based Practice (EBP) Riphah College of Rehabilitation Sciences(RCRS) Riphah International University Islamabad.
CONSORT 2010 Balakrishnan S, Pondicherry Institute of Medical Sciences.
Chronic pelvic pain Journal Club 17 th June 2011 Dr Claire Hoxley (GPST1) Dr Harpreet Rayar (GPST2)
1 Lecture 10: Meta-analysis of intervention studies Introduction to meta-analysis Selection of studies Abstraction of information Quality scores Methods.
Is a meta-analysis right for me? Jaime Peters June 2014.
How to Conduct a Meta-Analysis Arindam Basu MD MPH About the Author Required Browsing.
EBM --- Journal Reading Presenter :黃美琴 Date : 2005/10/27.
Week Seven.  The systematic and rigorous integration and synthesis of evidence is a cornerstone of EBP  Impossible to develop “best practice” guidelines,
UOG Journal Club: October 2016
Supplementary Table 1. PRISMA checklist
Clinical Study Results Publication
Heterogeneity and sources of bias
Critical Reading of Clinical Study Results
Welcome.
A Review of Methods used to Quantify Effect Sizes in Clinical Trials
Common Problems in Writing Statistical Plan of Clinical Trial Protocol
EAST GRADE course 2019 Introduction to Meta-Analysis
Evidence Based Practice
What is a review? An article which looks at a question or subject and seeks to summarise and bring together evidence on a health topic. Ask What is a review?
Presentation transcript:

Common Statistical Issues Andy Vail, MDSG meeting, 8 th July 2013

Outline MDSG templates Protocol stage −Literature review, Structure of comparisons, Outcomes, Risk of Bias, Heterogeneity, Subgroup & Sensitivity analyses Review/Update stage −Exclusions, Description, Unit of Analysis, Risk of Bias justification, Departures from protocol, When to pool, Interpretation

Good News! Structure and recognition −Primary outcomes, Adverse events, RCT designs, Randomisation & Risk of Bias, Summary statistics & Analysis, Confidence MDSG templates −Methods & Results −Much improved standard and standardisation

Templates Please use them or cover same issues −Understand them first −Copy only relevant parts −Do what you copied!

Systematic review Explicit Unambiguous Repeatable Formulaic, BUT......source trials far from ideal −Require interpretation −Need methodological & clinical insight

PROTOCOL STAGE

Literature review Avoid temptation to cite RCTs Selective citation prejudges review!

Structure for comparisons Aim is exhaustive and mutually exclusive Comparison X+Y vs X+P −Not a trial of X −Doesn’t belong in a review of treatment X −Just a trial of Y vs P Why exclude specific comparisons? −Consider network analysis −Consider readership!

Primary outcome Live birth Ongoing pregnancy −How ‘ongoing’: 13 wks? 20 wks? Cumulative pregnancy Cumulative over course? −“the one piece of information that a woman or a couple really want is the likelihood of having a baby at the end of a course of treatment” −Farquhar 2006

Explicit outcome definition Pregnancy −Biochemical, sac, heartbeat,...? Miscarriage −Of a clinical pregnancy? −How to count partial miscarriage? −Per woman or per qualifying pregnancy?

Does blinding matter? Subjective & process outcomes −Not much dispute Objective outcomes −Enthusiasm for follow-up? (clinician or participant) What if not possible? −Not judgement of author or research ‘quality’ −Risk of bias unaffected by ability to avoid

Other risks? Within-study concerns (mainly) Not ‘quality’ or ‘trustworthiness’ −Funding source −Baseline imbalance −Prospective power calculation More from Vivienne to follow

Heterogeneity Interpretation categories for I-squared −Deliberately overlapping to prevent blind copying! −Not really sensible −Size & direction matter

Subgroup or Sensitivity analysis? Sensitivity −Is conclusion affected by arbitrary choices made? −Analysis method Fixed effect, OR, imputation −Eligibility criteria Risk of bias, clinical criteria Subgroup −Is effect evident in subgroup of studies?

Subgroup or Sensitivity analysis? Sensitivity can be by subgroup analysis −Motive rather than method Subgroup −Is effect evident in subgroup of studies?

Bad subgroup analysis By patient characteristics −Participant age (or average age) −Diagnostic category −Studies likely to include mix By post-randomisation characteristics −‘Improper’ subgroups Subgroup versus subgroup −Not question for trial, use stratification

Bad sensitivity analysis Under-defined subgroups −Outlying results −Dominant studies −High risk of bias

REVIEW (& UPDATE!) STAGE

Justifying exclusion of studies By eligibility criteria Not by −Study quality −Reporting quality −Available outcomes

Description of included studies Please check consistency −If giving specific numbers, should sum to total −Distinguish ‘trials’ from ‘reports’ −Ensure patients contribute once only “One fresh cycle of DET compared with one fresh cycle of triple embryo transfer (TET) (Komori 2004; Heijnen 2006)Komori 2004Heijnen 2006 “Two fresh cycles of DET compared to two fresh cycles of TET (Heijnen 2006)Heijnen 2006 “Three fresh cycles of DET compared to three fresh cycles of TET (Heijnen 2006)”Heijnen 2006 Helpful if in order described under method Consider structuring by comparison

Unit of analysis Repetition of participant −Cross-over trials, multiple cycles Dependence between participants −Cluster trials, surgical/therapy & group interventions Repetition within participant −Bilateral condition, fertilisation ‘rate’ Post-randomisation exclusion −Mean oocyte retrieval excluding zeros

Risk of Bias Justify all decisions explicitly −“Sealed opaque envelopes” not enough for ‘low’ Other domains −Internal validity only e.g. unadjusted interim analyses Report efforts to obtain information − to arrange a phone call −Use a methodologist

Describing results Avoid ambiguity −“no cases were reported” Beware whacky SD −Check relative weight in line with relative size −Trial authors or journal mis-labelled SE?

Choosing not to pool data Aim to be systematic −Sometimes turns out to be silly! −“We should be prepared more often to assemble trials but not perform a formal meta-analysis” −“We should acknowledge the difficulties and not pretend that a systematic review is simpler or more objective than it is” −Doug Altman, 2004 Justify any departure from protocol explicitly

Reporting analyses Please be systematic −Order of outcomes −Same terminology −Be repetitive! Reporting scale −Translate analysis statistic for typical controls

Translation “The studies do not indicate that there is statistically significant difference... (OR 0.97, 95% CI 0.74 to 1.27)” Prefer to see: “This means that for women with a 25% chance of [outcome] using [control intervention] the corresponding chance using [experimental] would be between 16% and 30%”.

Inference Fixation with significance −Please interpret via the confidence interval −‘Significant’ does not mean ‘important’ Logic of subgroup comparison −Need stratified analysis to explore differences Absence of evidence −“We found no effect of...” always unhelpful

Interpreting partial results Beware outcome reporting bias −Do subset reporting live birth have typical pregnancy data? −Do those reporting both have similar OR for each?

Interpretation Precision versus accuracy −Sample size gives precision −Bias affects accuracy Remember your risk of bias assessment −Meta-analysis results often precisely wrong!

Summary Understand & use the templates −Don’t start without a methodologist Give explicit justification of all decisions −Could others repeat your work from detail given? Report with painstaking monotony −No prizes for literature! Resist urge to ‘spin’