Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.

Slides:



Advertisements
Similar presentations
What is a review? An article which looks at a question or subject and seeks to summarise and bring together evidence on a health topic.
Advertisements

Protocol Development.
8. Evidence-based management Step 3: Critical appraisal of studies
A systematic review of interventions for children with cerebral palsy: state of the evidence Rohini R Rattihalli
Reading the Dental Literature
Estimation and Reporting of Heterogeneity of Treatment Effects in Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare.
Elements of a clinical trial research protocol
THE NEWCASTLE CRITICAL APPRAISAL WORKSHEET
Writing a Research Protocol Michael Aronica MD Program Director Internal Medicine-Pediatrics.
Clinical Trials Hanyan Yang
Chapter 7. Getting Closer: Grading the Literature and Evaluating the Strength of the Evidence.
By Dr. Ahmed Mostafa Assist. Prof. of anesthesia & I.C.U. Evidence-based medicine.
Cohort Studies Hanna E. Bloomfield, MD, MPH Professor of Medicine Associate Chief of Staff, Research Minneapolis VA Medical Center.
Validity Lecture Overview Overview of the concept Different types of validity Threats to validity and strategies for handling them Examples of validity.
Critical Appraisal of an Article by Dr. I. Selvaraj B. SC. ,M. B. B. S
Assessing Quality of Individual Studies Interactive Quiz Prepared for: The Agency for Healthcare Research and Quality (AHRQ) Training Modules for Systematic.
September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW.
BC Jung A Brief Introduction to Epidemiology - XI (Epidemiologic Research Designs: Experimental/Interventional Studies) Betty C. Jung, RN, MPH, CHES.
Are the results valid? Was the validity of the included studies appraised?
STrengthening the Reporting of OBservational Studies in Epidemiology
Lecture 8 Objective 20. Describe the elements of design of observational studies: case reports/series.
EBD for Dental Staff Seminar 2: Core Critical Appraisal Dominic Hurst evidenced.qm.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Study Design. Study Designs Descriptive Studies Record events, observations or activities,documentaries No comparison group or intervention Describe.
Epidemiology The Basics Only… Adapted with permission from a class presentation developed by Dr. Charles Lynch – University of Iowa, Iowa City.
Designing your Research Project Designing your Research Project Ivan J Perry Department of Epidemiology & Public Health University College Cor.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Lecture Slides Elementary Statistics Eleventh Edition and the Triola Statistics Series by.
Systematic Reviews.
Study design P.Olliaro Nov04. Study designs: observational vs. experimental studies What happened?  Case-control study What’s happening?  Cross-sectional.
Evaluating a Research Report
Mother and Child Health: Research Methods G.J.Ebrahim Editor Journal of Tropical Pediatrics, Oxford University Press.
Appraising Randomized Clinical Trials and Systematic Reviews October 12, 2012 Mary H. Palmer, PhD, RN, C, FAAN, AGSF University of North Carolina at Chapel.
Systematic reviews to support public policy: An overview Jeff Valentine University of Louisville AfrEA – NONIE – 3ie Cairo.
Criteria to assess quality of observational studies evaluating the incidence, prevalence, and risk factors of chronic diseases Minnesota EPC Clinical Epidemiology.
Landmark Trials: Recommendations for Interpretation and Presentation Julianna Burzynski, PharmD, BCOP, BCPS Heme/Onc Clinical Pharmacy Specialist 11/29/07.
Deciding how much confidence to place in a systematic review What do we mean by confidence in a systematic review and in an estimate of effect? How should.
Clinical Writing for Interventional Cardiologists.
Systematic Review Module 11: Grading Strength of Evidence Interactive Quiz Kathleen N. Lohr, PhD Distinguished Fellow RTI International.
Discussion for a statement for biobank and cohort studies in human genome epidemiology John P.A. Ioannidis, MD International Biobank and Cohort Studies.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
System error Biases in epidemiological studies FETP India.
Research Methods in Psychology Chapter 2. The Research ProcessPsychological MeasurementEthical Issues in Human and Animal ResearchBecoming a Critical.
Critical appraisal of randomized controlled trial
How to Analyze Therapy in the Medical Literature (part 1) Akbar Soltani. MD.MSc Tehran University of Medical Sciences (TUMS) Shariati Hospital
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Overview of Study Designs. Study Designs Experimental Randomized Controlled Trial Group Randomized Trial Observational Descriptive Analytical Cross-sectional.
Objectives  Identify the key elements of a good randomised controlled study  To clarify the process of meta analysis and developing a systematic review.
Copyright © 2016 Wolters Kluwer All Rights Reserved Chapter 7 Experimental Design I— Independent Variables.
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
EBM --- Journal Reading Presenter :呂宥達 Date : 2005/10/27.
Finding, Evaluating, and Presenting Evidence Sharon E. Lock, PhD, ARNP NUR 603 Spring, 2001.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Design of Clinical Research Studies ASAP Session by: Robert McCarter, ScD Dir. Biostatistics and Informatics, CNMC
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Strengthening Research Capabilities Professor John B. Kaneene DVM, MPH, PhD, FAES, FAVES Center for Comparative Epidemiology Michigan State University.
Unit 11: Evaluating Epidemiologic Literature. Unit 11 Learning Objectives: 1. Recognize uniform guidelines used in preparing manuscripts for publication.
Introduction to Critical Appraisal January 9, 2006 Carin Gouws.
بسم الله الرحمن الرحیم.
1 Study Design Imre Janszky Faculty of Medicine, ISM NTNU.
Understanding Populations & Samples
Writing a sound proposal
Critically Appraising a Medical Journal Article
NURS3030H NURSING RESEARCH IN PRACTICE MODULE 7 ‘Systematic Reviews’’
Randomized Trials: A Brief Overview
Research Designs, Threats to Validity and the Hierarchy of Evidence and Appraisal of Limitations (HEAL) Grading System.
Critical Reading of Clinical Study Results
Significance and t testing
Critical Appraisal วิจารณญาณ
Presentation transcript:

Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC

Learning Objectives Define the concept of quality assessment Define the concept of quality assessment What are the reasons for quality assessment? What are the reasons for quality assessment? What are the stages in quality assessment? What are the stages in quality assessment? How do we report quality assessment? How do we report quality assessment?

CER Process Overview 2

Reasons for Quality Assessment Quality assessment is required for – – Interpreting results – – Grading the body of evidence Quality assessment may also be used for – – Selecting studies for inclusion – – Selecting studies for pooling 3

What is Quality Assessment? Quality can be defined as “the extent to which all aspects of a study’s design and conduct can be shown to protect against systematic bias, nonsystematic bias, and inferential error” (Lohr, 2004) Considered synonymous with internal validity Relevant for individual studies Distinct from assessment of risk of bias for a body of evidence 4 Lohr KN. Rating the Strength of Scientific Evidence: Relevance for Quality Improvement Programs. International Journal for Quality in Health Care 2004; 16(1):9-18.

What are Components of Quality Assessment? Systematic errors include selection bias and confounding, in which values tend to be inaccurate in a particular direction Nonsystematic errors are attributable to chance Inferential errors result from problems in data analysis and interpretation, such as choice of the wrong statistical measure or wrongly rejecting the null hypothesis 5 Lohr KN, Carey TS. Assessing 'best evidence': issues in grading the quality of studies for systematic reviews. Joint Commission. Journal On Quality Improvement 1999, Sep; 25(9),

Consider the Contribution of an Individual Study to a Body of Evidence 6 Internal validity of results Size of study (random error) Direction and degree of results Relevance of results (applicability) Type of study Limitations in study design and conduct Risk of Bias Precision Directness Consistency Applicability

What Are the Stages in Quality Assessment? Classify the study design Apply predefined criteria for quality and critical appraisal Arrive at a summary judgment of the study’s quality 7

Questions to Consider When Classifying Study Design Is a control group present? Is there concurrent assessment of intervention or exposure status? Do investigators have control over allocation and timing? Do investigators randomly allocate interventions? Is there more than one group? Is there concurrent assessment of outcomes? 8

Apply Predefined Criteria Apply one of several available tools that consider Similarity of groups at baseline in terms of baseline characteristics and prognostic factors Extent to which valid primary outcomes were described Blinding of subjects and providers Blinded assessment of the outcome Intention-to-treat analysis Differential loss to followup between the compared groups or overall high loss to followup Conflict of interest 9

Additional Criteria for Trials Methods used for randomization Allocation concealment 10

Additional Criteria for Observational Studies Sample size Methods for selecting participants (inception cohort, methods to avoid selection bias) Methods for measuring exposure variables Methods to deal with any design-specific issues such as recall bias and interviewer bias Analytical methods to control confounding 11

Arrive at a Universal Judgment of Quality Assign ratings of good, fair, or poor Ratings may vary across outcomes for an individual study Ratings should be based on the assessment of the impact of individual criteria on overall internal validity rather than on summary scores 12

Attributes of Good Studies A formal randomized controlled study Clear description of the population, setting, interventions, and comparison groups Appropriate measurement of outcomes Appropriate statistical and analytic methods and reporting No reporting errors Low dropout rate Clear reporting of dropouts 13

Attributes of Fair Studies Fair studies do not meet all the criteria required for a rating of good quality, because they have some deficiencies No flaw is likely to cause major bias Missing information often drives rating 14

Attributes of Poor Studies Significant biases, including – – Errors in design, analysis, or reporting – – Large amounts of missing information – – Discrepancies in reporting 15

Reporting Quality Assessment Overall assessments of quality must be accompanied by a statement of – – Flaws in design or execution of a study – – Assessment of the potential consequences of that flaw Poor studies may be excluded or included – – Decisions should be guided by gaps in current evidence – – Selective inclusion of poor studies for subgroups should be justified 16

Key Messages Transparency of process – – Full reporting on all elements of quality for each individual study – – Clear instructions on how abstractors scored quality – – Description of reconciliation process Transparency of judgment – – Explanation of final score 17

Key Source Draft AHRQ Methods Guide, Chapter 6, AHRQ, epFiles/2007_10DraftMethodsGuide.pdf. epFiles/2007_10DraftMethodsGuide.pdf 18