September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW.

Slides:



Advertisements
Similar presentations
Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian
Advertisements

What is a review? An article which looks at a question or subject and seeks to summarise and bring together evidence on a health topic.
Andrea M. Landis, PhD, RN UW LEAH
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Cross Cultural Research
June 19, Proposal: An overall Plan Design to obtain answer to the research questions or problems Outline the various tasks you plan to undertake.
8. Evidence-based management Step 3: Critical appraisal of studies
Copyright © Allyn & Bacon 2008 This multimedia product and its contents are protected under copyright law. The following are prohibited by law: any public.
Reading the Dental Literature
Reviewing and Critiquing Research
Critiquing Research Articles For important and highly relevant articles: 1. Introduce the study, say how it exemplifies the point you are discussing 2.
Research Synthesis (Meta-Analysis) Research Synthesis (Meta-Analysis) CHAPTER 1 CHAPTER 10.
Evidenced Based Practice; Systematic Reviews; Critiquing Research
Writing a Research Protocol Michael Aronica MD Program Director Internal Medicine-Pediatrics.
Concept of Measurement
Evaluating Hypotheses Chapter 9. Descriptive vs. Inferential Statistics n Descriptive l quantitative descriptions of characteristics.
Evaluating Hypotheses Chapter 9 Homework: 1-9. Descriptive vs. Inferential Statistics n Descriptive l quantitative descriptions of characteristics ~
Sampling and Experimental Control Goals of clinical research is to make generalizations beyond the individual studied to others with similar conditions.
Personality, 9e Jerry M. Burger
Course Content Introduction to the Research Process
Classroom Assessment A Practical Guide for Educators by Craig A
Validity Lecture Overview Overview of the concept Different types of validity Threats to validity and strategies for handling them Examples of validity.
CORRELATIO NAL RESEARCH METHOD. The researcher wanted to determine if there is a significant relationship between the nursing personnel characteristics.
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
Fig Theory construction. A good theory will generate a host of testable hypotheses. In a typical study, only one or a few of these hypotheses can.
Work in the 21st Century Chapter 2
Applying Science Towards Understanding Behavior in Organizations Chapters 2 & 3.
Variation, Validity, & Variables Lesson 3. Research Methods & Statistics n Integral relationship l Must consider both during planning n Research Methods.
RESEARCH A systematic quest for undiscovered truth A way of thinking
Methodology Describe Context & setting Design Participants Sampling Power Analysi s Interventions Outcome (study variables) Data Collection Procedures.
Study Design. Study Designs Descriptive Studies Record events, observations or activities,documentaries No comparison group or intervention Describe.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
Final Study Guide Research Design. Experimental Research.
Evaluating a Research Report
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
PTP 560 Research Methods Week 6 Thomas Ruediger, PT.
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
LECTURE 2 EPSY 642 META ANALYSIS FALL CONCEPTS AND OPERATIONS CONCEPTUAL DEFINITIONS: HOW ARE VARIABLES DEFINED? Variables are operationally defined.
Sampling “Sampling is the process of choosing sample which is a group of people, items and objects. That are taken from population for measurement and.
Understanding Research Design Can have confusing terms Research Methodology The entire process from question to analysis Research Design Clearly defined.
Qualitative Research Designs Day 4 The Curious Skeptics at work.
Criteria to assess quality of observational studies evaluating the incidence, prevalence, and risk factors of chronic diseases Minnesota EPC Clinical Epidemiology.
Validity RMS – May 28, Measurement Reliability The extent to which a measurement gives results that are consistent.
CDIS 5400 Dr Brenda Louw 2010 Validity Issues in Research Design.
6. Evaluation of measuring tools: validity Psychometrics. 2012/13. Group A (English)
STUDYING BEHAVIOR © 2009 The McGraw-Hill Companies, Inc.
1 Statistics in Research & Things to Consider for Your Proposal May 2, 2007.
Research Methods in Psychology Chapter 2. The Research ProcessPsychological MeasurementEthical Issues in Human and Animal ResearchBecoming a Critical.
Developing a Review Protocol. 1. Title Registration 2. Protocol 3. Complete Review Components of the C2 Review Process.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
Chapter Eight: Quantitative Methods
 When every unit of the population is examined. This is known as Census method.  On the other hand when a small group selected as representatives of.
Chapter 14 Research Synthesis (Meta-Analysis). Chapter Outline Using meta-analysis to synthesize research Tutorial example of meta-analysis.
Research Methods & Design Outline
Systematic Reviews of Evidence Introduction & Applications AEA 2014 Claire Morgan Senior Research Associate, WestEd.
Week Seven.  The systematic and rigorous integration and synthesis of evidence is a cornerstone of EBP  Impossible to develop “best practice” guidelines,
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
Part Two.
The Research Design Continuum
Supplementary Table 1. PRISMA checklist
DUET.
Chapter Eight: Quantitative Methods
Discussions and Conclusions
Writing a Research proposal
Introduction to Experimental Design
Meta-analysis, systematic reviews and research syntheses
Presentation transcript:

September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Stages of Systematic Review 1.Define the Problem 2.Literature Search 3.Data Evaluation 4.Data Analysis 5.Interpretation of Results 6.Presentation of Results

Data Evaluation What retrieved research should be included or excluded from your review? Are the methods used in retrieved literature suitable to study your research question? Are there problems in research implementation? Evaluating quality of retrieved literature Coding literature for inclusion and exclusion Coder reliability and avoiding coding error

Data Evaluation: Study Quality What makes a high quality study?

Data Evaluation: Study Quality What makes a high quality study? Validity Relevance Study design Reporting quality

Data Evaluation: Validity Internal Validity (experimental validity) Validity of causal inference Does the cause lead to the effect? Randomized controlled experiment is gold standard Requirements: Cause precedes effect Cause and effect are related Other plausible explanations are ruled out

Data Evaluation: Validity Threats to internal validity: Ambiguous timeline of cause and effect Other plausible explanations Uncontrolled circumstances Confounding Bias

Data Evaluation: Validity Example: organic dairy Does organic dairy farming (effect) produce higher quality milk (outcome)? What are threats to internal validity? Other explanations: differences in diet, climate, breed

Data Evaluation: Validity External validity Degree to which a causal inference can be generalized How would you be wrong to make a generalization? Does an experiment resemble the real world? Does it apply in other populations? Other regions? Example: organic dairy Can a cause and effect relationship between organic dairy farming and quality of milk in a study apply more generally? Ecological validity – are study conditions like those in natural conditions?

Data Evaluation: Validity Construct validity Degree to which operational definitions represent concepts Does the study measure the variable in a valid way? Example: organic dairy What is the concept of milk quality and how is it measured?

Data Evaluation: Validity Statistical conclusion validity Validity of statistical inferences used to assess the strength of relationship between cause and effect Does the data meet the assumptions of statistical tests used in the study? Examples: Study uses a t test, but data are not normally distributed Study uses linear regression but variables do not have a linear relationship

Data Evaluation: Study Design Which study designs should be included in your review? Design influences validity Randomized vs non-randomized Cohort Case-control Cross sectional Case reports

Data Evaluation: Relevance Degree to which a retrieved study applies to the review question High quality ≠ Relevant Does a retrieved study have features that make it irrelevant to the review? Population studied Methods used Definitions of variables (construct validity) Determine criteria for relevance to your question

Data Evaluation: Reporting Quality How a study is reported affects inclusion in review Poor reporting quality makes analysis difficult Incomplete data Missing information Space restriction in publishing

Data Evaluation: Strategies A priori Rules are determined ahead of time for what studies will be included Rules determined before data is examined or outcomes are known Need to consider implications of all rules Place specific values on rules Provide reasons why rules remove bias from review Post hoc Determines the impact of study quality on the review to make inclusion decisions How will inclusion of certain studies impact the results of the review? Does not rely on arbitrary rules Many systematic reviews use a blend of strategies

Data Evaluation: Strategies A priori Exclusion Deciding to exclude all studies that do not meet a certain criteria excluding all non-randomized studies Quality Scales Criteria included in scale has to be adapted to the field and research question Not based on empirical evidence Often not an evidence base for predictions of bias for quality indicators

Data Evaluation: Strategies Post hoc Quality is handled as an empirical question Attempts to avoid problems with a priori assignments How will the review results be influenced if certain studies are included? Can compare bias introduced by certain types of studies Example: inclusion of non-randomized studies A priori – include only randomized studies Post hoc – does inclusion of the non-randomized studies influence the results? If not, then include

Data Evaluation: Coding the Literature Once you have a set of retrieved studies, code them for inclusion in final review Coding components: Eligibility criteria Study features defined: Eligible study designs Eligible methods Sampling criteria Statistical criteria Geographical and linguistic Time frame

Data Evaluation: Coding the Literature Develop a coding protocol Develop like a questionnaire: Clearly define what you want to measure – concepts and study characteristics May need multiple coding questions to evaluate each concept Develop a matrix of all studies after retrieval Way to organize post hoc Reflects many characteristics of retrieved studies May help if unsure what to code before examining studies

Data Evaluation: Coding the Literature Develop a coding form All reviewers will use Allows efficient record keeping by whole team Include a report identification – assign each study a number, etc

Report Characteristics 1. First author 2. Journal 3. Volume 4. Pages Inclusion Criteria 5. Is this study an empirical investigation of the effects of teacher expectancies? 0. No 1. Yes 6. Is the outcome a measure of IQ?0. No 1. Yes 7. Are the study participants in grades 1-5 at the start of the study? 0. No 1. Yes Table 7.2 Handbook of Research Synthesis and Meta-Analysis Relevance Screen

9. Sampling Strategy1.Randomly sampled from a defined population 2.Stratified sampling from a defined population 3.Cluster sampling 4.Convenience sample 5.Can’t tell 10. Group assignment mechanism1.Random assignment 2.Haphazard assignment 3.Other nonrandom assignment 4.Can’t tell 11. Assignment mechanism1.Self-selected into groups 2.Selected into groups by others on a basis related to outcome 3.Selected into groups by others not known to be related to outcome 4.Can’t tell Table 7.3 Handbook of Research Synthesis and Meta-Analysis Coding for Internal Validity

18. IQ measure used in study1.Stanford-Binet 5 2.Wechsler (WISC) III 3.Woodcock Johnson III 4.Other 5.Can’t tell 19. Score reliability for given IQ measure________________ 20. Metric for score reliability1.Internal consistency 2.Split-half 3.Test-retest 4.Can’t tell 5.None given 21. Source of score reliability1.Current sample 2.Citation from another study 3.Can’t tell 4.None given 22. Is the validity of the IQ measure mentioned? 0. No 1. Yes Table 7.4 Handbook of Research Synthesis and Meta-Analysis Coding Construct Validity

BMC Medical Research Methodology 2008, 8:21

Data Evaluation: Coding the Literature Assess coding reliability Intra and inter-coder reliability Intra – consistency of a single coder (avoid coder drift) Inter – consistency between coders Sources of error in coding decisions Deficient reporting in the study Judgments made by coders Coder bias Coder mistake

Data Analysis What procedures should be used to summarize and integrate the research results? Quantitative analysis (meta-analysis) Statistical techniques to synthesize data from studies Qualitative analysis Allows synthesis and interpretation of non-numerical data

Data Analysis: Qualitative Methods of Qualitative Analysis: 1.Content Analysis Synthesis of the content of studies Organization of content with keywords or concepts 2.Meta-Ethnography Ethnography – study of a whole culture Focuses on the culture as a system, understanding the whole 3.Grounded Theory Formation of a theory from synthesis of data Reverse of most hypothesis driven research

Content Analysis Defining categories with keywords from studies ELO S. & KYNGA ELO S. & KYNGAS H. (2008) ¨ S H. (2008) The qualitative content analysis process. Journal ofAdvanced Nursing 62(1), 107–115

Meta-Ethnography BMC Medical Research Methodology 2008, 8:21

Grounded Theory