Presentation is loading. Please wait.

Presentation is loading. Please wait.

Conducting systematic reviews of public health and health promotion interventions Nicki Jackson Senior Training and Support Officer Cochrane Health Promotion.

Similar presentations


Presentation on theme: "Conducting systematic reviews of public health and health promotion interventions Nicki Jackson Senior Training and Support Officer Cochrane Health Promotion."— Presentation transcript:

1 Conducting systematic reviews of public health and health promotion interventions Nicki Jackson Senior Training and Support Officer Cochrane Health Promotion and Public Health Field

2 Overview Background to systematic reviews International systematic review initiatives Resources required Setting the scope of your review Asking an answerable question Searching for studies Data abstraction Principles of critical appraisal Synthesis of evidence Interpretation of results Writing the systematic review

3 Objectives This workshop will enable you to: 1. Be familiar with some of the key challenges of conducting systematic reviews of health promotion and public health interventions 2. Formulate an answerable question about the effectiveness of interventions 3. Identify primary studies, including developing strategies for searching electronic databases

4 Objectives cont. 4. Evaluate the quality of an individual health promotion or public health study 5. Synthesise the evidence from primary studies 6. Formulate conclusions and recommendations from the body of evidence 7. Evaluate the quality of a systematic review

5 Acknowledgement The Public Health Education and Research Program (PHERP) “Promoting and facilitating evidence-based policy and practice in Public Health and Health Promotion” Sydney Health Projects Group, School of Public Health, University of Sydney School of Public Health, La Trobe University Cochrane Health Promotion and Public Health Field

6 Background to systematic reviews

7 Meta-analysis Systematic reviews Reviews (narrative/literature/ traditional) Types of reviews

8 Narrative reviews Usually written by experts in the field Use informal and subjective methods to collect and interpret information Usually narrative summaries of the evidence Read: Klassen et al. Guides for Reading and Interpreting Systematic Reviews. Arch Pediatr Adolesc Med 1998;152:

9 What is a systematic review? A review of the evidence on a clearly formulated question that uses systematic and explicit methods to identify, select and critically appraise relevant primary research, and to extract and analyse data from the studies that are included in the review* *Undertaking Systematic Reviews of Research on Effectiveness. CRD’s Guidance for those Carrying Out or Commissioning Reviews. CRD Report Number 4 (2nd Edition). NHS Centre for Reviews and Dissemination, University of York. March 2001.

10 High quality Structured, systematic process involving several steps : 1. Plan the review 2. Formulate the question 3. Comprehensive search 4. Unbiased selection and abstraction process 5. Critical appraisal of data 6. Synthesis of data (may include meta-analysis) 7. Interpretation of results All steps described explicitly in the review

11 Systematic vs. Narrative reviews Scientific approach to a review article Criteria determined at outset Comprehensive search for relevant articles Explicit methods of appraisal and synthesis Meta-analysis may be used to combine data Depend on authors’ inclination (bias) Author gets to pick any criteria Search any databases Methods not usually specified Vote count or narrative summary Can’t replicate review

12 Advantages of systematic reviews Reduce bias Replicability Resolve controversy between conflicting studies Identify gaps in current research Provide reliable basis for decision making

13 Increased interest in systematic reviews Government interest in health costs Variations in practice Public want information Facilitated by computer developments

14 ExperienceEvidence Opinions Competing factors and pressures Financial pressures Time pressures Expectations

15 Who benefits? Practitioners - current knowledge to assist with decision making Researchers- reduced duplication - identify research gaps Community- recipients of evidence-based interventions Funders- identify research gaps/priorities Policy makers- current knowledge to assist with policy formulation

16 Limitations Results may still be inconclusive There may be no trials/evidence The trials may be of poor quality The intervention may be too complex to be tested by a trial Practice does not change just because you have the evidence of effect/effectiveness

17 Clinical vs. public health interventions Clinical Individuals Single interventions Outcomes only (generally) Often limited consumer input Quantitative approaches to research and evaluation Public health Populations and communities Combinations of strategies Processes as well as outcomes Involve community members in design and evaluation Qualitative and quantitative Health promotion theories and beliefs

18 International systematic review initiatives

19 Sources of systematic reviews in HP/PH Cochrane Collaboration Guide to Community Preventive Services (The Guide), US The Effective Public Health Practice Project, Canada Health Development Agency, UK The Evidence for Policy and Practice Information and Co-ordinating Centre (EPPI- Centre), UK Centre for Reviews and Dissemination, UK The Campbell Collaboration

20 Cochrane Collaboration Named in honour of Archie Cochrane, a British researcher In 1979: “It is surely a great criticism of our profession that we have not organised a critical summary, by specialty or subspecialty, adapted periodically, of all relevant randomised controlled trials”

21 Cochrane Collaboration International non-profit organisation that prepares, maintains, and disseminates systematic up-to-date reviews of health care interventions

22 The Cochrane Library

23 The Cochrane Library Cochrane Systematic reviews : Cochrane reviews and protocols Database of Reviews of Effects: Other systematic reviews appraised by the Centre for Reviews and Dissemination. Cochrane Central Register of Controlled Trials: Bibliography of controlled trials (some not indexed in MEDLINE). Cochrane database of Methodology Reviews: Cochrane reviews of methodological studies. The Cochrane Methodology register: Bibliography of studies relating to methodological aspects of research synthesis About the Cochrane Collaboration: Information about review groups, Fields, Centres, etc. Contact details provided. Health Technology Assessment Database: HTA reports NHS Economic evaluation database: Economic evaluations of health care interventions.

24

25 Organisation Steering group Review groups Centres Methods groups Fields Consumer network

26 Collaborative Review Groups Collaborative Review Groups  Focused around health problems (50)  Produce reviews  Editorial base facilitates review process  International and multidisciplinary eg. Airways GroupDrugs and Alcohol Group Heart GroupInjuries Group Skin GroupBreast Cancer Group

27 Cochrane Centres  Support review groups and reviewers within area (13)  Promote Cochrane Collaboration  Link to Government and other agencies  Not a production house for reviews eg. Australasian Cochrane Centre South African Cochrane Centre Italian Cochrane Centre

28 Cochrane Health Promotion and Public Health Field  Registered in Administered from Melbourne. Funded by VicHealth (Co-directors at EPPI-Centre, UK)  330 members on contact database across 33 countries Aims:  Promoting the conduct of reviews on HP/PH topics  Educating HP/PH practitioners about the Cochrane Collaboration, encouraging use of systematic reviews  Referring professionals to other databases

29 Reviews in HP/PH  Primary prevention of alcohol misuse in young people  Parent-training programmes for improving maternal psychosocial health  Interventions for preventing childhood obesity  Interventions for preventing eating disorders in children and adolescents  Supported housing for people with severe mental disabilities

30 For further information The Cochrane Collaboration The Cochrane Health Promotion and Public Health Field The Australasian Cochrane Centre The Cochrane Library

31 Other sources The Guide to Community Preventive Services

32 Other sources Effective Public Health Practice Project (EPHPP)

33 Other sources Health Development Agency

34 Other sources Evidence for Policy and Practice Information and Co-ordinating Centre (EPPI-Centre)

35 Other sources Centre for Reviews and Dissemination

36 Other sources The Campbell Collaboration

37 Resources required

38 Conduct of systematic reviews Topic of relevance or interest Team of co-authors Training and support Access to/understanding of stakeholders or likely users Funding and time (at least 6 months) Access to databases of published and unpublished literature Statistical software, if appropriate Bibliographic software

39 Review manuals Cochrane Collaboration Reviewers’ Handbook Cochrane Collaboration Open Learning Materials NHS Centre for Reviews and Dissemination Guidance for those Carrying Out or Commissioning Reviews The Methods of the Community Guide A Schema for Evaluating Evidence on Public Health Interventions EPPI-Centre Reviewers’ Manual

40 Guidelines for HP/PH reviews Cochrane Health Promotion and Public Health Field website /activities/guidelines.htm

41 Setting the scope of your review

42 Advisory Groups Policy makers, funders, practitioners, recipients/consumers Make or refine the review question Provide background material Help interpret the findings Assist with dissemination Formal (role descriptions) or informal

43 LUMPORSPLIT?

44 Lump or split? Users needs Policy – broad reviews to answer questions when there is a range of options Practitioners – more-specific interventions or approaches Lump – inform which interventions to implement, more time-consuming Split – Yes/No to implement, less time

45 Lumping or splitting Interventions to modify drug-related behaviours for preventing HIV infection in drug users Interventions to modify sexual risk behaviours for preventing HIV infection Interventions to modify sexual risk behaviours for preventing HIV infection in men who have sex with men. Interventions for preventing HIV infection in street youth Interventions for preventing HIV infection in young people in developing countries Counselling and testing for preventing HIV infection

46 Writing your protocol 1) Background Why is it important? How important is the problem? Is there uncertainty? What is the reasoning as to why the intervention(s) might work? (include theoretical frameworks) Other similar reviews?

47 Writing your protocol 2) Objectives What are the questions/hypotheses? 3) Selection criteria PICO(T) Population(s) Intervention(s) Comparison(s) Outcomes (Primary / Secondary) Types of studies

48 Writing your protocol 4) Planned search strategy Databases and terms 5) Planned data extraction Processes and outcomes? More than one reviewer? Planned quality appraisal (incl. checklists) 6) Method of synthesis Tabulate Narrative/qualitative synthesis or meta-analysis

49 Asking an answerable question

50 Importance A clearly framed question will guide: the reader in their initial assessment of relevance the reviewer on how to collect studies on how to check whether studies are eligible on how to conduct the analysis

51 Questions of interest Effectiveness: Does the intervention work/not work? Who does it work/not work for? Other important questions: How does the intervention work? Is the intervention appropriate? Is the intervention feasible? Is the intervention and comparison relevant?

52 Answerable questions EFFECTIVENESS A description of the populations P An identified intervention I An explicit comparison C Relevant outcomes O

53 A PICO question Time-consuming question: What is the best strategy to prevent smoking in young people?

54 An answerable question Q. Are mass media (or school-based or community-based) interventions effective in preventing smoking in young people? Choose to look at mass media interventions ………

55 Problem, population InterventionComparisonOutcomeTypes of studies Young people under 25 years of age a) Television b) Radio c) Newspapers d) Bill boards e) Posters f) Leaflets g) Booklets a) School-based interventions b) No intervention a) objective measures of smoking (saliva thiocyanate levels, alveolar CO) b) self-reported smoking behaviour c) Intermediate measures (intentions, attitude, knowledge, skills) d) Media reach a) RCT b) Controlled before and after studies c) Time series designs The PICO(T) chart

56 Types of study designs Randomised controlled trial Quasi-randomised/pseudo-randomised controlled trial/controlled clinical trial Controlled before and after study/cohort analytic (pre and post-test)/concurrently controlled comparative study Uncontrolled before and after study/cohort study Interrupted time series Qualitative research See handbook

57 Inequalities as an outcome Health inequalities “the gap in health status, and in access to health services, between different social classes and ethnic groups and between populations in different geographical areas.” 1 Other factors used in classifying health inequalities 2 Place of residence Race/ethnicity Occupation GenderPROGRESS Religion Education Socio-economic status Social capital 1 Public Health Electronic Library. 2 Evans T, Brown H. Injury Control and Safety Promotion 2003;10(2):11-12.

58 Inequalities reviews First Cochrane review Effectiveness of school feeding programs for improving the physical, psychological, and social health of disadvantaged children and for reducing socio-economic inequalities in health Defining effectiveness: More effective for disadvantaged than advantaged Potentially effective: Equally effective for both groups (prevalence of health problems greater in disadvantaged groups) If intervention is only allocated to disadvantaged and is effective

59 Incorporating inequalities into a review Reviews rarely present information on differential effects of interventions Systematic review methods for distilling and using information on relative effectiveness are underdeveloped. Difficult to locate studies – need broad search Need original data from authors Low power to detect subgroup differences Complexity and variety of study designs

60 Finding the evidence

61 Systematic review process 1. Well formulated question 2. Comprehensive data search 3. Unbiased selection and abstraction process 4. Critical appraisal of data 5. Synthesis of data 6. Interpretation of results

62 A good search Clear research question Comprehensive search All domains, no language restriction, unpublished and published literature, up- to-date Document the search (replicability)

63

64 1. Electronic searching Database choice should match area of interest: Medical: Medline, EMBASE, CINAHL Social Science: PsycINFO, Social Science Citation Index, Sociological Abstracts Educational: ERIC Other: AGRIS (agricultural), SPORTSDiscus (sports), EconLit (economics) Other registers: CENTRAL (Cochrane), BiblioMap (EPPI-Centre), HealthPromis (HDA)

65 MeSH / subject headings

66 Textwords

67 Components of electronic searching 1. Describe each PICO component 2. Start with primary concept 3. Find synonyms a) Identify MeSH / descriptors / subject headings b) Add textwords 4. Add other components of PICO question to narrow citations (may use study filter) 5. Examine abstracts 6. Use search strategy in other databases (may need adapting)

68 Example Mass media interventions to prevent smoking in young people P= Young people STEP ONE: Find MeSH and textwords to describe young people

69 Tick ‘map term’ to find MeSH

70 Click on i to find other suitable terms

71

72

73

74 Example Mass media interventions to prevent smoking in young people P= Young people MeSH: Adolescent Child Minors

75

76 Example Mass media interventions to prevent smoking in young people P= Young people Textwords: AdolescentGirl ChildBoy JuvenileTeenager Young peopleYoung adult StudentYouth

77 BOOLEAN OPERATORS - OR Or is more! Similar terms – combine MeSH and textwords for each PICO element Broader results Adolescent Student AdolescentAdolescent StudentStudent

78 Textwords Truncation $: To pick up various forms of a word Teen$.twSmok$.tw TeenageSmoke TeenagerSmoking TeenagersSmokes TeensSmoker TeenSmokers

79 Textwords Wild cards ? and #: To pick up different spellings Colo?r.tw (? Can be substituted for one or no characters) Colour Color Wom#n.tw (# Substitutes for one character) Woman Women

80 Textwords Adjacent ADJn: retrieves two or more query terms within n words of each other, and in any order Great when you are not sure of phraseology Eg sport adj1 policy Sport policy Policy for sport Eg mental adj2 health Mental health Mental and physical health

81 Example continued Mass media interventions to prevent smoking in young people I = Mass media interventions STEP TWO: Find MeSH and textwords to describe mass media interventions

82

83

84 Example continued MeSH Mass media Audiovisual aids Television Motion pictures Radio Telecommunications Newspapers Videotape recording Advertising

85

86 Example continued Mass media interventions to prevent smoking in young people O = Prevention of smoking STEP THREE: Find MeSH and textwords to describe prevention of smoking

87

88 BOOLEAN OPERATORS – AND Fewer records Different concepts – each element of PICO Focussed results Smoking Adolescent Smoking adolescent

89

90 Example of search P = YOUNG PEOPLE MeSHTextwords ……………………….………………………. ……………………….………………………. ……………………….………………………. I = MASS MEDIA MeSHTextwords ……………………….………………………. ……………………….………………………. ……………………….………………………. C = (if required) O = PREVENTION OF SMOKING MeSHTextwords ……………………….………………………. ……………………….………………………. ……………………….………………………. OR OR OR PANDIANDCANDO

91 Different bibliographic databases Databases use different types of controlled vocabulary Same citations indexed differently on different databases Medline and EMBASE use a different indexing system for study type PsycINFO and ERIC do not have specific terms to identify study types Need to develop search strategy for each database

92 CINAHL

93 Medline

94 PsycINFO

95 Compare subject headings MEDLINECINAHLPsycINFO AdolescentAdolescenceAdolescent attitudes Child Mass mediaCommunications media Mass media subject heading not used! Pamphlets Radio Television

96 Study design filters RCTs See Cochrane Reviewer’s Handbook Non-RCTs Not yet developed, research in progress Qualitative research Specific subject headings used in CINAHL, ‘qualitative research’ used in Medline CINAHL Filter: Edward Miner Library s/Cinahl_eb_filters.pdf Systematic reviews/meta-analyses CINAHL: as above Medline s/OVID_eb_filters.pdf s/OVID_eb_filters.pdf Medline and Embase PubMed

97 Other sources of primary research

98

99

100

101 2. Unpublished literature Only 30-80% of all known published trials are identifiable in Medline (depending on topic) Only 25% of all medical journals in Medline Non-English language articles are under- represented in Medline (and developing countries) Publication bias – tendency for investigators to submit manuscripts and of editors to accept them, based on strength and direction of results ( Olsen 2001)

102 2. Unpublished literature Hand searching of key journals and conference proceedings Scanning bibliographies/reference lists of primary studies and reviews Contacting individuals/agencies/ academic institutions Neglecting certain sources may result in reviews being biased

103 Examples of search strategies HEALTH - Cochrane Injuries Group Specialised Register - Cochrane Library databases - MEDLINE - EMBASE - National Research Register EDUCATIONAL/PSYCHOLOGICAL - PsycInfo - ERIC (Educational Resources Information Center - SPECTR (The Campbell Collaboration's Social, Psychological, Educational and Criminological Trials Register) TRANSPORT - NTIS - TRIS - ITRD - RANSDOC - Road Res (ARRB) - ATRI (Australian Transport Index) GENERAL - Zetoc (the British Library conference proceedings database) - SIGLE (System for Information on Grey Literature in Europe) - Science (and Social Science) Citation Index Ker K, Roberts I, Collier T, Beyer F, Bunn F, Frost C. Post-licence driver education for the prevention of road traffic crashes. The Cochrane Database of Systematic Reviews 2003, Issue 3. Art. No.: CD DOI: / CD There was no language restriction. In addition we undertook a general Internet search focusing on the websites of relevant road safety organisations. Reference lists of all potentially eligible studies were examined for other relevant articles and experts in the field were contacted for additional information. The database and website searches were performed during the early months of 2002.

104 Examples of search strategies Project CORK BIDS ISI (Bath Information and Data Services) Conference proceedings on BIDS Current contents on BIDS PSYCLIT ERIC (U.S.A.) ASSIA MEDLINE FAMILY RESOURCES DATABASE EMBASE Health Periodicals Database Dissertation Abstracts SIGLE DRUG INFO SOMED (Social Medicine) Social Work Abstracts National Clearinghouse on Alcohol and Drug Information Mental Health Abstracts DRUG INFO. DRUG database Alcohol and Alcohol Problems Science Database - ETOH Foxcroft DR, Ireland D, Lister-Sharp DJ, Lowe G, Breen R. Primary prevention for alcohol misuse in young people. The Cochrane Database of Systematic Reviews 2002, Issue 3. Art. No.: CD DOI: / CD

105 Librarians are your friends!

106 Data abstraction

107 DATA ABSTRACTION Effective Public Health Practice Project reviews The Community Guide bstractionform.pdf Effective Practice and Organisation of Care Review Group NHS CRD Report Number 4.

108 Details to collect Publication details Study design Population details (n, characteristics) Intervention details Theoretical framework Provider Setting Target group Study details (date, follow-up) Consumer involvement Process measures – adherence, exposure, training, etc Context details Outcomes and findings Pilot test on a sub-sample of studies

109 Example 1 - RCT Table 3. Weighted mean difference in BMI standard deviation score and vegetable intake between the five intervention schools and their control schools BMIVegetable intake Weighted mean difference % weight of school Weighted mean difference % weight of school 10 (-0.2 to 0.1) (-0.1 to 0.4) (0 to 0.2) (0.2 to 0.7) (-0.1 to 0.2) (0.1 to 0.5) (-0.3 to 0) (0.1 to 0.7) (-0.3 to 0) (-0.1 to 0.4)17.4 Overall0 (-0.1 to 0.1)0.3 (0.2 to 0.4)

110 Example 2 - CBA Table 2. Estimated Differences in Daily Dietary Intake Based on Repeated 24-Hour Recalls at Follow-up for Children in Intervention (n=173) vs Control (n=163) Schools, Controlling for Baseline Measures No. fruits and vegetables per 4184 kJ Control1.41 Intervention1.78

111 Principles of critical appraisal

112 Systematic review process 1. Well formulated question 2. Comprehensive data search 3. Unbiased selection and abstraction process 4. Critical appraisal of data 5. Synthesis of data 6. Interpretation of results

113 Critical appraisal Critical appraisal The process of systematically examining research evidence to assess its validity, results and relevance before using it to inform a decision. Alison Hill, Critical Appraisal Skills Programme, Institute of Health Sciences, Oxford

114 Critical appraisal I: Quantitative studies

115 Why appraise validity? Why appraise validity? Not all published and unpublished literature is of satisfactory methodological rigour Just because it is in a journal does not mean it is sound! Onus is on you to assess validity! Quality may be used as an explanation for differences in study results Guide the interpretation of findings and aid in determining the strength of inferences

116 Why appraise validity? Why appraise validity? Poor quality affects trial results by exaggerating intervention effect: Inadequate allocation concealment exaggerated treatment effects by 35-41% (Moher 1998, Schulz 1995) Lack of blinding of subjects exaggerated treatment effect by 17% (Schulz 1995) Open outcome assessment exaggerated treatment effect by 35% (Juni 1999, Moher 1998)

117 “The medical literature can be compared to a jungle. It is fast growing, full of dead wood, sprinkled with hidden treasure and infested with spiders and snakes.” Peter Morgan, Scientific Editor, Canadian Medical Association

118 Bias 1. Selection bias 2. Allocation bias 3. Confounding 4. Blinding (detection bias) 5. Data collection methods 6. Withdrawals and drop-outs 7. Statistical analysis 8. Intervention integrity

119 Recruit participants Allocation Exposed to intervention Not exposed to intervention Follow-up Outcome Allocation of concealment Blinding of outcome assessment Intention-to-treat Withdrawals Data collection methods Selection bias Integrity of intervention Intervention Control Analysis Statistical analysis Confounding

120 Selection bias Recruiting study population  Differences in the way patients are accepted or rejected for a trial, and the way in which interventions are assigned to individuals  Difficult in public health studies

121 Question One: checklist a) Are the individuals selected to participate in the study likely to be representative of the target population? b) What percentage of the selected individuals/schools, etc agreed to participate?

122 Allocation bias Randomisation (coin-toss, computer) Allocation schedule Allocation Intervention Control Alternate, days of week, record number Allocation Intervention Control

123 Allocation bias Need comparable groups Randomisation = similar groups at baseline Allocation schedule should not be administered by person who is responsible for the study to prevent manipulation

124 Concealed allocation? Lancet 2002; 359:

125 Allocation bias Reduced by: centralised randomisation on-site computer system with group assignments in a locked file sequentially numbered, sealed, opaque envelopes any statement that provides reassurance that the person who generated the allocation scheme did not administer it  Not: alternation, dates of birth, day of week.

126 Question Two: checklist Allocation bias: Type of study design RCT Quasi-experimental Uncontrolled study

127 Confounding Need similar groups at baseline Determine which factors could confound the association of the intervention and outcome Non-randomised studies – can never adjust for unknown confounding factors (and difficulties in measuring known confounding factors) If confounding – adjusted for in analysis

128 Question Three: checklist Confounders: Prior to intervention, were there differences for important confounders reported? Were the confounders adequately managed in the analysis? Were there important confounders not reported?

129 Blinding outcome assessors Detection bias – Blinding of outcome assessors to prevent systematic differences between groups in the outcome assessment

130 Question Four: checklist Blinding Were the outcome assessors blind to the intervention status of participants? Yes No Not applicable (if self-reported) Not reported

131 Data collection methods More often subjective outcomes in health promotion Require valid and reliable tools

132 Question Five: checklist Data collection methods Were data collection methods shown or known to be valid and reliable for the outcome of interest?

133 Withdrawals from study Attrition bias - Systematic differences between groups in losses of participants from the study Look at withdrawals, drop-outs

134 Question Six: checklist Withdrawals and drop-outs What is the percentage of participants completing the study?

135 Statistical analysis Power / sample size calculation Intention-to-treat Cluster studies Allocate by school/community etc Generally analyse at individual level Unit of analysis errors Appropriate sample size determination

136 Question Seven: checklist Is there a sample size calculation? Is there a statistically significant difference between groups? Are the statistical methods appropriate? Unit of allocation and analysis? Was a cluster analysis done? Intention to treat analysis

137 Integrity of intervention = Fidelity = Implementation = Delivery of intervention as planned

138 Integrity of intervention PH/HP are complex – multiple components Integrity – delivery of intervention Adherence to specified program Exposure – no. of sessions, length, frequency Quality of delivery Participant responsiveness Potential for contamination RELATED TO FEASIBILITY

139 School-based AIDS program 19-lesson comprehensive school based program Unit 1: Basic information on HIV/AIDS Unit 2: Responsible behaviour: delaying sex Unit 3: Responsible behaviour: protected sex Unit 4: Caring for people with AIDS

140 School-based AIDS program Program had no effect on knowledge and attitudes and intended behaviour No contamination Focus groups: Program not implemented Role plays and condoms not covered Teachers only taught topics they preferred Shortage of class time, condoms is controversial, teachers left or died

141 Gimme 5 Fruit, Juice and veges School-based intervention curriculum included components to be delivered at the school and newsletters with family activities and instructions for intervention at home. Small changes in F, J, V consumption All teachers were observed at least once during the 6- week intervention. Only 51% and 46% of the curriculum activities were completed in the 4 th and 5 th grade years In contrast, teacher self-reported delivery was 90%. Davis M, Baranowski T, Resnicow K, Baranowski J, Doyle C, Smith M, Wang DT, Yaroch A, Hebert D. Gimme 5 fruit and vegetables for fun and health: process evaluation. Health Educ Behav Apr;27(2):

142 Question Eight: Checklist What percentage of participants received the allocation intervention? Was the consistency of the intervention measured? Is contamination likely?

143 Different study designs Non-randomised studies Allocation of concealment bias Confounding – uneven baseline characteristics Uncontrolled studies Cannot determine the size of the effect – the effect relative to that which might have occurred in the absence of any intervention

144 Example – allocation bias Non-randomised study “Randomisation was not possible because of the interests of the initial participating schools in rapidly receiving intervention materials”

145 Bias cont.. Rivalry bias ‘I owe him one’ bias Personal habit bias Moral bias Clinical practice bias Territory bias Complementary medicine bias ‘Do something’ bias ‘Do nothing’ bias Favoured/disfavoured design bias Resource allocation bias Prestigious journal bias Non-prestigious journal bias Printed word bias ‘Lack of peer-review’ bias Prominent author bias Unknown or non-prominent author bias Famous institution bias Large trial bias Multicentre trial bias Small trial bias ‘Flashy title’ bias Substituted question bias Esteemed professor bias Geography bias Bankbook bias Belligerence bias Technology bias ‘I am an epidemiologist’ bias

146 Quality of reporting ≠ quality of study It may be necessary to contact the authors for further information about aspects of the study or to collect raw data

147 Schema for Evaluating Evidence on Public Health Interventions Record the scope of the review and review question Appraise each article or evaluation report Formulate summary statement on the body of evidence

148 Five sections 1. Recording the purpose and scope of your review 2. Evaluating each article in the review 3. Describing the results 4. Interpreting each paper 5. Summarising the body of evidence

149 Critical appraisal II: Qualitative studies

150 Qualitative research … explores the subjective world. It attempts to understand why people behave the way they do and what meaning experiences have for people. Undertaking Systematic Reviews of Research on Effectiveness. CRD’s Guidance for those Carrying Out or Commissioning Reviews. CRD Report Number 4 (2nd Edition). NHS Centre for Reviews and Dissemination, University of York. March 2001.

151 Uses of qualitative research Qualitative research relevant to systematic reviews of effectiveness may include: Qualitative studies of experience Process evaluation Identifying relevant outcomes for reviews Help to frame the review question

152 Fundamentals Accepts that there are different ways of making sense of the world Study is ‘led’ by the subjects’ experiences, not researcher led (open) No one qualitative approach: different questions may require different methods or combinations of methods Findings may translate to a similar situation, but are not usually generalisable or totally replicable

153 CASP appraisal checklist 1. Clear aims of research (goals, why it is important, relevance) 2. Appropriate methodology 3. Sampling strategy 4. Data collection 5. Relationship between researcher and participants 6. Ethical issues 7. Data analysis 8. Findings 9. Value of research (context dependent)

154 1. Aim of research Describes why the research is being carried out Goal Importance Relevance

155 2. Appropriate methodology Does it address any of the following? What is happening? How does it happen? Why does it happen? Eg, why women choose to breastfeed, why is there miscommunication, how did the intervention work?

156 3. Qualitative research methods Observation – non-verbal and verbal behaviour by notes, audio, video Interviews – semi- or unstructured Text – diaries, case notes, letters Focus groups – semi- or unstructured

157 4. Recruitment method How were participants selected? Eg. Maximum variation approach? Why was this method chosen?

158 5. Data collection How was data collected? Did the researchers discuss saturation of data? Are the methods explicit?

159 6. Reflexivity Types of interview Meaning given to dataquestions asked Area being studied Venue Researcher

160 7. Ethical issues Consent Confidentiality Professional responsibility Advice Reporting

161 8. Data analysis Interpretations are made by the researcher Often uses thematic analysis Transcribe data, re-reads it and codes it into themes/categories Is there a description of the analysis? How were the themes derived?

162 Credibility of the analysis Method of analysis Clarity of approach Use of all of the data Triangulation Respondent validation

163 9. Statement of findings Findings are explicit Quality of the argument Replicability by another researcher Alternative explanations explored

164 10. Value of research Contribution to knowledge Potential new areas of research or interventions Applicability of results

165 Other qualitative checklist See NHS CRD Report Number 4 Quality framework Government Chief Social Researcher’s Office, UK mework.pdf mework.pdf

166 Synthesising the evidence

167

168 Steps 1. Table of study data 2. Check for heterogeneity a. No – meta-analysis b. Yes – identify factors, subgroup analysis or narrative synthesis 3. Sensitivity analyses 4. Explore publication bias

169 Steps Step One: 1. Table of study data Year Setting Population details (including any baseline differences) Study design Intervention details (including theory) Control group details Results Study quality

170 StudySettingSample size and characteristics Unit of randomisati on and analysis TheoryInterventionLength of follow-up OutcomeBaseline characteristi cs Aarons et al Junior High Schools Washington DC 582 grade 7 students Mean age 12.8 years 52% female, 84% African- American, 13% low socioeconomic status Randomisati on: School Analysis: Individual Social Cognitive Theory 3 reproductive health lessons taught by health professionals, 5 sessions of postponing sexual involvement curriculum. Control group: conventional programme 3 months, 96.4% followed Intercourse. Use of birth control at last intercourse Favour intervention Coyle et al urban high schools Texas and California 3869 grade 9 students, mean age 15 years, 53% female, 31% white, 27% Hispanic, 16% African American Randomisati on: School Analysis: Adjusted Social Learning Theory Safe choices: 10 lessons for grade and 10; lessons for grade 10 on knowledge and skills and led by trained peers and teachers 31 months, 79% followed Intercourse. Use of birth control at last intercourse Favour control DiCenso A, Guyatt G, Willan A, Griffith L. Interventions to reduce unintended pregnancies among adolescents: systematic review of randomised controlled trials. BMJ 2002;324:

171 Steps Step Two: Check for heterogeneity Are the results consistent? Yes No Meta-analysis Narrative synthesis or subgroup analysis Explain causes of heterogeneity

172 Terminology Homogeneity = similar Homogenous studies – if their results vary no more than might be expected by the play of chance (opposite= heterogeneity)

173 Investigating heterogeneity Graphically: If homogenous studies – Point estimates on same side of line of unity CI should overlap to a large extent Lack of outliers

174 Heterogeneity - the eyeball test

175 Investigating heterogeneity Statistically: p=<0.1 would indicate heterogeneity But test has low power when there are a few studies Lack of statistical significance does not imply homogeneity

176 Sources of heterogeneity Examples: Populations Interventions Outcomes Study designs Study quality Need to identify which factors contribute to heterogeneity

177 Identifying and dealing with heterogeneity Subgroup analyses By gender, age group, quality, type of intervention…..but keep analyses to a minimum!

178 Dealing with heterogeneity

179 Not all systematic reviews are meta-analyses “…it is always appropriate and desirable to systematically review a body of data, but it may sometimes be inappropriate, or even misleading, to statistically pool results from separate studies. Indeed, it is our impression that reviewers often find it hard to resist the temptation of combining studies even when such meta-analysis is questionable or clearly inappropriate.” Egger et al. Systematic reviews in health care. London: BMJ Books, 2001:5

180 Yes – heterogeneity present Narrative synthesis Describes studies Assesses whether quality is adequate in primary studies to trust their results Demonstrates absence of data for planned comparisons Demonstrates degree of heterogeneity Stratify by – populations, interventions, settings, context, outcomes, validity, etc Adapted from NHS CRD Report No. 4, 2 nd Ed.

181 No – heterogeneity not present Meta-analysis What comparisons should be made? What study results should be used in each comparison? Are the results of studies similar within each comparison? What is the best summary of effect for each comparison? Cochrane Reviewers’ Handbook

182 Meta-analysis Weighted average of effect sizes Weighted by study size, events

183 Study results For dichotomous/binary outcomes (Y/N) use: Relative Risk or Odds Ratio Risk = number of events total number of observations Odds = number of events number without the event

184 Relative risk and Odds ratio RR – Risk of the event in one group divided by the risk in the other group OR- Odds of the event occurring in one group divided by the odds occurring in the other group

185 Let’s try! ABCNo ABCTotal Intervention Group 2a2a 62 b 64 Control Group 4c4c 59 d 63 Total

186 Calculation Relative risk a/(a+b)2/(2+62)= 0.49 c/(c+d)4/(4+59) The risk of developing ABC was 49% of the risk in the control group The intervention reduced the risk by 51% of what it was in the control group

187 Calculation Odds ratio a/b2/62= 0.48 c/d4/59 The intervention reduced the odds of having ABC by about 50% of what they were

188 Odds Ratio Graph more than 1less than 1 1 Line of no significance LEFT E S M O RIGHT E

189 Odds Ratio more than 1less than 1 1 Best/point estimate Confidence Interval

190 Odds Ratio – with pooled effect size more than 1less than 1 1 Best/point estimate Confidence Interval

191 Confidence Interval (CI) Is the range within which the true size of effect (never exactly known) lies, with a given degree of assurance (usually 95%)

192 Confidence Interval (CI) How ‘confident’ are we that the results are a true reflection of the actual effect/phenomena? the shorter the CI the more certain we can be about the results if it crosses the line of unity (no treatment effect) the intervention might not be doing any good and could be doing harm

193 The p-value in a nutshell p < 0.05 a statistically significant result p = 0.05 or 1 in 20 result fairly unlikely to be due to chance 0 1 Could the result have occurred by chance? The result is unlikely to be due to chance The result is likely to be due to chance 1 20 p > 0.05 not a statistically significant result p = 0.5 or 1 in 2 result quite likely to be due to chance 1 2

194 Continuous data Data which is normally presented with means and SDs (ie. height, BMI) For each study you need means and SDs to calculate difference Difficult if continuous data arise from different scales

195 Statisticians are your best friend!

196 Statistical software for Meta-analysis

197 The meta-analysis

198 Steps Step Three: Sensitivity analysis How sensitive are the results of the analysis to changes in the way it was done?

199 Sensitivity analysis How sensitive are the results of the analysis to changes in the way it was done? Changing inclusion criteria for types of studies Including or excluding studies where there is ambiguity Reanalysing the data imputing using a reasonable range of values for missing data Reanalysing the data using different statistical approaches

200 Steps Step Four: Explore publication bias Is there a possibility I have missed some studies?

201 Publication bias Funnel plot Studies with significant results are more likely to be Published Published in English Cited by others

202 Funnel plots No publication bias = symmetrical inverted funnel Effect size vs. sample size i.e. Smaller studies without statistically significant effects remain unpublished, gap in bottom corner of graph Effect size Sample size

203 Synthesis of qualitative research In its infancy Widely varying theoretical perspectives Unit of analysis is concept/theme Secondary summary of research Most developed method is meta- ethnography Help is around the corner – many research projects in progress!!

204 Interpretation of results

205 Formulating conclusions and recommendations VERY IMPORTANT! Many people prefer to go directly to the conclusions before looking at the rest of the review Conclusions must reflect findings in review Conclusions Objectives Method of review

206 Issues to consider Conclusions should be based on: Strength of evidence Biases/limitations of review Applicability and sustainability of results Trade-offs between benefits and harms Implications for public health and future research

207 Strength and biases Strength How good is the quality of evidence? How large are the effects? Consistent results? Biases / limitations of review Comprehensive search? Quality assessment? Appropriate analysis? Publication bias?

208 Applicability Applicability – relates to: Study population characteristics Validity of the studies Relevant outcomes (incl. efficiency), interventions, comparisons Integrity of intervention – details of intervention (provider, adherence, medium, setting, access, infrastructure) Maintenance of intervention/sustainability

209 Factors relating to the interpretation of effectiveness Theoretical frameworks Integrity of the intervention Influence of context

210 Theory Change in behaviour at individual, community, organisational, policy level Examine the impact of the theoretical framework on effectiveness Group studies according to theory Assists in determining implementation (integrity of interventions) Discuss theoretical frameworks used (all single level?)

211

212 Context VERY IMPORTANT IN HP/PH REVIEWS Influences effectiveness of intervention Social/cultural, political, organisational Affects ability to pool results Affects applicability

213 Context Time and place of intervention Local policy environment, incl. management support for intervention Broader political and social environment, concurrent social changes Structural, organisational (aspects of system), physical environment Training, skills, experience of those implementing the intervention Characteristics of the target populations (eg. culture, literacy, SES) DATA OFTEN NOT PROVIDED!!

214 Trade-offs Trade-offs Adverse effects / potential for harm Costs of intervention

215 Sustainability Sustainability of outcomes and/or interventions - consider: Economic and political variables Strength of the institution Full integration of activities into existing programs/curricula/services, etc Whether program involves a strong training component Community involvement/participation

216 Implications for PH/HP Not good enough to simply say “more research is needed” State what type of research should be done and why What specific study design or quality issue should be addressed in future research?

217 Writing the systematic review

218 Writing your systematic review Useful review manuals and guidelines for publication Cochrane Reviewers’ Handbook Cochrane Open-Learning materials NHS CRD Report QUORUM statement (Lancet 1999;354(9193): ) MOOSE guidelines (JAMA 2000;283: )

219 Appraisal of a systematic review 10 questions 1. Clearly-focused question 2. The right type of study included 3. Identifying all relevant studies 4. Assessment of quality of studies 5. Reasonable to combine studies 6. What were the results 7. Preciseness of results 8. Application of results to local population 9. Consideration of all outcomes 10. Policy or practice change as a result of evidence


Download ppt "Conducting systematic reviews of public health and health promotion interventions Nicki Jackson Senior Training and Support Officer Cochrane Health Promotion."

Similar presentations


Ads by Google