Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evidence Review Methodological Framework: The Case of Teen Pregnancy Prevention Research April 2011 Presentation to the Inter-American Development Bank.

Similar presentations


Presentation on theme: "Evidence Review Methodological Framework: The Case of Teen Pregnancy Prevention Research April 2011 Presentation to the Inter-American Development Bank."— Presentation transcript:

1 Evidence Review Methodological Framework: The Case of Teen Pregnancy Prevention Research April 2011 Presentation to the Inter-American Development Bank Larissa Campuzano Christopher Trenholm

2  Methodological Framework: Methods and Considerations for Conducting a Systematic Review  Case Study: The Teen Pregnancy Prevention Research Evidence Review (PPRER) Agenda 2

3 Methodological Framework 3

4  Determine eligibility criteria for review  Determine evidence standards  Determine a process for reporting findings and assessing effectiveness  Determine a method for combining findings from multiple studies Key Decisions to Be Taken Before Review Starts 4

5  The type of relevant interventions  The time frame for eligible studies  The target population  The eligible study designs  The outcomes of interest along with the domains in which they will be classified  The subgroups of interest Eligibility Criteria Should be Specified 5

6  Determine design-specific standards –Standards for RCT, QED, RD, and SC.  Determine other factors that could affect the ability of the study to demonstrate causal evidence –Confounds –Quality of outcomes –Data collection issues  The standards will determine the quality of study (high, moderate, or low) Evidence Standards Should Be Defined 6

7  Establish that the assignment process was random.  Attrition could affect the ability of the study to provide causal evidence –Determine the level of attrition that we are willing to accept.  Baseline differences could affect the ability of the study to provide causal evidence –Determine if we will require baseline equivalence or statistical adjustments in case of differences (statistical or in magnitude) Standards for RCTs 7

8  Baseline differences could affect the ability of the study to provide causal evidence –Determine how to establish baseline equivalence (BE) Specify variables that would require BE Specify if non-equivalence in one measure invalidates others in the same domain. Specify if the focus is on statistically significance or some magnitude measure (effect size) Specify the levels of acceptable differences –Specify the range at which we require statistical adjustments in case of initial differences –Specify the statistical methods accepted to control for initial differences Standards for QEDs 8

9  Face validity  Reliability (internal consistency, inter-rater reliability, test re-test reliability)  Overalignment  Multiple follow-up measures  Relevant domains  Acceptable measures within domains  Aggregate vs subscales measures Outcome Characteristics Should Be Specified 9

10  Once the quality of the study has been established we need to determine a process for reporting relevant findings.  Findings will be reported on those studies that are able to provide causal evidence.  The reporting process will determine the degree of comparability across studies.  The reporting process will also determine the options for combining findings. Determine a Process for Assessing Evidence 10

11  Report in a common metric (effect size) –Continuous vs dichotomous outcomes –The level of analysis affects comparability  Report statistical significance –Multiple comparison adjustment –Cluster adjustment –Disregard adjustments  Report sign of effects –Allow for different metrics Options for Reporting Findings 11

12 Mathematica ® is a registered trademark of Mathematica Policy Research.  A framework for categorizing each program’s evidence of effectiveness needs to be determined.  The framework can take into account: –Quality of the study –Magnitude, sign, or statistical significance of impacts –Duration of effects –Effects on full sample or subgroups –Number of studies (replicability) and consistency of results. Categorizing Evidence of Effectiveness 12

13 Mathematica ® is a registered trademark of Mathematica Policy Research.  Evidence reviews do not necessarily combine evidence across studies.  A meta-analysis is an option for combining evidence.  If the main goal of the review is to combine studies that may affect how the review protocol is defined. Combining Evidence across Studies 13

14 Mathematica ® is a registered trademark of Mathematica Policy Research. Case Study: The Teen Pregnancy Prevention Research Evidence Review (PPRER) 14

15 Mathematica ® is a registered trademark of Mathematica Policy Research. 1.Define Inclusion Criteria: Which Studies to Review 2.Identify Studies Meeting Inclusion Criteria 3.Assess Quality of Studies 4. Assess Evidence of Effectiveness Among Studies Meeting Quality “Bar” Review: Four-Step Process 15

16 Mathematica ® is a registered trademark of Mathematica Policy Research.  Fundamental questions –How far back do we go? –Do we care only about causal/statistically-based evidence? –What outcomes do we care about? (e.g., behavioral mediators?)  What interventions do we care about? –E.g., Is study of job training “in” if it measures pregnancy impacts?  Answers reflect input of evaluator and policymaker –Value of evidence (does impact on attitudes matter enough?) –Practical considerations (purpose, time, resources) Step 1: Issues in Defining Inclusion Criteria 16

17 Mathematica ® is a registered trademark of Mathematica Policy Research.  Intervention had to focus on preventing teen pregnancy (among U.S. youth < age 20) –Excludes interventions like job training, dropout prevention  Study had to estimate impact on sexual risk behavior or its health consequence(s) –E.g., sexual initiation, contraception, STIs, pregnancies, births  Estimates had to be based on statistical analysis of quantitative data  Findings had to be published/reported in past 20 years Step 1: Inclusion Criteria as Defined 17

18 Mathematica ® is a registered trademark of Mathematica Policy Research.  Reviewed existing syntheses of studies related to adolescent pregnancy prevention –Fortunate to have available seven relevant, extensive literature reviews or research syntheses  Searched websites of relevant research and policy organizations  Distributed a “call for studies” to 43 research organizations, professional associations, etc.  Conducted keyword search of electronic databases Step 2: Process for Identifying Studies 18

19 Mathematica ® is a registered trademark of Mathematica Policy Research.  Process uncovered more than 1,000 possible studies for the evidence review –Majority picked up in keyword search  Quick review of these studies (e.g., review of abstracts) reduced the list to 430 possible studies  Review of the full papers reduced the list to 199 studies meeting all study inclusion criteria Step 2: Studies Identified for Review 19

20 Mathematica ® is a registered trademark of Mathematica Policy Research.  Critical decisions (for assessment) –How many “quality grades” are there? How fine do we need to be? –What criteria define how randomized control trials (RCTs) are assigned to these grades? –What about quasi-experimental studies?  Consideration: Finer distinctions increase debate, so quality grades should only be as fine as review purpose demands  Consideration: Draw on existing expertise in developing quality criteria –Example: Dept. of Ed’s What Works Clearinghouse (WWC) Step 3: Considerations in Quality Assessment 20

21 Mathematica ® is a registered trademark of Mathematica Policy Research. Criteria Category High Study RatingModerate Study RatingLow Study Rating 1. Study design Random control trial (RCT) or functionally random assignment Quasi-experiment w/ comparison group; RCT w/ high attrition or reassignment Does not meet criteria for high, moderate rating 2. Attrition What Works Clearinghouse standards for attrition No requirement Does not meet criteria for high, moderate rating 3. Baseline equivalence Must control for statistically significant baseline differences Must establish baseline equivalence and control for baseline outcome measures Does not meet criteria for high, moderate rating 4. Reassignment Analysis must be based on original assignment to research groups No requirement Does not meet criteria for high, moderate rating 5. Confounding factors Must have 2+units in each research group and no systematic differences between them Does not meet criteria for high, moderate rating Step 3: Study Quality Criteria 21

22 Mathematica ® is a registered trademark of Mathematica Policy Research.  Of the 199 studies assessed: –61 received a high rating –32 received a moderate rating –106 received a low rating  Only studies assessed as high or moderate quality were considered for the final step – evidence review Step 3: Results of Assessing Study Quality 22

23 Mathematica ® is a registered trademark of Mathematica Policy Research.  Four dimensions of evidence (for an intervention) 1.Quality of the study or studies 2.Importance of the measure(s) evaluated 3.Evidence of impact for measure(s) within each study (attribution, breadth, duration, magnitude, main/subgroup effect) 4.Consistency of evidence across studies  Major question: How do we parameterize these? –E.g. How do we define duration? How many categories are there? What are the thresholds for each category? –Even the most basic parameterization yields massive # of categories (dichotomize eight sub-dimensions = 256 categories!) Step 4: Assessing Evidence 23

24 Mathematica ® is a registered trademark of Mathematica Policy Research.  Like quality grades, fine distinctions in evidence “levels” give rise to reasonable disagreement –i.e. 256 categories are not helpful for policymaking  In this context, DHHS ultimately developed eight categories to characterize the current evidence base  These categories formed “Tier 1” of the Teen Pregnancy Prevention (TPP) grant announcement, recently released by the Office of Adolescent Health Step 4: Process for Assessing Evidence 24

25 Mathematica ® is a registered trademark of Mathematica Policy Research. Evidence Category High Qual. Study Replicated Impact High Qual. Study Sustained Impact High Qual. Study Short-Term Impact High Qual. Study Subgroup Impact Study qualityHigh Sample with positive impacts Full sample Subgroup Duration of impactsYear or more Less than yearAny ReplicatedYesNoYes or no Number of Programs06103 Step 4: Categories of Evidence 25 Evidence Category Mod Qual. Study Replicated Impact Mod Qual. Study Sustained Impact Mod Qual. Study Short-Term Impact Mod Qual. Study Subgroup Impact Study qualityModerate Sample with positive impacts Full sample Subgroup Duration of impactsYear or more Less than yearAny ReplicatedYesNoYes or no Number of Programs0234

26 Mathematica ® is a registered trademark of Mathematica Policy Research. Program Name Aban Aya Youth ProjectFOCUSReducing the Risk Adult Identity Mentoring HIV Risk Reduction Among Detained Adolescents Rikers Health Advocacy Program All4You!HorizonsSafer Sex Assisting in Rehabilitating Kids It’s Your Game: Keep it RealSiHLE Be Proud! Be Responsible!Making a Difference!Sisters Saving Sisters Be Proud! Be Responsible! Be Protective! Making Proud Choices!Teen Health Project Becoming a Responsible Teen Project TALCTeen Outreach Program Children’s Aid Society— Carrera Program Promoting Health Among Teens! Abstinence-Only Intervention What Could You Do? ¡Cuídate! Promoting Health Among Teens! Comprehensive Intervention Draw the Line/Respect the Line Raising Healthy Children Result: 28 Programs Met “Tier I” Standards 26

27 Mathematica ® is a registered trademark of Mathematica Policy Research.  Quality rating of supporting study: –High = 19 programs –Moderate = 9 programs  Analysis sample showing impacts: –Full sample = 21 programs –Subgroup only = 7 programs  Duration of impacts: –Less than 12 months = 14 programs –12 months or more = 14 programs Strength of Supporting Evidence 27

28 Mathematica ® is a registered trademark of Mathematica Policy Research.  Impacts replicated in more than one high- or moderate-quality study: –Yes = 1 program –No = 27 programs  Number of programs showing impacts on: –Initiation of sexual activity = 5 programs –Other measures of sexual activity (frequency, number of partners, etc.) = 17 programs –Contraceptive use = 9 programs –STIs = 4 programs –Pregnancy or birth = 5 programs Strength of Supporting Evidence (Continued) 28

29 Mathematica ® is a registered trademark of Mathematica Policy Research.  Questions/comments?  For more information, please contact: –Larissa Campuzano LCampuzano@mathematica-mpr.com –Chris Trenholm CTrenholm@mathematica-mpr.com Discussion 29


Download ppt "Evidence Review Methodological Framework: The Case of Teen Pregnancy Prevention Research April 2011 Presentation to the Inter-American Development Bank."

Similar presentations


Ads by Google