Getting Inside the “Black Box” – Capitalizing on Natural and Random Variation to Learn from the HPOG Impact Study Presenters: Alan Werner, co-Principal.

Slides:



Advertisements
Similar presentations
Child Care Subsidy Data and Measurement Challenges 1 Study of the Effects of Enhanced Subsidy Eligibility Policies In Illinois Data Collection and Measurement.
Advertisements

WREC May 29, 2014 Washington, D.C. Learning “What Works” in Career Pathways Programming: The ISIS Evaluation. David Judkins Abt Associates, Inc.
Job Search Assistance Strategies Evaluation Presentation for American Public Human Services Association February 25, 2014.
Implementation and Evaluation of the Rural Early Adolescent Learning Project (REAL): Commonalities in Diverse Educational Settings Jill V. Hamm, Dylan.
WHY LARGE-SCALE RANDOMIZED CONTROL TRIALS? David Myers Senior Vice President IES 2006 Research Conference David Myers Senior Vice President IES 2006 Research.
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
Can Financial Work Incentives Pay For Themselves? Final Report on the Self-Sufficiency Project for Welfare Applicants Reuben Ford, David Gyarmati, Kelly.
NRCOI March 5th Conference Call
Agenda: Block Watch outcome map Program Theory overview Evaluation theory overview Mentoring Evaluation Assignment 1 Evaluation Debrief.
Agenda: Block Watch: Random Assignment, Outcomes, and indicators Issues in Impact and Random Assignment: Youth Transition Demonstration –Who is randomized?
PAI786: Urban Policy Class 2: Evaluating Social Programs.
Experimental Design and Other Evaluation Methods Lana Muraskin
IMPACTS OF A WORK-BASED POVERTY REDUCTION PROGRAM ON CHILDREN’S ACHIEVEMENT AND SOCIAL BEHAVIOR: THE NEW HOPE PROJECT Aletha C. Huston, Greg J. Duncan,
Imagine that you're one of the estimated 36 million adults in the U.S. who has limited skill levels. You want to improve your skills and get a better.
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
ARREARS LEVERAGING PILOT PROJECT: OUTCOMES ACHIEVED & LESSONS LEARNED Pamela C. Ovwigho, PhD Correne Saunders, BA Catherine E. Born, PhD Paper presented.
Teaching Self-Sufficiency: 30-Month Impacts of a Home Visitation and Life Skills Education Program for Hard-to-Employ TANF Recipients Findings from the.
Early Childhood Education The Research Evidence Deborah Lowe Vandell December 11, 2003.
1 SARAH HUNTER RAND CORPORATION LAURA STEIGHNER AMERICAN INSTITUTES FOR RESEARCH NOVEMBER 16, 2009 National Evaluation of the Demonstration to Improve.
This project was supported by Award No VF-GX-0001, awarded by the National Institute of Justice, Office of Justice Programs, U.S. Department of Justice.
Ten years of Evaluability at the IDB Yuri Soares, Alejandro Pardo, Veronica Gonzalez and Sixto Aquino Paris, 16 November, 2010.
CSU Math and Science Teacher Initiative California Mathematics Council-South November 7, 2009 Joan Bissell, Director Mathematics and Science Teacher Initiative.
The Evaluation of Charter School Impacts June 30, 2010 Presentation at the 2010 IES Research Conference Philip Gleason ● Melissa Clark Christina Clark.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Welfare Dynamics Under Time Limits Jeffrey Grogger Charles Michalopoulos By: Tien Ho.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Integrated Performance Information (IPI) Project Mike Switzer Workforce Florida, Inc. Jay Pfeiffer Florida Department of.
What is HPOG? Goal: To provide education and training to TANF recipients and other low-income individuals for occupations in the healthcare field that.
Measuring and reporting outcomes for BTOP grants: the UW iSchool approach Samantha Becker Research Project Manager U.S. IMPACT Study 1UW iSchool evaluation.
Rigorous Quasi-Experimental Evaluations: Design Considerations Sung-Woo Cho, Ph.D. June 11, 2015 Success from the Start: Round 4 Convening US Department.
1 Implementing WIA Performance Measures for the European Social Fund Cynthia Fagnoni, Managing Director Education, Workforce, and Income Security Issues.
0 Emerging Findings from the Employment Retention and Advancement (ERA) Evaluation Gayle Hamilton, MDRC Workforce Innovations 2005 Conference.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Developing Evidence on “What Works” in Moving TANF Recipients to Work through Job Search Assistance Karin Martinson, Abt Associates February,
What is randomization and how does it solve the causality problem? 2.3.
Presented by: Shubha Chakravarty (Economist, AFTPM) Impact Evaluation team: Mattias Lundberg (Sr. Economist, HDNCY) Markus Goldstein (Sr. Economist, AFTPM.
Barriers to Independence Among TANF Recipients: Comparing Caseworker Records & Client Surveys Correne Saunders Pamela C. Ovwigho Catherine E. Born Paper.
The Disability Employment Initiative (DEI): Impact Evaluation Design October 21, 2015 Sung-Woo Cho, Ph.D.
Monitoring MCHB’s Six Core Outcomes for CSHCN Paul Newacheck, DrPH MCH Policy Research Center.
+ Evidence Based Practice University of Utah Evidence-Based Treatment and Practice: New Opportunities to Bridge Clinical Research and Practice, Enhance.
Abt Associates | pg 1 Using Impact Evaluation Tools to Unpack the Black Box and Learn What Works Laura R. Peck Principal Scientist, Abt Associates Inc.
1 The Effects of Customer Choice: First Findings from the Individual Training Account (ITA) Experiment Mathematica Policy Research, Inc. Social Policy.
Lessons from the United States: Evaluating Employment Services is Neither Easy Nor Cheap November 7, 2009 Sheena McConnell Peter Schochet Alberto Martini.
Patterns of community involvement: what are they and can they be changed? Kirby Swales, Survey Research Centre.
Evaluation Plan Steven Clauser, PhD Chief, Outcomes Research Branch Applied Research Program Division of Cancer Control and Population Sciences NCCCP Launch.
Regional Nutrition Education and Obesity Prevention Centers of Excellence National Coordination Center at the University of Kentucky.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
Pushing Forward from BHR through Random Individual-Level Variation in Program Components within Sites: The HPOG Impact Study Stephen H. Bell APPAM Research.
1 YOUTHBUILD EVALUATION Building evidence about the effect of YouthBuild on the young people it serves August 17, 2011 Washington, D.C.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Randomisation Bias and Post-Randomisation Selection Bias in RCTs: Barbara Sianesi Institute for Fiscal Studies September 14, 2006 The role of non-experimental.
BUILDING BRIDGES AND BONDS (B3) A rigorous, multi-site study of innovative services offered by Responsible Fatherhood/ReFORM (RF) programs Unique opportunity.
11 Green Workforce Development Lessons Learned November 30, 2010.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Approaches to Linking Process and Outcome Data in a Cross-Site Evaluation Laura Elwyn, Ph.D. Kristin Stainbrook, Ph.D. American Evaluation Association.
Pathways for Advancing Careers and Education (PACE): Findings from a Study of a Career Pathways Program Karen Gardiner Abt Associates, Inc. National Association.
Measuring Results and Impact Evaluation: From Promises into Evidence
OSEP Project Directors Meeting
America’s Promise Evaluation What is it and what should you expect?
Connecting TANF to Career Pathways with HPOG
Presented by: Asaph Glosser MEF Associates
SWFI Evaluation Overview And Update
IV-E Prevention Family First Implementation & Policy Work Group
TECHHIRE GRANTS MANAGEMENT PLENARY SESSION: EVALUATION UPDATE
Building Evidence The YCC Evaluation
Class 2: Evaluating Social Programs
Class 2: Evaluating Social Programs
David Mann David Stapleton (Mathematica Policy Research) Alice Porter
Presented by: Robin Koralek, Abt Associates
Presentation transcript:

Getting Inside the “Black Box” – Capitalizing on Natural and Random Variation to Learn from the HPOG Impact Study Presenters: Alan Werner, co-Principal Investigator Laura Peck, co-Principal Investigator Project Director: Gretchen Locke APPAM Conference November, 2013

Abt Associates | pg 2 Presentation Overview  What we know and don’t know about what works  The HPOG Impact Study  Strategies to “Get Inside the Black Box”  Q&A

Abt Associates | pg 3 Effects of Training: Progress to Date  It is well-established that vocational training and employment support programs for low-income individuals work –Strong experimental impact research supports claims of effectiveness of specific approaches –Best when program model tested in multiple sites and/or when program models tested head-to-head (e.g., HCD vs. LFA in NEWWS) –But, new program models within Career Pathways framework-based programs, and…

Abt Associates | pg 4 Challenges in Evaluating Training  Challenge has been to get inside the “black box” of program to develop findings more useful for policy and program design, for example: –What “parts” of a tested program are most responsible for impacts? –What accounts for variation in impacts across multiple program realizations? –What design and implementation strategies work better than others? –What works best for whom?

Abt Associates | pg 5 HPOG and Its Impact Evaluation  Career Pathways framework-based training for TANF and low-income individuals to pursue healthcare sector careers  HPOG-Impact is part of a rich research “portfolio” at ACF  Impact Evaluation involves an experimental design, with randomization of eligibles to control and treatment groups, with randomization to enhanced treatment in some locations

Abt Associates | pg 6 Study Sample and Data Collection  Sample size –Individuals: about 10,500 overall: 7,000 T; 3,500 C –Study sites: 38 study sites programs across 20 grantees –Planned variation sample (TBD) Peer support Emergency financial assistance Non-cash incentives  Data collection –At baseline (before RA), from PRS & supplement –Quarterly wage data (NDNH) –Follow-up surveys at 15 months post-randomization –Implementation study site visits –Grantee and other surveys from NIE

Abt Associates | pg 7 Research Questions HPOG-Impact will address the following questions: 1.What impacts do the HPOG programs as a group have on the outcomes of participants and their families? 2.To what extent do those impacts vary across selected subpopulations? 3. Which locally-adopted program components influence average impacts? 4. To what extent does participation in a particular component (or components) change the impacts experienced by individual trainees?

Abt Associates | pg 8 Research Questions HPOG-Impact will address the following questions: 1.What impacts do the HPOG programs as a group have on the outcomes of participants and their families? 2.To what extent do those impacts vary across selected subpopulations? 3. Which locally-adopted program components influence average impacts? 4. To what extent does participation in a particular component (or components) change the impacts experienced by individual trainees?

Abt Associates | pg 9 Sources of Variation Generating new evidence on the role of program components in generating impacts from HPOG sites will encompass:  Natural variation in program features across sites  Planned variation in certain sites where subsets of participants are randomized to gain access to a specific program enhancement in addition to the basic program  Person-to-person variation in components of the offered intervention actually received by individuals

Abt Associates | pg 10 Sources of Variation Generating new evidence on the role of program components in generating impacts from HPOG sites will encompass:  Natural variation in program features across sites  Planned variation in certain sites where subsets of participants are randomized to gain access to a specific program enhancement in addition to the basic program  Person-to-person variation in components of the offered intervention actually received by individuals

Abt Associates | pg 11 Intervention Components by Site and RA Sample: An Illustration Site PQRSN Type of Program Component: Naturally occurring, universalFFFFF Naturally occurring, varied–GEH– Randomized enhancement –E–– E Components Provided for Each Random Assignment Sample: C group sampleABCDL T1 sampleA, FB, F, GC, F, ED, F, HL, F T2 sample–B, F, G, E–--L, F, E

Abt Associates | pg 12 Sources of Variation Extracting the best information on the role of program components in generating impacts from HPOG sites will encompass:  Natural variation in program features across sites  Planned variation in certain sites where subsets of participants are randomized to gain access to a specific program enhancement in addition to the basic program  Person-to-person variation in components of the offered intervention actually received by individuals

Abt Associates | pg 13 Conceptually… When exposed to treatment… used program component (e.g., emergency assistance) achieved interim outcome (e.g., recognized credential) If exposed to treatment, would have… used program component (e.g., emergency assistance) achieved interim outcome (e.g., recognized credential)

Abt Associates | pg 14 Practically… Step 1  Step 1: Use baseline (exogenous) characteristics to predict subgroup membership –To capitalize on the internal validity of the experimental design, use symmetric identification of T and C subgroups  HPOG application –Use baseline variables… demographics, supplemental baseline Qs on efficacy, work preferences, barriers/needs –…to predict participation in various program components: used emergency assistance, supports, used child care support/services, accessed majority of available supports –…or to predict selected achieved/short-term outcomes (mediators): earned recognized credential, found healthcare job

Abt Associates | pg 15 ASPES: Steps 2 & 3  Step 2: Estimate impacts on predicted subgroups –Impact estimates are unbiased  Step 3: Convert estimated impacts for predicted subgroups to represent actual subgroups –Conversion rests (1) on an assumption of the homogeneity of impacts among those predicted to be in a subgroup; and (2) on the foundation of an experimental impact estimate

Further Information Molly Irwin Federal Project Officer, HPOG HHS/ACF/OPRE Alan Werner & Laura Peck Co-Principal Investigators Abt Associates Inc.