The Value of Random Assignment Impact Evaluations for Youth-Serving Interventions? Notes from Career Academy Research and Practice James Kemple Senior.

Slides:



Advertisements
Similar presentations
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Advertisements

BETTER TOGETHER Region 6 DOL Gathering. 2 Organize Community Resources SIX GUIDING PRINCIPLES Deepen, Sustain Employer Partnerships Make Easier to Acquire.
1 Improving School Leadership - Guidelines for Country Background Reports - Education and Training Policy Division Directorate of Education.
Criteria for High Quality Career and Technical Education Programs National Career Pathways Network Orlando, FL November 14, 2014.
New York State Workforce Investment Board Healthcare Workforce Development Subcommittee Planning Grant Overview.
Randomized Controlled Trials in Community Colleges Successes, Challenges, and Next Steps for Research on Developmental Education U.C. Davis March 16, 2015.
Assessing Program Impact Chapter 8. Impact assessments answer… Does a program really work? Does a program produce desired effects over and above what.
WHY LARGE-SCALE RANDOMIZED CONTROL TRIALS? David Myers Senior Vice President IES 2006 Research Conference David Myers Senior Vice President IES 2006 Research.
1 Higher Education Summit: Defining Attainment Goals for MN December 8th, 2014.
S-STEM Program Evaluation S-STEM PI Meeting Arlington, VA October 2012.
RE-ENGAGING OUR YOUTH Presented By Selena Barajas-Ledesma, MSW Pupil Services Administrator, City Partnership Program Los Angeles Unified School District.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
IMPACTS OF A WORK-BASED POVERTY REDUCTION PROGRAM ON CHILDREN’S ACHIEVEMENT AND SOCIAL BEHAVIOR: THE NEW HOPE PROJECT Aletha C. Huston, Greg J. Duncan,
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
February 8, 2012 Session 3: Performance Management Systems 1.
Strengthening the quality of research for policy engagement in the African context – achievements and aspirations Tebogo B. Seleka Botswana Institute for.
PROFESSIONAL ORGAINIZATIONS LEADERSHIP FORUM AUGUST 6-7, 2013 NYSACTE RECOMMENDATIONS FOR COLLEGE AND CAREER READINESS.
Research and Public School Partnerships:
Measuring Impact: Experiments
1 Introduction to Grant Writing Beth Virnig, PhD Haitao Chu, MD, PhD University of Minnesota, School of Public Health December 11, 2013.
The Evaluation of Charter School Impacts June 30, 2010 Presentation at the 2010 IES Research Conference Philip Gleason ● Melissa Clark Christina Clark.
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
What Was Learned from a Second Year of Implementation IES Research Conference Washington, DC June 8, 2009 William Corrin, Senior Research Associate MDRC.
Making a Difference Update on the National i3 Evaluation of the Implementation and Impact of Diplomas Now.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
1. 2 Collaborative Partnerships It’s that evolution thing again! Adult education has been partnering and collaborating for years.
FRAMEWORK AND METHODOLOGY FOR LOCAL/REGIONAL ACTION PLANNING FOR SOCIAL INCLUSION Dr. Haroon Saad Director QeC-ERAN Venice April 2007.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Less Pain, More Gain: An Evidence-Based Approach to Long-term Deficit Reduction Jon Baron Coalition for Evidence-Based Policy March 2013.
High School Reform: Learning From Rigorous Research Prepared for the U.S. Department of Education High School Initiative Regional High School Summit St.
Rigorous Quasi-Experimental Evaluations: Design Considerations Sung-Woo Cho, Ph.D. June 11, 2015 Success from the Start: Round 4 Convening US Department.
1 Alliances for Graduate Education and the Professoriate Comparison Groups and Other issues September 18, 2008 by Catherine M. Millett, Ph.D. Policy Evaluation.
Can Financial Innovation Promote Energy Efficiency? An Impact Analysis for China November 13, 2009 Hiroyuki Hatashima Independent Evaluation Group-IFC.
+ IDENTIFYING AND IMPLEMENTING EDUCATIONAL PRACTICES SUPPORTED BY RIGOROUS EVIDENCE: A USER FRIENDLY GUIDE Presented by Kristi Hunziker University of Utah.
What is randomization and how does it solve the causality problem? 2.3.
Measuring Impact 1 Non-experimental methods 2 Experiments
Development Impact Evaluation in Finance and Private Sector 1.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Evaluation Planning & Reporting for School Climate Transformation Grant (SCTG) Sites Bob Algozzine University of North Carolina at Charlotte Steve GoodmanMichigan's.
PATHWAYS TO ECONOMIC OPPORTUNITY Terry Grobe Jobs for the Future | November 16, 2015 OPPORTUNITY WORKS: HOW BACK ON TRACK PATHWAYS CAN REENGAGE STUDENTS.
Framework of Preferred Evaluation Methodologies for TAACCCT Impact/Outcomes Analysis Random Assignment (Experimental Design) preferred – High proportion.
Council of Ontario Universities Ontario’s Graduate Programs Outcomes Survey Presented to The Canadian Association for Graduate Studies Julia Colyar October.
Elizabeth Spier, PhDJohannes Bos, PhD Principal ResearcherSenior Vice President FAST in Philadelphia SEPTEMBER 2015 Copyright © 2015 American Institutes.
Lessons from the United States: Evaluating Employment Services is Neither Easy Nor Cheap November 7, 2009 Sheena McConnell Peter Schochet Alberto Martini.
Informing Demand: What Do Consumers Know About Postsecondary Options? Andrew P. Kelly Center on Higher Education Reform American Enterprise Institute.
Designing New Programs Design & Chronological Perspectives (Presentation of Berk & Rossi’s Thinking About Program Evaluation, Sage Press, 1990)
Evaluation Research Dr. Guerette. Introduction Evaluation Research – Evaluation Research – The purpose is to evaluate the impact of policies The purpose.
1 YOUTHBUILD EVALUATION Building evidence about the effect of YouthBuild on the young people it serves August 17, 2011 Washington, D.C.
The work reported herein was supported under the National Research Center for Career and Technical Education, PR/Award (No. VO51A070003) as administered.
ACE re-design – we need you!. Collaborative and Thoughtful.
Outcomes Evaluation A good evaluation is …. –Useful to its audience –practical to implement –conducted ethically –technically accurate.
CAREER PATHWAYS THE NEW WAY OF DOING BUSINESS. Agenda for our Discussion Today we’ll discuss: Career Pathways Systems and Programs Where we’ve been and.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Randomization.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
The Future of Higher Education in Texas Dr. Larry R. Faulkner Vice-Chair, Higher Education Strategic Planning Committee Presentation to Texas Higher Education.
Evidence-Based and Promising Practices to Increase Graduation and Improve Outcomes Dr. Loujeania Bost Dr. Catherine Fowler Dr. Matthew Klare.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
The Role of Federal Government in Driving Research and Evaluation Grover (Russ) Whitehurst The Herman and George R. Brown Chair Senior Fellow Director.
MANAGING HUMAN RESOURCES
Post Secondary Pathways
YCC Career Pathways Discussion
FRAMEWORK OF EVALUATION QUESTIONS
Implementation Challenges
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Designs for Male Circumcision
Building Evidence The YCC Evaluation
February 21-22, 2018.
Steps in Implementing an Impact Evaluation
Presentation transcript:

The Value of Random Assignment Impact Evaluations for Youth-Serving Interventions? Notes from Career Academy Research and Practice James Kemple Senior Fellow, MDRC

2 What Can Career Academy Research and Practice Offer Evidence-Based Policy? Practice  34-year track record of implementation, planned expansion, and efforts at continuous improvement  Intervention with goals and core features aligned with important problems in high schools and prominent policy options Research  25 years of non-experimental research and a commitment to learning what works  10-year random assignment field experiment involving 9 sites, over 1,700 students, and 8-years of follow-up  Positive effects on labor market outcomes without compromising on academic goals

3 Context for Impact Evaluations  Learning “what works” is a long-term and cumulative process  Questions drive methodology, not the reverse  Multiple questions require multiple methods  Must balance research ambition against operational and political realities  Knowledge-building should be an integral part of policy development and continuous improvement, not an add-on or after-thought

4 Why Conduct Impact Evaluations?  Outcomes vs. Impacts  Outcome: Measure of individual or group behavior, attitudes, achievement, labor market participation, an so on.  Impact: The effect on an intervention on an outcome: the difference between outcome for program group and outcome for counterfactual.  Outcome-focused studies risk getting the wrong answer to the right question  Outcome standards risk awarding programs:  based on who they serve, rather than what they do  that operate under promising conditions, rather than use promising practices

5 Judging Program Impacts: High Outcomes/No Impact Note: National average estimates are adjusted to represent a sample with the same background characteristics as those in the Evaluation Sample Evaluation Sample National Averages for Similar Students in Similar Schools Academy Career/Tech. Non-AcademyGeneralAcademic Percent Graduating On-Time

6 Judging Program Impacts: Adding value vs. starting with strong context

7 Career Academy Impacts on Average Earnings Impacts

8 Guiding Principles for Impact Evaluations  Random assignment may be the “gold standard” but it is not the “Philosopher’s Stone” (i.e., won’t extend life or answer every important question.)  Questions drive methodology, not the reverse  Because evaluations involve multiple questions they require multiple methods  Implementing methodologies requires balancing research ambition against operational realities  Strong research designs cannot compensate for weak treatments

9 Conditions for Random Assignment  Priority Question: What is the impact?  Ethical and legal standards  No denial of services to which otherwise entitled  No reduction in expected service levels  Informed consent and data confidentiality  Operational Realities  Collaboration between researchers and program managers  Structured process for program entry or access to resources  Excess demand: more eligible applicants than available program slots or resources  Fair method for allocating scarce resources  Opportunity for a “fair test” of the intervention

10 Conditions for a “Fair Test”  Strong contrast with “status quo”  Implementation of program being tested  Participant exposure to program services  Well-understood alternative to program service  High quality methods for answering questions about why programs are effective (or not) and for whom  Dissemination of findings about what works and what does not work

11 Implications of Career Academies Evaluation  Random assignment provided findings that could not have been obtained with other designs.  Increased investments in career-related experiences during high school can improve post-secondary labor market prospects.  Career Academies serve as viable pathway to post- secondary education, but not necessarily better than other opportunities.  Career Academies demonstrate feasibility of accomplishing goals of school-to-career and career technical education without compromising on academic goals.