Bringing Evidence-Driven Progress To Education MSP Regional Conference - Boston Jon Baron The Coalition for Evidence-Based Policy March 30, 2006.

Slides:



Advertisements
Similar presentations
East Bay Conservation Corps Charter School Charter Renewal Presentation OUSD State Administrator Board of Education September 28, 2005.
Advertisements

Harold Pollack, Co-Director. Founded in 2008 to partner with Chicago and other jurisdictions to carry out randomized experiments to learn more about.
Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
Synthesizing the evidence on the relationship between education, health and social capital Dan Sherman, PhD American Institutes for Research 25 February,
GEAR UP Idaho  GEAR UP Idaho is a federal grant program that provides comprehensive, early intervention college access programming to selected Idaho.
Arts in Basic Curriculum 20-Year Anniversary Evaluation the Improve Group.
Evidence-Based Education (EBE) Grover J. (Russ) Whitehurst Assistant Secretary Educational Research and Improvement United States Department of Education.
What randomized trials have taught us about what works and doesn’t work in education Jon Baron Coalition for Evidence-Based Policy December 9, 2003.
Job Search Assistance Strategies Evaluation Presentation for American Public Human Services Association February 25, 2014.
Bureau of Justice Assistance JUSTICE AND MENTAL HEALTH COLLABORATIONS Bureau of Justice Assistance JUSTICE AND MENTAL HEALTH COLLABORATIONS Presentation.
1 Substance Abuse Prevention in Dare County A Public Health Approach Sheila Davies Community Development Specialist Dare County Department of Public Health.
“Scientifically Based Evaluation Methods” Presented by Paula J. Martin COE Conference, September 13, 2004.
How Do We Know if a Charter School is Really Succeeding? – Various Approaches to Investigating School Effectiveness October 2012 Missouri Charter Public.
Advancing Evidence-Based Reforms in Federal Math and Science Education Programs MSP Regional Conference – Dallas, TX David Anderson The Coalition for Evidence-Based.
Ready or Not? Protecting the Public’s Health from Diseases, Disasters, and Bioterrorism Jeffrey Levi, PhD Congressional Briefing February 3, 2012.
THE RESEARCH ON S trengthening F amilies P rogram for P arents and Y outh Presented on November 16, 2006 Funded by the Annie E. Casey Foundation.
Arkansas MSP Grants: Statewide Evaluation Plan Judy Trowell Arkansas Department of Education Charles Stegman Calli Holaway-Johnson University of Arkansas.
Increasing Government Effectiveness Through Rigorous Evidence About “What Works” Jon Baron Coalition for Evidence-Based Policy NASCSP Conference, March.
Science Foundation Arizona Arizona STEM Network October, 2011.
STEM Education Reorganization April 3, STEM Reorganization: Background  The President has placed a very high priority on using government resources.
GEAR UP GEAR UP NASSFA Conference Maureen McLaughlin Deputy Assistant Secretary Office of Post Secondary Education JULY 9, 2000.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Acting Commissioner, National Center for Education Research.
Catherine Cross Maple, Ph.D. Deputy Secretary Learning and Accountability
Why Summer Learning Matters - to Boston and the Nation Summer Learning: Bridging the Opportunity and Achievement Gap April 3, 2013 Will Miller President,
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Department of Engineering Fundamentals – J.B. Speed School of Engineering – University of Louisville Partnerships for Creating STEM Pipelines Dr. Patricia.
Philomath School District Board of Directors Work Session May 10, 2012.
Collaborative Closing the Gap Action Plans: School Counselors, School Social Workers and School Psychologists Working to Close the Gaps.
Evaluating Outcomes Across the Partnerships Tom Loveless Director, Brown Center on Education Policy The Brookings Institution Saturday,
Reaching for Excellence in Middle and High School Science Teaching Partnership Cooperative Partners Tennessee Department of Education College of Arts and.
Counselor’s Role in the Age of High School Reform March 3-4, 2005 Judy Bowers, TUSD Guidance Coordinator President, American School Counselor Association.
Perspectives on Impact Evaluation Cairo, Egypt March 29 – April 2, 2009 Presented by: Wayne M. Harding. Ed.M., Ph.D., Director of Projects, Social Science.
Research Policies and Mechanisms: Key Points from the National Mathematics Advisory Panel Joan Ferrini-Mundy Director, Division of Research on Learning.
Mathematics and Science Education U.S. Department of Education.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Less Pain, More Gain: An Evidence-Based Approach to Long-term Deficit Reduction Jon Baron Coalition for Evidence-Based Policy March 2013.
1 The Bill & Melinda Gates Foundation – An Overview of What We Know Now in Washington State May 11, 2006.
The Improving Teacher Quality State Grants Program California Postsecondary Education Commission California Mathematics & Science Partnership 2011 Spring.
1 No Child Left Behind for Indian Groups 2004 Eva M. Kubinski Comprehensive Center – Region VI January 29, 2004 Home/School Coordinators’ Conference UW-Stout.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
1 The New York State Education Department New York State’s Student Data Collection and Reporting System.
Dave neilsen Deputy Director. Commitment, Knowledge and Services… The Department of Alcohol and Drug Programs (ADP) is committed to providing excellent.
THE CHANGING FACE OFWISCONSIN’SWORKFORCE January 26, 2009 Careers Conference GLOBAL SKILLS IN THE 21 ST CENTURY Dennis K. Winters Chief, Office.
US Department of Labor Employment and Training Administration (ETA) Partnering for Effective Business Engagement Heather Graham Director of Special Initiatives.
0 Emerging Findings from the Employment Retention and Advancement (ERA) Evaluation Gayle Hamilton, MDRC Workforce Innovations 2005 Conference.
+ IDENTIFYING AND IMPLEMENTING EDUCATIONAL PRACTICES SUPPORTED BY RIGOROUS EVIDENCE: A USER FRIENDLY GUIDE Presented by Kristi Hunziker University of Utah.
Accelerating Adoption of Sector Strategies A State Policy Development Assistance Project Funded by The Ford and C. S. Mott Foundations February 24, 2006.
October 26,  FCSD Shared Decision Making Model The State of the District-The Big Picture District Guiding Principles Regents Reform Agenda FCSD.
Southern Regional Education Board High Schools That Work Jo Kister, SREB Consultant Archived Information.
Strategies for Effective Program Evaluations U.S. Department of Education The contents of this presentation were produced by the Coalition for Evidence-Based.
Southern Regional Education Board HSTW HSTW/CSR High Schools Making the Greatest Gains in Achievement: What did they do differently? Gene Bottoms Senior.
The Value of Random Assignment Impact Evaluations for Youth-Serving Interventions? Notes from Career Academy Research and Practice James Kemple Senior.
What Works And Is Hopeful Grover J. (Russ) Whitehurst, Ph.D. Director Institute of Education Sciences United States Department of Education About High.
Past, Present, & Key to our Future. * In 1995 a survey was conducted across DE and it was found that the predominant form of Science Education was textbook.
1 YOUTHBUILD EVALUATION Building evidence about the effect of YouthBuild on the young people it serves August 17, 2011 Washington, D.C.
The Whole School Development & The School Grant The case of The Gambia Prepared by the World Bank Impact Evaluation Team Contacts for questions about these.
Preparing for the Title III Part F STEM Competition Alliance of Hispanic Serving Institutions Educators Grantsmanship Institute March 20, 2016.
Spring 2015 OMSP Request For Proposal. Important Dates Intent to Submit: March 21, 2015 Applications: 4:30 p.m., Friday, May 15, 2015 Announcement of.
Office of School Turnaround Center for Accountability and Improvement, Ohio Department of Education 25 South Front Street, Columbus, Ohio
BUILDING BRIDGES AND BONDS (B3) A rigorous, multi-site study of innovative services offered by Responsible Fatherhood/ReFORM (RF) programs Unique opportunity.
Evidence-Based Policy and Practice in the United States April 5, 2016 Wellington, New Zealand Ron Haskins Cabot Family Chair & Co-Director, Center on Children.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
The Role of Federal Government in Driving Research and Evaluation Grover (Russ) Whitehurst The Herman and George R. Brown Chair Senior Fellow Director.
Deciphering “Evidence” in the New Era of Education Research Standards Ben Clarke, Ph.D. Research Associate - Center for Teaching and Learning, University.
Professional Development: Imagine Difference Shapes and Sizes
Turning Around 1,000 Schools: The Story of Success for All
The Diplomas Now Approach
CSAP’s Mission To Decrease Substance Use & Abuse and Related Problems Among the American Public Through: Bridging the gap between research and practice.
Evidence Based Curriculum & Instruction
Implementation of Randomized Trials
Presentation transcript:

Bringing Evidence-Driven Progress To Education MSP Regional Conference - Boston Jon Baron The Coalition for Evidence-Based Policy March 30, 2006

Coalition for Evidence-Based Policy Board of Advisors Robert Boruch - Co-Chair, Campbell Collaboration Jonathan Crane - Sr Fellow, Progressive Policy Institute David Ellwood - Dean, Harvard’s JFK School Judith Gueron – fmr President, MDRC Ron Haskins – Sr Advisor to President for Welfare Policy Robert Hoyt – Founder, Jennison Associates Blair Hull – Founder, Hull Trading Co David Kessler – fmr FDA Commissioner Jerry Lee – President, Jerry Lee Foundation Dan Levy – Researcher, Mathematica Diane Ravitch – fmr Asst Secretary of Education Laurie Robinson – fmr Asst Attorney General, DOJ Howard Rolston – fmr Director of Research, HHS/ACF Isabel Sawhill – fmr Associate Director of OMB Martin Seligman – fmr President, American Psychological Assn Robert Slavin – Co-Director of CRESPAR at Johns Hopkins Univ Robert Solow – Nobel Laureate in Economics, MIT Nicholas Zill – Vice-President, Westat

The Problem: Little progress in many areas of social policy U.S. has made no significant progress versus substance abuse since U.S. has made very limited progress in raising K-12 achievement over past 30 years. U.S. poverty rate today is higher than in 1973.

Randomized trials have identified a few highly- effective interventions : SMART tutoring program for at-risk readers in grades 1-2  At 2-yr followup, increases students’ national percentile ranking in reading ability from ~20 th percentile to ~30 th percentile. LifeSkills Training Reduces smoking by 20% and serious levels of substance abuse by ~30% by end of high school. Good Behavior Game At age 19-21, reduces substance abuse by 30-60%; at age 11-13, reduces school suspensions, conduct disorder, smoking/hard drug use by 25-60%.

Randomized trials have identified a few interventions that are ineffective/harmful: DARE Ineffective in preventing substance use, according to randomized trials (is now being redesigned). Scared Straight Causes a small increase in subsequent criminal activity by participating youth.

We seek to advance a major federal/state strategy to: 1. Fund rigorous studies – particularly randomized trials – to build the knowledge base of research-proven interventions. 2. Spur widespread use of such interventions by recipients of federal and state funds.

Challenges: How one measures an intervention’s effectiveness is extremely important Randomized trials are considered the “gold standard” in other fields for establishing what works. Nonrandomized designs – even the best designs – can produce erroneous conclusions. Well-matched comparison-group studies can produce “possible” evidence, but results need to be confirmed in randomized trials wherever feasible.

Job Training Partnership Act: Impact on Earnings of Male Youth (Non-arrestees) Program group Control group

Impact of Career Academies on High School Graduation Rates Graduated late Graduated on time Randomized trial results Comparison-group study results* Career Academy Students Source: Data provided by James Kemple, MDRC Inc. Randomized Control Group *The comparison group is similar students in similar schools nationwide, who are enrolled in the general (as opposed to career) curriculum. Their estimated graduation rates are statistically adjusted to control for observed differences in their background characteristics versus the program group Career Academy Students Non-randomized Comparison Group* Percent Graduating

Evidence-Based Education “Help Desk” Launched January 2006 by the What Works Clearinghouse of the U.S. Education Department’s Institute of Education Sciences

Recent federal efforts to advance evidence-based education: Education Sciences Reform Act of 2002 “Scientifically-based research” provisions in No Child Left Behind. ED’s competitive priority for grant applications that include a rigorous evaluation. Increased Congressional funding for rigorous education research.

We held focus groups of potential Help Desk users: Focus groups included (i) fed/state/local officials, (ii) researchers, and (iii) program providers. They asked: What are the concrete, practical steps we need to take to Advance rigorous evaluations; and 2. Identify and implement evidence-based programs and practices.

Illustrative areas of researcher need for assistance: Can we randomly assign individual students, or do we need to randomly assign whole classrooms or schools? How do we gain the cooperation of school officials in (i) random assignment, and (ii) access to student educational records? What are the key items to get right when conducting a randomized controlled trial?

Illustrative Areas of Publisher/Program Provider Need for Assistance We’d like to sponsor a rigorous evaluation of our program that will meet WWC standards. How can we find a researcher(s) capable of conducting a rigorous evaluation? Are there conditions in which a randomized controlled trial can be conducted at low cost (e.g., $50,000 - $75,000)? What are the key items I should ask the research team to include in its study report?

Illustrative Areas of Federal/State/Local Officials Need for Assistance: Federal or State official: How do we incentivize our grantees to do a randomized controlled evaluation, or effectively implement evidence-based practices? (e.g., what language to include in our RFP?) School district superintendent: What are key items to get right when replicating an evidence-based program in our school district.

Help Desk seeks to provide answers to many of these questions from the user’s desktop:  Thru practical, easy-to-use resources (e.g., WWC, and user-friendly “how-to” guides).  Users access these tools through a web site managed by knowledgeable “moderators” immediately available by phone or .

Illustrative Examples of Help Desk “How-To” Resources:  Key Items to Get Right When Conducting a Randomized Controlled Trial in Education  Reporting the Results of Your Study: A User-Friendly Guide  Identifying and Implementing Educational Practices Supported By Rigorous Evidence: A User-Friendly Guide  How To Conduct Rigorous Evaluations of Math and Science Partnership (MSP) Projects: A User-Friendly Guide  Future Guides: Finding a capable evaluator, interpreting and applying What Works Clearinghouse reports, conducting low- cost randomized controlled trials

Jon Baron The Coalition for Evidence-Based Policy