Strategies for Effective Program Evaluations U.S. Department of Education The contents of this presentation were produced by the Coalition for Evidence-Based.

Slides:



Advertisements
Similar presentations
Title I Directors Conference Sept 2007 Carol Diedrichsen Gwen Pollock Surveys of the Enacted Curriculum for English.
Advertisements

Synthesizing the evidence on the relationship between education, health and social capital Dan Sherman, PhD American Institutes for Research 25 February,
Ies.ed.gov Connecting Research, Policy and Practice Accelerating the Academic Achievement of Students with Learning Disabilities Research Initiative Kristen.
Best Practices for State & Local Oral Health Programs ASTDD Best Practices Project March 2010 Introduction to Best Practices Optimal oral health across.
Supplemental Educational Services Evaluations Data Collection Process Allison Potter Steven M. Ross Center for Research in Educational Policy The University.
Funding Opportunities at the Institute of Education Sciences: Information for the Grants Administrator Elizabeth R. Albro, Ph.D. Acting Commissioner National.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Chapter 13 Survey Designs
Reading STARS – Title I Parent Meeting September 2012.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Dr. Timothy S. Brophy Director of Institutional Assessment University of Florida GRADUATE AND PROFESSIONAL PROGRAM ASSESSMENT PLANS.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Professional Development Programs
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
AVID PROGRAM ADVANCEMENT VIA INDIVIDUAL DETERMINATION [L. avidus]: eager for knowledge.
RRTC-EBP-VR The Rehabilitation Research and Training Center on Effective Vocational Rehabilitation Service Delivery Practices (RRTC-EBP-VR) is established.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
The Evaluation of Mathematics and Science Partnership Program A Quasi Experimental Design Study Abdallah Bendada, MSP Director
Evaluating Outcomes Across the Partnerships Tom Loveless Director, Brown Center on Education Policy The Brookings Institution Saturday,
Quasi-Experimental Designs For Evaluating MSP Projects: Processes & Some Results Dr. George N. Bratton Project Evaluator in Arkansas.
HOW TO DO A STATE LONGITUDINAL EVALUATION MATH AND SCIENCE PARTNERSHIPS PROGRAM FEBRUARY 2011.
MATH/SCIENCE PARTNERSHIP BASICS The U.S. Department of Education´s Mathematics and Science Partnerships (MSP) program is administered by the Academic Improvement.
Robert Barnoski (Barney) Washington State Institute for Public Policy Phone: (360) Institute Publications:
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Accessing and Reporting State Student Achievement Data for GPRA Purposes Amy A. Germuth, Ph.D. Compass Consulting Group, LLC.
U.S. Department of Education Mathematics and Science Partnerships: FY 2005 Summary.
Seeking Funding from the Institute of Education Sciences A Guide for WCER Researchers.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
Adolescent Literacy – Professional Development
Module 5: Data Collection. This training session contains information regarding: Audit Cycle Begins Audit Cycle Begins Questionnaire Administration Questionnaire.
Challenging Curriculum and Organizational Structures Oct. 23, 2013 Jesse White.
Special Education Law for the General Education Administrator Charter Schools Institute Webinar October 24, 2012.
Evidence-Based Medicine Presentation [Insert your name here] [Insert your designation here] [Insert your institutional affiliation here] Department of.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
Rigorous Quasi-Experimental Evaluations: Design Considerations Sung-Woo Cho, Ph.D. June 11, 2015 Success from the Start: Round 4 Convening US Department.
Debra Brockway, Beth McGrath, Mercedes McKay Center for Innovation in Engineering and Science Education Analysis of a Statewide K-12 Engineering Program:
1 The New York State Education Department New York State’s Student Data Collection and Reporting System.
MSP Annual Performance Report: Online Instrument MSP Regional Conferences November, 2006 – February, 2007.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
CaMSP Cohort 8 Orientation Cohort 8 State and Local Evaluation Overview, Reporting Requirements, and Attendance Database February 23, 2011 California Department.
TEAM Coordinating Committee Training (TCC).  Introductions  Mission of the TEAM Program  Design of the TEAM Program  Overview of the Module Process.
Mathematics and Science Partnerships: Summary of the FY2006 Annual Reports U.S. Department of Education.
+ IDENTIFYING AND IMPLEMENTING EDUCATIONAL PRACTICES SUPPORTED BY RIGOROUS EVIDENCE: A USER FRIENDLY GUIDE Presented by Kristi Hunziker University of Utah.
The Evaluation of Mathematics and Science Partnerships Program A Quasi Experimental Design Study Abdallah Bendada, Title II Director
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
REGIONAL EDUCATIONAL LAB ~ APPALACHIA The Effects of Hybrid Secondary School Courses in Algebra 1 on Teaching Practices, Classroom Quality and Adolescent.
Mathematics and Science Partnership APR Updates apr.ed-msp.net.
1 An Evaluation Plan and Tool Kit for the Archibald Bush Innovative Teaching and Technology Strategies Grant Valerie Ruhe and J.D. Walker, Center for Teaching.
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Mathematics and Science Partnerships Grant RFP Informational Session April 5, 2010.
North Carolina MSP Data Collection Center. Primary Purposes To create a database application containing common information for all MSP’s across NC To.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
The Every Student Succeeds Act Highlights of Key Changes for States, Districts, and Schools.
AssessPlanDo Review QuestionYesNo? Do I know what I want to evaluate and why? Consider drivers and audience Do I already know the answer to my evaluation.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Unit 3 Introduction to Marketing
Approved Evaluator Training Provider Application Process
2018 OSEP Project Directors’ Conference
Grantee Guide to Project Performance Measurement
What types of research are exempt and ohrp guidance on exemptions
Evidence-Based Practices Under ESSA for Title II, Part A
Fallsmead Elementary School
Presentation transcript:

Strategies for Effective Program Evaluations U.S. Department of Education The contents of this presentation were produced by the Coalition for Evidence-Based Policy, in partnership with the National Opinion Research Center (NORC) at the University of Chicago, under a contract with the Institute of Education Sciences.

Step 1: Find a researcher with expertise in conducting rigorous impact evaluations to include on the study team.  Contact authors of previous, well- designed impact evaluations.  Ask the evaluator to identify a plan that describes in non-technical language, how they would conduct the study.  Check references

Step 2: Decide what research question(s) the study seeks to answer.  Evaluate a specific, well-defined MSP approach.  Measure the effect of the program on teachers’ content knowledge.  Measure the impact on student knowledge.  If resources permit, ask questions about the long-term effect of the MSP projects.

Step 3: Decide on the study design.  Decide on overall design: preferably a randomized controlled trial, or if not possible, a well-matched comparison-group study.  Decide the sample size to measure teacher content knowledge – at least 90 teachers needed.  Decide the sample size to measure student achievement – at least 60 teachers needed.

Step 3 continued:  Decide how to recruit and allocate teachers.  The simplest way to allocate teachers in a RCT is to apply random assignment to the entire sample. However, if there are big differences across the group (high-achieving, low-achieving) then grouping in blocks and then random assignment.  In a comparison group study the teachers must be very closely matched: First, teachers must be closely matched by achievement levels; and second, by students’ achievement.

Step 3 continued:  Decide how to measure MSP project outcomes for student achievement. There are 3 conditions: --must obtain scores for individual students; --must obtain scores before they enter the teachers class, and at the end of the study; and --must be able to convert the scores so that they will enable comparisons across grade levels.  Decide how to measure teacher content knowledge.

Step 3 Example: Program group Control group

Step 4: Gain the cooperation of teachers and school officials.  Identify one or more senior-level advocates for the study.  Gain teachers’ cooperation by explaining the benefits of the research design.  Satisfy privacy (FERPA) concerns in your design.  Special consideration for matched comparison group of teachers from another school district.

Step 5: Allocate teachers to the program and control (or matched comparison) groups.  The evaluator should conduct random assignment of teachers rather than project director.  Ask participating teachers not to share their materials.  Ensure that students are assigned to classes using the normal procedures.

Step 6: Collect the data needed to measure the MSP project’s effectiveness.  Make process as short and streamlined as possible.  Provide a unique personal identifier for each participant.  At the time of assignment, obtain pre-program performance data.  Make every effort to collect data from 80 percent of the original sample of teachers.

Step 7: Analyze and report the study’s results.  Obtain regression-adjusted estimates of the MSP project’s effect on student achievement.  Use the students’ pre-program test score as a covariate.  Use data from all of the original participants.

Improved Intervention Improved Evaluation

Help Desk Website: Phone: WWC-9799 (8am-8pm ET Mon-Fri) Mission is to provide federal, state, and local education officials, researchers, program providers, and educators with practical, easy-to-use tools to (i) advance rigorous evaluations of educational interventions (i.e., programs, products, practices, and policies), and (ii) identify and implement evidence-based interventions.