November 15, 20151 Regional Educational Laboratory - Southwest The Effects of Teacher Professional Development on Student Achievement: Finding from a Systematic.

Slides:



Advertisements
Similar presentations
Synthesizing the evidence on the relationship between education, health and social capital Dan Sherman, PhD American Institutes for Research 25 February,
Advertisements

Improving the Intelligence of Assessment Through Technology: An IES Perspective Martin E. Orland, Ph.D. Special Assistant Institute of Education Sciences.
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
A Guide to Education Research in the Era of NCLB Brian Jacob University of Michigan December 5, 2007.
How Do We Know if a Charter School is Really Succeeding? – Various Approaches to Investigating School Effectiveness October 2012 Missouri Charter Public.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
COMPREHENSIVE SCHOOL REFORM PROGRAM (Obey-Porter) CSR Program Overview January 2002.
Roger D. Goddard, Ph.D. March 21, Purposes Overview of Major Research Grants Programs Administered by IES; Particular Focus on the Education Research.
Alvin Kwan Division of Information & Technology Studies
Using Statistics Effectively in Statistics Education Research Sterling C. Hilton Brigham Young University.
BUSINESS AND FINANCIAL LITERACY FOR YOUNG ENTREPRENEURS: EVIDENCE FROM BOSNIA-HERZEGOVINA Miriam Bruhn and Bilal Zia (World Bank, DECFP)
1. 2 Research should clearly describe  The intervention  How the intervention differed from the control  How the intervention is supposed to affect.
Campbell Collaboration Colloquium 2012 Copenhagen, Denmark The effectiveness of volunteer tutoring programmes Dr Sarah Miller Centre.
WRITING PROPOSALS WITH STRONG METHODOLOGY AND IMPLEMENTATION Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation 1.
September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Professional Development Activity Log: Comparing Teacher Log and Survey Approaches to Evaluating Professional Development AERA Annual Meeting Montreal,
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Education 793 Class Notes Welcome! 3 September 2003.
Systematic Reviews.
Longitudinal Study to Measure Effects of MSP Professional Development on Improving Math and Science Instruction.
Professional Development Activity Log: A New Approach to Design, Measurement, Data Collection, and Analysis AERA Annual Meeting San Diego April 13, 2004.
Introduction to Systematic Reviews Afshin Ostovar Bushehr University of Medical Sciences Bushehr, /9/20151.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Research Indicators for Sustaining and Institutionalizing Change CaMSP Network Meeting April 4 & 5, 2011 Sacramento, CA Mikala L. Rahn, PhD Public Works,
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
U.S. Department of Education Mathematics and Science Partnerships: FY 2005 Summary.
Session I: Unit 2 Types of Reviews September 26, 2007 NCDDR training course for NIDRR grantees: Developing Evidence-Based Products Using the Systematic.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
EBC course 10 April 2003 Critical Appraisal of the Clinical Literature: The Big Picture Cynthia R. Long, PhD Associate Professor Palmer Center for Chiropractic.
What Works Clearinghouse Susan Sanchez Institute of Education Sciences.
Systematic reviews to support public policy: An overview Jeff Valentine University of Louisville AfrEA – NONIE – 3ie Cairo.
Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides.
Implementation of CCSS CCCOE Curriculum Council November 2011.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
Classifying Designs of MSP Evaluations Lessons Learned and Recommendations Barbara E. Lovitts June 11, 2008.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
WWC Standards for Regression Discontinuity Study Designs June 2010 Presentation to the IES Research Conference John Deke ● Jill Constantine.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
OSEP Project Directors’ Conference Washington, DC July 21, 2008 Tools for Bridging the Research to Practice Gap Mary Wagner, Ph.D. SRI International.
No Child Left Behind. HISTORY President Lyndon B. Johnson signs Elementary and Secondary Education Act, 1965 Title I and ESEA coordinated through Improving.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Characteristics of Studies that might Meet the What Works Clearinghouse Standards: Tips on What to Look For 1.
Evidence Based Practice (EBP) Riphah College of Rehabilitation Sciences(RCRS) Riphah International University Islamabad.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
Outcomes for the ESA 6 Regional In-Service “What Works in Schools” Gain an understanding of … Dr. Marzano’s, “What Works in Schools.” the eleven factors.
June 25, Regional Educational Laboratory - Southwest Review of Evidence on the Effects of Teacher Professional Development on Student Achievement:
Systematic Reviews of Evidence Introduction & Applications AEA 2014 Claire Morgan Senior Research Associate, WestEd.
Deciphering “Evidence” in the New Era of Education Research Standards Ben Clarke, Ph.D. Research Associate - Center for Teaching and Learning, University.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. EVIDENCE-BASED TEACHING IN NURSING – Chapter 15 –
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Supplementary Table 1. PRISMA checklist
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
A Registry of Efficacy & Effectiveness Studies Supported under grant IES R305U to the Society for Research on Educational Effectiveness Jessaca.
Measuring Differences in the Quality of Professional Development
Evidence Based Curriculum & Instruction
Introduction to Systematic Reviews
Presenter: Kate Bell, MA PIP Reviewer
Meta-analysis, systematic reviews and research syntheses
Introduction to Systematic Reviews
Presentation transcript:

November 15, Regional Educational Laboratory - Southwest The Effects of Teacher Professional Development on Student Achievement: Finding from a Systematic Review of Evidence Presented by: Kwang Suk Yoon (AIR) Teresa Duncan (AIR) Sylvia Wen-Yu Lee (National Taiwan U) Kathy Shapley (Edvance Research, Inc.) American Education Research Association New York March 27, 2008

November 15, Teacher Professional Development  One of key policy strategies for standards-based reform efforts  No Child Left Behind Act Teacher Quality provisions “High-quality” professional development Expectations: PD activities are to be regularly evaluated for their impact on teacher effectiveness and improved student achievement  Until recently, little systematic effort on vetting the PD effectiveness Background

November 15, Objective AIR completed a fast-turnaround study, sponsored by the Regional Education Laboratory Southwest (RELSW), which was funded by IES To conduct a systematic review of research-based evidence on the effects of teacher professional development (PD) on student achievement

November 15, Research-based Evidence on the effects of PD on student achievement Challenges in demonstrating research-based evidence Quality of professional development  Workable theory of actions Logic model  Sufficient implementation Quality of empirical study  Valid causal inferences  Rigorous study design

November 15, Overview of methodology Systematic review  Using explicit & transparent methods  Following a set of standards  Being accountable, replicable, and updatable Review protocol  Aligned with What Works Clearinghouse (WWC) evidence standards  Study selection criteria  Multi-stage, multi-coder review process Screening, coding & reconciliation Evidence Review Tool (ERT)

November 15, Study selection criteria Topic: Inservice teacher professional development Population: K-12 students and their teachers Subject: reading/English/language arts, mathematics, or science Study design  Randomized controlled trial (RCT)  Quasi-experimental design (QED) with matched comparison group Student achievement outcome Measures and their psychometric properties Time: Country: Australia, Canada, the United Kingdom, or the United States

November 15, Overview of the review process

November 15, Literature search & prescreening Literature searches  Keyword searches on 7 major electronic databases  Contacted key researchers  Identified 1,343 citations potentially addressing the effects of PD on student achievement Prescreening  Quickly scanning the abstracts for a few selection criteria (e.g., empirical study?)  Narrowed down to 132 relevant studies

November 15, Reasons for failing Stage-1 screening criteria  Of 132 relevant studies, about two-thirds failed in the “study design” criterion Stage-1 coding

November 15, Stage-2 Coding Determining the study quality ratings  N = 27 relevant studies eligible for quality ratings 9 met WWC evidence standards  5 met evidence standards with reservations  4 met evidence standards without reservations 18 failed to meet the evidence standards  RCT – randomization, attrition, disruption, etc.  QED – baseline equivalence, attrition, disruption, etc.

November 15, Nine studies Carpenter et al., 1989 (RCT) Cole, 1992 (RCT) Duffy et al., 1986 (RCT) Marek & Methven, 1991 (QED) McCutchen et al., 2002 (QED) McGill-Franzen et al., 1999 (RCT) Saxe et al., 2001 (QED) Sloan, 1993 (RCT) Tienken, 2003 (RCT with group equivalence problems)

November 15, Documentation of studies  Effect size  Characteristics of PD Form Duration, contact hours Content – Kennedy’s (1998) classification Provider Participants (volunteers?) Information about the implementation of PD Stage-3 coding

November 15, Results (1) Paucity of rigorous studies  Only 9 studies met evidence standards  Mostly small-scale, underpowered efficacy trials Distribution of 9 studies  By study design 5 RCT 4 QED  By content area Concentrated in reading/English/language arts  By grade level All focused on elementary school level

November 15, Results (2) Overall effect size  Average of 20 effect sizes (drawn from 9 studies) =.54 Of 20 effect sizes, 12 were not statistically significant. Effects by subject area  Fairly consistent across three subject areas Effects by form, duration, and intensity of PD  Lack of variability in form: all workshop or summer institute  Some evidence of the effect of intensive PD Effects by content of PD  No consistent pattern  Failed to replicate Kennedy’s (1998) finding

November 15, Conclusion There is some evidence of positive effects of PD on student achievement.  Caveats  The limited number of studies and the variability in their professional development approaches preclude detailed conclusions about the effectiveness of particular professional development programs or about the effectiveness of professional development by such features as form and content. The PD impact research is still at its developmental stage.

November 15, Suggestions (1) Doing more well-conducted efficacy trials, replications, and effectiveness trials Improving the design of PD impact studies  Redressing common reasons for failing WWC evidence standards For example, lack of baseline equivalence in QED  Increasing statistical power to detect effects  Aligning outcome measures with PD  Examining mediation effects  Addressing potential confounding effects of PD with other important instructional factors (e.g., curriculum)

November 15, Suggestions (2) Adequate documentation of PD & study  PD: theory of action, implementation  Sample and cluster, if any  Group assignment  Baseline equivalence  Effect size (ES) Use of structured abstract to facilitate research synthesis  Structured abstract (Mosteller et al, 2004)  Claim-based structured abstract (Kelly & Yin, 2007)

November 15, Contact info & links Kwang Suk Yoon  ( ) Link to our RELSW report on the IES website  id=70 Link to our presentation material  Thank you!

November 15, Message A “stainless steel” law of systematic reviews may be operating—namely, “the more rigorous the review, the less evidence there will be that the intervention is effective.” (Peter Rossi)

November 15, Logic Model