June 25, 20161 Regional Educational Laboratory - Southwest Review of Evidence on the Effects of Teacher Professional Development on Student Achievement:

Slides:



Advertisements
Similar presentations
Synthesizing the evidence on the relationship between education, health and social capital Dan Sherman, PhD American Institutes for Research 25 February,
Advertisements

Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
How Do We Know if a Charter School is Really Succeeding? – Various Approaches to Investigating School Effectiveness October 2012 Missouri Charter Public.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
The Common Core State Standards for Mathematics. Common Core Development Initially 48 states and three territories signed on As of November 29, 2010,
MSP Evaluation Rubric and Working Definitions Xiaodong Zhang, PhD, Westat Annual State Coordinators Meeting Washington, DC, June 10-12, 2008.
Chapter 7. Getting Closer: Grading the Literature and Evaluating the Strength of the Evidence.
Campbell Collaboration Colloquium 2012 Copenhagen, Denmark The effectiveness of volunteer tutoring programmes Dr Sarah Miller Centre.
What should be the basis of
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Professional Development Activity Log: Comparing Teacher Log and Survey Approaches to Evaluating Professional Development AERA Annual Meeting Montreal,
Experiences and requirements in teacher professional development: Understanding teacher change Sylvia Linan-Thompson, Ph.D. The University of Texas at.
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Striving to Link Teacher and Student Outcomes: Results from an Analysis of Whole-school Interventions Kelly Feighan, Elena Kirtcheva, and Eric Kucharik.
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
The Campbell Collaborationwww.campbellcollaboration.org C2 Training: May 9 – 10, 2011 Data Evaluation: Initial screening and Coding Adapted from David.
Longitudinal Study to Measure Effects of MSP Professional Development on Improving Math and Science Instruction.
Professional Development Activity Log: A New Approach to Design, Measurement, Data Collection, and Analysis AERA Annual Meeting San Diego April 13, 2004.
Introduction to Systematic Reviews Afshin Ostovar Bushehr University of Medical Sciences Bushehr, /9/20151.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
The effects of within class grouping on reading achievement: A meta-analytic synthesis Kelly Puzio & Glenn Colby Vanderbilt University.
Tips for Researchers on Completing the Data Analysis Section of the IRB Application Don Allensworth-Davies, MSc Statistical Manager, Data Coordinating.
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
Classroom-Based Applications of Single-Case Designs: Methodological and Statistical Issues Joel R. Levin University of Arizona.
Session I: Unit 2 Types of Reviews September 26, 2007 NCDDR training course for NIDRR grantees: Developing Evidence-Based Products Using the Systematic.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
What Works Clearinghouse Susan Sanchez Institute of Education Sciences.
Plymouth Health Community NICE Guidance Implementation Group Workshop Two: Debriding agents and specialist wound care clinics. Pressure ulcer risk assessment.
Systematic reviews to support public policy: An overview Jeff Valentine University of Louisville AfrEA – NONIE – 3ie Cairo.
Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
Rigorous Quasi-Experimental Evaluations: Design Considerations Sung-Woo Cho, Ph.D. June 11, 2015 Success from the Start: Round 4 Convening US Department.
Classifying Designs of MSP Evaluations Lessons Learned and Recommendations Barbara E. Lovitts June 11, 2008.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
WWC Standards for Regression Discontinuity Study Designs June 2010 Presentation to the IES Research Conference John Deke ● Jill Constantine.
November 15, Regional Educational Laboratory - Southwest The Effects of Teacher Professional Development on Student Achievement: Finding from a Systematic.
OSEP Project Directors’ Conference Washington, DC July 21, 2008 Tools for Bridging the Research to Practice Gap Mary Wagner, Ph.D. SRI International.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
7.0 Evaluating Reviews of Research. Systematic Review (Meta-Analysis)
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Strategies for Effective Program Evaluations U.S. Department of Education The contents of this presentation were produced by the Coalition for Evidence-Based.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Developing an evaluation of professional development Webinar #4: Going Deeper into Analyzing Results 1.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Characteristics of Studies that might Meet the What Works Clearinghouse Standards: Tips on What to Look For 1.
Evidence Based Practice (EBP) Riphah College of Rehabilitation Sciences(RCRS) Riphah International University Islamabad.
Is a meta-analysis right for me? Jaime Peters June 2014.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
Reviewing systematic reviews: meta- analysis of What Works Clearinghouse computer-assisted interventions. November 2011 American Evaluation Association.
Evidence Review Methodological Framework: The Case of Teen Pregnancy Prevention Research April 2011 Presentation to the Inter-American Development Bank.
Systematic Reviews of Evidence Introduction & Applications AEA 2014 Claire Morgan Senior Research Associate, WestEd.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. EVIDENCE-BASED TEACHING IN NURSING – Chapter 15 –
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Are Evidence-Based Practice Websites Trustworthy and Accessible?
Supplementary Table 1. PRISMA checklist
Randomized Trials: A Brief Overview
Research Designs, Threats to Validity and the Hierarchy of Evidence and Appraisal of Limitations (HEAL) Grading System.
A Registry of Efficacy & Effectiveness Studies Supported under grant IES R305U to the Society for Research on Educational Effectiveness Jessaca.
STROBE Statement revision
Teacher Evaluation “SLO 101”
Measuring Differences in the Quality of Professional Development
Evidence Based Curriculum & Instruction
Introduction to Systematic Reviews
Evidence Based Practice
Meta-analysis, systematic reviews and research syntheses
Introduction to Systematic Reviews
Presentation transcript:

June 25, Regional Educational Laboratory - Southwest Review of Evidence on the Effects of Teacher Professional Development on Student Achievement: Finding & Suggestions for Future Evaluation Designs Presented by: Kwang Suk Yoon (AIR) Teresa Duncan (AIR) Sylvia Lee (Nat’l Taiwan U) Kathy Shapley (Edvance Research, Inc.) American Evaluation Association Washington, DC November 8, 2007

June 25, Objective To conduct a systematic review of research-based evidence on the effects of teacher professional development (PD) on student achievement

June 25, Overview of Methodology Systematic review  Using explicit & transparent methods  Following a set of standards  Being accountable, replicable, and updatable Review protocol  Aligned with What Works Clearinghouse (WWC) standards  Study selection criteria  Rigorous evidence standards  Multi-coder, multi-stage review process Screening, coding & reconciliation Evidence Review Tool (ERT)

June 25, Overview of the coding process

June 25, Electronic searches by keywords Literature search

June 25, Study selection criteria Relevance of study by:  Topic  Population  Subject  Study design Randomized controlled trial (RCT) Quasi-experimental design (QED) with matched comparison group  Student achievement outcome  Measures and their psychometric properties  Time  Country

June 25, Prescreening stage  N = 1,343 “potentially relevant” studies Reasons for failing selection criteria (1)

June 25, Stage-1 coding  N = 132 relevant studies Reasons for failing selection criteria (2)

June 25, Stage-2 coding  N = 27 relevant studies eligible for quality ratings  18 failed to meet WWC evidence standards RCT – randomization, attrition, disruption, etc. QED – baseline equivalence, attrition, disruption, etc. Inter-rater reliability  4 met evidence standards without reservations  5 met evidence standards with reservations Reasons for failing evidence standards

June 25, Stage-3 coding  Effect size  Characteristics of PD Form, duration, contact hours Content Provider Participants (volunteers?) Information about the implementation of PD Replicability issue Documentation of studies

June 25, Results (1) Paucity of rigorous studies  Only 9 studies met evidence standards Distribution of 9 studies  By study design 5 RCT 4 QED  By content area Concentrated in reading  By grade level All focused on elementary school level

June 25, Results (2) Overall effect size  Average of 20 effect sizes drawn from 9 studies =.54 Of 20 effect sizes, 12 were not statistically significant. Nine of those 12, however, are substantively important according to WWC conventions. Effects by subject area  Fairly consistent across three subject areas Effects by form, duration, and intensity of PD  Some evidence of effect of intensive PD Effects by content of PD  No consistent pattern  Failed to replicate Kennedy’s (1998) finding

June 25, Suggestions (1) Matching units of assignment and analysis Increasing statistical power to detect effects Considering potential confounding effects of PD and other important instructional factors (e.g., curriculum)

June 25, Suggestions (2) Adequate documentation of PD & study  PD implementation  Sample and cluster, if any  group assignment  baseline equivalence  effect size (ES) calculating and reporting effect sizes (weighting ES, adjusting ES for multiple comparisons, and correcting ES for clusters) Use of structured abstract to facilitate research synthesis structured abstract (Mosteller et al, 2004) claim-based structured abstract (Kelly & Yin, 2007)

June 25, Contact info & links Kwang Suk Yoon  ( ) Link to our RELSW report on the IES website  id=70 Link to our presentation material  Thank you!