What Works Clearinghouse Susan Sanchez Institute of Education Sciences.

Slides:



Advertisements
Similar presentations
Evidence-based Dental Practice Developing guidelines or clinical recommendations Slide #1 This lecture follows the previous online lecture on evidence.
Advertisements

Synthesizing the evidence on the relationship between education, health and social capital Dan Sherman, PhD American Institutes for Research 25 February,
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Protocol Development.
5.0 Types of Reviews. Indispensable Rules of the Review Process Transparency Documentable Replicable.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
Selected Items from a Report of the Higher Learning Commission Comprehensive Evaluation Visit to OSU Pam Bowers Director, University Assessment & Testing.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Student Learning Development, TCD1 Systematic Approaches to Literature Reviewing Dr Tamara O’Connor Student Learning Development Trinity College Dublin.
VCE Religion and Society Revised Study
Chapter 15 Evaluation.
Chapter 7. Getting Closer: Grading the Literature and Evaluating the Strength of the Evidence.
Laneshia McCord Kent School of Social Work Reflection of i2a Class Activity.
Writing a Research Proposal
The 2010 World Programme on Population and Housing Censuses Paul Cheung, Director United Nations Statistics Division.
Standards Aligned System April 21, 2011 – In-Service.
1 The Literature Review March 2007 (3). 2 The Literature Review The review of the literature is defined as a broad, comprehensive, in- depth, systematic,
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Their contribution to knowledge Morag Heirs. Research Fellow Centre for Reviews and Dissemination University of York PhD student (NIHR funded) Health.
Systematic Review of the Literature: A Novel Research Approach.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
Research Synthesis of Population-Based Prevalence Studies ORC Macro Benita J. O’Colmain, M.P.H. Wanda Parham, M.P.A. Arlen Rosenthal, M.A. Adrienne Y.
Best Practices: Standing on the Shoulders of Giants? Ronnie Detrich Wing Institute.
Systematic Reviews.
Introduction to Systematic Reviews Afshin Ostovar Bushehr University of Medical Sciences Bushehr, /9/20151.
QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.
What is Research ? Research Methodology CHP400:
Systematic Review Module 7: Rating the Quality of Individual Studies Meera Viswanathan, PhD RTI-UNC EPC.
Session I: Unit 2 Types of Reviews September 26, 2007 NCDDR training course for NIDRR grantees: Developing Evidence-Based Products Using the Systematic.
Overview of Chapter The issues of evidence-based medicine reflect the question of how to apply clinical research literature: Why do disease and injury.
Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides.
Deciding how much confidence to place in a systematic review What do we mean by confidence in a systematic review and in an estimate of effect? How should.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
Classifying Designs of MSP Evaluations Lessons Learned and Recommendations Barbara E. Lovitts June 11, 2008.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Evaluation Proposal Defense Observations and Suggestions Yibeltal Kiflie August 2009.
WWC Standards for Regression Discontinuity Study Designs June 2010 Presentation to the IES Research Conference John Deke ● Jill Constantine.
November 15, Regional Educational Laboratory - Southwest The Effects of Teacher Professional Development on Student Achievement: Finding from a Systematic.
OSEP Project Directors’ Conference Washington, DC July 21, 2008 Tools for Bridging the Research to Practice Gap Mary Wagner, Ph.D. SRI International.
Developing a Review Protocol. 1. Title Registration 2. Protocol 3. Complete Review Components of the C2 Review Process.
Centre for the Study of Learning and Performance: Systematic Review Theme Robert M. Bernard (Theme Leader) Philip C. Abrami Richard F. Schmid Anne Wade.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Introduction to Systematic Reviews of Disability and Rehabilitation Interventions 3.0.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Systematic Approaches to Literature Reviewing Dr Tamara O’Connor Student Learning Development
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
The Process of Conducting a Systematic Review: An Overview.
RTI International is a trade name of Research Triangle Institute Nancy Berkman, PhDMeera Viswanathan, PhD
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Characteristics of Studies that might Meet the What Works Clearinghouse Standards: Tips on What to Look For 1.
ESRC Research Methods Festival st July 2008 Exploring service user participation in the systematic review process Sarah Carr, Research Analyst,
Evidence Based Practice (EBP) Riphah College of Rehabilitation Sciences(RCRS) Riphah International University Islamabad.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 27 Systematic Reviews of Research Evidence: Meta-Analysis, Metasynthesis,
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
Workshop on Standards for Clinical Practice Guidelines Institute of Medicine January 11, 2010 Vivian H. Coates, Vice President, ECRI Project Director,
Evidence Review Methodological Framework: The Case of Teen Pregnancy Prevention Research April 2011 Presentation to the Inter-American Development Bank.
June 25, Regional Educational Laboratory - Southwest Review of Evidence on the Effects of Teacher Professional Development on Student Achievement:
Systematic Reviews of Evidence Introduction & Applications AEA 2014 Claire Morgan Senior Research Associate, WestEd.
Week Seven.  The systematic and rigorous integration and synthesis of evidence is a cornerstone of EBP  Impossible to develop “best practice” guidelines,
Evidence Informed Public Health
MUHC Innovation Model.
Clinical Studies Continuum
MSc in Social Research Methods
Supplementary Table 1. PRISMA checklist
Needs assessment and evaluation : service improvement and women
H676 Meta-Analysis Brian Flay WEEK 1 Fall 2016 Thursdays 4-6:50
Systematic Review (Advanced_Course_Module_6_Appendix)
Evaluating Your Home Visiting Program
Meta-analysis, systematic reviews and research syntheses
Presentation transcript:

What Works Clearinghouse Susan Sanchez Institute of Education Sciences

Purpose: To promote education decision making through a web-based dissemination system featuring comprehensive, systematic, high- quality reviews of studies on the effectiveness of educational interventions (programs, products, practices and policies).

The WWC Does Not: l Endorse educational interventions l Conduct field studies of the effects of interventions Rather, the WWC reports on the effectiveness of educational interventions as measured by available evidence.

Challenges l Research literature too abundant for individual selection. Where to begin? l Difficult to sift and know what to trust and what to use. Lots of claims of effectiveness out there, but what can you trust? l It has to be transparent if it’s to be believed. l Complex, technical issues can take more than forever and ever to resolve.

Challenges l Opps, we got the answer but now there are new advances in methodology. l Consumers need a fast response for evidence on what works. l Pretty wonky stuff. How do you create user-friendly web-based reports and products that meets your customers’ needs?

WWC Systematic Review Process A systematic review is a review of the evidence on a clearly formulated question that uses systematic and explicit methods to identify, select and critically appraise relevant research, and to extract and analyze data from the studies that are included in the review.

WWC Systematic Review Process l Select the Topic l Develop the review protocol l Research questions l Search parameters l Inclusion/Exclusion criteria l Conduct a literature search & identify the research l Screen and code studies for relevance and methodological quality l Analyze the results of eligible studies l Summarize results

Study Review Process Does Not Pass Screen Meets WWC Evidence Standards with Reservations Submissions from the Public & Intervention Developers Literature Searches Screening Standards Pass Screen Study Reviewed Against the WWC Evidence Standards Does Not Meet WWC Evidence Screens Develop Protocol

Three Stages of Review l Screen studies for relevance l Select only relevant evidence l Assessing strength of the evidence and sort by rigor l Select only credible evidence l Sort credible evidence by whether any reservations l Identify other important study characteristics

Stage 1: Screening for Relevance l All potentially relevant studies identified through the extensive search Does the study provide evidence that, if credible, would be a valuable addition to the knowledge base regarding the effectiveness of the focal intervention?

Six Essential Screeners 1) Relevant time frame 2) Relevant intervention 3) Relevant sample 4) Relevant outcome 5) Adequate outcome measures 6) Adequate reporting

Stage 2: Assessing Strength of Evidence l The goal is to apply consistent criteria to sort evidence by its credibility into three buckets: Meets Evidence Standards Meets Evidence Standards with Reservations Does Not Meet Evidence Screens

The WWC Evidence Standards (applied to individual studies) Meets Evidence Standards RCTs without severe design or implementation flaws Meets Evidence Standards with Reservations RCTs with severe design or implementation flaws QEDs with equating and without severe design or implementation flaws Does Not Meet Evidence Screens Studies not relevant to review Studies with fatal design or implementation flaws

Five Factors Govern Judgments l Method for forming intervention and comparison groups l Evidence on baseline equivalence l Sample attrition l Possible contamination of study conditions l Threat of teacher confound

Stage 3: Other Study Characteristics l Variations in people, settings and outcomes represented l How generalizable are the findings? l Results reported by subgroups, settings and outcomes l What is the breadth of the outcomes reported? l Statistical results available l How complete is the reporting? l Do the estimates reflect statistical controls for baseline characteristics?

Evidence Base of Character Education Programs l Over 70 programs were submitted to or identified by the WWC. l 41 school-based programs met WWC definition and were eligible for review. l 93 studies on the 41 programs were collected. l 13 programs had at least one study meeting evidence standards, either with or without reservations l 27 programs had no studies passing evidence screens l 1 program is under review

13 Character Education Programs with at Least one Study Meeting Evidence Standards, either with or without Reservations 1. An Ethics Curriculum for Children8. Skills for Action 2. Building Decision Skills9. Skills for Adolescence 3. Caring School Communities10. Too Good for Drugs 4. Connect with Kids11. Too Good for Violence 5. Facing History and Ourselves12. Too Good for Violence and Drugs 6. Lessons in Character 7. Positive Action 13. Voices Literature and Character Education Program

Distribution of 92 Studies on 40 Programs across Evidence Categories Meet evidence standards with reservations Meet evidence standards Do not meet evidence screens

Common Reasons for Studies Failing to Meet Evidence Screens l Not meet relevance screens l Lack of a comparison group l Severe overall or differential attrition for QEDs l Lack of baseline equivalence for QEDs l Confound in one-teacher/school-per-condition studies l Inadequate statistical reporting for ES computation

For More Information on the WWC Visit our website at Subscribe for WWCUpdate, the WWC’s electronic news alert through our website Phone: WWC