An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.

Slides:



Advertisements
Similar presentations
Evidence-based Education: Can We Get There From Here?
Advertisements

Improving the Intelligence of Assessment Through Technology: An IES Perspective Martin E. Orland, Ph.D. Special Assistant Institute of Education Sciences.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Victorian Curriculum and Assessment Authority
Growing Success Overview
Teacher Preparation and Education Reform: A Behavioral Systems Perspective Çhair: Ronnie Detrich, Wing Institute Discussant: Chuck Salzberg, Utah State.
Evidence-based Education: It Isn’t as Simple as You Might Think Ronnie Detrich Randy Keyworth Jack States Wing Institute.
Evidence, Ethics, and the Law Ronnie Detrich Wing Institute.
“Scientifically Based Evaluation Methods” Presented by Paula J. Martin COE Conference, September 13, 2004.
Campus Improvement Plans
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
From Evidence-based Practice to Practice-based Evidence: Behavior Analysis in Special Education Ronnie Detrich Wing Institute.
Professional Judgment: Then and Now During the previous two decades, “professional judgment” was often used to indicate that a process wherein the EDT/IEP.
Chapter 1: An Overview of Program Evaluation Presenter: Stephanie Peterson.
Evaluation Research and Problem Analysis
Introduction to Evidence-Based Inquiry
West Virginia Achieves Professional Development Series Volume II Standards-Based Curriculum.
+ Evidence Based Practice University of Utah Presented by Will Backner December 2009 Training School Psychologists to be Experts in Evidence Based Practices.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Introduction to Research
Problem Identification
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Refining Your Research Question. In this session, we will… Discuss guidelines for creating a ‘good’ research question Provide time to revisit and revise.
Chapter 2 Understanding the Research Process
Standards and Guidelines for Quality Assurance in the European
NANDA International Investigating the Diagnostic Language of Nursing Practice.
From where did single-case research emerge? What is the logic behind SCDs? What is high quality research? What are the quality indicators for SCDs? SPCD.
Internal Auditing and Outsourcing
Principles of Assessment
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Sustainability Through the Looking Glass: Shifting Contingencies Across Levels of a System Jack States Randy Keyworth Ronnie Detrich 34th Annual Convention.
Association for Behavior Analysis Conference Sustainable Programs: In Search of the Elusive Randy Keyworth Ronnie Detrich Jack States.
The Seventh Annual Hawaii International Conference on Education Sustainability: Implementing Programs That Survive and Thrive Randy Keyworth Jack States.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Strategies for Teaching Learners with Special Needs (Ninth Edition) By Edward A. Polloway James R. Patton Loretta Serna.
EDU 385 Education Assessment in the Classroom
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Evaluating a Research Report
Randomized Clinical Trials: The Versatility and Malleability of the “Gold Standard” Wing Institute Jack States Ronnie Detrich Randy Keyworth “Get your.
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Raising standards, improving lives The use of assessment to improve learning: the evidence 15 September Jacqueline White HMI National Adviser for Assessment.
IMPLEMENTATION QUALITY RESEARCH OF PREVENTION PROGRAMS IN CROATIA MIRANDA NOVAK University of Zagreb, Faculty of Education and Rehabilitation Sciences.
Getting There from Here: Creating an Evidence- Based Culture Within Special Education Ronnie Detrich Randy Keyworth Jack States.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
+ IDENTIFYING AND IMPLEMENTING EDUCATIONAL PRACTICES SUPPORTED BY RIGOROUS EVIDENCE: A USER FRIENDLY GUIDE Presented by Kristi Hunziker University of Utah.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
+ NASP’s Position Statement on Prevention and Intervention Research in the Schools Training School Psychologists to be Experts in Evidence Based Practices.
+ Evidence Based Practice University of Utah Evidence-Based Treatment and Practice: New Opportunities to Bridge Clinical Research and Practice, Enhance.
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
Introduction to Research. Purpose of Research Evidence-based practice Validate clinical practice through scientific inquiry Scientific rational must exist.
Evidence, Ethics, and the Law Ronnie Detrich Wing Institute APBS Conference March, 2007.
The Kirkpatrick Model organizational change Richard Lambert, Ph.D.
RESEARCH An Overview A tutorial PowerPoint presentation by: Ramesh Adhikari.
Development of the Egyptian Code of Practice for Student Assessment Lamis Ragab, MD, MHPE Hala Salah, MD.
Bridging the Research to Practice Gap: Perspectives from the Practice Side Ronnie Detrich Wing Institute.
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Dr. Salwa El-Magoli Chairperson of the National Quality Assurance and Accreditation Committee. Former Dean of the Faculty of Agricultural, Cairo university.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
Cal-ABA 27th Annual Western Regional Conference Building a Data-based Decision Making Culture through Performance Management Randy Keyworth Ronnie Detrich.
Campbell Collaboration Colloquium 2014 "Better Evidence for a Better World" Why The U.S. Is So Bad At Knowledge Transfer and Implementation Randy Keyworth.
Cal-ABA 26th Annual Western Regional Conference What We Know About Sustaining Programs? Randy Keyworth Ronnie Detrich Jack States.
Research Design Quantitative. Quantitative Research Design Quantitative Research is the cornerstone of evidence-based practice It provides the knowledge.
Writing a sound proposal
Brodhead, Cox, and Quigley (2018)
Evidence-Based Practices Under ESSA for Title II, Part A
Presentation transcript:

An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute

Evidence-based: Why Now? No Child Left Behind (NCLB) language calls for interventions to be based on scientific research. There are over 100 references in the language of NCLB to scientific research. As with many federal policies there is no clear definition of what constitutes evidence-based. The term evidence-based is not well defined in the professional literature. There is no consensus on what constitutes evidence.

An Expanded Model of Evidence-based Practice The purpose of this paper is to define the primary components of an evidence-based culture, their functions, and how they relate to each other.

Research Replicability Sustainability Evidence-based Education Practice Research to Practice

Efficacy Research (What Works?) Primary concern is demonstration relationship between independent and dependent variable (causal relations). Precision is key to unambiguous statement of causation. Experimentally controlled so threats to internal validity are minimized. Currently, this is the most common form of published educational research. Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? What works?

Characteristics of Efficacy research Conducted in highly controlled settings Implemented by well trained change agents. Relatively small N Not always easy to immediately translate to practice. Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? What works?

Effectiveness Research (When Does it Work?) Overall goal is taking interventions to scale and evaluating the robustness when implemented in more typical practice settings. Primarily concerned with answering questions of external validity or generality of effects.  For whom does the intervention work?  In what settings can it work?  What are the necessary minimum conditions for an intervention to be effective?  What are limitations and constraints on the impact ? Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? When does it work?

Effectiveness Research (When Does it Work?) Efficacy research informs effectiveness research.  Suggests specific dimensions to examine: 3Parameters of independent variable. 3Extensions to different subject populations. 3Extensions to different settings. 3Extensions to different change agents. Rigor is still critical.  Precision of impact on independent variable may be reduced as a function of change of evaluation methods and changes in unit of analysis (impact on classroom rather than individual student).  Efficacy research has established the power of the independent variable in more precise manner than may be possible with effectiveness research. Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? When does it work?

Establishing an Evidence-base No single study is sufficient to demonstrate that an intervention is effective. Science depends on both direct and systematic replication. Currently, there is no consensus about the quantity or quality of evidence necessary to establish an intervention is evidence- based.  What Works Clearinghouse suggests: 3Randomized trials in at least two settings. 3Many subjects (150 per condition). 3Settings similar to decision maker’s school. 3If conditions met then meets criteria for strong evidence. 3If conditions not met then may meet criteria for possible evidence. 3If does not meet criteria for possible evidence, then conclude that the intervention is not supported by meaningful evidence. 3No mention of single subject research methods.

Implementation (How do we make it work?) The primary question is how to place an intervention within a specific context. Very little research on how to move evidence-based interventions into practice settings. Until questions around implementation are answered the ultimate promise of evidence-based education will go unfulfilled. One of most important tasks of implementation is analyze contingencies operating on various stakeholders in practice settings and how they influence adoption and sustainability of an intervention. Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? How do we make it work?

Implementation: Some of the questions How do we increase the interest in evidence-based interventions in practice settings? What organizational features are necessary to support evidence-based interventions? What steps are necessary to move practitioners to an evidence-based intervention after a history with other interventions? How do we write policy and regulations so that it is possible to implement evidence-based interventions at a broad level? Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? How do we make it work?

Performance Monitoring (Is it Working?) Effectiveness research guides us to implement specific interventions in specific settings to solve specific problems.  Generalizing from effectiveness research to a particular setting is always a leap in deductive logic and confidence is less than 1.0. To assure that the intervention is actually being effective must monitor the impact of the intervention in the setting (practice-based evidence). If ineffective must change one or more components of the intervention until positive effects are accomplished. Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working?

Performance Monitoring (Is it Working?) Performance monitoring is informed by efficacy and effectiveness research but it also informs both efficacy and effectiveness research.  Identifies areas where a new intervention is required because it is not effective under some conditions.  Identifies populations, settings, conditions that may be boundary conditions for an intervention unless it is modified in some way. Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working?

Performance Monitoring Ultimately, in special education the unit of analysis is the individual student so it is fundamental that data reflect performance at this level rather than aggregate measures. Performance measures should meet acceptable criteria so that we can have reasonable confidence in the data. Must choose measures that reflect important outcomes and can be linked to other important outcomes.  Curriculum based measurement is a well-established method for sampling academic performance of individual students. Currently, IEP defines important goals and what is to be measured.  Little evidence that goals are selected through systematic process or that measures are reliable and valid. Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working?

Building an Evidence-based Culture through a Gap Analysis The function of a Gap Analysis is to identify the gap between current performance and desired performance; the contingencies that account for the gap; and interventions that will close the gap. Through analysis can develop a scope and sequence for interventions so that evidence-based culture can develop.

Example of Gap Analysis GoalCurrent Conditions ContingenciesIntervention All interventions in special education are evidence- based. Many interventions have no empirical basis. Interventions selected by teacher and other professionals responsible for implementation. Reflect training and preferences rather than empirical basis. No systematic process to inform decision makers of evidence-based interventions in a particular area. Parents can influence nature of intervention through advocacy. No quality control system. Establish data base of evidence- based interventions (best available evidence). Work with parents to select from an array of evidence- based procedures. Establish QA system that reviews interventions.

Summary Evidence-based education is more than simply having the evidence. Not all of the necessary components for an evidence-based culture are well established. Powerful contingencies are in place to mitigate against moving toward evidence-based education. Analysis of these contingencies through a Gap Analysis can provide some guidance as how to proceed. Active intervention will be required.