Bridging the Research to Practice Gap: Perspectives from the Practice Side Ronnie Detrich Wing Institute.

Slides:



Advertisements
Similar presentations
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Advertisements

Making RTI Work for Children, Teachers, and Schools Mary K. Lose Assistant Professor Oakland University Rochester, Michigan International Reading Association.
Instructional Decision Making
Standardized Scales.
Creating an Early Childhood System Karen Ponder February 9, 2010 Arizona Early Childhood Task Force.
Making Evidence-Based Education Policy Ontario Research Chairs in Public Policy Symposium Carol Campbell Ontario Institute for Studies in Education, University.
US Office of Education K
Engaging Patients and Other Stakeholders in Clinical Research
Evidence-based Education: It Isn’t as Simple as You Might Think Ronnie Detrich Randy Keyworth Jack States Wing Institute.
Evidence, Ethics, and the Law Ronnie Detrich Wing Institute.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
From Evidence-based Practice to Practice-based Evidence: Behavior Analysis in Special Education Ronnie Detrich Wing Institute.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Clinic Staff Meeting, 10/24/07 Evidence-based Practice in Psychology (EBPP) Lindsey Cohen.
Including Parents in Evidence-based Education Ronnie Detrich Wing Institute.
Karen L. Mapp, Ed.D. Deputy Superintendent, Boston Public Schools
Learning How to Learn – in classrooms, schools and networks Sue Swaffield University of Cambridge.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Acting Commissioner, National Center for Education Research.
Why Student Perceptions Matter Rob Ramsdell, Co-founder April 2015.
Service Delivery Models and Inclusive Practices in Speech-Language Pathology: Challenges and Solutions Connecticut Speech-Language-Hearing Association.
Research to Practice: Implementing the Teaching Pyramid Mary Louise Hemmeter Vanderbilt University
Meeting SB 290 District Evaluation Requirements
Diane Paul, PhD, CCC-SLP Director, Clinical Issues In Speech-Language Pathology American Speech-Language-Hearing Association
Some Emerging Characteristics of Sustainable Practices Ronnie Detrich Randy Keyworth Jack States Wing Institute.
Designing a Culture: From Walden II to Classroom Consultation Ronnie Detrich Wing Institute Wing Institute Summit, 2014.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Title: A study… Name Department of Early Childhood Education, University of Taipei References Hoover-Dempsey, K. V., & Sandler, H. (1995). Parental involvement.
Association for Behavior Analysis Conference Sustainable Programs: In Search of the Elusive Randy Keyworth Ronnie Detrich Jack States.
The Seventh Annual Hawaii International Conference on Education Sustainability: Implementing Programs That Survive and Thrive Randy Keyworth Jack States.
Thomas College Name Major Expected date of graduation address
Building State Capacity: Tools for Analyzing Transition- Related Policies Paula D. Kohler, Ph.D., Western Michigan University National Secondary Transition.
Best Practices: Standing on the Shoulders of Giants? Ronnie Detrich Wing Institute.
1 Promoting Evidence-Informed Practice: The BASSC Perspective Michael J. Austin, PhD, MSW, MSPH BASSC Staff Director Mack Professor of Nonprofit Management.
WORKSHOP PERSPECTIVES: NORTH / SOUTH RESEARCH PARTNERSHIPS ICT IN EDUCATION GeSCI’s thematic focus areas and meta-review of ICT in education research Patti.
Academic Research Enhancement Award (AREA) Program Erica Brown, PhD Director, NIH AREA Program National Institutes of Health 1.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 New Research findings on children and youth from the National.
HECSE Quality Indicators for Leadership Preparation.
Opportunities, Challenges, and Solutions within a Family-School Partnership Approach The Future of School Psychology Task Force on Family-School Partnerships.
Policymaking and evidence: how can we improve the fit? Sonia Sodha Head of Policy & Strategy The Social Research Unit at Dartington Using evidence to improve.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
MERRELL, K.W., ERVIN, R. A., & PEACOCK, G. G. (2006). SCHOOL PSYCHOLOGY FOR THE 21ST CENTURY: FOUNDATIONS AND PRACTICES. Chapters 10 and 11.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
WHAT YOU SHOULD EXPECT FROM YOUR EVALUATOR: PROMISING PRACTICAL PRACTICES July 28, 2011 Hi-TEC Conference, San Francisco.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
The Brave New World of Special Education The purpose of special education and our roles in facilitating optimal learning outcomes for ALL students.
Family & Professional Networks in Disability Policy: A Qualitative Inquiry.
1 RESPONSE TO INSTRUCTION ________________________________ RESPONSE TO INTERVENTION New Opportunities for Students and Reading Professionals.
Getting There from Here: Creating an Evidence- Based Culture Within Special Education Ronnie Detrich Randy Keyworth Jack States.
We worry about what a child will be tomorrow, yet we forget that he is someone today. --Stacia Tauscher.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
Selecting Evidence Based Practices Oregon’s initial attempts to derive a process Implementation Conversations 11/10.
Leading Beyond the Institution: Graduates as Learners, Leaders, and Scholarly Practitioners Drs. Ron Zambo, Debby Zambo, Ray R. Buss.
THOUGHTFUL SUSTAINABILITY Teri Lewis Oregon State University Ronnie Detrich Wing Institute David StandifordOregon State University ABAI, San Antonio, Texas.
Assessing Teacher Effectiveness Charlotte Danielson
+ Evidence Based Practice University of Utah Evidence-Based Treatment and Practice: New Opportunities to Bridge Clinical Research and Practice, Enhance.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Kathy Corbiere Service Delivery and Performance Commission
What evidence can help practice decisions about what works and what doesn’t? Elizabeth Waters Chair in Public Health School of Health and Social Development,
Evidence, Ethics, and the Law Ronnie Detrich Wing Institute APBS Conference March, 2007.
Introduction to OR/IR: purpose and definitions Jane Kengeya-Kayondo, WHO/TDR.
Campbell Collaboration Colloquium 2014 "Better Evidence for a Better World" Why The U.S. Is So Bad At Knowledge Transfer and Implementation Randy Keyworth.
Cal-ABA 26th Annual Western Regional Conference What We Know About Sustaining Programs? Randy Keyworth Ronnie Detrich Jack States.
World Bank Group Impact Evaluations: Relevance and Effectiveness Jeff Tanner Task Team Leader for Impact Evaluations Independent Evaluation Group Based.
Children’s Policy Conference Austin, TX February 24, ECI as best practice model for children 0-3 years with developmental delays / chronic identified.
The PDA Center is funded by the US Department of Education Office of Special Education Programs Stories from the Field and from our Consumers Building.
Data-based Decision Making: More than the Data Ronnie Detrich Randy Keyworth Jack States Wing Institute Cal-ABA, March 2009.
Making Connections between Research and Practice
MTSS implementation: Perspectives from the National Center on Intensive Intervention Allison Gandhi, Ed.D. American Institutes for Research.
European Partnership for Supervisory Organisations in Health Services and Social Care (EPSO), Sofia, 12th October 2018 How does regulation impact on.
Seminar on the Evaluation of AUT STEM Programme
Presentation transcript:

Bridging the Research to Practice Gap: Perspectives from the Practice Side Ronnie Detrich Wing Institute

Questions What do you think when you hear the phrase “research to practice” gap? How often do you interact with researcher/practitioner in collaborative relationship? As practitioner what keeps you from being more research-based? As practitioner how often do you give feedback to researchers about practices?

Is There a Research to Practice Gap? Concern is that research based practices are not showing up in practice

Scope of the Problem 550 named interventions for children and adolescents Behavioral Cognitive- behavioral Empirically evaluated? Evidence-based interventions are less likely to be used than interventions for which there is no evidence or there is evidence about lack of impact. Kazdin (2000)

Are We Training Educators to be Evidence- based? Survey of School Psychology Directors of Training 29% Evidence-based interventions Knowledge (Shernoff, Kratochwill, & Stoiber, 2003) training 41% programs

Changing the Conversation

Why Not a Practice to Research Gap Out of necessity, practitioners are innovating in almost every case/project. These innovations rarely are communicated to researchers.  Creating a Practice to Research Gap

Why is There a Gap? Technical challenges getting research to practitioners. Cultural differences between research and practice.

Technical Challenges Dissemination problem.  Researchers primary form of dissemination is professional journals and conferences.  Research published in diverse journals.  Journals are mechanisms for researchers talking to each other.  Practitioners don’t read professional journals often enough to keep informed. Data suggest practitioners read very infrequently. Often a problem of access Certainly a problem of time.

Cultural Differences: Assumptions, Values, Goals Researchers Knowledge creation Identify relations among variables. Research will lead to social benefit someday. Place high value on “formal” knowledge. High value on rigor Practitioners “Doing good’ helping others. Practice results in benefit now. “Craft” skills highly valued. High value on relevance

Clash of Cultures for Researchers and Practitioners Researchers Publication Promotion and tenure Funding cycles Practitioners Legal Regulatory Policy Budget

Influences on Researchers Publication  Original research  Experimentally well controlled Promotion and Tenure  High level of productivity: publication rates is one of the primary measures. Funding Cycles  Typically three year cycles.

Influences on Researchers Greater emphasis on efficacy than effectiveness.  Journals and promotion and tenure rules place greater value on original research. Effectiveness research is essentially replication research.  Short funding cycles encourage efficacy research.  Typically efficacy requires less time and resources than effectiveness research, resulting in higher rates of published research. Questions related to generality (external validity, relevance) are slow to be answered.

Dissemination of Research and the Practitioner The sources of influence associated with promotion and tenure encourage dissemination through professional journals. Studies published across a large number of journals make it difficult for practitioners to have breadth of knowledge in a particular area. There are no direct requirements on researchers to make results available to practitioners.  Interaction largely between researchers rather than researchers and practitioners.

Dissemination of Research and the Practitioner What Works Clearinghouse and Campbell Collaboration are recent attempts to make evidence available to practitioners in usable form. Current limitations with clearinghouse approach  Standards for quantity and quality of research. It is necessary for practitioners to have data now or will use other criteria to determine interventions. There is not always a large body of evidence. High quality data not always available. Must address issue between most rigorous evidence and best available evidence.

Relevant Rigor Researcher Concerns Internal Validity Practitioner Concerns External Validity

Low Relevant High Low Practice Based Evidence Pseudo-Science High Effectiveness Research Efficacy Research Rigor

Low Relevant High Low Practice Based Evidence Pseudo-Science High Effectiveness Research Efficacy Research Rigor

Efficacy Research (What Works?) Largely concerned with integrity of independent variable (internal validity).  Research is often conducted in analog settings where all relevant variables can be controlled.  Studies conducted by well trained graduate students and research assistants.  Very close oversight to assure integrity.  Funded by research grants.

Effectiveness Research (When Does it Work) Concerned with “robustness” of an intervention when implemented in typical practice settings by usual care staff.  Answers questions related to external validity or generalizability of effects.  Typically smaller magnitude of effect. Likely a function of poor treatment integrity.

Impact of Efficacy Research on Practitioners Often seen as:  Irrelevant because of the analog nature of the work (“not the real world”).  Impractical because of the level of training required.  Impossible because of the resources required. Researchers are engaged in behavior that is not always highly valued by practitioners.

Influences on Practitioners Practitioners responsible to provide educational services for all children (ESEA/IDEIA). Eligibility categories for special education are very broad and are not diagnostic categories.  Researchers often narrow characteristics of students for the purpose of research. As an example, Learning Disabilities includes a very broad range of characteristics and is not particularly meaningful when developing an intervention for a specific student.  Students in special education programs often have co-morbid conditions which may limit effects of a particular intervention. Data are not readily available on effective interventions for students with co-morbid conditions because researchers exclude these students from studies.

Influences on Practitioners An insufficient number of practitioners have necessary training to implement evidence-based interventions.  Failure to have well trained, qualified staff will result in something other than evidence-based interventions being implemented. Even if well trained, necessary resources may not be in place to support evidence-based interventions. An insufficient number of decision makers have necessary skills to evaluate research and translate research to practice.

Influences on Practitioners Empirically-supported interventions may be more costly than categorical, “generic” services, i.e., early intensive behavioral interventions for children with autism, resulting in decisions to provide more costly services only following litigation. Evidence-based practice may be seen as a fad and discounted without examination.

Direction of Influence Research Practice Research Practice Unidirectional Feedback Loop Bidirectional

Closing the Gap Requiring practitioners to be researchers will not be effective, Practitioners should be consumers of research but not primary source journals.  What Works Clearinghouse and Best Evidence Encyclopedia.  Will be useful only to the extent it solves a problem for the practitioner.

Closing the Gap Create occasions for researchers and practitioners to interact.  Communities of Practice. Create context for learning difference in values and assumptions. Context for collaboration once common ground is found.

Closing the Gap Universities teach interventions that have empirical support.

Do We Have the Necessary Conditions for Closing the Gap? Not yet. Shared perspective between researchers and practitioners not readily apparent. Very little research on bridging research and practice (Schoenwald & Hoagwood, 2001; Ringeisen, Henderson, & Hoagwood, 2003).

Where to Start? Addressing motivation of practitioners to implement empirically-supported interventions.  No Child Left Behind Programmatic research related to“Goodness of fit” between an intervention and the setting in which it is to occur (one size does not fit all). Identify practices that promote sustainability. Establish empirically supported interventions for assuring treatment integrity and necessary levels of integrity to have an effect. Conditions to support effectiveness research (funding, publication guidelines, etc.)

Where to Start? Pre-service training emphasizing evidence-based interventions. Multiple methods and multiple groups evaluating research with transparent standards. Distinguish between best evidence and best available evidence.  Establish a continuum of rigor that reflects current contingencies influencing practitioners. Increase interaction between researchers and practitioners.

Summary Researchers and Practitioners responding to different assumptions, values, goals, and pressures. Research will not be important to practitioners until it solves problems that are important to them. Gap will not be bridged until direct contact with variables that influence each groups behavior. Continuous feedback loop between researchers and practitioners increases chance of success.

Thank you Copies available at winginstitute.org

Influences on Researchers and Practitioners Researchers  Publication  Promotion and tenure  Funding cycles Practitioners  Legal  Regulatory  Policy  Budget

Low Relevant High Low Practice Based Evidence Pseudo-Science High Effectiveness Research Efficacy Research Rigor

Low Relevant High Low Practice Based Evidence Pseudo-Science High Effectiveness Research Efficacy Research Rigor