Ensuring RtI’s Successful Implementation

Slides:



Advertisements
Similar presentations
Integrated Systems to Support Implementation
Advertisements

Scaling Up: From Research to National Implementation
SISEP Dean Fixsen, Karen Blase, Rob Horner, and George Sugai
Dean L. Fixsen, Karen A. Blase, Michelle A. Duda, Sandra F. Naoom, Melissa Van Dyke National Implementation Research Network Implementation and System.
April 10, 2013 SPDG Implementation Science Webinar #3: Organization Drivers.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
CT PBS Coaches’ Meeting Coaching SWPBS Basics December 9, 2008 Brandi Simonsen, Kari Sassu, & George Sugai.
Rob Horner University of Oregon Implementation of Evidence-based practices School-wide behavior support Scaling evidence-based practices.
Scaling Up Innovations Dean L. Fixsen, Ph.D. & Karen A. Blase, Ph.D. University of North Carolina - Chapel Hill Rob Horner, Ph.D. University of Oregon.
April 29, 2014 WA OSPI SISEP Active State State Capacity Assessment Results.
RE-EXAMINING THE ROLE OF PROFESSIONAL DEVELOPMENT AND TRAINING EVALUATION THROUGH AN IMPLEMENTATION SCIENCE LENS MICHELLE GRAEF & ROBIN LEAKE NHSTES June.
Dean L. Fixsen, Ph.D. Karen A. Blase, Ph.D. National Implementation Research Network Louis de la Parte Florida Mental Health Institute Implementing Innovations.
Effective and Scientific Implementation of EBP Initiatives in Community Corrections: Motivational Interviewing, Progression Matrix, and B.SMART, Chad Dilworth.
Dean Fixsen, Karen Blase, Rob Horner, and George Sugai University of North Carolina – Chapel Hill University of Oregon University of Connecticut Scaling.
Dean Fixsen, Karen Blase, Rob Horner, and George Sugai Taking EBPs to Scale: Capacity Building PBS Conference 2008.
The District Role in Implementing and Sustaining PBIS
9/15/20151 Scaling Up Presentation: SIG/SPDG Regional Meeting October 2009 Marick Tedesco, Ph.D. State Transformation Specialist for Scaling Up.
Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute.
Sustainability Through the Looking Glass: Shifting Contingencies Across Levels of a System Jack States Randy Keyworth Ronnie Detrich 34th Annual Convention.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
Maine’s Response to Intervention Implementation: Moving Forward Presented by: Barbara Moody Title II Coordinator Maine Department of Education.
Evidence-Based Practices Implementation for Capacity Chad Dilworth EBP Implementation Specialist Ty Crocker EBP Implementation Specialist.
V Implementing and Sustaining Effective Programs and Services that Promote the Social-Emotional Development of Young Children Part I Karen Blase, Barbara.
Implementation Science 101 Vestena Robbins, PhD Kentucky Dept for Behavioral Health, Developmental and Intellectual Disabilities.
Dean L. Fixsen, Karen A. Blase, Michelle A. Duda, Sandra F. Naoom, Melissa Van Dyke National Implementation Research Network Frank Porter Graham Child.
Using Implementation Science To Inform Professional Development
SectionVideo/PresentSlidesTotal Time Overview + Useable Intervention8:30 min Stages7:19 min Teams PDSA Terri present Drivers8:50 min Lessons Learned +
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Implementing School-wide PBIS Pennsylvania PBIS Implementer’s Forum Rob Horner University of Oregon.
Implementation and Scaling Literacy Programs
Improving Outcomes for All Students: Bringing Evidence-Based Practices to Scale March 25, 2009 MN RtI Center Conference Cammy Lehr, Ph.D. EBP & Implementation.
Dean L. Fixsen, Karen A. Blase, Michelle Duda, Sandra Naoom, Melissa Van Dyke, Frances Wallace National Implementation Research Network Louis de la Parte.
Strand D Sustaining & Scaling Implementation of SWPBS: Systems & Applications Rob Horner & George Sugai OSEP Center on PBIS April 4,
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
Barbara Sims Brenda Melcher Dean Fixsen Karen Blase Michelle Duda Washington, D.C. July 2013 Keep Dancing After the Music Stops OSEP Project Directors’
Rob Horner OSEP Center on PBIS Jon Potter Oregon RTI David Putnam Oregon RTI.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Copyright © Dean L. Fixsen and Karen A. Blase, 2008.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
Scaling Up Improved Education Dean L. Fixsen, Ph.D. & Karen A. Blase, Ph.D. University of North Carolina - Chapel Hill Rob Horner, Ph.D. University of.
Integrated System for Student Achievement SISEP in Illinois.
Dean L. Fixsen, Karen A. Blase, Sandra F. Naoom, Melissa Van Dyke, Frances Wallace National Implementation Research Network Louis de la Parte Florida Mental.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett Cyndi Boezio,
Victoria White, PhD Ann George, EdD Associate Professor Assistant Professor Director of KC Metro Center SSLS.
California Implementation of Functional Family Therapy (FFT) California Institute for Mental Health (CiMH) Todd Sosna Ph.D., Senior Associate, Center for.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. &
Karen A. Blase, PhD, Allison Metz, PhD and Dean L. Fixsen, PhD Frank Porter Graham Child Development Institute University of North Carolina at Chapel Hill.
State Scaling-up Workgroup March 6, SWG Agenda 9:00 Welcome & Introductions 9:15 Review Agenda/TIPS Meeting Form 9:20 Purpose of the Scaling-Up.
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
Barbara Sims Debbie Egan Dean L. Fixsen Karen A. Blase Michelle A. Duda Using Implementation Frameworks to Identify Evidence Based Practices 2011 PBIS.
National Child Welfare Resource Center for Organizational Improvement A service of the Children’s Bureau, Member of the T/TA Network Readiness for Systemic.
Copyright © Dean L. Fixsen and Karen A. Blase, Dean L. Fixsen, Ph.D. National Implementation Research Network Louis de la Parte Florida Mental Health.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
IMPLEMENTATION SCIENCE OVERVIEW. CONTEXT & RATIONALE.
Understanding Implementation and Leadership in School Mental Health.
Michelle A. Duda Barbara Sims Dean L. Fixsen Karen A. Blase October 2, 2012 Making It Happen With Active Implementation Frameworks: Implementation Drivers.
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
1 Organizational Readiness Jan M. Markiewicz, M.Ed. National Center August 13, 2007.
Min.cenmi.org Michigan Implementation Network Providing Support through District Leadership and Implementation Team April 29, 2010 Michigan Implementation.
Translating Aspiration to Operation: Developing Active Implementation Capacity and Infrastructure W. Oscar Fleming; MSPH Tuesday, September 29 th 2:30-4:00pm.
Tier 1 Positive Behavior Support Response to Intervention for Behavior Faculty Overview.
Sandra F. Naoom, MSPH National Implementation Research Network Frank Porter Graham Child Development Institute University of North Carolina- Chapel Hill.
RDQ 5 District Coaching Capacity Discussion Leader: George Sugai, University of Connecticut.
An Introduction to Implementation Tools to Help Build Implementation Capacity SPDG Evaluators May 2012 Michelle A. Duda, Dean L. Fixsen,
District Leadership Team Sustainability Susan Barrett Director, Mid-Atlantic PBIS Network Sheppard Pratt Health.
Using the Hexagon tool to Effectively select a practice
Assessing Readiness Erin Chaparro, Ph.D. Kimberly Ingram-West, Ph.D.
Response to Intervention in Illinois
Presentation transcript:

Ensuring RtI’s Successful Implementation Virginia RtI Meeting 2007 Dean L. Fixsen, Karen A. Blase, Sandra F. Naoom, Melissa Van Dyke, Frances Wallace National Implementation Research Network Louis de la Parte Florida Mental Health Institute (c) Dean Fixsen and Karen Blase, 2004

A Functional System Policies Bureaucracy Agencies Practitioners (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 2

Evidence-Based Movement The “evidence-based movement” is an international experiment to make better use of research findings in typical service settings. The purpose is to produce greater benefits to students and society. (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 3

Education 65 million kids 6 million teachers and staff 140,000 schools 3,143 counties 60 states & U.S. jurisdictions (c) Dean Fixsen and Karen Blase, 2004

Science to Service GAP SCIENCE IMPLEMENTATION SERVICE (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 5

Science to Service Science to Service Gap Implementation Gap What is known is not what is adopted to help students, families, and communities Implementation Gap What is adopted is not used with fidelity and good outcomes for consumers. What is used with fidelity is not sustained for a useful period of time. What is sustained is not used on a scale sufficient to impact societal outcomes. We talk about ebp’s as though they are taking over the service sectors from mental health intervention to primary prevention…from older adults to infants. While the fact of the matter is that what is known is not what is generally adopted to help children, families and adults. Along with the Science to Service Gap there is the Implementation Gap: There are not clear pathways from a series of rigorous scientific studies to widespread use of a practice or program. When adoption does occur what is is adopted is often not done with fidelity and benefits to consumers are not realized. And often with staff turnover, the practice or program walks out the door with the exiting staff. (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 6

Making use of Science Letting it happen Helping it happen Recipients are accountable Helping it happen Making it happen Implementation teams are accountable Based on Greenhalgh, Robert, MacFarlane, Bate, & Kyriakidou, 2004 (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 7

Teaching–Family Replications Fixsen, Blase, Timbers, & Wolf (2001) 900 300 800 700 250 600 200 500 Cumulative Homes 150 400 300 100 Cumulative Couples 200 50 100 ≤1972 1974 1976 1978 1980 1982 (c) Dean Fixsen and Karen Blase, 2004

Follow Through Programs The purpose of the Follow Through planned variation experiment was to identify effective educational methods. However, there is little utility in identifying effective methods if they are not then made accessible to school districts. The Joint Dissemination Review Panel and the National Diffusion Network were created to validate and disseminate effective educational programs. In 1977, Follow Through sponsors submitted programs to the JDRP. "Effectiveness" was, however, broadly interpreted. For example, according the JDRP, the positive impact of a program need not be directly related to academic achievement. In addition, a program could be judged effective if it had a positive impact on individuals other than students. As a result, programs that had failed to improve academic achievement in Follow Through were rated as "exemplary and effective." And, once a program was validated, it was packaged and disseminated to schools through the National Diffusion Network. The JDRP's validation practices did not go unchallenged. According to former Commissioner of Education, Ernest Boyer, "Since only one of the sponsors (Direct Instruction) was found to produce positive results more consistently than any of the others, it would be inappropriate and irresponsible to disseminate information on all the models..." (quoted in Carnine, 1984, p. 87). However, commissioner Boyer's concerns could not prevent the widespread dissemination of ineffective instructional approaches. The JDRP apparently felt that to be "fair" it had to represent the multiplicity of methods in education. Not only did this practice make it virtually impossible for school districts to distinguish between effective and ineffective programs, it defeated the very purpose for which the JDRP and NDN were established. Quoted from: Cathy L. Watkins (1995). Follow Through: Why Didn't We? Effective School Practices, 15 (1), Figure 1: This figure shows the average effects of nine Follow Through models on measures of basic skills (word knowledge, spelling, language, and math computation), cognitive-conceptual skills (reading comprehension, math concepts, and math problem solving) and self-concept. This figure is adapted from Engelmann, S. and Carnine, D. (1982), Theory of Instruction: Principles and applications. New York: Irvington Press. (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 9

School Wide PBS 4.2% of all schools

Implementation Reviews Human service prevention and treatment programs (e.g. education, substance abuse, adult / children’s MH, justice, health) Advanced manufacturing technologies AMA clinical guidelines Engineering: bridge maintenance Hotel service management National franchise operations Cancer prevention & treatment (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 11

Ineffective Methods Excellent experimental evidence for what does not work Diffusion/dissemination of information by itself does not lead to successful implementation (research literature, mailings, promulgation of practice guidelines) Training alone, no matter how well done, does not lead to successful implementation (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 12

Ineffective Methods Excellent evidence for what does not work Implementation by edict by itself does not work Implementation by “following the money” by itself does not work Implementation without changing supporting roles and functions does not work Nutt, P. C. (1986). Tactics of Implementation. Academy of Management Journal, 29(2), 230-261. Paul Nutt (2002). Why Decisions Fail (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 13

What Works + = Good outcomes for consumers Effective intervention practices + Effective implementation practices = Good outcomes for consumers (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 14

Implement Innovations IMPLEMENTATION Effective NOT Effective Student Benefits Effective INTERVENTION Placebo NOT Effective PLACEBO: Something of no intrinsic remedial value that is used to appease or reassure another (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 15

Implementation An effective intervention is one thing Implementation of an effective intervention is a very different thing (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 16

EBPs & Implementation From an implementation perspective, what do we need to know about innovations such as evidence-based programs? (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 17

EBPs & Implementation The usability of a program has little to do with the quality or weight of the evidence regarding that program Evidence on intervention effectiveness for specific populations helps us choose what to implement Evidence on the effectiveness of the intervention does not help implement the program or practice successfully (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 18

EBPs & Implementation Core intervention components Clearly described (who/what) Practical measure of fidelity Fully operationalized (do/say) Field tested (recursive revision) Contextualized (org./systems fit) Effective (worth the effort) Nielsen, J. (2000). Why You Only Need to Test With 5 Users. Retrieved April 22, 2007, from http://www.useit.com/alertbox/20000319.html Rubin, J. (1994). Handbook of usability testing: How to plan, design, and conduct effective tests. New York: John Wiley & Sons. Frick, T., Elder, M., Hebb, C., Wang, Y., & Yoon, S. (2006). Adaptive usability evaluation of complex web sites: How many tasks? Unpublished manuscript, Indiana University, W.W. Wright Education 2276, 201 N. Rose Ave., Bloomington, IN 47405-1006. Allen, B. L. (1996). Information tasks: Toward a user-centered approach to information systems. New York: Academic Press. (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 19

Implementation What do we need to know about successful implementation methods? (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 20

Stages of Implementation Implementation is not an event A mission-oriented process involving multiple decisions, actions, and corrections (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 21

Stages of Implementation Implementation occurs in stages: Exploration Installation Initial Implementation Full Implementation Innovation Sustainability 2 – 4 Years Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 (c) Dean Fixsen and Karen Blase, 2004

Stages of Implementation Implementation occurs in stages: Exploration Installation Initial Implementation Full Implementation Innovation Sustainability Intervention Outcomes 0% 100% Implementation Outcomes Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 (c) Dean Fixsen and Karen Blase, 2004

Implementation Drivers STAFF PERFORMANCE EVALUATION CONSULTATION & COACHING DECISION SUPPORT DATA SYSTEMS INTEGRATED & COMPENSATORY FACILITATIVE ADMINISTRATIVE SUPPORTS PRESERVICE TRAINING Practitioner selection. Who is qualified to carry out the evidence-based practices and programs? What are the methods for recruiting and selecting those practitioners? Beyond academic qualifications or experience factors, certain practitioner characteristics are difficult to teach in training sessions so must be part of the selection (S) criteria (e.g., knowledge of the field, common sense, social justice, ethics, willingness to learn, willingness to intervene, good judgment). Staff selection also represents the intersection with a variety of larger system variables. General workforce development issues, the overall economy, organizational financing, the demands of the evidence-based program in terms of time and skill, and so on impact the availability of staff for human service programs. Innovations such as evidence-based practices and programs represent new ways of providing treatment and support. Practitioners (and others) at an implementation site need to learn when, where, how, and with whom to use new approaches and new skills. Preservice and inservice training (T) are efficient ways to provide knowledge of background information, theory, philosophy, and values; introduce the components and rationales of key practices; and provide opportunities to practice new skills and receive feedback in a safe training environment. Most skills needed by successful practitioners can be introduced in training but really are learned on the job (C) with the help of a consultant/coach (e.g., craft information, engagement, treatment planning, teaching to concepts, clinical judgment). Implementation of evidence-based practices requires behavior change at the practitioner, supervisory, and administrative support levels. Training and coaching are the principle ways in which behavior change is brought about for carefully selected staff in the beginning stages of implementation and throughout the life of evidence-based practices and programs. Staff performance evaluation (E) is designed to assess the use and outcomes of the skills that are reflected in the selection criteria, are taught in training, and reinforced and expanded in consultation and coaching processes. Assessments of practitioner performance and measures of fidelity also provide useful feedback to managers and purveyors regarding the progress of implementation efforts and the usefulness of training and coaching. Decision support data systems (e.g., quality improvement information, organizational fidelity measures) assesses key aspects of the overall performance of the organization to help assure continuing implementation of the core intervention components over time. Facilitative administration (A) provides leadership and makes use of a range of data inputs to inform decision making, support the overall processes, and keep staff organized and focused on the desired clinical outcomes. Finally, systems interventions (SI) are strategies to work with external systems to ensure the availability of the financial, organizational, and human resources required to support the work of the practitioners. RECRUITMENT AND SELECTION SYSTEMS INTERVENTIONS (c) Dean Fixsen and Karen Blase, 2004

and Use new Skills in the Classroom) Integrated & Compensatory    OUTCOMES (% of Participants who Demonstrate Knowledge, Demonstrate new Skills in a Training Setting, and Use new Skills in the Classroom) TRAINING COMPONENTS Knowledge Skill Demonstration Use in the Classroom Theory and Discussion 10% 5% 0% ..+Demonstration in Training 30% 20% …+ Practice & Feedback in Training 60% …+ Coaching in Classroom 95% The 2002 meta-analysis of training and coaching data by Joyce and Showers makes a compelling case for the need for skillful coaching. Only when training was accompanied by coaching in the service setting – in this case a classroom, was there substantial implementation in the practice setting. These findings move supervision from systems that monitor units of service, react to crises and advise around case specifics to active coaching systems that monitor adherence to evidence-based practices, are purposeful in developing practitioner skills and offer support in trying out new approaches during that “awkward stage” just after training.   Joyce and Showers, 2002 (c) Dean Fixsen and Karen Blase, 2004

Who does the work? Implementation Teams Develop effective, flexible, adaptable capacity to initiate and manage continual change Requires new roles, functions, and skill sets that do not exist currently (c) Dean Fixsen and Karen Blase, 2004

Implementation Team A group that knows the innovation very well (formal and practice knowledge) A group that knows how to implement that innovation with fidelity and good effect A group that accumulates data & experiential knowledge -- more effective and efficient over time (information economics, K. Arrow) An advantage of having a well organized and persistent approach to implementation of evidence-based practices and programs may be that the purveyor can accumulate knowledge over time (Fixsen & Blase, 1993; Fixsen, Phillips, & Wolf, 1978; Winter & Szulanski, 2001). Each attempted implementation of the program reveals barriers that need to be overcome and their (eventual) solutions. Problems encountered later on may be preventable with different actions earlier in the implementation process. The Toyota Supplier and Support Center (TSSC) is a purveyor of the Toyota Production Systems for manufacturing automobiles. MST Services, Inc. is the purveyor of the Multisystemic Therapy (MST) program for serious and chronic juvenile offenders. These are clear-cut examples of purveyors and each has a set of activities designed to help new organizations ("implementation sites") implement their respective programs. In other cases, the "purveyor" is not so readily identified nor are the activities well described. For example, the Assertive Community Treatment program and the Wraparound approach seem to have several individuals who act as consultants to communities and agencies interested in adopting those programs. The Wraparound group has recognized the problem of multiple definitions of their approach being used by different purveyors and have formed a national association to develop a common definition of the approach and a common set of processes for assessing the fidelity of new implementation sites (Bruns, Suter, Leverentz-Brady, & Burchard, 2004). The literature is not always clear about the activities of a purveyor. For example, the Quantum Opportunity Program (Maxfield, Schirm, & Rodriguez-Planas, 2003) was implemented in several sites in a major, multi-state test of the program. The report of the findings simply noted that the originators of the program had received funding to provide technical assistance to the implementation sites. Given the uneven results, it is unfortunate that there was no link back to purveyor activities. (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 27

Implementation Team Policy members (change policy, barrier busters, facilitators) Practice members (do the innovation, test policies, feedback) Families and stakeholders Management members (roles and functions) Daily / Weekly / Monthly Meetings (urgent, unfiltered, goal focused) (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 28

Implementation Team Simultaneous, Multi-Level Interventions School Management (leadership, policy) Administration (HR, structure) Supervision (nature, content) Teacher State and Community Context District Implementation Team Purveyors also quickly learn that the sphere of influence is critical to the success of the implementation effort and over time take on a very active, simultaneous and multi-level intervention role to help increase the likelihood that such meta-contingencies as funding, licensing, referral mechanisms, regulations, and reporting requirements are aligned to support the new way of work (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 29

Implementation Team Change the behavior of adult education professionals “Systems don’t change, people do” (J.W.) Change organizational structures, cultures, and climates Change the thinking of system directors and policy makers Successful and sustainable implementation of evidence-based programs always requires organization and systems change. (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 30

RtI

What is RtI?? Core intervention components Clearly described (who/what) Practical measure of fidelity Fully operationalized (do/say) Field tested (recursive revision) Contextualized (org./systems fit) Effective (worth the effort) Nielsen, J. (2000). Why You Only Need to Test With 5 Users. Retrieved April 22, 2007, from http://www.useit.com/alertbox/20000319.html Rubin, J. (1994). Handbook of usability testing: How to plan, design, and conduct effective tests. New York: John Wiley & Sons. Frick, T., Elder, M., Hebb, C., Wang, Y., & Yoon, S. (2006). Adaptive usability evaluation of complex web sites: How many tasks? Unpublished manuscript, Indiana University, W.W. Wright Education 2276, 201 N. Rose Ave., Bloomington, IN 47405-1006. Allen, B. L. (1996). Information tasks: Toward a user-centered approach to information systems. New York: Academic Press. (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 32

Dona Meinders, Silvia DeRuvo; WestEd, California Comprehensive Center Response to Intervention: Philosophy Core Principles (NASDSE, 2006) We can effectively teach all children We need to identify the curricular, instructional and environmental conditions for learning. Intervene early Solving small problems early is both more efficient and more successful. Use a multi-tier model of service delivery Needs-driven, resourced deployment systems to match instructional resources with student need. Dona Meinders, Silvia DeRuvo; WestEd, California Comprehensive Center

Dona Meinders, Silvia DeRuvo; WestEd, California Comprehensive Center RtI Definition: Components RtI is the practice of providing: high-quality instruction and intervention matched to student need, monitoring progress frequently to make decisions about change in instruction or goals and applying child response data to important educational decisions. (NASDSE, 2005) Dona Meinders, Silvia DeRuvo; WestEd, California Comprehensive Center

Dona Meinders, Silvia DeRuvo; WestEd, California Comprehensive Center RtI: Components (NASDSE, 2006) Monitor student progress to inform instruction Use assessment to collect information on how student is progressing Use data to make decisions Ongoing data collection systems in place and used to make informed decisions Use assessment for three different purposes Screening Diagnostics Progress monitoring Dona Meinders, Silvia DeRuvo; WestEd, California Comprehensive Center

Response to Intervention Shifts accountability for outcomes from the children to the teachers and their supporting schools & education systems

Stages of Implementation Implementation occurs in stages: Exploration Installation Initial Implementation Full Implementation Innovation Sustainability 2 – 4 Years Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 (c) Dean Fixsen and Karen Blase, 2004

RtI: Oregon Group District level entry point; system change focus Information, what it is, is not Stakeholder buy-in, Application/ selection process Informed agreement, understand and defend the initiative The Tigard-Tualatin District; Carol Sadler et al

RtI: Oregon Group District level system change Give lots of rationales Not a project, not patchwork Focus on RtI functions Establish a common vocabulary to ease communication Build on what folks are doing already – help them get ready for change

RtI: Oregon Group Demand far exceeds capacity Select the willing Be overly strict at the start Work with others to help them become “RtI ready” Develop capacity: Yr 3 sites now helping with newbies

Implementation Drivers STAFF PERFORMANCE EVALUATION CONSULTATION & COACHING DECISION SUPPORT DATA SYSTEMS INTEGRATED & COMPENSATORY FACILITATIVE ADMINISTRATIVE SUPPORTS PRESERVICE TRAINING Practitioner selection. Who is qualified to carry out the evidence-based practices and programs? What are the methods for recruiting and selecting those practitioners? Beyond academic qualifications or experience factors, certain practitioner characteristics are difficult to teach in training sessions so must be part of the selection (S) criteria (e.g., knowledge of the field, common sense, social justice, ethics, willingness to learn, willingness to intervene, good judgment). Staff selection also represents the intersection with a variety of larger system variables. General workforce development issues, the overall economy, organizational financing, the demands of the evidence-based program in terms of time and skill, and so on impact the availability of staff for human service programs. Innovations such as evidence-based practices and programs represent new ways of providing treatment and support. Practitioners (and others) at an implementation site need to learn when, where, how, and with whom to use new approaches and new skills. Preservice and inservice training (T) are efficient ways to provide knowledge of background information, theory, philosophy, and values; introduce the components and rationales of key practices; and provide opportunities to practice new skills and receive feedback in a safe training environment. Most skills needed by successful practitioners can be introduced in training but really are learned on the job (C) with the help of a consultant/coach (e.g., craft information, engagement, treatment planning, teaching to concepts, clinical judgment). Implementation of evidence-based practices requires behavior change at the practitioner, supervisory, and administrative support levels. Training and coaching are the principle ways in which behavior change is brought about for carefully selected staff in the beginning stages of implementation and throughout the life of evidence-based practices and programs. Staff performance evaluation (E) is designed to assess the use and outcomes of the skills that are reflected in the selection criteria, are taught in training, and reinforced and expanded in consultation and coaching processes. Assessments of practitioner performance and measures of fidelity also provide useful feedback to managers and purveyors regarding the progress of implementation efforts and the usefulness of training and coaching. Decision support data systems (e.g., quality improvement information, organizational fidelity measures) assesses key aspects of the overall performance of the organization to help assure continuing implementation of the core intervention components over time. Facilitative administration (A) provides leadership and makes use of a range of data inputs to inform decision making, support the overall processes, and keep staff organized and focused on the desired clinical outcomes. Finally, systems interventions (SI) are strategies to work with external systems to ensure the availability of the financial, organizational, and human resources required to support the work of the practitioners. RECRUITMENT AND SELECTION SYSTEMS INTERVENTIONS (c) Dean Fixsen and Karen Blase, 2004

RtI: Oregon Group Guided development Leadership involvement (will require more in the future) Year of training with on-going coaching (will require more on-site visits in the future) Include leaders in the training (well informed, able to explain and defend, willing to do what is required)

RtI: Oregon Group Infrastructure development DOs need information to guide the changes Top down and bottom up approach Never done, always changing, 3-ring binders and updated websites

Systems Change Organizational changes (schools and districts) System changes (state and federal) (c) Dean Fixsen and Karen Blase, 2004

RtI: Oregon Group Infrastructure development Designate funds to support implementation efforts right from the start (costs associated with making changes)

RtI: Oregon Group Issues Teacher education does not support RtI work (philosophy, values, skills) Teacher certification may need to change

A Sobering Observation "All organizations [and systems] are designed, intentionally or unwittingly, to achieve precisely the results they get." R. Spencer Darling Business Expert (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 47

Systems Change New practices do not fare well in old organizational structures and systems Develop new position descriptions and job functions in state departments and districts focused on implementation (effective use) of policies and innovations (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 48

Dona Meinders, Silvia DeRuvo; WestEd, California Comprehensive Center Multi-tier Model Academic Systems Behavioral Systems Intensive, Individual Interventions Individual Students Assessment-based High Intensity Of longer duration Intensive, Individual Interventions Individual Students Assessment-based Intense, durable procedures 1-5% 1-5% 5-10% 5-10% Targeted Group Interventions Some students (at-risk) High efficiency Rapid response Targeted Group Interventions Some students (at-risk) High efficiency Rapid response Although three tiers are the ones most often seen, an RtI model can have any number of tiers. One misinterpretation to guard against is that tier 1 is general education, tier 2 is Title I and tier 3 is Special Education. This is a common misunderstanding and cold lead to simply keeping the historical system and calling it RtI. General ed., title I and special education are resources for providing Universal interventions, supplemental interventions and intensive interventions. There are students, for example, who need intensive intervention who do not qualify for special education ( ELL, gifted and talented, students who have missed a lot of school). The focus of this model is primarily on the Nature and Intensity of instruction that students need. 80-90% Universal Interventions All students Preventive, proactive All settings, all students Dona Meinders, Silvia DeRuvo; WestEd, California Comprehensive Center (c) Dean Fixsen and Karen Blase, 2004

Attention, Effort, Precision Multi-tier Model Academic Systems Behavioral Systems Intensive, Individual Interventions Individual Students Assessment-based High Intensity Of longer duration Intensive, Individual Interventions Individual Students Assessment-based Intense, durable procedures Attention, Effort, Precision 1-5% 1-5% 5-10% 5-10% Targeted Group Interventions Some students (at-risk) High efficiency Rapid response Targeted Group Interventions Some students (at-risk) High efficiency Rapid response Although three tiers are the ones most often seen, an RtI model can have any number of tiers. One misinterpretation to guard against is that tier 1 is general education, tier 2 is Title I and tier 3 is Special Education. This is a common misunderstanding and cold lead to simply keeping the historical system and calling it RtI. General ed., title I and special education are resources for providing Universal interventions, supplemental interventions and intensive interventions. There are students, for example, who need intensive intervention who do not qualify for special education ( ELL, gifted and talented, students who have missed a lot of school). The focus of this model is primarily on the Nature and Intensity of instruction that students need. 80-90% Universal Interventions All students Preventive, proactive All settings, all students (c) Dean Fixsen and Karen Blase, 2004

“Making it happen” Implementation: Active involvement of implementation teams that work at the intersection of practices, programs, systems, communities, & scientists Implementation teams are accountable for assuring use of innovations with fidelity and good outcomes Based on Greenhalgh, Robert, MacFarlane, Bate, & Kyriakidou, 2004 (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 51

Implementation Team Implementation Team Prepare Communities Prepare schools faculty, staff Work with Researchers Assure Implementation Prepare Districts Assure Student Benefits (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 52

Systems Change Teachers / staff impact students It is the job of administrators, managers, and funders to align policies and structures to facilitate effective teacher / staff practices There is no such thing as an “administrative decision” – they are all education decisions (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 53

Systems Change Implementation Teams ALIGNMENT Federal Departments State Department Districts ALIGNMENT Schools Teachers/ Staff Effective Practices FORM FOLLOWS FUNCTION

Creating Capacity for Competent Change New OSEP Center State Implementation and Scaling up of Evidence-based Practices (SISEP) (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 55

Creating Capacity for Competent Change State Transformation Team that learns complex skills related to creating and sustaining RITs Regional Implementation Teams that learn complex skills related to system, school, and teacher change Capacity = knowledge, skills, and experience; self-correcting, self-sustaining knowledge utilization (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 56

(c) Dean Fixsen and Karen Blase, 2004 57

Functional Education Policies Bureaucracy Schools Teachers (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 58

Thank You We thank the following for their support Annie E. Casey Foundation (EBPs and cultural competence) William T. Grant Foundation (implementation literature review) Substance Abuse and Mental Health Services Administration (implementation strategies grants; NREPP reviews; SOC analyses of implementation; national implementation awards) Centers for Disease Control & Prevention (implementation research contract) National Institute of Mental Health (research and training grants) Juvenile Justice and Delinquency Prevention (program development and evaluation grants Office of Special Education Programs (Capacity Development Center contract) Agency for Children and Families (Child Welfare Leadership Development contract) (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 59

For More Information Karen A. Blase Dean L. Fixsen 813-974-4463 kblase@fmhi.usf.edu Dean L. Fixsen 813-974-4446 dfixsen@fmhi.usf.edu National Implementation Research Network At the Louis de la Parte Florida Mental Health Institute University of South Florida http://nirn.fmhi.usf.edu (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 60

For More Information Implementation Research: A Synthesis of the Literature Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005). Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Download all or part of the monograph at: http://nirn.fmhi.usf.edu/resources/publications/Monograph/index.cfm  (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen and Karen Blase, 2004 61