Key messages Public policy decisions should be based on evaluations of programs that have been implemented with quality. Otherwise, the relative value.

Slides:



Advertisements
Similar presentations
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Advertisements

SISEP Dean Fixsen, Karen Blase, Rob Horner, and George Sugai
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk –
Heather J. Gotham, PhD Mid-America ATTC UMKC School of Nursing
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
CW/MH Learning Collaborative First Statewide Leadership Convening Lessons Learned from the Readiness Assessment Tools Lisa Conradi, PsyD Project Co-Investigator.
A Quick Look at Implementation Science Mary Louise Peters Technical Assistance Specialist National Early Childhood Technical Assistance Center
Making the most of what we have: The role of evidence, the importance of context and working together to achieve high quality implementation to improve.
Building Implementation Capacity to Improve Youth Outcomes Allison Metz, Ph.D. Associate Director National Implementation Research Network University of.
Prepared by the Justice Research and Statistics Association IMPLEMENTING EVIDENCE-BASED PRACTICES.
Rob Horner University of Oregon Implementation of Evidence-based practices School-wide behavior support Scaling evidence-based practices.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
RE-EXAMINING THE ROLE OF PROFESSIONAL DEVELOPMENT AND TRAINING EVALUATION THROUGH AN IMPLEMENTATION SCIENCE LENS MICHELLE GRAEF & ROBIN LEAKE NHSTES June.
Dean L. Fixsen, Ph.D. Karen A. Blase, Ph.D. National Implementation Research Network Louis de la Parte Florida Mental Health Institute Implementing Innovations.
Effective and Scientific Implementation of EBP Initiatives in Community Corrections: Motivational Interviewing, Progression Matrix, and B.SMART, Chad Dilworth.
Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
Evidence-Based Practices Implementation for Capacity Chad Dilworth EBP Implementation Specialist Ty Crocker EBP Implementation Specialist.
V Implementing and Sustaining Effective Programs and Services that Promote the Social-Emotional Development of Young Children Part I Karen Blase, Barbara.
Implementation Science 101 Vestena Robbins, PhD Kentucky Dept for Behavioral Health, Developmental and Intellectual Disabilities.
Implementation Science Vision 21: Linking Systems of Care June 2015 Lyman Legters.
SectionVideo/PresentSlidesTotal Time Overview + Useable Intervention8:30 min Stages7:19 min Teams PDSA Terri present Drivers8:50 min Lessons Learned +
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Implementing School-wide PBIS Pennsylvania PBIS Implementer’s Forum Rob Horner University of Oregon.
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
INFLUENCES ON EVALUATION: IMPLEMENTATION SCIENCE AND ITS USEFULNESS FOR EVALUATION Robyn Mildon, PhD Parenting Research Centre Annette Michaux & Andrew.
Components of a national drug prevention system Ms. UNODC.
Critical Elements Effective teaching Alignment of curriculum and instruction to standards Instructional program coherence Fidelity of implementation Evaluation.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
Getting There from Here: Creating an Evidence- Based Culture Within Special Education Ronnie Detrich Randy Keyworth Jack States.
Barbara Sims Dean L. Fixsen Karen A. Blase Michelle A. Duda
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
Integrated System for Student Achievement SISEP in Illinois.
June 1, 2012 Martha Martinez, Director, District and School Support Services District Leadership Support for Implementation of the Common Core State Standards.
: The National Center at EDC
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. &
Karen A. Blase, PhD, Allison Metz, PhD and Dean L. Fixsen, PhD Frank Porter Graham Child Development Institute University of North Carolina at Chapel Hill.
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
Barbara Sims Debbie Egan Dean L. Fixsen Karen A. Blase Michelle A. Duda Using Implementation Frameworks to Identify Evidence Based Practices 2011 PBIS.
Copyright © Dean L. Fixsen and Karen A. Blase, Dean L. Fixsen, Ph.D. National Implementation Research Network Louis de la Parte Florida Mental Health.
Circe Stumbo, West Wind Education Policy Inc. and CCSSO CCSSO/SCEE National Summit on Educator Effectiveness April 10, 2013 Implementation Science: Closing.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
SunCoast Region Transformation Implementation Team November 2, 2012.
Understanding Implementation and Leadership in School Mental Health.
CAPA in Child and Adolescent Mental Health Services An independent evaluation by the Mental Health Foundation 2009 National CAMHS Support Service.
Min.cenmi.org Michigan Implementation Network Providing Support through District Leadership and Implementation Team April 29, 2010 Michigan Implementation.
Sandra F. Naoom, MSPH National Implementation Research Network Frank Porter Graham Child Development Institute University of North Carolina- Chapel Hill.
Dissemination and Implementation Research
Projects, Events and Training
RDQ 5 District Coaching Capacity Discussion Leader: George Sugai, University of Connecticut.
Coaching for Impact Susan Barrett
University of Louisville
HR and Knowledge Management in Multidisciplinary Team
Building evaluation in the Department of Immigration and Citizenship
District Leadership Team Sustainability Susan Barrett Director, Mid-Atlantic PBIS Network Sheppard Pratt Health.
RtI Innovations: Evaluation Anna Harms & Jose Castillo
RESEARCH IMPLEMENTATION PRACTICE
Miblsi.cenmi.org Helping Students Become Better Readers with Social Skills Necessary for Success Steve Goodman Funded through OSEP.
As we reflect on policies and practices for expanding and improving early identification and early intervention for youth, I would like to tie together.
2018 OSEP Project Directors’ Conference
Capacity Building for HMIS Leads
ImpleMentAll Midterm Workshop
The Use and Impact of FTA
Martha Martinez, Director, District and School Support Services
Installation Stage and Implementation Analysis
Building Capacity for Quality Improvement A National Approach
Presentation transcript:

Robyn Mildon, PhD Executive Director Centre for Evidence and Implementation

Key messages Public policy decisions should be based on evaluations of programs that have been implemented with quality. Otherwise, the relative value and cost-effectiveness of alternative programs cannot be determined. Implementation is important for all programs and services and increasing the quality of implementation increases the chances that these will yield its intended outcomes. Durlak (2013)

Key messages It is possible to adapt an evidence-based program to fit local circumstances and needs as long as the program’s core components, established by theory or preferably through empirical research, are retained and not modified. High quality implementation is the joint responsibility of multiple stakeholders who typically include funders/policy makers, program developers/researchers, local practitioners, and local administrators. Durlak (2013)

People cannot benefit from something they do not receive

Only 9% of those who completed all 3 phases of training followed through to implementation.

Effective interventions and services + Effective implementation = Positive outcomes

Implementation science addresses the “know-do” gap in social services and health care: the disparity between what scientific research identifies as the best evidence-based practices, and what is actually done in the community. (Merrert et al 2016) Studies suggest that it takes an average of 17 years for 14 % of original research to be integrated into physician practice and that only 54 % of US adults receive care that meets indicators of high quality (McGlynn et al; 2003)

Implementation science is the study of methods to promote the use and integration of research evidence into policy and practice (Lobb and Codlitz, 2013) Implementation is the use of strategies to adopt and integrate evidence-based health, mental health, and social care interventions and change practice patterns within specific settings

Evidence to practice gap Widespread implementation with sustainment has been difficult to achieve across human services What is implemented often disappears with time and staff turnover. Moving effective interventions from development and research settings to the practice setting, with fidelity and good effect, involves far more then making the interventions available (Fixsen et al, 2005; Kauffman Foudation Best Practices Project; Rogers 1995)

Alone, these will not do. Identification and cataloguing of “evidence-based” practices and programs. Making laws and policy directives Providing funding One way dissemination of information (tip sheets, websites, lectures like this one...) Implementation without changing supporting roles and functions does not work Training alone (Embry & Biglan, 2008; Fixsen et al., 2005 )

Implementation matters

“Evidence” on effectiveness helps you select what to implement for whom “Evidence” on these outcomes does not help you implement the program or practice Fixsen & Blase (2008)

Implementation Matters (from Fixsen et al., 2005) Implementation: The How Effective Not Effective Inconsistent; not sustainable; poor outcomes Effective Actual benefits Intervention – The What Poor outcomes; sometimes harmful Not Effective Poor outcomes (Institute of Medicine, 2000; 2001; 2009; New Freedom Commission on Mental Health, 2003; National Commission on Excellence in Education,1983; Department of Health and Human Services, 1999)

Durlak & DuPre (2008) Review of over 500 implementation studies in the field of prevention and health promotion on programs for children and youth the magnitude of mean effect sizes were at least 2 to 3 4mes higher when programs were implemented well with few or no problems in the implementation full implementation of programs was associated with better outcomes, particularly when fidelity and dosage were used to measure levels of implementation.

”… in some analyses, the quality with which the intervention is implemented has been as strongly related to recidivism effects as the type of program, so much so that a well-implemented intervention of an inherently less efficacious type can outperform a more efficacious one that is poorly implemented. …” Lipsey (2009)

Letting it happen – Diffusion; networking; communication Helping it happen – Dissemination; manuals; websites Making it happen (Implementation Science) – Purposeful and proactive use of implementation practice and science Based on Hall & Hord (1987); Greenhalgh, Robert, MacFarlane, Bate, & Kyriakidou (2004); Fixsen, Blase, Duda, Naoom, & Van Dyke (2010)

Implementation frameworks

Quality Implementation Framework

Phase One : Initial Considerations regarding the host setting (elements) 1. Conducting a Needs and Resources Assessment 2. Conducting a Fit Assessment 3. Conducting a Capacity/ Readiness Assessment 4. Possibility for Adaptation 5. Obtaining Explicit Buy-in from Critical Stakeholders and Fostering a Supportive Organizational Climate 6. Building General/Organisational Capacity 7. Staff recruitment/ maintenance 8. Effective Pre-innovation Staff Training  

Phase Two : Creating a Structure for Implementation 9 Phase Two : Creating a Structure for Implementation 9. Creating Implementation Teams: 10. Developing an implementation plan: Phase three : Ongoing Structure Once Implementation Begins 11. Technical assistance/ Coaching/Supervision: 12. Process evaluation 13. Supportive feedback mechanism Phase Four : improving Futures Applications 14. Learning from experience

Implementation frameworks

Staged and phased approach to implementation Operational implementation Sustained Implementation Exploration Installation Assess needs Examine fit and feasibility Operationalise model Make decisions Stakeholder buy-in Shared vision Implementation supports developed Practitioners trained in new processes Make necessary structural and instrumental changes Identify coordinated yet differentiated roles Service delivery initiated Data used to drive decision-making and continuous improvement Rapid cycle problem solving Feedback loops among stakeholders Shared authority and decision- making Skillful implementation System and organisational changes institutionalised Child and family outcomes measureable Shared resources Shared data Culture and values shift PRC to lead This diagram shows the implementation stages that are based on the ‘implementation science’ methodology successfully employed by PRC in similar projects. It emphasises the critical important of the exploration and installation phases for ensuring a rigorous and sustainable implementation. Implementation process for complex change is around two years with appropriate support

Implementation strategies

Implementation Strategies “methods or techniques used to enhance the adoption, implementation, and sustainability of a clinical program or practice” Proctor, Powell & McMillen (2013)

Improved outcomes for children and families Effective implementation (fidelity) Coaching Systems Intervention Organization Drivers Competency Drivers Training Integrated & Compensatory Leadership So let’s try to put this into context. We start with what we’re trying to achieve – our outcome. To do that, we need to use what we think will work – the evidence-based practices that make up the IFSS Practice Model. To make sure we’re using the Practice Model well, and that the workers are supported to do that job, we put infrastructure in place to make that happen as easily as possible for the workers. What we have here is the core components or building blocks of the infrastructure needed to support practice, organsiational and systems change. They are drawn from research around the successfully implemented programs and practices. There are three categories of Implementation Drivers: Competency, Organization, and Leadership When these core components are in place they provide the support to establish and maintain a successful implementation of an evidence-based practice or other innovation. Competency Drivers are mechanisms that help to develop, improve, and sustain practitioners’ and supervisors’ ability to implement an intervention to benefit the client. Competency Drivers include: Staff selection, Training, Coaching, and Performance Assessment Organization Drivers are mechanisms to create and sustain hospitable organizational and systems environments for effective child welfare services. Organization Drivers include: Decision Support Data System, Facilitative Administration, and Systems Intervention Leadership Drivers are methods to manage Technical problems where there is high levels of agreement about problems and high levels of certainty about solutions and to constructively deal with Adaptive challenges where problems are not clear and solutions are elusive Technical Adaptive 28

Accountability Questions Relevant Lit 1 Needs/Resources What are the underlying needs & conditions that must be addressed? Needs/Resource Assessment 2 Goals What are the goals, target population, & objectives? Goal Setting 3 Best Practice What evidence based models & best practice programs can be used in reaching the goals? Consult Literature on Science Based & Best Practice Programs 4 Fit What actions need to be taken so the selected program “fits” the community context? Feedback on Comprehensiveness & Fit of Program 5 Capacities What organizational capacities are needed to implement the program? Assessment of Organizational Capacities 6 Plan What is the plan for this program? Planning 7 Process Is the program being implemented with quality? Process evaluation 8 Outcome Evaluation How well is the program working? Outcome and Impact Evaluation 9 Improve How will continuous quality improvement strategies be included? Continuous Quality Improvement 10 Sustain If the program is successful, how will it be sustained? Sustainability and Institutionalization

TOWARD AN EVIDENCE INFORMED SYSTEM FOR INNOVATION SUPPORT Training + QI/QA + Tools + TA + = Current Level of Capacity + GTO Steps: (1) Needs & Resources; (2) Goals & Desired Outcomes; (3) Science-based practices; (4) Fit; (5) Capacity ; (6) Plan; (7) Implementation & Process Evaluation; (8) Outcome evaluation; (9) Continuous Quality Improvement; and (10) Sustainability Actual Outcomes Achieved To Achieve Desired Outcomes From Barbee et al 2011.

Thank you For more information please contact Associate Professor Robyn Mildon Executive Director +61 488 188 072 Robyn.mildon@cei.org.au www.cei.org.au