Lisa Saldana and Rena Gold Blueprints for Healthy Youth Development

Slides:



Advertisements
Similar presentations
Diversity Issues in Research Charlotte Brown, Ph.D. Associate Professor of Psychiatry Western Psychiatric Institute and Clinic PMBC Summer Institute, Pittsburgh,
Advertisements

Introduction to the unit and mixed methods approaches to research Kerry Hood.
Developing Our Leaders – Creating a Foundation for Success
The Practice of Evidence Based Practice … or Can You Finish What You Started? Ron Van Treuren, Ph.D. Seven Counties Services, Inc. Louisville, KY.
US Office of Education K
Institute of Medicine: Innovations in Design and Utilization of Measurement Systems to Promote Children’s Cognitive, Affective, and Behavioral Health November.
Establishing an Effective Network of PB4L: School wide Coaches
Catulpa Community Support Services.  Use of an electronic data entry program to record demographic data and case notes to reflect service delivery 
Requires DSHS and HCA to expend state funds on: (1) Juvenile justice programs or programs related to the prevention, treatment, or care of juvenile offenders.
Linking Actions for Unmet Needs in Children’s Health
Quality evaluation and improvement for Internal Audit
Implementing EBPs in Mental Health Systems David Lynde, MSW Co-Director Dartmouth EBP Center Dartmouth Psychiatric Research Center.
Applying Multiple Frameworks and Theories in Implementation Research Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
DC Home visiting Implementation and impact evaluation
Early Childhood Mental Health Consultants Early Childhood Consultation Partnership® Funded and Supported by Connecticut’s Department of Children and Families.
Capacity for Family Partnership, Youth Partnership, Cultural and Linguistic Competence and Cross System Partnership Track 1 – Early Developmental Stages.
RE-EXAMINING THE ROLE OF PROFESSIONAL DEVELOPMENT AND TRAINING EVALUATION THROUGH AN IMPLEMENTATION SCIENCE LENS MICHELLE GRAEF & ROBIN LEAKE NHSTES June.
Dean L. Fixsen, Ph.D. Karen A. Blase, Ph.D. National Implementation Research Network Louis de la Parte Florida Mental Health Institute Implementing Innovations.
Applying Multiple Frameworks and Theories in Implementation Research (Part 2) Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Strengthening Service Quality © The Quality Service Review Institute, a Division of the Child Welfare Policy & Practice Group, 2014.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
Presenter-Dr. L.Karthiyayini Moderator- Dr. Abhishek Raut
Implementation Strategy for Evidence- Based Practices CIMH Community Development Team Model Pam Hawkins, Senior Associate Association for Criminal Justice.
Sue Huckson Program Manager National Institute of Clinical Studies Improving care for Mental Health patients in Emergency Departments.
Perioperative fasting guideline Getting it into practice Getting started.
ISLLC Standard #2 Implementation
Implementation Science Vision 21: Linking Systems of Care June 2015 Lyman Legters.
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Program Fidelity Influencing Training Program Functioning and Effectiveness Cheryl J. Woods, CSW.
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Practice Area 1: Arrest, Identification, & Detention Practice Area 2: Decision Making Regarding Charges Practice Area 3: Case Assignment, Assessment &
: The National Center at EDC
Karen A. Blase, PhD, Allison Metz, PhD and Dean L. Fixsen, PhD Frank Porter Graham Child Development Institute University of North Carolina at Chapel Hill.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
Barbara Sims Debbie Egan Dean L. Fixsen Karen A. Blase Michelle A. Duda Using Implementation Frameworks to Identify Evidence Based Practices 2011 PBIS.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
A Multi-Level Framework to Understand Factors Influencing Program Implementation in Schools Celene E. Domitrovich, Ph.D. Penn State Prevention Research.
SunCoast Region Transformation Implementation Team November 2, 2012.
Understanding Implementation and Leadership in School Mental Health.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
1 Introducing the ARC: The New Performance Appraisal Tool for RCs and UNCTs March 2016.
Laying the Foundation A Discussion on Moving Fidelity of Implementation from Compliance to Capacity Building Carol K. McElvain American Institutes for.
State of the Science in Functional Family Therapy
The New Performance Appraisal Tool for RCs and UNCTs
Dissemination and Implementation Research
Proctor’s Implementation Outcomes
Framework for Getting Results at Scale
Evidence Based Practice In the Community Sector
NYHQ DSRIP Primary Care & Behavioral Health Committee Kick-Off Meeting
Readiness Consultations
Using Observation to Enhance Supervision CIMH Symposium Supervisor Track Oakland, California April 27, 2012.
MUHC Innovation Model.
Hannah E. Frank, Lisa Saldana, Philip C. Kendall, Holle Schaper
Experiences from The Center for Deployment Psychology
What is a Learning Collaborative?
Measurement of Implementation Components Ten Years After a Nationwide Introduction of Empirically Supported Programs Society for Prevention Research,
RAPID RESPONSE program
Collective Impact Fall 2017.
Chapter 16 Nursing Informatics: Improving Workflow and Meaningful Use
The Process of VTPBiS Implementation
Professor Stephen Pilling PhD
Health care for the Homeless Strategic Planning 2018
As we reflect on policies and practices for expanding and improving early identification and early intervention for youth, I would like to tie together.
SWPB Action Planning for District Leadership
Creating System Change: A State-Initiated Rollout of the R3 Supervisor-Targeted Practice Change Model Lisa Saldana, PhD Patricia Chamberlain, PhD Jason.
Introduction & Overview
Implementation Part 2: Focus on MAT Implementation
Presentation transcript:

Improving Implementation of TFCO Through Evidence-Informed Implementation Assessment and Feedback Lisa Saldana and Rena Gold Blueprints for Healthy Youth Development Denver, Colorado April 12, 2016

Goals Define Implementation Briefly define TFCO model Describe research on measuring implementation that has come out of the TFCO model Describe how this research has lead to minor changes in TFCO implementation process Describe how this research has lead to the potential for providing empirical, data-driven assessment and feedback to stakeholders attempting to adopt TFCO

What Happens to Innovation? Good ideas spread on their own! The fundamental question driving implementation science is this: What happens to good ideas? Let’s take an example from industry, the iPad. How many people in here own an iPad? An iphone? 2009, the first year that the iPads were released, 25 million iPads were sold. Two years later, 11 million were sold in one quarter! Meaning, that in just two years, Apple doubled its selling power almost 50%, reaching over 44 million people in 2011. 92% of Fortune500 companies are testing or deploying iPads, and they are now distributed in over 90 countries. The iPhone is another example, where it has been estimated that around 13% of the US mobile phone users use an iPhone. So do good ideas just spread on their own? Certainly, some do, especially ones such as the iPhone and Ipad (although I would argue there is a significant marketing strategy driving sales, as well as a socially reinforcing benefit of using the machines ).

Defining Implementation Dissemination Diffusion “Make it happen” “Help it happen” I want to start by presenting some definitions as the field has been hampered by . I know that many people have done so, but I’d like to frame my presentation within the lens of these terms. Diffusion = passive spread; such as a new innovation that is spread between nurses and doctors by word of mouth and the influence of social networks. Dissemination = Targeted distribution of information and intervention materials to a specific audience with the intent of spreading knowledge. An example of this might be the dissemination of evidence-based practice guidelines, manuals, or continuing education workshops. Implementation = Use of strategies to adopt and integrate evidence-based health interventions and change practice patterns within specific settings. An example of this might include auditing doctors and providing feedback in their use of a new innovation, such as a new medicine or patient care technology. If you think about it on a continuum, these activities range from letting it happen, helping it happen, and making it happen. Just in full disclosure, my bias is that we need to make it happen especially in healthcare, and I’ll explain why as we move forward. “Let it happen” (Greenhalgh et al., 2004; Lomas, 1993)

Implementation Process

Defining Implementation Spans the continuum from the pre-implementation toward the development of competency and independence Aarons, G.A., Hurlburt, M. & Horwitz, S.M. (2011). Advancing a Conceptual Model of Evidence-Based Practice Implementation in Public Service Sectors. Administration and Policy in Mental Health and Mental Health Services Research.38, 4-23.

The Challenge of Measuring Implementation Implementation of EBP entails extensive planning, training, and quality assurance Involves a complex set of interactions between developers, system leaders, front line staff, and consumers Recursive process of well defined stages or steps that are not necessarily linear

Agents in Implementation Agency Leadership System Program Champion Developer Practitioners Purveyor Client/Patient

Natural Tension Methodology of Intervention Development and Real- World Pace, Resources, and Needs for Implementation Rigor of Science and Adaptation to Fit Contexts

But there Is Agreement Implementation Components Rated as Important by Both Purveyors and Implementers Blasé, K., Naoom, S., Wallace, F., & Fixsen, D. (2004). Understanding Purveyor and Implementer Perceptions of Implementing Evidence-Based Programs

Treatment Foster Care Oregon

TFCO Basics EBP for youth who otherwise would be in congregate care– JJ and CWS Youth placed in well supported foster homes Backed by multiple randomized clinical trials Currently being implemented in over 70 sites domestically and internationally

TFCO Basics Objective How is this achieved? Change the trajectory of negative behavior by improving social adjustment across settings How is this achieved? Simultaneous & well-coordinated treatments in multiple settings Home School Community Peer group

TFCO Basics Youth are placed individually in foster homes Treatment in a family setting focusing on the youth and the family Intensive support and treatment in a setting that closely mirrors normative life Intensive parent management training Youth attend public schools

Known Risk and Protective Factors From TFCO Research Effects mediated by: Supervision Relationship with a mentoring adult Consistent non-harsh discipline Less association with delinquent peers Homework completion

Implementation of TFCO Many moving parts Agents involved to varying degrees during different parts of the process Well defined, yet individualized Multiple decision points Agency Driven and Purveyor Supported

The Role of Measurement to Inform Successful Implementation Can we use measurement to inform feedback within these interactions? How can we develop metrics and benchmarks to create data-driven feedback?

The Stages of Implementation Completion (SIC)TM Initially developed to measure the TFCO implementation process as part of a large scale implementation trial to scale- up TFCO (Chamberlain PI, NIMH R01MH076158) Measures implementation activities from Engagement to Competency Spans three Phases: Pre-implementation, Implementation, Sustainment Has been adapted for multiple EBPs in different service sectors including JJ, CWS, HIV Prevention, Housing, Substance Use, Primary Care, Mental Health, Schools

Operationalizing Implementation Activities that are thought to be necessary for implementation Activities that are conducted as usual practice (even without evidence of being necessary) during implementation Date driven Must define what “completed” means Must define what missing data means not completed not completed because not relevant in this implementation context completed but with a previous implementation completed but not certain when it happened

SIC Stages 8 Stages: Involvement: 1. Engagement System Leader 2. Consideration of Feasibility System Leader, Agency 3. Readiness Planning System Leader, Agency 4. Staff Hired and Trained Agency, Practitioner 5. Adherence Monitoring Practitioner, Client Established 6. Services and Consultation Practitioner, Client 7. Ongoing Services, Practitioner, Client Consultation, Fidelity, Feedback 8. Competency (certification) System Leader, Agency, Practitioner, Client Pre Imp Sus

TFCO SIC Activities Within Stage Engagement Date agreed to consider implementation Consideration of Feasibility Date of stakeholder meeting #1 3. Readiness Planning Date of cost calculator/funding plan review 4. Staff Hired and Trained Date of initial supervisor training 5. Adherence Monitoring Established Date fidelity technology set-up 6. Services and Consultation Begin Date of first client served 7. Ongoing Services, Consultation, Fidelity, Feedback Date of Implementation Review #1 8. Competency (certification) Date of first certification application submitted

Yields THREE Scores Duration Proportion Stage Score

TFCO-SIC Outcomes Reliably distinguish good from poor performers Reliability distinguishes between implementation strategies Meaningful prediction of implementation milestones Pre-implementation SIC behavior predicts successful program start-up Pre-implementation SIC behavior predicts discontinuing program Pre-implementation and implementation behavior combined predict development of Competency (Stage 8)

What Is Pre-Implementation? Stage 1: Engagement Date site informed/learned of TFCO services Date of interest indicated Date agreed to consider implementing Stage 2: Consideration of Feasibility Date of first response to planning contact Date Stakeholder meeting #1 held Date Feasibility Questionnaire completed

What Is Pre-Implementation? Stage 3: Readiness Planning Date of cost calculator/funding plan review Date of staff sequencing, timeline, hiring plan Date of Foster Parent recruitment review Date of referral criteria and liaison review Date of communication plan review Date of stakeholder meeting #2 Date of written implementation plan Date provider selected

Which Aspects are Most Influential? Qualitative interviews with actively implementing sites Encouraged bi-directional developer-agency dialogue Provided insights into common challenges Provided insights into the rigor-flexibility tension

Combining Qualitative and Quantitative Data Stage 3: Readiness Planning Date of cost calculator/funding plan review Date of staff sequencing, timeline, hiring plan Date of Foster Parent recruitment review Date of referral criteria and liaison review Date of communication plan review Date of stakeholder meeting #2 Date of written implementation plan Date provider selected

Using Data to Guide Success Turning Lessons Learned into Implementation Feedback and Support Tools

TIP SHEETS Foster Parent Recruitment Referral/Eligibility Criteria

Web-Based SIC Developed with funding from NIMH as part of an Administrative Supplement to a larger SIC evaluation grant (Saldana, PI; R01 MH097748-S1) User-friendly data entry to address challenges of being date driven Build a Repository of SIC data across multiple practices Develop a method for providing feedback to improve the chance for implementation success

Enhancing Consultation

Combining Implementation and Program Delivery Data

Future Directions Research Practice Theory Develop a protocol for using the implementation feedback system Evaluate the effectiveness of implementation feedback In TFCO Equally effective across EBPs? If demonstrated to be effective, then ultimately another implementation challenge Theory Research Practice

SIC TEAM Patricia Chamberlain, PhD Co-I Jason Chapman, PhD Co- I Analyst John Landsverk, PhD Co-I Mark Campbell, MS Research Economist Holle Schaper, MS Statistician Coordinator Courtenay Padgett, MS Project Coordinator Vanessa Ewen Data Manager Katie Lewis Editorial Assistant Off-Site Investigative Team: Sonja Schoenwald, PhD Co-I David Bradford, PhD Co-I Larry Palinkas, PhD Co-I Advisory Board: Greg Aarons, Sarah Horowitz, Lonnie Snowden, Lynne Marsenich,

THANK YOU PI: Saldana NIMH R01 MH097748 NIMH R01 MH097748-S1 PI: Chamberlain NIMH R01MH076158 NIDA R01MH076158-05S1 NIDA P50 DA035763-01

THANK YOU ! lisas@oslc.org