Developing, Measuring, and Improving Program Fidelity: Achieving positive outcomes through high-fidelity implementation SPDG National Conference Washington,

Slides:



Advertisements
Similar presentations
Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Advertisements

Educational Specialists Performance Evaluation System
Research Findings and Issues for Implementation, Policy and Scaling Up: Training & Supporting Personnel and Program Wide Implementation
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
April 10, 2013 SPDG Implementation Science Webinar #3: Organization Drivers.
Teaching/Learning Strategies to Support Evidence-Based Practice Asoc. prof. Vida Staniuliene Klaipeda State College Dean of Faculty of Health Sciences.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
What Is “It” and How Do We Make “It” Happen Karen Blase, PhD Dean L. Fixsen, PhD Melissa Van Dyke, LCSW Michelle Duda, PhD Frank Porter Graham Child Development.
Building Implementation Capacity to Improve Youth Outcomes Allison Metz, Ph.D. Associate Director National Implementation Research Network University of.
Prepared by the Justice Research and Statistics Association IMPLEMENTING EVIDENCE-BASED PRACTICES.
Essential Elements in Implementing and Monitoring Quality RtI Procedures Rose Dymacek & Edward Daly Nebraska Department of Education University of Nebraska-
Standards and Guidelines for Quality Assurance in the European
Research to Practice: Implementing the Teaching Pyramid Mary Louise Hemmeter Vanderbilt University
Meeting SB 290 District Evaluation Requirements
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
Dean L. Fixsen, Ph.D. Karen A. Blase, Ph.D. National Implementation Research Network Louis de la Parte Florida Mental Health Institute Implementing Innovations.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Developing, Measuring, and Improving Program Fidelity: Achieving positive outcomes through high-fidelity implementation SPDG National Conference Washington,
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.

Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute.
Barbara Sims, Co-Director National SISEP Center FPG Child Development Center University of North Carolina at Chapel Hill Greensboro.NC March 20, 2013 Implementation.
Building Capacity with Implementation Drivers
V Implementing and Sustaining Effective Programs and Services that Promote the Social-Emotional Development of Young Children Part I Karen Blase, Barbara.
Implementation Science 101 Vestena Robbins, PhD Kentucky Dept for Behavioral Health, Developmental and Intellectual Disabilities.
Implementation Science Vision 21: Linking Systems of Care June 2015 Lyman Legters.
SectionVideo/PresentSlidesTotal Time Overview + Useable Intervention8:30 min Stages7:19 min Teams PDSA Terri present Drivers8:50 min Lessons Learned +
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Thomas College Name Major Expected date of graduation address
HECSE Quality Indicators for Leadership Preparation.
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Laying the Foundation for Scaling Up During Development.
Barbara Sims Brenda Melcher Dean Fixsen Karen Blase Michelle Duda Washington, D.C. July 2013 Keep Dancing After the Music Stops OSEP Project Directors’
APR Know-how Jennifer Coffey November 2013 The Revised SPDG Program Measures and Other Reporting Requirements.
Rob Horner OSEP Center on PBIS Jon Potter Oregon RTI David Putnam Oregon RTI.
Barbara Sims Dean L. Fixsen Karen A. Blase Michelle A. Duda
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
June 1, 2012 Martha Martinez, Director, District and School Support Services District Leadership Support for Implementation of the Common Core State Standards.
: The National Center at EDC
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. &
Karen A. Blase, PhD, Allison Metz, PhD and Dean L. Fixsen, PhD Frank Porter Graham Child Development Institute University of North Carolina at Chapel Hill.
January 26, 2011 Careers Conference, Madison, Wisconsin Robin Nickel, Ph.D. Associate Director, Worldwide Instructional Design System.
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
Barbara Sims Debbie Egan Dean L. Fixsen Karen A. Blase Michelle A. Duda Using Implementation Frameworks to Identify Evidence Based Practices 2011 PBIS.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
These slides were taken from: Practice Profiles Melissa Nantais, Ph.D. Professional Learning Coordinator Michigan’s Integrated Behavior & Learning Supports.
The School Effectiveness Framework
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
Coaching Within a Statewide Multi-Tiered System of Supports (MTSS) Steve Goodman miblsi.cenmi.org December 6, 2010.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Understanding Implementation and Leadership in School Mental Health.
Michelle A. Duda Barbara Sims Dean L. Fixsen Karen A. Blase October 2, 2012 Making It Happen With Active Implementation Frameworks: Implementation Drivers.
District Literacy Leaders Network Meeting March 24, :00am-12:00pm Dr. LaWonda Smith Manager, English Language Arts Dr. Argentina Back Manager, Multilingual.
Min.cenmi.org Michigan Implementation Network Providing Support through District Leadership and Implementation Team April 29, 2010 Michigan Implementation.
Coaching PLC April 5, 2011 Pat Mueller
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Sandra F. Naoom, MSPH National Implementation Research Network Frank Porter Graham Child Development Institute University of North Carolina- Chapel Hill.
Expanding the Role of the Pharmacist Enhancing Performance in Primary Care through Implementation of Comprehensive Medication Management.
Stages of Research and Development
An Introduction to Implementation Tools to Help Build Implementation Capacity SPDG Evaluators May 2012 Michelle A. Duda, Dean L. Fixsen,
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
OSEP Project Directors Meeting
Key messages Public policy decisions should be based on evaluations of programs that have been implemented with quality. Otherwise, the relative value.
Miblsi.cenmi.org Helping Students Become Better Readers with Social Skills Necessary for Success Steve Goodman Funded through OSEP.
Installation Stage and Implementation Analysis
Presentation transcript:

Developing, Measuring, and Improving Program Fidelity: Achieving positive outcomes through high-fidelity implementation SPDG National Conference Washington, DC March 5, 2012 Allison Metz, PhD, Associate Director, NIRN Frank Porter Graham Child Development Institute University of North Carolina

Program Fidelity 6 Questions What is it? Why is it important? When are we ready to assess fidelity? How do we measure fidelity? How can we produce high fidelity use of interventions in practice? How can we use fidelity data for program improvement?

“ PROGRAM FIDELITY ” “The degree to which the program or practice is implemented ‘as intended’ by the program developers and researchers.” “Fidelity measures detect the presence and strength of an intervention in practice.”

What is fidelity? Three components – Context: Structural aspects that encompass the framework for service delivery – Compliance: The extent to which the practitioner uses the core program components – Competence: Process aspects that encompass the level of skill shown by the practitioner and the “way in which the service is delivered” Question 1

Why is fidelity important? Interpret outcomes – is this an implementation challenge or intervention challenge? Detect variations in implementation Replicate consistently Ensure compliance and competence Develop and refine interventions in the context of practice Identify “active ingredients” of program Question 2

Why is fidelity important? Question 2 Effective Interventions The “WHAT” Effective Implementation The “HOW” Positive Outcomes for Children

Implementation Science EffectiveNOT Effective Effective NOT Effective IMPLEMENTATION INTERVENTION Actual Benefits (Institute of Medicine, 2000; 2001; 2009; New Freedom Commission on Mental Health, 2003; National Commission on Excellence in Education,1983; Department of Health and Human Services, 1999) Inconsistent; Not Sustainable; Poor outcomes Unpredictable or poor outcomes; Poor outcomes; Sometimes harmful from Mark Lipsey’s 2009 Meta- analytic overview of the primary factors that characterize effective juvenile offender interventions – “... in some analyses, the quality with which the intervention is implemented has been as strongly related to recidivism effects as the type of program, so much so that a well-implemented intervention of an inherently less efficacious type can outperform a more efficacious one that is poorly implemented.”

When are we ready to assess fidelity? Operationalize Part of Speech: verb Definition: to define a concept or variable so that it can be measured or expressed quantitatively Webster's New Millennium™ Dictionary of English, Preview Edition (v 0.9.7) Copyright © Lexico Publishing Group, LLC The “it” must be operationalized whether it is: An Evidence-Based Practice or Program A Best Practice Initiative or New Framework A Systems Change Initiative or Element Question 3

How developed is your WHAT? Does this approach involve the implementation of an evidence-based program or practice that has been effectively implemented in other locations? Does this approach involve the purveyor or other “expert” support? How well-defined are the critical components of the approach? Does this approach involve the implementation of an evidence-informed approach that hasn’t been implemented often or ever? To what extent is the approach still being developed or fine-tuned? How clearly defined are the critical components of the approach?

Developing Practice Profiles Each critical component is a heading For each critical component, identify: –“gold standard” practice – “expected” –developmental variations in practice –ineffective practices and undesirable practices Adapted from work of the Heartland Area Education Agency 11, Iowa

Developing Practice Profiles Adapted from work of the Heartland Area Education Agency 11, Iowa

How do we measure fidelity? Establish fidelity criteria if not yet developed 1. Identify critical components, operationalize them and determine indicators a. Describe data sources b. Make indicators as objective as possible (e.g., anchor points for rating scales) 2. Collect data to measure these indicators (“preferably though a multi-method, multi- informant approach” (Mowbray, 2003)) 3. Examine the measures in terms of reliability and validity Question 4

How do we measure fidelity? Staff performance assessments serve as a mechanism to begin to identify “process” aspects of fidelity for newly operationalized programs Contextual (or structural) aspects of fidelity are “in service to” adherence and competence. – length, intensity, and duration of service (or dosage), roles and qualifications of staff – training and coaching procedures – case protocols and procedures – administrative policies – data collection requirements – inclusion/exclusion criteria of the target population Question 4

Performance Assessment Start with the Expected/Proficient column Develop an indicator for each Expected/Proficient Activity Identify “evidence” that this activity has taken place Identify “evidence” that this activity has taken place with high quality Identify potential data source(s)

Fidelity Criteria Parent Involvement and Leadership Practice Profile Partnering Expected/ Proficient Indicator that activity is happening (Adherence) Potential Data Source Indicator that activity is happening well (Competence) Potential Data Source Encourage and include parent involvement in educational decision-making Parent/Teacher meetings take place to develop goals and plans for child progress Observation Documentation Parent feels included and respected Parent Partnering Survey

How do we measure fidelity? If fidelity criteria are already developed 1. Understand reliability and validity of instruments a. Are we measuring what we thought we were? b. Is fidelity predictive of outcomes? c. Does fidelity assessment discriminate between programs? 2. Work with program developers or purveyors to understand the detailed protocols for data collection a. Who collects the data (expert raters, teachers) b. How often is data collected c. How are data scored and analyzed 3. Understand issues (reliability, feasibility, cost) in collecting different kinds of fidelity data a. Process data vs. Structural data Question 4

How do we measure fidelity? If adapting an approach… How well ‘developed’ is the program or practice being adapted? (Winter & Szulanski, 2001) Have core program components been identified?. Do adaptations change function or form? How will adaptation affect fidelity criteria and assessments? Question 4

How do we measure fidelity? Steps to measuring fidelity (new or established criteria): 1. Assure fidelity assessors are available, understand the program or innovation, and are well versed in the education setting 2. Develop schedule for conducting fidelity assessments 3. Assure adequate preparation for teachers/practitioners being assessed 4. Report results of the fidelity assessment promptly 5. Enter results into decision-support data system 5 Questions

Build, improve and sustain practitioner competency Create hospitable organizational and systems environments Appropriate leadership strategies. How can we produce high-fidelity implementation in practice?

“IMPLEMENTATION DRIVERS” Common features of successful supports to help make full and effective uses of a wide variety of innovations

© Fixsen & Blase, 2008 Performance Assessment (Fidelity) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Integrated & Compensatory Competency Drivers Organization Drivers Leadership AdaptiveTechnical Improved Outcomes for Children and Youth Effective Education Strategies

Produce high-fidelity implementation? Fidelity is an implementation outcome ◦ Implementation Drivers influence how well or how poorly a program is implemented ◦ The full and integrated use of the Implementation Drivers supports practitioners in consistent, high-fidelity implementation of program ◦ Staff performance assessments are designed to assess the use and outcomes of the skills that are required for the high-fidelity implementation of a new program or practice Question 5

Produce high-fidelity implementation? Competency Drivers – Demonstrate knowledge, skills and abilities – Practice to criteria – Coach for competence and confidence Organizational Drivers – Use data to assess fidelity and improve program operations – Administer policies and procedures that support high-fidelity implementation – Implement needed systems interventions Leadership Drivers – Use appropriate leadership strategies to identify and solve challenges to effective implementation Question 5

Use fidelity data for program improvement? Program Review Process to create sustainable improvement cycle for program – Process and Outcome Data – measures, data sources, data collection plan – Detection Systems for Barriers – roles and responsibilities – Communication protocols – accountable, moving information up and down the system Questions to Ask – What formal and informal data have we reviewed? – What is the data telling us? – What barriers have we encountered? – Would improving the functioning of any Implementation Driver help address barrier? Question 6

Program Fidelity Fidelity has multiple facets and is critical to achieving outcomes Fully operationalized programs are pre-requisites for developing fidelity criteria Valid and reliable fidelity criteria need to be collected carefully with guidance from program developers or purveyors Fidelity is an implementation outcome; effective use of Implementation Drivers can increase our chances of high-fidelity implementation Fidelity data can and should be used for program improvement Summary

Program Fidelity Examples of fidelity instruments Teaching Pyramid Observation Tool for Preschool Classrooms (TPOT), Research Edition, Mary Louise Hemmeter and Lise Fox The PBIS fidelity measure (the SET) described at IS_ResourceID=222 IS_ResourceID=222 Articles Sanetti, L. & Kratochwill, T. (2009). Toward Developing a Science of Treatment Integrity: Introduction to the Special Series. School Psychology Review, Volume 38, No. 4, pp. 445–459. Mowbray, C.T., Holter, M.C., Teague, G.B., Bybee, D. (2003). Fidelity Criteria: Development, Measurement and Validation. American Journal of Evaluation, 24 (3), Hall, G.E., & Hord, S.M. (2011). Implementing Change: Patterns, principles and potholes (3 rd ed.)Boston: Allyn and Bacon. Resources

Stay Connected! nirn.fpg.unc.edu