09/17/11 Meadowcroft&Associates and Wesley Spectrum (c) 1 Converting LOCAL Program to a Valid EBP: Fidelity Management Meadowcroft & Associates and Wesley.

Slides:



Advertisements
Similar presentations
Successfully Implementing Evidence-Based Programs for Children and Families in North Carolina A Presentation for the Family Impact Seminar Michelle Hughes,
Advertisements

The Practice of Evidence Based Practice … or Can You Finish What You Started? Ron Van Treuren, Ph.D. Seven Counties Services, Inc. Louisville, KY.
Dashboard: Data Review & Input. Presentation Objectives Learning Objectives List the 5 key data components that contribute to Evidence Based treatment.
Implementation of MST in Norway Iceland June 2008 Bernadette Christensen Clinical Director of the Youth Department Anne Cathrine Strütt MST Consultant.
Performance Based Contracting and Quality Assurance: Building Systems to Support Success National Quality Improvement Center on the Privatization of Child.
Making a Difference: Measuring Your Outcomes Montgomery County Volunteer Center February 4, 2014 Pam Saussy and Barry Seltser, Consultants.
Research Insights from the Family Home Program: An Adaptation of the Teaching-Family Model at Boys Town Daniel L. Daly and Ronald W. Thompson EUSARF 2014/
PCCYFS 2012 Annual Spring Conference 02/2/12 Fidelity Management: Increasing Provider Negotiating Strength in the Environment of Evidence-based Programs.
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Introduction Results and Conclusions Comparisons on the TITIS fidelity measure indicated a significant difference between the IT and AS models on the Staffing.
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
“If you don’t know where you are going, how are you gonna’ know when you get there?” Yogi Berra Common Elements of Strong Program Evaluation THEORY OF.
Our Mission Community Outreach for Youth & Family Services, Inc. is dedicated to improving the quality of life for both the youth and adult population.
Family Resource Center Association January 2015 Quarterly Meeting.
Monitoring and Evaluation Frameworks Kyiv, Ukraine May 23, 2006 MEASURE Evaluation.
Human Services Research Institute Identifying and Developing Evidence Based Practices Presentation to: Implementing Evidence-Based Practices and Performance.
Evidence-Based Assessment Tool A Guide for Family Support Providers and Funders.
Implementation and Fidelity within a Theory-Driven Framework Greg Roberts, PhD. Vaughn Gross Center & National Center for Instruction The University of.
Diane Schilder, EdD and Jessica Young, PhD Education Development Center, Inc. Quality Rating and Improvement System (QRIS) Provisional Standards Study.
Evaluation Workshop 10/17/2014 Presenters: Jim Whittaker – KCP/Evaluation Dr. Larry M. Gant – SSW Evaluation Christiane Edwards – SSW Evaluation 1.
Evaluation: A Necessity for the Profession, an Essential Support for Practitioners Linking Intervention with Outcome Bryan Hiebert University of Victoria.
Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations.
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
NASC 2012 ANNUAL CONFERENCE AUGUST 6, 2012 NASC 2012 ANNUAL CONFERENCE AUGUST 6, 2012 Ray Wahl Deputy State Court Administrator.
Facilitated by Sharon Schnelle, Ph.D. Social Science Research Specialist Incorporating Evidence Based Practices: Overview, Opportunities & Challenges.
Too expensive Too complicated Too time consuming.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Medication Management Toolkit A guide to the use of medication algorithms to guide clinical care.
Evaluation Assists with allocating resources what is working how things can work better.
Progressing Toward a Shared Set of Methods and Standards for Developing and Using Measures of Implementation Fidelity Discussant Comments Prepared by Carol.
Unit 10. Monitoring and evaluation
Program Fidelity Influencing Training Program Functioning and Effectiveness Cheryl J. Woods, CSW.
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
1 Promoting Evidence-Informed Practice: The BASSC Perspective Michael J. Austin, PhD, MSW, MSPH BASSC Staff Director Mack Professor of Nonprofit Management.
No Place Like Home Cross-Site Evaluation Training.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Quantifying and Executing the Risk Principle in Real World Settings Webinar Presentation Justice Research and Statistics Association August 15, 2013 Kimberly.
Children’s Mental Health Reform Overview: North Sound Mental Health Administration Prepared by Julie de Losada, M.S./CMHS
Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.
Using Client Feedback to Build a Strong Therapeutic Relationship
Class 5: Evidence Based Interventions and logic models UTA SSW, SOCW 6371 Community & Administrative Practice UTA school of social work Dr. Dick Schoech.
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
Performance Indicators Table for Youth Substance Abuse & Use Prevention September 15, 2005 Ottawa, Ontario Wanda Jamieson & Tullio Caputo.
Evidence-Based Practice: Selection and Implementation A Story about Better Outcomes CMHACY Conference 2007 Todd Sosna, Ph.D.
What, Why and How? IPSCAN Master Class September 2013.
Vermont Early Childhood MTSS
Selecting Evidence Based Practices Oregon’s initial attempts to derive a process Implementation Conversations 11/10.
Training, Evaluation and Developmental Change in Learning Systems Katharine Briar-Lawson, PhD University at Albany.
Implementing Integrated Healthcare in Community Settings: Factors to Consider in Designing and Evaluating Programs Toni Watt, PhD, Associate Professor.
ENHANCING POSITIVE WORKER INTERVENTIONS WITH CHILDREN AND THEIR FAMILIES IN PROTECTION SERVICES: BEST PRACTICES AND REQUIRED SKILLS.
ANFPP National Program Centre Introduction to ANFPP.
Evaluating the Implementation of Child Welfare System Change.
Reduce Waiting & No-Shows  Increase Admissions & Continuation Reduce Waiting & No-Shows  Increase Admissions & Continuation Lessons Learned.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Presented at the Leadership Symposium on Evidence-Based Practice in Child Welfare Services June 28, 2007 Davis, CA It Can Work! Lessons Learned from a.
Designing An Adaptive Treatment Susan A. Murphy Univ. of Michigan Joint with Linda Collins & Karen Bierman Pennsylvania State Univ.
Evidenced Based Protocols for Adult Drug Courts Jacqueline van Wormer, PhD Washington State University NADCP/NDCI.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
Therapeutic Residential Care: Developing evidence based international practice LISA HOLMES, DIRECTOR, CENTRE FOR CHILD AND FAMILY RESEARCH.
What Makes A Juvenile Justice Program Evidence Based? Clay Yeager Policy Director, Blueprints for Healthy Youth Development.
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
Sandra F. Naoom, MSPH National Implementation Research Network Frank Porter Graham Child Development Institute University of North Carolina- Chapel Hill.
RDQ 17 Using the Alignment Brief to Build Priority for Wellness Discussion Leaders: Susan Barrett, Mid-Atlantic PBIS Network.
How to Assess the Effectiveness of Your Language Access Program
Incorporating Evidence Based Practices:
Kavitha N Krishnan MS OTR/L Dec 2nd, 2016
First 5 Sonoma County Triple P Implementation & Evaluation
Integrating Outcomes Learning Community June 12, 2013
Presentation transcript:

09/17/11 Meadowcroft&Associates and Wesley Spectrum (c) 1 Converting LOCAL Program to a Valid EBP: Fidelity Management Meadowcroft & Associates and Wesley Spectrum Services For more information, please contact Pamela Meadowcroft, Ph.D. at Or

09/21/11 Meadowcroft&Associates and Wesley Spectrum (c) 3 Levels of confidence  Evidence-based practice (rigorously evaluated; most often proven via RCT)  Evidence-informed practice/research-based (existing research support)  Best Practices (expert opinion)  Promising practice (acceptable treatments, anecdotal)  Innovations  Intuition, “the way it’s always done”

09/17/11 Meadowcroft&Associates and Wesley Spectrum (c) 4 We Know a Lot About What Works! Meta-analyses on thousands of studies Many programs ARE using research-based practices They just have not MEASURED and TRACKED their work!!!

09/21/115 Mark Lipsey, “ Evidence-based Practice More than One Approach. ” MST and FFT (two brand-names) show positive results, the dark boxes, but even “ generic ” interventions showed better results. From

Meadowcroft&Associates and Wesley Spectrum (c) 6 Wesley Spectrum In Home: History of Tracking Outcomes

05/17/11 Meadowcroft&Associates and Wesley Spectrum (c) 7 But… Why Good Outcomes? Easier population? OR Something we are DOING (our interventions/program model)? In other words: TRACKING OUTCOMES IS NOT ENOUGH

Meadowcroft&Associates and Wesley Spectrum (c) 8 Ideal Results High “fidelity” to the model leads to the best outcomes

09/21/11 Meadowcroft&Associates and Wesley Spectrum (c) 9 Steps for Building a local EBP: Fidelity Management 1. Define the program 2. Verify key program elements with existing research 3. Develop and Track Model Fidelity(outputs) 4. Develop and Monitor Outcomes 5. Validate the Locally-Developed Program Model (link outputs to outcomes) 6. Build-in CQI

Define the Program Logic Model Key program components ◦ Specific population ◦ Staff selection and training ◦ What the staff does ◦ How they are supervised ◦ Expected outputs and outcomes ◦ Suggested measures Example of draft IRT Logic Model 9

Verify Program Elements with Existing Research Literature review Eliminate from tracking anything that doesn’t have existing research support Examples 10

09/21/11 Meadowcroft&Associates and Wesley Spectrum (c) 10 Develop and Track Model Fidelity and Outcomes Therapist and Supervisor Checklist (Intake, Monthly, Discharge) Scores: ◦ Who we are serving (population assessments) ◦ What are we doing (outputs related to key activities, intensity of services) ◦ How did we do (client outcomes) Consumer Satisfaction Survey Scores ◦ Items relate to key program activities; additional output measures

09/21/11 Meadowcroft&Associates and Wesley Spectrum (c) 11 Build in CQI: Model Fidelity Comparison of Two Sites

09/21/11 Meadowcroft&Associates and Wesley Spectrum (c) 12 Higher Model Fidelity Improved Child Well-Being: Strong Relationship between Outputs and Outcomes

09/21/11 Meadowcroft&Associates and Wesley Spectrum (c) 13 Locally Developed vs Proprietary EBP

09/21/11 Meadowcroft&Associates and Wesley Spectrum (c) 14 Brand-name EBP vs Local-developed EBP Purchased EBP $millions for research and evaluation Many decades research/development Highly prescribed Low adaptability High effort Ongoing high program Cost (e.g., recertification) Locally Developed EBP Low-cost research and evaluation in short-time Moderate level program requirements Lower program cost Greater utility across populations Embedded in CQI Tools for incorporating new practices Staff commitment

09/21/11 Meadowcroft&Associates and Wesley Spectrum (c) 15 Key Conclusions Evidence based models pose limitations that our model building process does not Our model building process is replicable so other programs could do the same The process gives programs supervision and monitoring tools for continuous improvement AND for making the case of value to stakeholders