MARCH, 2010 PATRICIA CHAMBERLAIN, PHD Strategies for Constructing & Scaling Up Evidence-Based Practices.

Slides:



Advertisements
Similar presentations
Parent Connectors: An Evidence-based Peer-to-Peer Support Program Albert J. Duchnowski, Ph.D. Krista Kutash, Ph.D. University of South Florida Federation.
Advertisements

Research Findings and Issues for Implementation, Policy and Scaling Up: Training & Supporting Personnel and Program Wide Implementation
The Practice of Evidence Based Practice … or Can You Finish What You Started? Ron Van Treuren, Ph.D. Seven Counties Services, Inc. Louisville, KY.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
Research Insights from the Family Home Program: An Adaptation of the Teaching-Family Model at Boys Town Daniel L. Daly and Ronald W. Thompson EUSARF 2014/
Building a Foundation for Community Change Proposed Restructure 2010.
Center for the Study and Prevention of Violence University of Colorado Boulder
CW/MH Learning Collaborative First Statewide Leadership Convening Lessons Learned from the Readiness Assessment Tools Lisa Conradi, PsyD Project Co-Investigator.
Requires DSHS and HCA to expend state funds on: (1) Juvenile justice programs or programs related to the prevention, treatment, or care of juvenile offenders.
Site Dev Highlighted article or topic – Transport Findings Update and Summary _____________________________________ Website: Pre Sonja Schoenwald, Ph.D.
Erwin McEwen & Dana A. Weiner Illinois Department of Children & Family Services Northwestern University.
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Preventing and Intervening in Delinquency through Integration and Coordination of Services.
Caregiver Support. Child Intervention Intake Statistics  Calgary and Area 2013:  The Region received 14,100 reports about a child or youth who may be.
Overview of Managing Access for Juvenile Offender Resources and Services Antonio Coor DMHDDSAS
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Using An Organizational Assessment : A framework to Help Agencies Build on Strengths, Recognize Challenges, and Develop a Comprehensive Work Plan, CWDA.
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
Molly Chamberlin, Ph.D. Indiana Youth Institute
1 NSCAW I and II Updates and New Field Work for a Child Welfare Landmark Study John Landsverk, Ph.D. Child & Adolescent Services Research Center Rady Children’s.
How to Develop the Right Research Questions for Program Evaluation
8/24/ Service Coordination: A Recipe for Success Shared philosophy among providers Shared philosophy among providers Collaborative policy and funding.
Claire Brindis, Dr. P.H. University of California, San Francisco American Public Health Association- Annual Meeting November 10, 2004 Adolescent Health:
WRAPAROUND MILWAUKEE “Never doubt that a small group of committed citizens can change the world: indeed, it’s the only thing that ever does.” Margaret.
Strengthening Service Quality © The Quality Service Review Institute, a Division of the Child Welfare Policy & Practice Group, 2014.
1 Data Revolution: National Survey of Child and Adolescent Well-Being (NSCAW) John Landsverk, Ph.D. Child & Adolescent Services Research Center Children’s.
Implementation Strategy for Evidence- Based Practices CIMH Community Development Team Model Pam Hawkins, Senior Associate Association for Criminal Justice.
A New Narrative for Child Welfare February 16, 2011 Bryan Samuels, Commissioner Administration on Children, Youth & Families.
Participants Adoption Study 109 (83%) of 133 WSU Cooperative Extension county chairs, faculty, and program staff responded to survey Dissemination & Implementation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
KENTUCKY YOUTH FIRST Grant Period August July
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
Maine DHHS: Putting Children First
The Alabama REACH Demonstration Project (ARDP) A case example using the RE-AIM model Lou Burgio, Ph.D. University of Alabama Center for Mental Health and.
Understanding TASC Marc Harrington, LPC, LCASI Case Developer Region 4 TASC Robin Cuellar, CCJP, CSAC Buncombe County.
CrossRoads Association and Princess Royal Trust for Carers Applied Policy and Practice Research Unit.
Barbara Resnick, PhD, CRNP, FAAN, FAANP
Potential Alcohol Strategies March 20, 2008 Sheila Nesbitt.
Early Childhood Mental Health Consultation Early interventions with very young children (birth to 6 years of age) at risk of experiencing serious emotional.
Youth Mental Health and Addiction Needs: One Community’s Answer Terry Johnson, MSW Senior Director of Services Senior Director of Services Deborah Ellison,
What is a Family Connections Program? An Overview of a New Service Approach Being Developed by the Bay Area Residentially Based Services Consortium.
Managing Organizational Change A Framework to Implement and Sustain Initiatives in a Public Agency Lisa Molinar M.A.
Addressing Maternal Depression Healthy Start Interconception Care Learning Collaborative Kimberly Deavers, MPH U.S. Department of Health & Human Services.
Child and Family Service Review CFSR 101. Child and Family Service Review CFSR stands for the Child and Family Service Review. It is the federal government’s.
PUTTING PREVENTION RESEARCH TO PRACTICE Prepared by: DMHAS Prevention, Intervention & Training Unit, 9/27/96 Karen Ohrenberger, Director Dianne Harnad,
Integrating Substance Abuse Competency Within A Child Welfare System Kim Bishop-Stevens LICSW Loretta Butehorn PhD Jan-Feb 2007.
Project KEEP: San Diego 1. Evidenced Based Practice  Best Research Evidence  Best Clinical Experience  Consistent with Family/Client Values  “The.
An Ecological Approach to Family Intervention in Early Childhood: Embedding Services in WIC Thomas J. Dishion University of Oregon & Child and Family Center.
Race and Child Welfare: Exits from the Child Welfare System Brenda Jones Harden, Ph.D. University of Maryland College Park Research Synthesis on Child.
Foster Care & Youth Offending Criminal Justice Forum Wellington, February, 2009 Dave Robertson Clinical Director, Youth Horizons Little research into.
Barbara Sims Debbie Egan Dean L. Fixsen Karen A. Blase Michelle A. Duda Using Implementation Frameworks to Identify Evidence Based Practices 2011 PBIS.
1 Executive Summary of the Strategic Plan and Proposed Action Steps January 2013 Healthy, Safe, Smart and Strong 1.
Educating Youth in Foster Care Shanna McBride and Angela Griffin, M.Ed.
Translational Social Work: Bringing Together Practice and Research Lawrence A. Palinkas, PhD Albert G and Frances Lomas Feldman Professor of Social Policy.
Integrating Tobacco Prevention Strategies into Behavioral Parent Training for Adolescents with ADHD Rosalie Corona, Ph.D. Associate Professor of Psychology.
Public Children Services Association of Ohio SAFE CHILDREN, STABLE FAMILIES, SUPPORTIVE COMMUNITIES.
Implementation and Sustainability in the US National EBP Project Gary R. Bond Dartmouth Psychiatric Research Center Lebanon, NH, USA May 27, 2014 CORE.
Background Objectives Methods Study Design A program evaluation of WIHD AfterCare families utilizing data collected from self-report measures and demographic.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
HRSA Early Childhood Comprehensive Systems (ECCS) Impact 2016 Funding Opportunity Announcement (FOA) Barbara Hamilton, Project Officer Division.
1 This project was supported by the Health Resources and Services Administration (HRSA) of the U.S. Department of Health and Human Services (HHS) under.
Performance and Progress 2012/2013. Why We Do an Annual Data Presentation To assess the Levy’s performance in various categories against goals. To highlight.
NYU Child Study Center: Bridges Program Caring Across Communities: Annual Grantee Meeting April
Lisa Saldana and Rena Gold Blueprints for Healthy Youth Development
Using Observation to Enhance Supervision CIMH Symposium Supervisor Track Oakland, California April 27, 2012.
Ken Larimore, Ph.D., LISW-S
Panhandle Partnership for Health and Human Services
First 5 Sonoma County Triple P Implementation & Evaluation
Daniel S. Shaw University of Pittsburgh
Presentation transcript:

MARCH, 2010 PATRICIA CHAMBERLAIN, PHD Strategies for Constructing & Scaling Up Evidence-Based Practices

The Focus How are evidence-based practices constructed (what goes in to them and why)? How can the child outcomes and factors that predict (or drive) those outcomes be measured within “real world” settings? How can evidence-based practice models fit into existing public service systems like juvenile justice and child welfare? How can evidence-based models be scaled up?

Create the Blueprint: Carefully Visualize and Define the Outcome Specificity - “Arrests” and “days incarcerated” versus “delinquency” - Make it measurable (Observable, from multiple sources, not only self- reports) Parsimoniousy

Constructing an EBP: Develop the Plan What do we want to make happen for whom? Define specific desired outcomes & how they can be measured. Resist the temptation to focus on too many outcomes. Keep the plan clean and focused. Who is the focus? (exclusion/ inclusion) Primary Outcomes to Decrease Primary Outcomes to Increase Adolescent girls in JJS -Crime -Drug use -Pregnancy -School + Peer relations Children in foster care -Placement disruptions -Behavior problems -Stability -Reunification Foster parents-Dropping out of providing care -Stress -Parenting Skills -Support

What Goes into the Plan? We look for high quality studies that identify risk and protective factors that predict or have strong associations with the outcomes of interest  Randomized controlled trials are the strongest for inferring causality  Longitudinal studies that examine development over the lifespan are helpful because they provide information on when to intervene (developmental sensitivity) Multiple studies constitute a strong evidence base Which of the risk and protective factors found in the studies are potentially malleable (by us/you)?

Structural Plan Sample Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Sample Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Malleable Protective Factor 2 Malleable Protective Factor 2 Malleable Risk Factor 1 Malleable Risk Factor 1 Malleable Risk Factor 2 Malleable Risk Factor 2 Malleable Protective Factor 1 Malleable Protective Factor 1

Engineering the Intervention Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Protective Factor 2 Protective Factor 2 Risk Factor 1 Risk Factor 1 Risk Factor 2 Risk Factor 2 Protective Factor 1 Protective Factor 1 Intervention Components Aimed at decreasing Risk Factor #1 Aimed at decreasing Risk Factor #2 Aimed increasing Protective Factor #1 Aimed at increasing Protective Factor #2 Intervention Components Aimed at decreasing Risk Factor #1 Aimed at decreasing Risk Factor #2 Aimed increasing Protective Factor #1 Aimed at increasing Protective Factor #2

Testing the Impact of the Intervention Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Protective Factor 1 Protective Factor 1 Intervention Components Aimed at decreasing Risk Factor #1 Aimed at decreasing Risk Factor #2 Aimed increasing Protective Factor #1 Aimed at increasing Protective Factor #2 Intervention Components Aimed at decreasing Risk Factor #1 Aimed at decreasing Risk Factor #2 Aimed increasing Protective Factor #1 Aimed at increasing Protective Factor #2 Protective Factor 2 Protective Factor 2 Absence of specific intervention components Risk Factor 1 Risk Factor 2 Significantly less change in outcomes Mediators

Logic Model for Intervention Effects Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Intervention Components Existing Factors Expected to Moderate Outcomes -Number of CWS placements -Age at first placement -Number of changes in caregivers -Number of previous arrests Age Gender Existing Factors Expected to Moderate Outcomes -Number of CWS placements -Age at first placement -Number of changes in caregivers -Number of previous arrests Age Gender Foster parent training and support to implement tracking of specific behavior Reinforcement of being where you are supposed to be Daily point and level system School Card Supervision Delinquent Peer Groups Positive Adult Mentoring Reinforcement for + school & home behaviors

p <.05 MTFC or Group Care

MTFC or Group Care

Fitting Research in to “Real World” Settings Ask if outcomes being addressed and measures of outcomes are:  Feasible (do not increase burden)  Meaningful (fit their agenda)  Capitalize on their existing system data

Example in Child Welfare Placement disruptions: Between 1/3 to 1/2 of children disrupt within the first 12 months of care.  Feasible- already tracked in CFSRs  Meaningful- rates are high and desirable to decrease  Capitalizes on their data and easy to count  Costs increase exponentially as the # of disruptions increase

Using System Data to Predict Risk Level

Research-based Risk & Protective Factors for Disruption Risk Factors  Child behavioral problems  Foster parent stress Protective Factors  Foster Parent support  Behaviorally based parenting skills

Example of a measure of risk factor: Parent Daily Report A daily snapshot of risk and protective factors /20/007/27/00 8/3/00 8/10/008/17/008/24/008/31/00 9/7/00 9/14/009/21/009/28/0010/5/00 10/12/0010/19/0010/26/00 11/2/0011/9/00 11/16/0011/23/0011/30/00 12/7/00 12/14/0012/21/0012/28/00 1/4/01 1/11/011/18/01 Total PDR # of Beh. Linear (Total PDR) Linear (# of Beh.)

PDR Scores at Baseline Predict Placement Disruption  A threshold effect: After 6 behaviors, every additional behavior on the PDR increases the probability of disruption by 17%.

Research on Uptake and Scaling Up Evidence-based Practices In the US 90% of child serving agencies do no t use EBPs The agencies that tend to innovate do so repeatedly The rich get richer and most fall behind The Needs/Innovations paradox (the systems that are in most need are least likely to innovate)

Scaling Up MTFC in California & Ohio WhoWhereDiscipline Patti Chamberlain CR2P, OregonPsychology Hendricks BrownU of MiamiBiostatistics Lynne MarsenichCa Institute for M.H.Social Work Todd Sosna Ca Institute for M.H.Psychology Larry PalinkasU of Southern CAAnthropology Lisa SaldanaCR2P, OregonPsychology Peter Sprengelmeyer CR2P, OregonPsychology Gerry Bouwman TFCC Inc, OregonBusiness Wei Wang U of South FloridaBiostatistics Patrick KanaryCIP, OhioSocial Work Courtenay Padgett CR2P, OregonCoordinator

Study Design 40 non-early adopting counties are randomized to:  2 implementation conditions (CDT or IND)  1 of 3 time frames (research resource issue: Cohorts #1, #2, #3)  Quantitative and qualitative measures - Assess stable non-malleable factors (population density, # of placements, % minority) - Assess “dynamic” malleable factors expected to mediate implementation success (organizational factors, attitudes towards EBPs) - Clinical team factors (fidelity, competence, willingness) - Child and Family factors (behavior change, placement outcomes)  Implementation success/failure Stages of Implementation Completion (SIC)

Design Included / Excluded MatchedR. A. to Time & Condition Included 1. No MTFC 2. Placed 6 or more (N = 40) Excluded 1. Existing MTFC 2. Placed <6 (N = 19) 1. Population Size (urban / rural) 2. Percentage minority 3. Number placed 4. Poverty Cohort 1: 2007 Cohort 2: 2008 Cohort 3: 2009

The Stages of Implementation Completion (SIC) Theoretical Premise Includes steps that have been identified as essential to the successful adoption, implementation and sustainability of MTFC Protocol is developed to measure the achievement of a model- adherent program aimed at obtaining outcomes similar to RCTs. SIC stages are operationalized and sequential  Engagement--the fit between community needs and the goals of MTFC  Procuring fiscal resources  Developing a feasible time-line  Analyzing the impact of staff recruitment on the organization (readiness)  Assessment of long-term sustainability

Stages of Implementation Completion (SIC) M easures Multiple Levels: System, Practitioner, Child/Family 8 Stages: Who is Involved? 1. Engagement System 2. Considering feasibility System 3. Planning/readiness System, Practitioner 4. Staff hired and trained Practitioner 5. Fidelity monitoring process in place Practitioner, Child/Family 6. Services and consultation begin Practitioner, Child/Family 7. Fidelity, competence, & adherence Practitioner, Child/Family 8. Sustainability (certification) System, Practitioner

Activities Within the 8 SIC Stages Stage 1Engagement 1.1Date site is informed services/ program available 1.2Date of interest indicated 1.3Date agreed to consider implementation 1.4Date declined to consider implementation; Stage 1 discontinued Stage 3Readiness planning 3.1Date of cost / funding plan review 3.2Date of staff sequence, timeline, hire plan review 3.3Date of FP recruitment plan review 3.4Date of referral criteria plan review 3.5Date written implementation plan completed 3.6Date Stage 3 discontinued

Stage 4Staff hired & trained 4.1Date Service Provider selected 4.2Date 1st staff hired 4.3Date clinical training scheduled 4.4Date clinical training held Count of # of staff trained 4.5Date FP training scheduled/held 4.6Date Stage 4 discontinued Stage 6Services and Consultation to Services Begin 6.1Date of first placement 6.2Date of first consult call 6.3Date of first clinical meeting video review (count of number of videos) 6.4Date of first foster parent meeting video review (count of number of videos) 6.5 Date Stage 6 discontinued

Two Scales on the SIC  Quantity - performance date driven - tracks completion of activities  Quality - performance ratings driven - relies on ratings by sites & trainers

Example of Measuring Quantity (days) Stage 1 Time VariableMeanRange Time to Decline Time to Consent

Black = Cohort 1, Blue = Cohort 2, Yellow = Cohort 3, Red = Discontinued, Beige Shading = Discontinue Activity SIC Progress by County

Examples of Quality Measures Stage 2: Consideration of Feasibility - Ratings of system leaders interestMTFC trainers - Stakeholders feedbackSystem leaders - Stage 3: Planning and Readiness - Planning Meeting ImpressionsMTFC trainers - Ratings of helpfulness of planning activitiesSite participants Stage 4: Staff Hired and Trained - Pre-training Ratings of MTFCClinical team - Trainer ImpressionsMTFC trainers - Trainee ImpressionsClinical team - PS, FP, Team, Org RatingsMTFC trainers

Examples of Quality How strongly do you subscribe to the MTFC model?Clinical How much support for the MTFC program is there organizationally? MTFC Rate team member’s ability to engage well with others. MTFC

Next Steps on the SIC  See sites through Stage 8  Finalize most appropriate scale scores  Assess if implementation condition (CDT vs. IND) affects quantity and/or quality scales  Assess how quantity and quality are related  Use of other study measures to validate the measure and assess its ability to predict successful implementation  Validate with non-study MTFC sites  Validate with other EBPs

What it takes to Scale-Up Evidence-based Practices?  Top down and bottom-up buy in  Mapping the “fit” between the intervention and the mission of the agency/system  Assessing how the activities/structures of the intervention disrupt daily duties & requirements (paperwork, court appearances, home visits, on-call)  Plan for change and instability (leadership turnover, funding ends)

Early Results on Predictors of Implementation Densely populated counties who placed the largest number of youth in placement were the fastest to consent System leaders who had the largest social networks were the “fence sitters” Systems with a positive organizational climate and high motivational readiness to change were the most likely to implement

References - Chamberlain, P., Brown, C. H., Saldana, L., Reid, J., Wang, W., Marsenich, L., Sosna, T., Padgett, C., & Bouwman, G. (2008). Engaging and recruiting counties in an experiment on implementing evidence-based practice in California. Administration and Policy in Mental Health and Mental Health Research, 35(4), Chamberlain, P., Saldana, L., Brown, H., & Leve, L. D. (in press). Implementation of multidimensional treatment foster care in California: A randomized control trial of an evidence-based practice. In M. Roberts-DeGennaro, & S. J. Fogel (Eds.), Empirically supported interventions for community and organizational change. Chicago: Lyceum. - Hoagwood, K., & Olin, S. (2002). The NIMH blueprint for change report: Research priorities in child and adolescent mental health. Journal of American Academy of Child and Adolescent Psychiatry, 41, NIMH (2004). Treatment research in mental illness: Improving the nation’s public mental health care through NIMH funded interventions research. Washington, DC: Author.