Presentation is loading. Please wait.

Presentation is loading. Please wait.

MARCH, 2010 PATRICIA CHAMBERLAIN, PHD Strategies for Constructing & Scaling Up Evidence-Based Practices.

Similar presentations


Presentation on theme: "MARCH, 2010 PATRICIA CHAMBERLAIN, PHD Strategies for Constructing & Scaling Up Evidence-Based Practices."— Presentation transcript:

1 MARCH, 2010 PATRICIA CHAMBERLAIN, PHD Strategies for Constructing & Scaling Up Evidence-Based Practices

2 The Focus How are evidence-based practices constructed (what goes in to them and why)? How can the child outcomes and factors that predict (or drive) those outcomes be measured within “real world” settings? How can evidence-based practice models fit into existing public service systems like juvenile justice and child welfare? How can evidence-based models be scaled up?

3 Create the Blueprint: Carefully Visualize and Define the Outcome Specificity - “Arrests” and “days incarcerated” versus “delinquency” - Make it measurable (Observable, from multiple sources, not only self- reports) Parsimoniousy

4 Constructing an EBP: Develop the Plan What do we want to make happen for whom? Define specific desired outcomes & how they can be measured. Resist the temptation to focus on too many outcomes. Keep the plan clean and focused. Who is the focus? (exclusion/ inclusion) Primary Outcomes to Decrease Primary Outcomes to Increase Adolescent girls in JJS -Crime -Drug use -Pregnancy -School + Peer relations Children in foster care -Placement disruptions -Behavior problems -Stability -Reunification Foster parents-Dropping out of providing care -Stress -Parenting Skills -Support

5 What Goes into the Plan? We look for high quality studies that identify risk and protective factors that predict or have strong associations with the outcomes of interest  Randomized controlled trials are the strongest for inferring causality  Longitudinal studies that examine development over the lifespan are helpful because they provide information on when to intervene (developmental sensitivity) Multiple studies constitute a strong evidence base Which of the risk and protective factors found in the studies are potentially malleable (by us/you)?

6 Structural Plan Sample Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Sample Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Malleable Protective Factor 2 Malleable Protective Factor 2 Malleable Risk Factor 1 Malleable Risk Factor 1 Malleable Risk Factor 2 Malleable Risk Factor 2 Malleable Protective Factor 1 Malleable Protective Factor 1

7 Engineering the Intervention Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Protective Factor 2 Protective Factor 2 Risk Factor 1 Risk Factor 1 Risk Factor 2 Risk Factor 2 Protective Factor 1 Protective Factor 1 Intervention Components Aimed at decreasing Risk Factor #1 Aimed at decreasing Risk Factor #2 Aimed increasing Protective Factor #1 Aimed at increasing Protective Factor #2 Intervention Components Aimed at decreasing Risk Factor #1 Aimed at decreasing Risk Factor #2 Aimed increasing Protective Factor #1 Aimed at increasing Protective Factor #2

8 Testing the Impact of the Intervention Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Protective Factor 1 Protective Factor 1 Intervention Components Aimed at decreasing Risk Factor #1 Aimed at decreasing Risk Factor #2 Aimed increasing Protective Factor #1 Aimed at increasing Protective Factor #2 Intervention Components Aimed at decreasing Risk Factor #1 Aimed at decreasing Risk Factor #2 Aimed increasing Protective Factor #1 Aimed at increasing Protective Factor #2 Protective Factor 2 Protective Factor 2 Absence of specific intervention components Risk Factor 1 Risk Factor 2 Significantly less change in outcomes Mediators

9 Logic Model for Intervention Effects Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Outcomes < Criminal Offending < Drug Use < Pregnancy > Positive Peer Relations > School Attendance Intervention Components Existing Factors Expected to Moderate Outcomes -Number of CWS placements -Age at first placement -Number of changes in caregivers -Number of previous arrests Age Gender Existing Factors Expected to Moderate Outcomes -Number of CWS placements -Age at first placement -Number of changes in caregivers -Number of previous arrests Age Gender Foster parent training and support to implement tracking of specific behavior Reinforcement of being where you are supposed to be Daily point and level system School Card Supervision Delinquent Peer Groups Positive Adult Mentoring Reinforcement for + school & home behaviors

10 p <.05 MTFC or Group Care

11 MTFC or Group Care

12 Fitting Research in to “Real World” Settings Ask if outcomes being addressed and measures of outcomes are:  Feasible (do not increase burden)  Meaningful (fit their agenda)  Capitalize on their existing system data

13 Example in Child Welfare Placement disruptions: Between 1/3 to 1/2 of children disrupt within the first 12 months of care.  Feasible- already tracked in CFSRs  Meaningful- rates are high and desirable to decrease  Capitalizes on their data and easy to count  Costs increase exponentially as the # of disruptions increase

14 Using System Data to Predict Risk Level

15 Research-based Risk & Protective Factors for Disruption Risk Factors  Child behavioral problems  Foster parent stress Protective Factors  Foster Parent support  Behaviorally based parenting skills

16 Example of a measure of risk factor: Parent Daily Report A daily snapshot of risk and protective factors 0 5 10 15 20 25 30 35 7/20/007/27/00 8/3/00 8/10/008/17/008/24/008/31/00 9/7/00 9/14/009/21/009/28/0010/5/00 10/12/0010/19/0010/26/00 11/2/0011/9/00 11/16/0011/23/0011/30/00 12/7/00 12/14/0012/21/0012/28/00 1/4/01 1/11/011/18/01 Total PDR # of Beh. Linear (Total PDR) Linear (# of Beh.)

17 PDR Scores at Baseline Predict Placement Disruption  A threshold effect: After 6 behaviors, every additional behavior on the PDR increases the probability of disruption by 17%.

18 Research on Uptake and Scaling Up Evidence-based Practices In the US 90% of child serving agencies do no t use EBPs The agencies that tend to innovate do so repeatedly The rich get richer and most fall behind The Needs/Innovations paradox (the systems that are in most need are least likely to innovate)

19 Scaling Up MTFC in California & Ohio WhoWhereDiscipline Patti Chamberlain CR2P, OregonPsychology Hendricks BrownU of MiamiBiostatistics Lynne MarsenichCa Institute for M.H.Social Work Todd Sosna Ca Institute for M.H.Psychology Larry PalinkasU of Southern CAAnthropology Lisa SaldanaCR2P, OregonPsychology Peter Sprengelmeyer CR2P, OregonPsychology Gerry Bouwman TFCC Inc, OregonBusiness Wei Wang U of South FloridaBiostatistics Patrick KanaryCIP, OhioSocial Work Courtenay Padgett CR2P, OregonCoordinator

20 Study Design 40 non-early adopting counties are randomized to:  2 implementation conditions (CDT or IND)  1 of 3 time frames (research resource issue: Cohorts #1, #2, #3)  Quantitative and qualitative measures - Assess stable non-malleable factors (population density, # of placements, % minority) - Assess “dynamic” malleable factors expected to mediate implementation success (organizational factors, attitudes towards EBPs) - Clinical team factors (fidelity, competence, willingness) - Child and Family factors (behavior change, placement outcomes)  Implementation success/failure Stages of Implementation Completion (SIC)

21 Design Included / Excluded MatchedR. A. to Time & Condition Included 1. No MTFC 2. Placed 6 or more (N = 40) Excluded 1. Existing MTFC 2. Placed <6 (N = 19) 1. Population Size (urban / rural) 2. Percentage minority 3. Number placed 4. Poverty Cohort 1: 2007 Cohort 2: 2008 Cohort 3: 2009

22

23 The Stages of Implementation Completion (SIC) Theoretical Premise Includes steps that have been identified as essential to the successful adoption, implementation and sustainability of MTFC Protocol is developed to measure the achievement of a model- adherent program aimed at obtaining outcomes similar to RCTs. SIC stages are operationalized and sequential  Engagement--the fit between community needs and the goals of MTFC  Procuring fiscal resources  Developing a feasible time-line  Analyzing the impact of staff recruitment on the organization (readiness)  Assessment of long-term sustainability

24 Stages of Implementation Completion (SIC) M easures Implementation @ Multiple Levels: System, Practitioner, Child/Family 8 Stages: Who is Involved? 1. Engagement System 2. Considering feasibility System 3. Planning/readiness System, Practitioner 4. Staff hired and trained Practitioner 5. Fidelity monitoring process in place Practitioner, Child/Family 6. Services and consultation begin Practitioner, Child/Family 7. Fidelity, competence, & adherence Practitioner, Child/Family 8. Sustainability (certification) System, Practitioner

25 Activities Within the 8 SIC Stages Stage 1Engagement 1.1Date site is informed services/ program available 1.2Date of interest indicated 1.3Date agreed to consider implementation 1.4Date declined to consider implementation; Stage 1 discontinued Stage 3Readiness planning 3.1Date of cost / funding plan review 3.2Date of staff sequence, timeline, hire plan review 3.3Date of FP recruitment plan review 3.4Date of referral criteria plan review 3.5Date written implementation plan completed 3.6Date Stage 3 discontinued

26 Stage 4Staff hired & trained 4.1Date Service Provider selected 4.2Date 1st staff hired 4.3Date clinical training scheduled 4.4Date clinical training held Count of # of staff trained 4.5Date FP training scheduled/held 4.6Date Stage 4 discontinued Stage 6Services and Consultation to Services Begin 6.1Date of first placement 6.2Date of first consult call 6.3Date of first clinical meeting video review (count of number of videos) 6.4Date of first foster parent meeting video review (count of number of videos) 6.5 Date Stage 6 discontinued

27 Two Scales on the SIC  Quantity - performance date driven - tracks completion of activities  Quality - performance ratings driven - relies on ratings by sites & trainers

28 Example of Measuring Quantity (days) Stage 1 Time VariableMeanRange Time to Decline100.473-1020 Time to Consent70.750-533

29 Black = Cohort 1, Blue = Cohort 2, Yellow = Cohort 3, Red = Discontinued, Beige Shading = Discontinue Activity SIC Progress by County

30 Examples of Quality Measures Stage 2: Consideration of Feasibility - Ratings of system leaders interestMTFC trainers - Stakeholders feedbackSystem leaders - Stage 3: Planning and Readiness - Planning Meeting ImpressionsMTFC trainers - Ratings of helpfulness of planning activitiesSite participants Stage 4: Staff Hired and Trained - Pre-training Ratings of MTFCClinical team - Trainer ImpressionsMTFC trainers - Trainee ImpressionsClinical team - PS, FP, Team, Org RatingsMTFC trainers

31 Examples of Quality How strongly do you subscribe to the MTFC model?Clinical How much support for the MTFC program is there organizationally? MTFC Rate team member’s ability to engage well with others. MTFC

32 Next Steps on the SIC  See sites through Stage 8  Finalize most appropriate scale scores  Assess if implementation condition (CDT vs. IND) affects quantity and/or quality scales  Assess how quantity and quality are related  Use of other study measures to validate the measure and assess its ability to predict successful implementation  Validate with non-study MTFC sites  Validate with other EBPs

33 What it takes to Scale-Up Evidence-based Practices?  Top down and bottom-up buy in  Mapping the “fit” between the intervention and the mission of the agency/system  Assessing how the activities/structures of the intervention disrupt daily duties & requirements (paperwork, court appearances, home visits, on-call)  Plan for change and instability (leadership turnover, funding ends)

34 Early Results on Predictors of Implementation Densely populated counties who placed the largest number of youth in placement were the fastest to consent System leaders who had the largest social networks were the “fence sitters” Systems with a positive organizational climate and high motivational readiness to change were the most likely to implement

35 References - Chamberlain, P., Brown, C. H., Saldana, L., Reid, J., Wang, W., Marsenich, L., Sosna, T., Padgett, C., & Bouwman, G. (2008). Engaging and recruiting counties in an experiment on implementing evidence-based practice in California. Administration and Policy in Mental Health and Mental Health Research, 35(4), 250-260. - Chamberlain, P., Saldana, L., Brown, H., & Leve, L. D. (in press). Implementation of multidimensional treatment foster care in California: A randomized control trial of an evidence-based practice. In M. Roberts-DeGennaro, & S. J. Fogel (Eds.), Empirically supported interventions for community and organizational change. Chicago: Lyceum. - Hoagwood, K., & Olin, S. (2002). The NIMH blueprint for change report: Research priorities in child and adolescent mental health. Journal of American Academy of Child and Adolescent Psychiatry, 41, 760-767. - NIMH (2004). Treatment research in mental illness: Improving the nation’s public mental health care through NIMH funded interventions research. Washington, DC: Author.


Download ppt "MARCH, 2010 PATRICIA CHAMBERLAIN, PHD Strategies for Constructing & Scaling Up Evidence-Based Practices."

Similar presentations


Ads by Google