11 Triple P Outcomes in California Arizona Child Trauma Summit April 9, 2013 Cricket Mitchell, PhD Senior Associate, CiMH.

Slides:



Advertisements
Similar presentations
Evidence Based Practices Lars Olsen, Director of Treatment and Intervention Programs Maine Department of Corrections September 4, 2008.
Advertisements

Standardized Scales.
The San Francisco Parent Training Institute Triple P Program December 7, 2011 Stephanie Romney, PhD Danijela Zlatevski, PhD
Research Insights from the Family Home Program: An Adaptation of the Teaching-Family Model at Boys Town Daniel L. Daly and Ronald W. Thompson EUSARF 2014/
Mental Health Data Workbook and Training Ann Arneill-Py, PhD, Executive Officer CA Mental Health Planning Council Stephanie Oprendek, PhD, Senior Associate.
Embedding the Early Brain & Child Development Framework into Quality Rating and Improvement Systems Meeting Name Presenter Name Date 1.
Chapter 7 Performance Management
Effectively Utilizing Data Collection: A Case Study of a Functioning System Robert Bartelt & Kristen Gay Silver Springs – Martin Luther School.
California Association of School Psychologists 916/
Cricket Mitchell, PhD CiMH Senior Associate
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Chapter 9 Flashcards. measurement method that uses uniform procedures to collect, score, interpret, and report numerical results; usually has norms and.
 Department of Family and Children Services, Santa Clara County  San Jose State University School of Social Work  Santa Clara County Children’s Issue.
Copyright © 2011 American Institutes for Research All rights reserved. Recent Findings and Resources for Early Childhood Intervention Programs Deborah.
BASIC STATISTICS WE MOST OFTEN USE Student Affairs Assessment Council Portland State University June 2012.
EMERGENCY MEDICAL SERVICE FOR CHILDREN (EMS-C) Cynthia Frankel EMS-C Coordinator Alameda County EMS.
Cricket Mitchell, PhD CIMH Evaluation Consultant
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
Los Angeles County Evaluation System Accomplishments and Next Steps toward Outcomes and Performance Measurement of Treatment Services in LA County 2008.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
RIVERSIDE COUNTY A-G COMPLETION REPORT Mike Horton AVID Administrator, RCOE 2011.
The First 5 Movement: Investing in California’s Children.
Cricket Mitchell, PhD CIMH Evaluation Consultant CiMH Palette of Measures Evaluation Training: Center for Epidemiological Studies – Depression Scale (CES-D)
Small County Data Center Project: Phase 1
Massachusetts Behavioral Health Partnership / ValueOptions
FFT in California: Evaluation Outcomes Cricket Mitchell, PhD CIMH Consultant April 3, 2008.
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
Maria R. Zuniga California State University Long Beach May, 2012.
Cricket Mitchell, PhD CIMH Evaluation Consultant CiMH Palette of Measures Evaluation Training: Youth Outcome Questionnaires (YOQs)
The Alabama REACH Demonstration Project (ARDP) A case example using the RE-AIM model Lou Burgio, Ph.D. University of Alabama Center for Mental Health and.
Evaluating the Incredible Years School Readiness Parenting Programme Kirstie Cooper.
JOHN BURTON FOUNDATION Beyond Data Collection: Using the Tracking System to Measure Progress THP-Plus Institute July 27, 2009.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
Local Public Health System Assessment using the NPHPSP Local Instrument Essential Service 1 Monitor Health Status to Identify Community Health Problems.
Youth Mental Health and Addiction Needs: One Community’s Answer Terry Johnson, MSW Senior Director of Services Senior Director of Services Deborah Ellison,
Katie A. Learning Collaborative For Audio, please call: Participant code: Please mute your phone Building Child Welfare and Mental.
Treating Chronic Pain in Adolescents Amanda Bye, PsyD, Behavioral Medicine Specialist Collaborative Family Healthcare Association 15 th Annual Conference.
State of California Department of Alcohol and Drug Programs Summit: Using Performance and Outcome Measures to Improve Treatment Performance Management.
Cricket Mitchell, PhD CIMH Evaluation Consultant CiMH Palette of Measures Evaluation Training: Skill Streaming Checklists.
1 Emission and Air Quality Trends Review California July 2013.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
A NEW RESOURCE FOR RECONNECTING CHILDREN AND FAMILIES WITH COMPLEX AND ENDURING NEEDS Residentially Based Services.
Bob Algozzine Rob Horner National PBIS Leadership Forum Chicago Hyatt Regency O’Hare October 8, /
Overview of Continuous Quality Improvement (CQI) and Quality Service Review (QSR) Process.
Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Presentation prepared for Helping Extend Learning and.
New Communities Meeting 2010 February 9, 2010 Washington, DC Only Connect! and Other Axioms of System Implementation Four Critical Lessons from Successful.
2 nd Annual FFT Symposium, April 23 & 24, 2009 Cricket Mitchell, PhD Research Associate, Child and Families Team, CIMH California FFT Outcome Evaluation.
Project KEEP: San Diego 1. Evidenced Based Practice  Best Research Evidence  Best Clinical Experience  Consistent with Family/Client Values  “The.
Behavioral and Emotional Rating Scale - 2 Understanding and Sharing BERS-2 Information and Scoring with Parents, Caregivers and Youth May 1, 2012.
Knowing Your Population: A First Step Towards Measuring Outcomes
Gina Fleming, MA Project Manager Health Services Advisory Group December 15, 2015 Everyone with Diabetes Counts: Diabetes Self-Management Education.
Using Propensity Score Matching in Observational Services Research Neal Wallace, Ph.D. Portland State University February
Concurrent Validity of Alternative CANS Outcome Metrics William A. Shennum Julian Leiro Delisa Young Five Acres Altadena, California.
Program Evaluation Principles and Applications PAS 2010.
Cricket Mitchell, PhD CIMH Evaluation Consultant CiMH Palette of Measures Evaluation Training: Aggression Questionnaire (AQ)
Practice Key Driver Diagram. Chapter Quality Network ADHD Project Jeff Epstein PhD CQN ADHD National Expert/CQN Data Analyst The mehealth Portal and CQN.
Introduction to the Pennsylvania Kindergarten Entry Inventory.
Presented at the Leadership Symposium on Evidence-Based Practice in Child Welfare Services June 28, 2007 Davis, CA It Can Work! Lessons Learned from a.
Approaches to quantitative data analysis Lara Traeger, PhD Methods in Supportive Oncology Research.
Alberta Centre for Child, Family and Community Research Child and Youth Data Laboratory CYDL Project One Symposium Child Intervention Family Support for.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
The PDA Center is funded by the US Department of Education Office of Special Education Programs Stories from the Field and from our Consumers Building.
California County Behavioral Health Medical Directors Leadership Training Series Summer Session 2016.
Approaches to Linking Process and Outcome Data in a Cross-Site Evaluation Laura Elwyn, Ph.D. Kristin Stainbrook, Ph.D. American Evaluation Association.
First Things First Grantee Overview.
Statistics & Evidence-Based Practice
SB 163 WRAPAROUND.
Using Observation to Enhance Supervision CIMH Symposium Supervisor Track Oakland, California April 27, 2012.
EBP Symposium April 27, 2012 Cricket Mitchell, PhD
First 5 Sonoma County Triple P Implementation & Evaluation
Presentation transcript:

11 Triple P Outcomes in California Arizona Child Trauma Summit April 9, 2013 Cricket Mitchell, PhD Senior Associate, CiMH

Summary of Breakout Session Overview of California’s Triple P Outcome Evaluation Data –Counties supported by CiMH Options to consider in developing outcome evaluation protocols Outcome evaluation for Arizona’s Triple P implementation – facilitated discussion 2

What is CiMH? And How is It Related to Triple P? The California Institute for Mental Health (CiMH) is a statewide non-profit that provides training, technical assistance, research, evaluation, and policy support to publicly-funded agencies –Supports the dissemination and implementation of 12 evidence-based practices Program performance and outcome evaluation is a critical implementation support Triple P was selected for dissemination by CiMH and promoted to county agencies in 2006 –Some agencies contract with CiMH, and some do not 3

Triple P Implementation Sites Across California Counties Mendocino Alameda Shasta Nevada Sonoma Marin San Francisco Contra Costa Santa Cruz Santa Clara San Joaquin Ventura Los Angeles Riverside Orange San Diego ______________ Also Tri-cities Area 4

Overview of California’s Triple P Outcome Evaluation Data Summer 2012 Triple P Data Submission to CiMH –Four Counties Los Angeles Shasta Sonoma Ventura –74 implementation sites –5,292 unique child clients served 5

Overview of California’s Triple P Outcome Evaluation Data Outcome evaluation protocols within each county vary –Data elements collected Demographics Service delivery information –Outcome measures used –Applications/software used for data entry 6

Overview of California’s Triple P Outcome Evaluation Data CiMH’s Program Performance and Outcome Evaluation Reports –Three primary domains Characteristics of clients served Description of services provided Outcomes achieved –Two-Pronged Approach to Outcome Measurement »Target-specific symptoms »General mental health functioning 7

Overview of California’s Triple P Outcome Evaluation Data Today’s presentation will highlight select data elements from the Summer 2012 data submission –Triple P Levels and Types –Child Client Demographics Age, Gender, Ethnicity, Primary Language Spoken in the Home, and Primary Axis I DSM-IV diagnosis –Triple P Outcomes Eyberg Child Behavior Inventory (ECBI), Parenting Scale, and Youth Outcome Questionnaire (YOQ) 8

Overview of California’s Triple P Outcome Evaluation Data – Level and Type of Triple P 9

Overview of California’s Triple P Outcome Evaluation Data – Age 10 Range:.01 – years –Some counties serve Transition Age Youth (15-26) Mean: 7.7 –Standard Deviation: 4.1 Mode: 4.0 Frequency distribution is positively skewed –25 th percentile: 4.6 –50 th percentile: 7.2 –75 th percentile: 10.7

Overview of California’s Triple P Outcome Evaluation Data – Gender 11

Overview of California’s Triple P Outcome Evaluation Data – Ethnicity 12

Overview of California’s Triple P Outcome Evaluation Data – Primary Language 13

Overview of California’s Triple P Outcome Evaluation Data – Primary Axis I DSM-IV Dx* 14 *Two of the four Counties track mental health dx

CiMH Outcome Indicators Percent Improvement –Percent improvement from average pre-score to average post-score Paired t-test conducted to examine whether or not the difference is likely to be due to chance (p<.01); if not, the percent change is asterisked (*) to indicate a statistically significant improvement Effect Size Estimate: Cohen’s d –A standardized measure that estimates the magnitude, or strength, of the observed change Conventional interpretation:.8 ≈ “large” effect;.5 ≈ “moderate” effect; and,.2-.3 ≈ “small” effect 15

CiMH Outcome Indicators Reliable Change –The amount of change observed in an outcome measure that can be considered an actual change, and not likely to be due to the passage of time or measurement error (p<.05) Complex statistical formula that takes the measure’s reliability into consideration, as well as the variability observed among scores –Once the formula is applied, clients can be grouped into one of three categories: reliable positive change; reliable negative change; and, no reliable change 16

Overview of California’s Triple P Outcome Evaluation Data Target-Specific Outcome Measure Focused on Child Disruptive Behaviors –Eyberg Child Behavior Inventory (ECBI) Parent/Caregiver Report of the Intensity and Problematic extent of child behavior problems 36 items Intensity Score Range 36 – 252 –Clinical cutpoint 131 and higher Problem Score Range 0 – 36 –Clinical cutpoint 15 and higher –Used by two of the four counties 17

Overview of California’s Triple P Outcome Evaluation Data – Outcomes: ECBI 18 Eyberg Child Behavior Inventory (ECBI) Percent Improvement from the Average Pre- Score to the Average Post- Score Effect Size Estimate (Cohen’s d) Percent of Clients Showing Reliable Change from Pre- to Post- Positive Change No Change Negative Change Intensity Raw Score 28.9%* (n=726) [pre=135.3] % (n=460) 31.5% (n=229) 5.1% (n=37) Problem Raw Score 51.0%* (n=744) [pre=17.7] % (n=487) 29.5% (n=220) 5.0% (n=37) *A statistically significant improvement, p <.01

Overview of California’s Triple P Outcome Evaluation Data – Outcomes: ECBI Intensity 19 Solid line indicates clinical cutpoint

Overview of California’s Triple P Outcome Evaluation Data – Outcomes: ECBI Problem 20 Solid line indicates clinical cutpoint

Overview of California’s Triple P Outcome Evaluation Data Target-Specific Outcome Measure Focused on Parenting –Parenting Scale Parent/Caregiver Report that assesses parenting and disciplinary styles that are found to be related to the development and/or maintenance of child disruptive behavior problems 30 items Total Score is a mean item response ranging from 1 – 7 –Clinical cutpoint 2.8 and higher –Used by two of the four counties 21

Overview of California’s Triple P Outcome Evaluation Data – Outcomes: Parenting Scale 22 Parenting Scale Percent Improvement from the Average Pre- Score to the Average Post- Score Effect Size Estimate (Cohen’s d) Percent of Clients Showing Reliable Change from Pre- to Post- Positive Change No Change Negative Change Total Score 28.0%* (n=154) [pre=3.6] % (n=75) 49.4% (n=76) 1.9% (n=3) *A statistically significant improvement, p <.01

Overview of California’s Triple P Outcome Evaluation Data – Outcomes: Parenting Scale 23 Solid line indicates clinical cutpoint

Overview of California’s Triple P Outcome Evaluation Data General Outcome Measure of Mental Health Functioning –Youth Outcome Questionnaire Parent/Caregiver Report that assesses multiple dimensions of child/youth mental health functioning 64 items Total Score Range -16 – 240 –Clinical cutpoint 47 and higher –Used by one of the four counties 24

Overview of California’s Triple P Outcome Evaluation Data – Outcomes: YOQ Total 25 Youth Outcome Questionnaire (YOQ) Percent Improvement from the Average Pre- Score to the Average Post- Score Effect Size Estimate (Cohen’s d) Percent of Clients Showing Reliable Change from Pre- to Post- Positive Change No Change Negative Change Total Score 36.3%* (n=638) [pre=63.9] % (n=367) 34.0% (n=217) 8.5% (n=54) *A statistically significant improvement, p <.01

Overview of California’s Triple P Outcome Evaluation Data – Outcomes: YOQ Total 26 Solid line indicates clinical cutpoint

Overview of California’s Triple P Outcome Evaluation Data – Outcomes: Reliable Change 27

28 Overview of California’s Triple P Outcome Evaluation Data Follow-up analyses of aggregate data indicate no differences in: –change in ECBI Intensity Score outcomes; –change in ECBI Problem Score outcomes; –change in Parenting Scale outcomes; or, –change in YOQ Total Score outcomes by gender or ethnicity

Options to Consider in Developing Outcome Evaluation Protocols Data elements to track/collect –Parsimony –Utility Outcome measures –Relevance to treatment target/goals –Psychometric characteristics (valid, reliable) –Cost –Time (administration, scoring, data entry) –Training and technical assistance 29 Thoughtful and thorough planning is the key!

Options to Consider in Developing Outcome Evaluation Protocols Application/software used for data entry –System already in place that can be modified? (e.g., EHR, county- or state-level information system) –Cost –Skill level to use/employ –Utility of data elements for analysis and reporting –Training and technical assistance 30

Options to Consider in Developing Outcome Evaluation Protocols Frequency of analysis and reporting –Multiple stakeholders Different reports for different audiences –Processes for maximizing utility of data Clinical utility Program improvement Systems-level decisions 31 Feedback is Essential

Additional Considerations for Telling the Whole Story Collect minimal data on all clients referred –Determine entry rate –Determine additional need (waiting lists) Collect completion status (yes/no) –Determine dropout rate –May provide the opportunity to examine dose-response relationships Track clients who are served by more than one Level/Type of Triple P (within and across providers) Track population-level indicators (substantiated child maltreatment cases, out of home placements, emergency room visits for unexplained child injuries) 32

Evaluation for Arizona’s Triple P Implementation – Discussion Questions Who should be included in decision-making? Is there an overarching evaluation framework? What data are currently being collected? –What additional data elements are of interest? What outcome measures will be used? –How will they be obtained, distributed, and used? –Who will provide training and technical assistance? How will data be tracked/collected from individual Triple P providers? 33

Evaluation for Arizona’s Triple P Implementation – Discussion Questions How will population-level indicators be tracked? With what frequency? Responsibility for and frequency of data analysis and reporting? How will data be used to inform decisions? –Client-level –Program-level –System-level 34

35 Discussion Summary

36 The End Contact Information Cricket Mitchell, PhD Cell phone: