Presentation is loading. Please wait.

Presentation is loading. Please wait.

Approaches to Linking Process and Outcome Data in a Cross-Site Evaluation Laura Elwyn, Ph.D. Kristin Stainbrook, Ph.D. American Evaluation Association.

Similar presentations


Presentation on theme: "Approaches to Linking Process and Outcome Data in a Cross-Site Evaluation Laura Elwyn, Ph.D. Kristin Stainbrook, Ph.D. American Evaluation Association."— Presentation transcript:

1 Approaches to Linking Process and Outcome Data in a Cross-Site Evaluation Laura Elwyn, Ph.D. Kristin Stainbrook, Ph.D. American Evaluation Association Annual Meeting Washington, DC ◊◊◊ October 18, 2013 This presentation was prepared with preliminary data collected under SAMHSA Contract HSS283200700038.

2 Overview of Presentation Briefly describe the JDTR Program Describe the Cross Site Evaluation Goals, as well as data sources and challenges Present an analytic approach to linking qualitative and quantitative data Provide an example of the approach Discuss the implications, limitations, and next steps 2

3 CHMS Jail Diversion and Trauma Recovery (JDTR) Program SAMHSA service grant program to support the development of local programs and statewide expansion of diversion and services to address the needs of individuals with trauma related problems – In recognition of the high rates of trauma among returning Operation Enduring Freedom (OEF)/Operation Iraqi Freedom (OIF)/ Operation New Dawn (OND)- veterans are a priority population JDTR has a multi-level approach – Local program pilot – State level involvement to create structural changes to facilitate veteran access and services; to expand and replicate pilot; focus on trauma informed services Thirteen states funded in 2008-09 for 5 years each 3

4 JDTR Cross-Site Evaluation Goals There are two overarching goals – Conduct an outcome evaluation to determine the extent to which behavioral health treatment and supports result in improved client outcomes, particularly for veterans – Conduct a process evaluation to document grantee implementation of pilot and statewide programs, as well as changes in practice and policies to support sustainability and expansion of pilot screening and treatment strategies

5 Outcome Evaluation Data Local Evaluators are responsible for site client data collection Client interviews standardized across sites (baseline, 6, 12 month) using validated instruments collecting: – Demographics – U.S. Military experience – Trauma history – Criminal justice history – PTSD/Mental Health (e.g. PCL, BASIS 24) – Substance use/abuse (e.g. CAGE, BASIS 24) – Functioning/quality of life (e.g. REE) – Treatment history – Service receipt Prior and post year secondary data on arrests and incarcerations Post year secondary data on services received 5

6 Process Evaluation Data Two In-Person Site Visits (in years 2 and 4) – Explore program implementation, service delivery, challenges and accomplishments through in-depth discussions with a range of state and local stakeholders. – Qualitative data codified by general agreement and integrated into an Implementation Rating Scale and Core Program Components scale; both scales developed based on SAMHSA requirements and expectations. Semi-Annual Progress Reports – Bi-Annual self-assessment reports to document State and Pilot level program goals; understand project environment and project spending; describe activities and progress on state level infrastructure change components; describe progress on the Pilot Project; and document project accomplishments, including any policy changes.

7 Evaluation Challenges Differences between programs Differences in client populations between sites (more homogeneity within sites) Variations in numbers served and data collected 7

8 Differences in Programs Programs vary on a number of dimensions: – Criminal justice intercepts – Program structures such as: Court involvement/judicial oversight Service models/types of services Partner Relationships – Level of Implementation of components such as: Trauma screening Peer involvement – Stability (leadership and staff turnover) 8

9 Differences in Site Population Characteristics 9

10 Variations in Data Submission 10

11 Linking the Outcome and Process Data to Develop More Meaningful Findings 11

12 First Step: Coding Code program level dimensions such as: – Judicial oversight – Service models – Program component implementation Calculate population level characteristics by site such as: – Mean age – Proportion U.S. military – Diversion characteristics (post-booking, felony) – Proportion with mental health or substance abuse problems 12

13 Second Step: Statistical Approach Evaluate descriptive and other statistics on a site by site basis and compare with pooled average (e.g. tests of change over time) Analyze using mixed effect models: – Test whether significant variance between sites in outcomes – Test whether program dimensions explain this variance controlling for individual and population characteristics Advantages: correctly models clustering effects within sites 13

14 Example Research Question: – Is judicial oversight (court involvement in program) associated with participant improvement in the area of substance abuse? Preliminary findings: Based on 8 sites with sufficient qualitative and quantitative follow up data (n=612) 14

15 Substance Abuse Outcomes 15

16 Differences in Substance Abuse by Site 16

17 Multi-level Model Dependent variable: – Change on BASIS 24 substance abuse subscale between baseline and six months Level 1 Covariates: – Baseline score – Services received – Earlier models controlled for demographics, education, U.S. military, diversion intercept, etc. (no association and do not effect findings) 17

18 Multi-level (mixed effects) Level 2: Judicial Oversight (Program level qualitative index measure from process study): – 1=program affiliated with court (5 sites) – 0=program not affiliated with court (3 sites) Level 2 covariates: – Site population race indicator (% white) – Earlier models controlled for other site population characteristics such as mean age, % U.S. military, % male, etc. (no association and do not effect findings) 18

19 Results without Process Indicator Model without judicial oversight: When allow substance abuse outcome measure to vary by site (random effects): – Variance of grand-mean centered change score significant; – Likelihood ratio test comparing with OLS (no random effect) p<.001 19

20 Results with Process Indicator Model with judicial oversight indicator: Judicial oversight predicts an average of over a third of a point (0.3) in improvement on B24 substance abuse from baseline to six month controlling for initial score, substance abuse service receipt, and percent white Judicial oversight explains the variance by site – this model no longer has significant level of variance; no different from OLS model 20

21 Conclusions Court involvement/oversight important for substance use outcomes Multi-level mixed effect modeling is useful strategy for disentangling influence of program dimensions from individual characteristics, services received, and site level population characteristics Data on context and implementation of programs adds value to understanding client outcomes 21

22 Limitations No comparison group (descriptive versus causal associations) Preliminary findings limited to subset of sites (still collecting data) Statistical power is consideration (e.g. three level models: time by person by site) 22

23 Next Steps Continue data collection in last year Continued analysis with additional qualitative and quantitative data Further dissemination will help to inform expansion of jail diversion programs and veteran and behavioral health courts and improve participant outcomes 23


Download ppt "Approaches to Linking Process and Outcome Data in a Cross-Site Evaluation Laura Elwyn, Ph.D. Kristin Stainbrook, Ph.D. American Evaluation Association."

Similar presentations


Ads by Google