Performance measurement: Finding our way from outputs to outcomes.

Slides:



Advertisements
Similar presentations
1 Radio Maria World. 2 Postazioni Transmitter locations.
Advertisements

Jack Jedwab Association for Canadian Studies September 27 th, 2008 Canadian Post Olympic Survey.
Números.
Trend for Precision Soil Testing % Zone or Grid Samples Tested compared to Total Samples.
Trend for Precision Soil Testing % Zone or Grid Samples Tested compared to Total Samples.
AGVISE Laboratories %Zone or Grid Samples – Northwood laboratory
/ /17 32/ / /
Reflection nurulquran.com.
EuroCondens SGB E.
Worksheets.
Addition and Subtraction Equations
Disability status in Ethiopia in 1984, 1994 & 2007 population and housing sensus Ehete Bekele Seyoum ESA/STAT/AC.219/25.
OPTN Modifications to Heart Allocation Policy Implemented July 12, 2006 Changed the allocation order for medically urgent (Status 1A and 1B) patients Policy.
1 IDEA 2004 SPP Indicators Related to Transition: How We Collect the Data & What We Have Learned Ginger Blalock Summer Transition Meeting June 11, 2007.
Performance measurement: Finding our way from outputs to outcomes.
EQUS Conference - Brussels, June 16, 2011 Ambros Uchtenhagen, Michael Schaub Minimum Quality Standards in the field of Drug Demand Reduction Parallel Session.
Create an Application Title 1Y - Youth Chapter 5.
Add Governors Discretionary (1G) Grants Chapter 6.
1 Targeted Case Management (TCM) Changes Iowa Medicaid Enterprise October 14, 2008.
CALENDAR.
Behavioral Health DATA BOOK A quarterly reference to community mental health and substance abuse services Fiscal Year 2013 Quarter 1 January 9, 2012
Behavioral Health DATA BOOK A quarterly reference to community mental health and substance abuse services Fiscal Year 2011 Quarter 3 July 11, 2011.
Behavioral Health DATA BOOK A quarterly reference to community mental health and substance abuse services Fiscal Year 2011 Quarter 4 October 10, 2011.
Briefing July 16, 2001 Judge Kathleen Kearney Kenneth A. DeCerchio Secretary Director of Substance Abuse Substance Abuse Program.
Summative Math Test Algebra (28%) Geometry (29%)
Plan My Care Brokerage Training Working in partnership with Improvement and Efficiency South East.
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
The 5S numbers game..
APS Teacher Evaluation
Site Safety Plans PFN ME 35B.
Welcome. © 2008 ADP, Inc. 2 Overview A Look at the Web Site Question and Answer Session Agenda.
Break Time Remaining 10:00.
The basics for simulations
Employee & Manager Self Service Overview
Ontario’s Policy Framework for Child and Youth Mental Health
A comparison of results from an alcohol survey of a pre-recruited internet panel and the National Epidemiological Survey on Alcohol and Related Conditions.
TCCI Barometer March “Establishing a reliable tool for monitoring the financial, business and social activity in the Prefecture of Thessaloniki”
Dynamic Access Control the file server, reimagined Presented by Mark on twitter 1 contents copyright 2013 Mark Minasi.
TCCI Barometer March “Establishing a reliable tool for monitoring the financial, business and social activity in the Prefecture of Thessaloniki”
Crisis Shelter Program GOALS To stabilize youth and families in crisis To develop stable living conditions for youth To engage families in the resolution.
Promoting Regulatory Excellence Self Assessment & Physiotherapy: the Ontario Model Jan Robinson, Registrar & CEO, College of Physiotherapists of Ontario.
Progressive Aerobic Cardiovascular Endurance Run
2.5 Using Linear Models   Month Temp º F 70 º F 75 º F 78 º F.
Continuing Care: The Common Challenge Ahead John G. Abbott, CEO Health Council of Canada.
Name of presenter(s) or subtitle Canadian Netizens February 2004.
AHS IV Trivia Game McCreary Centre Society
Opportunities for Prevention & Intervention in Child Maltreatment Investigations Involving Infants in Ontario Barbara Fallon, PhD Assistant Professor Jennifer.
Facebook Pages 101: Your Organization’s Foothold on the Social Web A Volunteer Leader Webinar Sponsored by CACO December 1, 2010 Andrew Gossen, Senior.
TCCI Barometer September “Establishing a reliable tool for monitoring the financial, business and social activity in the Prefecture of Thessaloniki”
Employment Ontario Program Updates EO Leadership Summit – May 13, 2013 Barb Simmons, MTCU.
2011 WINNISQUAM COMMUNITY SURVEY YOUTH RISK BEHAVIOR GRADES 9-12 STUDENTS=1021.
Before Between After.
Values Driven Systems of Care: the BC SCORES Experience Children’s Mental Health Research and Policy Conference March 22, 2011 Holly Wald, PhD, Cynthia.
2011 FRANKLIN COMMUNITY SURVEY YOUTH RISK BEHAVIOR GRADES 9-12 STUDENTS=332.
7/16/08 1 New Mexico’s Indicator-based Information System for Public Health Data (NM-IBIS) Community Health Assessment Training July 16, 2008.
1 Non Deterministic Automata. 2 Alphabet = Nondeterministic Finite Accepter (NFA)
Module 12 WSP quality assurance tool 1. Module 12 WSP quality assurance tool Session structure Introduction About the tool Using the tool Supporting materials.
Static Equilibrium; Elasticity and Fracture
1 Phase III: Planning Action Developing Improvement Plans.
Resistência dos Materiais, 5ª ed.
School Health Profiles (Profiles) 2010 State Results National Center for Chronic Disease Prevention and Health Promotion Division of Adolescent and School.
Patient Survey Results 2013 Nicki Mott. Patient Survey 2013 Patient Survey conducted by IPOS Mori by posting questionnaires to random patients in the.
A Data Warehouse Mining Tool Stephen Turner Chris Frala
1 Non Deterministic Automata. 2 Alphabet = Nondeterministic Finite Accepter (NFA)
Student conference Tuesday 7 and Wednesday 8 October 2014.
Student Equity Report Prepared by Berkeley City College, Faculty, Administrators, and Staff May, 2012 Data Sources: PCCD Institutional Research, CCCCO.
Introduction Embedded Universal Tools and Online Features 2.
Presented to: By: Date: Federal Aviation Administration FAA Safety Team FAASafety.gov AMT Awards Program Sun ‘n Fun Bryan Neville, FAASTeam April 21, 2009.
Schutzvermerk nach DIN 34 beachten 05/04/15 Seite 1 Training EPAM and CANopen Basic Solution: Password * * Level 1 Level 2 * Level 3 Password2 IP-Adr.
Youth Mental Health and Addiction Needs: One Community’s Answer Terry Johnson, MSW Senior Director of Services Senior Director of Services Deborah Ellison,
Presentation transcript:

Performance measurement: Finding our way from outputs to outcomes

Finding our way from outputs to outcomes How do we know we are having an impact on children, youth and their families? How can we come together to ensure that the services we offer are appropriate and effective?

Panel members Dr. Melanie Barwick: The role of CAFAS in measuring organizational and system outcomes for children and youths mental health Roger Rolfe: The CYTS experience with CAFAS Samantha Yamada: Building research and evaluation capacity at PRI

Overview: Performance measurement WHAT is it? WHY is it necessary? WHO benefits? How is it done?

What is performance measurement? The regular collection of information for monitoring how a policy, program or initiative is doing at any point in time. It can be used to report on the level of attainment of planned results and on performance trends over time. - Treasury Board Secretariat of Canada

Why is it necessary? Serves as a descriptive tool on how a project, policy or program is doing Serves as an early warning if the direction of a program, policy or project is not going as planned

Who is the audience? End users: clients, families caregivers Service providers, educators, program staff Organization or network Health system Public at large

How is it different from evaluation? PM: provides regular snapshots of how a program or policy is doing; focuses on what the outcome is Client satisfaction surveys is often a key indicator of PM Evaluation: can provide insight into how and why an outcome is occurring Client satisfaction is part of the process evaluation and can influence outcomes

Performance measurement and evaluation SITUATION ACTIVITIES OUTCOMES INPUTS OUTPUTS Resources of a program Quantity of work, products or participants Change in target audience

How is it done? Program logic models Balanced score cards Strategy maps

Balanced scorecard approach on health promotion Source: ICES, 2004

Improve healthy behaviours, health promotion and disease prevention Improve linkages and transitions between addiction, mental health, health, education, social and justice systems Improve client focus of addiction services Improve access to appropriate addiction treatment Increase productive use & appropriate allocation of resources across system Ensure the continuum of interventions includes prevention, health promotion, early intervention, harm reduction and treatment services. Reducing Risk through influencing the broader determinants of health Improve health outcomes at the individual and population level Increase sustainability and equity of the addiction & health systems Increase availability and retention of the qualified human resources Ensure evidence informed practices are developed, implemented and maintained across province Further develop & increase equitable resources and capacity Ensure quality assurance within the addiction system Addiction System Strategy Map Ontario Federation of Community Mental Health & Addictions Programs Increase linkages, transition & integration within addiction services

Child welfare performance measurement Source: OACAS QA Framework, 2004

Types of performance measures Outcome measures Intermediate outcome measures Process measures Output measures Input measures

Strategic priorities by the Select Committee

Challenges and Opportunities Selecting measures Valid and reliable Relevant, feasible, sensitive to changes Developmentally and culturally appropriate Information management capacity within agencies and in the government Collaboration, buy-in and cultures that foster learning

Questions? Dr. Evangeline Danseco Head of Evaluation and Research Ext

The role of CAFAS in measuring organizational and system outcomes for children and youth mental health Melanie Barwick, PhD, C.Psych. Associate Scientist, Hospital for Sick Children Lead Implementer, CAFAS in Ontario (c) Barwick

Overview (c) Barwick 1 Evidence Base for Outcome Measurement 2 CAFAS Measure 3 CAFAS Implementation 4 Ontario Outcomes

When clinicians are given feedback about how clients were responding to treatment (as expected, normally functioning, failure to respond), they have an opportunity to improve outcomes and reduce deterioration in the patient. Lambert, Whipple, Smart, et al., (2001) found that they could identify potential treatment failure based on initial level of disturbance and early negative response to treatment. Providing feedback to therapists enhanced outcomes and reduced deterioration. Those identified as potential treatment failures stayed in therapy longer and had better outcomes when feedback was provided to their clinician. Patient outcomes can be improved if therapists are alerted to treatment response. This is called outcome management. (c) Barwick 1 Evidence Base for Outcome Measurement

Utility of Outcome Measurement (c) Barwick Using outcomes to MODIFY treatment if necessary Assessing the outcome of treatment or service Assessing outcomes DURING treatment to track client progress

Benefits of Outcome Measurement Promotes comprehensive approach to treatmentProvides balanced view of strengths, weakness, goalsCompliments symptom informationIdentifies risk behaviours systematicallyProvides information that is useful for formulationAssists in the development of a treatment planInforms treatment direction and discharge planning Provides services providers with a common language (CYMH, child welfare, Juvenile Justice, Education Provides a tool for advocacy and benchmarking (c) Barwick

Elements of a Successful System of Care in CYMH? (c) Barwick business practices human resources practice change implementation CAFAS BCFPI EBP Best practices

(c) Barwick Provincially mandated use of a the Child and Adolescent Functional Assessment Scale (Hodges, 2002) to measure level of functioning outcomes among 6-17 year old children and youth receiving mental health services in Ontario Begun in 2000 with training of over 3000 practitioners over 3 years; now reaching 6,000 practitioners! 120 CYMH organizations selected by MCYS to participate in the initiative CMHCs also participate in use of a systematic intake screening interview called the Brief Child and Family Phone Interview; oversight and training for BCFPI is provided by Childrens Mental Health Ontario 2 Overview of the CAFAS Measure

AGE GUIDELINES CAFAS - Child and Adolescent Functional Assessment Scale children ages 6-17 PECFAS - Pre-school and Early Childhood Functional Assessment Scale children ages 4-7

CAFAS Subscales (c) Barwick

Levels of Functional Impairment Severe (30) Moderate ( 20 ) Mild (10) None (0) (c) Barwick

31

CAFAS Caregiver Subscales (c) Barwick

33 Caregiver Resources, Material Needs… Caregiver difficulties in providing for the childs material needs - housing, food, clothes - and there is a negative impact on level of functioning Childs needs for food, clothing, housing, medical attention are not being met, causing severe risk Insufficient material needs leads to frequent negative impact on the child An occasional negative impact due to this depravation

34 Caregiver Resources, Social Support Caregiver difficulties in providing a home setting that is free of known risk factors (abuse, parental alcoholism) or in providing for the childs emotional & social needs Caregiver is hostile, rejecting, or does not want child to return to the home Family members are insensitive, angry and/or resentful to the youth Family not able to provide warmth, security, & sensitivity

35 Score Interpretation Total Score of:Corresponds to clients who are: 0-30Likely referred to qualified health professional 40-70Likely requires outpatient services Likely requires outpatient care with additional services of a supportive or intensive nature Likely requires intensive, community- based services, although some youths may need acute residential services at some point > 140Very intensive services would be required; maybe in residential or inpatient settings at some point

Milestones in Implementation and Uptake 1999 Ministry Mandate 2000 Rater Reliability Begins for 120 Organizations 2002 Software Training and Train- the-Trainer Begins (c) Barwick 2004 First Annual Report 2007 Advisory Expands 2008 CoPs Reach ~ Focus on Wiki Supports for Practitioners 2010 Possible Implementation of CAFAS v6.0 Endless Possibilities…

(c) Barwick 3 CAFAS Implementation

38 Implementation Supports (c) Barwick

Annual Reports (c) Barwick

40 4 Ontario Outcomes (c) Barwick Table 2.1 Analyzable cases in period Annual ReportAnalyzable cases Ratio (2008: other years) Ratio (current : previous year) 2008 (fiscal year) 26, , , , ,

(c) Barwick Total cases submitted by export deadline: N = 52,423 Next: Restrict to the exporting time-frame Find cases outside of the admission date interval required by last export (01/04/ /03/2009) N = 151 Cases outside the range or with erroneous admission dates N = 52,423 Next: Restrict to the reporting time-frame Find cases closed prior to 01/04/2008 or cases with a T1 evaluation after 31/03/2009 N = 24,144 N = 28,128 Next: Restrict to cases within the 6-18 yrs old age range Find cases outside the age range N = 516 N = 27,612 Next: Restrict to cases without a pre- treatment evaluation (T1) Find cases without a T1 N = 638 Total analyzable cases: N = 26,974 Exclude Retain Cases without T1 or a T14 (Exit) CAFAS evaluation: N = 419 Cases without a T1 but with a T14 (Exit) CAFAS evaluation: N = 219 Cases with just T14 CAFAS evaluation N = 170 Cases with T14 and at least one other CAFAS evaluation: N = 49 Retain Exclude Retain

(c) Barwick Region Participant agencies Agencies submitting data Participant agencies Agencies submitting data Participant agencies Agencies submitting data Participant agencies Agencies submitting data Participant agencies Agencies submitting data No. of Analyzable Cases Submitted Central East ,580 Central West ,235 Eastern ,872 Hamilton- Niagara ,990 North East ,013 Northern ,566 South East ,241 South West , ,730 Total ,974 Number and Regional Distribution of Mandated Agencies Submitting Data

Gender Distribution of Children and Youth Receiving CMH Services and CAFAS Rating (c) Barwick

Age Distribution (c) Barwick

Children with Complex Needs (c) Barwick

Complex Needs (2) (c) Barwick

Severity at Entry to Treatment for Ontario and Regions (2005: N=9,065; N= 2006: N=18,255; 2007: N=23,566; 2008: N=26,974) (c) Barwick 0-30 Some need for service Outpatient needs Outpatient plus extra supports CE CW E HN NE N SE SW TO

Severity at Entry to Treatment (2) (c) Barwick Intensive needs 140+ Very intensive supports CE CW E HN NE N SE SW TO

Severe Impairment on CAFAS Subscales at Entry to Treatment – years 2005 to 2008 (c) Barwick

Average CAFAS Subscale Score at Entry to Treatment (T1) by Sex (N for Boys varies between 15,371 and 15,394 and N for Girls varies between 11,437 and 11,455 due to missing subscale scores) (c) Barwick

Total analysable cases: N = 26,974 Find evaluations subsequent to Entry CAFAS and retain them Find cases with a T14 evaluation (Exit) Calculate Last CAFAS evaluation N = 3,531 Retain Exclude N = 14,757 Exclude open cases and cases where all evaluations are dated before T1 or after the close date N =11,038 Last CAFAS = T14 N = 12,217 Cases with just T1 Exclude Retain N = 188 Last CAFAS = the most recent evaluation dated before or at the closing date N = 1,388 Cases with T14 but still open N = 9,650 Cases closed with T14 N = 11,226 Total cases with a Last CAFAS (candidates for Outcome Reports) N = 3,719 Cases without T14. Retain for further investigation N =11,038 Cases with T14. Retain for further investigations Exit from Services

Time between Entry to Treatment and Last CAFAS (c) Barwick

Percentage of CAFAS Evaluations (c) Barwick Look at all the missed opportunities to help me have better outcomes !

Change in Average CAFAS Total Score from Treatment Entry to Last CAFAS (N=9,663) (c) Barwick

Change in Average Score on CAFAS Subscales from Treatment Entry to Last CAFAS (N varies between 11,098 and 11,113 because of missing subscale scores ) (c) Barwick

Absolute Change in Level of Functioning (N=6,721 for 2006; N=9,663 for 2007 and N=10,955 for 2008 ) (c) Barwick Not Improved 2006: 26.1 % 2007: 25.6 % 2008: 25.8 % Improved: 2006: 73.9 % 2007: 74.4 % 2008: 74.2% No Change

Severity of Child Functioning for Various Jurisdictions (c) Barwick Author / Source Sample Description CAFAS total score Mean (SD) DiffEffect Size EntryExit, 2008 N= 10,955 children and youth served in community and hospital based mental health organizations = = , 2007 N= 9,462 children and youth served in community and hospital based mental health organizations = = , 2006 N= 6,721 children and youth served in community and hospital based mental health organizations = = , 2005 N= 2,164 children and youth served in community and hospital based mental health organizations = = , 2004 N=964 children and youth served in community and hospital based mental health organizations Hodges, 2003 N=11,815 youth referred to public mental health in fiscal year Of these, N=2,501 had an intake and discharge CAFAS Not reported in manual 0.66,, MATCH N=678 children served by Georgias Multi-Agency Team for Children who have severe emotional disturbances requiring mental health treatment in a residential setting, 64% male and 36% female. 54% Caucasian. Results are for those with an intake and discharge rating, N= No standard deviations reported, hence, no effect size calculation. Hodges, Xue & Wotring 2004 N=5, 638 youths with serious emotional disturbance (score above 50) ages 7-17 years served in community mental health service providers in = =

(c) Barwick Summary

Benefits for the Kids (c) Barwick Practitioner use of the CAFAS provides a common language and common metric for the CMH system in Ontario (system integration) Systematic assessment of functioning in multiple areas of the clients life is imperative for comprehensive assessment and formulation The systematic measurement of a clients response to treatment over the course of formulation and treatment has been shown to improve outcomes Patient outcomes can be improved if therapists are alerted to treatment response

60 Triaging for level of risk Periodic assessment of treatment response leads to improved outcomes Outcome data for service planning Increased receptivity and awareness of EBPs and outcome management; Capacity building for EBP implementation and evaluation; Common language & metric across childrens sector services; Advancing knowledge about how to roll out EBPs and support practice change Benefits for Practitioners (c) Barwick

Benefits for the Provider Organization Knowing who they serve Matching their client populations with appropriate services Hiring staff that meets client needs Becoming learning organizations Improving capacity to implement other changes Demonstrating their impact Building accountability and pride in service delivery (c) Barwick

Benefits for the CYMH system (c) Barwick Determine – for the first time ever – if Ontario children improve as a function of the services they receive System-wide use of CAFAS builds accountability for the quality of the services we provide Access to services is only meaningful if services are effective Provides an evidence-base from which to develop system and organization-level service delivery improvements

Thank you (c) Barwick

Utilizing CAFAS Exports & Reports at CTYS & Reports at CTYS

The CAFAS in Ontario Quarterly Report An excellent resource for performance measurement At 40+ pages, its long and not readily accessible for management & staff consumption Capturing trends over time requires another tool

Task: Abridge the Quarterly Report Pull key variables & performance indicators from the Report each quarter Enable comparison over time Present data in a simple spreadsheet Append charts to aid interpretation

Solution: Agency CAFAS Performance since 2006.xls

Features of the Spreadsheet Key variables are the rows, shown on left-hand side Each column represents data from one Quarterly Report Data is copied from the Report to the spreadsheet Trend symbols added as final column Charts appended as separate sheets

Sample Chart

How the spreadsheet is used Presented to senior management each quarter Highlights and key trends noted Required actions taken Spreadsheet is accessible enough to be useful as a tool for teaching management & staff how to interpret quantitative performance data

Reprocessing CAFAS Exports for Program- level Evaluation Agency-wide views dont reveal whats happening at the program level Program level results are required for effective evaluation and QI Program level results needed to explain agency-wide trends

Solution: CAFAS Program Results.xls Split Split the CAFAS Export data set into program sub-samples Reprocess Reprocess each sub-sample to create program-level reports Assemble Assemble results in one spreadsheet to allow comparison

CAFAS Program Results 2009Q3 for 36 mo. period ending Sep 30/09

How we do it.... Program-level analysis run every second quarter (six month intervals) Each spreadsheet (2008Q1, 2008Q3...) is placed in the same Excel workbook Program spreadsheets are created capturing performance history for each program Each programs spreadsheet is copied into its own workbook and charts are appended there.

Program-level Performance History: GROUPWORK Results since 2008

Program-level Chart: GW Male/Female Improvement Scores

Techie stuff.... Client ID#2 field in CAFAS is used to enter program codes (CO, GW,....) Re-run the CAFAS Export modifying the standard filter to include Client ID#2 field Use SPSS to split the export sample into program sub- samples Run reports in SPSS for each program sub-sample Cristina Vladthe staff A big thank-you to Cristina Vlad and the staff at CAFAS in Ontario for sharing their syntax files and training us on their use!

Value-added: some examples 1.Monitoring staff data input (compliance) 2.Face validity check on risk profile of program 3.Gender differences in improvement scores in two programs (CO, ERSP) > what do these mean?

Questions? Roger Rolfe, MEd, RMFT Research & QA Dept ext/238

Measuring Hope Building a Culture of Research and Evaluation at Pine River Institute Samantha Yamada, MEd Co-Founder

Pine River Institute year olds year olds Treatment for substance abuse Treatment for substance abuse Family-based Outdoor Leadership Experience Residential Treatment TransitionAftercare

Objectives 1. Describe the Process 2. Findings 3. Lessons Learned

Measuring Hope The importance of Mission The importance of Mission Defining Success Defining Success Asking The Right Questions Asking The Right Questions Being Curious Being Curious

Year 1 GoalsActivities Key Lessons Establish research as a priority for Pine River Data collection Challenges: Low enrollment Unstable work environment Lack of resources Competing priorities Development of data collection tools Sporadic data collection Focus groups post- program First outcomes report (internal document shared at staff retreat) Networking with key partners Feedback to staff is critical for buy-in Qualitative data collection is useful early in program development

Year 2 GoalsActivities Key Lessons Consistent data collection Build resources Research network Challenges: Admin changeovers Competing priorities Lack of resources Research Coordinator established Two outcome reports Quarterly feedback of results to staff Partnership with York University PhD Class Grant Application and Award Collaborate with a university Have a staff member responsible for research Support from leadership is critical

Year 3 GoalsActivitiesKey Lessons Improve data collection Knowledge Translation Research network Challenges: Competing priorities Joined collaborative NATSAP research study Hired Research Assistant Two outcome reports Monthly meetings with staff for professional development Online data collection Expanding collaborations to additional research projects Partnerships are KEY Regular contact with staff is essential Even imperfect research can help with resource gathering and policy impacts Personal connections (e.g. phone calls or in-person meetings) are best for data collection

Decreased Problem Substance Use

Increased Academic Functioning

Improved Quality of Life and Future Orientation

Qualitative Data I can say now that I no longer want to be part of my old lifestyle but in my last treatment I still did. I can see good things about me and I have goals and believe I can have a future. Student 2007 I can say now that I no longer want to be part of my old lifestyle but in my last treatment I still did. I can see good things about me and I have goals and believe I can have a future. Student 2007 My wilderness experience was the most incredible six weeks of the past decade. Student 2008 My wilderness experience was the most incredible six weeks of the past decade. Student 2008 I have learned much more about the meaning of my family and friends as well [as] more about myself and others around me. Student 2008 I have learned much more about the meaning of my family and friends as well [as] more about myself and others around me. Student 2008 My child arrived home some weeks ago from treatment at the Pine River Institute…The first few days went well and I pinched myself...Two week passed. School was good…at six weeks were in a rhythm. Life is good, – my childs phrase, not mine. I see a future, I believe it may be bright. PRI parent My child arrived home some weeks ago from treatment at the Pine River Institute…The first few days went well and I pinched myself...Two week passed. School was good…at six weeks were in a rhythm. Life is good, – my childs phrase, not mine. I see a future, I believe it may be bright. PRI parent

Lessons Learned Qualitative data collection is great early on Qualitative data collection is great early on Buy-in from leadership is critical Buy-in from leadership is critical Have regular meetings to share findings with staff Have regular meetings to share findings with staff Partnerships with universities or other agencies are great ways to increase research capacity Partnerships with universities or other agencies are great ways to increase research capacity Explicitly allocate resources toward research and evaluation Explicitly allocate resources toward research and evaluation Reporting findings aids in obtaining resources for the organization and impacting policy Reporting findings aids in obtaining resources for the organization and impacting policy