Presentation is loading. Please wait.

Presentation is loading. Please wait.

Performance measurement: Finding our way from outputs to outcomes.

Similar presentations


Presentation on theme: "Performance measurement: Finding our way from outputs to outcomes."— Presentation transcript:

1 Performance measurement: Finding our way from outputs to outcomes

2 Finding our way from outputs to outcomes How do we know we are having an impact on children, youth and their families? How can we come together to ensure that the services we offer are appropriate and effective?

3 Panel members Dr. Melanie Barwick: The role of CAFAS in measuring organizational and system outcomes for children and youths mental health Roger Rolfe: The CYTS experience with CAFAS Samantha Yamada: Building research and evaluation capacity at PRI

4 Overview: Performance measurement WHAT is it? WHY is it necessary? WHO benefits? How is it done?

5

6 What is performance measurement? The regular collection of information for monitoring how a policy, program or initiative is doing at any point in time. It can be used to report on the level of attainment of planned results and on performance trends over time. - Treasury Board Secretariat of Canada

7 Why is it necessary? Serves as a descriptive tool on how a project, policy or program is doing Serves as an early warning if the direction of a program, policy or project is not going as planned

8 Who is the audience? End users: clients, families caregivers Service providers, educators, program staff Organization or network Health system Public at large

9 How is it different from evaluation? PM: provides regular snapshots of how a program or policy is doing; focuses on what the outcome is Client satisfaction surveys is often a key indicator of PM Evaluation: can provide insight into how and why an outcome is occurring Client satisfaction is part of the process evaluation and can influence outcomes

10 Performance measurement and evaluation SITUATION ACTIVITIES OUTCOMES INPUTS OUTPUTS Resources of a program Quantity of work, products or participants Change in target audience

11 How is it done? Program logic models Balanced score cards Strategy maps

12 Balanced scorecard approach on health promotion Source: ICES, 2004

13 Improve healthy behaviours, health promotion and disease prevention Improve linkages and transitions between addiction, mental health, health, education, social and justice systems Improve client focus of addiction services Improve access to appropriate addiction treatment Increase productive use & appropriate allocation of resources across system Ensure the continuum of interventions includes prevention, health promotion, early intervention, harm reduction and treatment services. Reducing Risk through influencing the broader determinants of health Improve health outcomes at the individual and population level Increase sustainability and equity of the addiction & health systems Increase availability and retention of the qualified human resources Ensure evidence informed practices are developed, implemented and maintained across province Further develop & increase equitable resources and capacity Ensure quality assurance within the addiction system Addiction System Strategy Map Ontario Federation of Community Mental Health & Addictions Programs Increase linkages, transition & integration within addiction services

14 Child welfare performance measurement Source: OACAS QA Framework, 2004

15 Types of performance measures Outcome measures Intermediate outcome measures Process measures Output measures Input measures

16

17 Strategic priorities by the Select Committee

18 Challenges and Opportunities Selecting measures Valid and reliable Relevant, feasible, sensitive to changes Developmentally and culturally appropriate Information management capacity within agencies and in the government Collaboration, buy-in and cultures that foster learning

19

20 Questions? Dr. Evangeline Danseco Head of Evaluation and Research Ext

21 The role of CAFAS in measuring organizational and system outcomes for children and youth mental health Melanie Barwick, PhD, C.Psych. Associate Scientist, Hospital for Sick Children Lead Implementer, CAFAS in Ontario (c) Barwick

22 Overview (c) Barwick 1 Evidence Base for Outcome Measurement 2 CAFAS Measure 3 CAFAS Implementation 4 Ontario Outcomes

23 When clinicians are given feedback about how clients were responding to treatment (as expected, normally functioning, failure to respond), they have an opportunity to improve outcomes and reduce deterioration in the patient. Lambert, Whipple, Smart, et al., (2001) found that they could identify potential treatment failure based on initial level of disturbance and early negative response to treatment. Providing feedback to therapists enhanced outcomes and reduced deterioration. Those identified as potential treatment failures stayed in therapy longer and had better outcomes when feedback was provided to their clinician. Patient outcomes can be improved if therapists are alerted to treatment response. This is called outcome management. (c) Barwick 1 Evidence Base for Outcome Measurement

24 Utility of Outcome Measurement (c) Barwick Using outcomes to MODIFY treatment if necessary Assessing the outcome of treatment or service Assessing outcomes DURING treatment to track client progress

25 Benefits of Outcome Measurement Promotes comprehensive approach to treatmentProvides balanced view of strengths, weakness, goalsCompliments symptom informationIdentifies risk behaviours systematicallyProvides information that is useful for formulationAssists in the development of a treatment planInforms treatment direction and discharge planning Provides services providers with a common language (CYMH, child welfare, Juvenile Justice, Education Provides a tool for advocacy and benchmarking (c) Barwick

26 Elements of a Successful System of Care in CYMH? (c) Barwick business practices human resources practice change implementation CAFAS BCFPI EBP Best practices

27 (c) Barwick Provincially mandated use of a the Child and Adolescent Functional Assessment Scale (Hodges, 2002) to measure level of functioning outcomes among 6-17 year old children and youth receiving mental health services in Ontario Begun in 2000 with training of over 3000 practitioners over 3 years; now reaching 6,000 practitioners! 120 CYMH organizations selected by MCYS to participate in the initiative CMHCs also participate in use of a systematic intake screening interview called the Brief Child and Family Phone Interview; oversight and training for BCFPI is provided by Childrens Mental Health Ontario 2 Overview of the CAFAS Measure

28 AGE GUIDELINES CAFAS - Child and Adolescent Functional Assessment Scale children ages 6-17 PECFAS - Pre-school and Early Childhood Functional Assessment Scale children ages 4-7

29 CAFAS Subscales (c) Barwick

30 Levels of Functional Impairment Severe (30) Moderate ( 20 ) Mild (10) None (0) (c) Barwick

31 31

32 CAFAS Caregiver Subscales (c) Barwick

33 33 Caregiver Resources, Material Needs… Caregiver difficulties in providing for the childs material needs - housing, food, clothes - and there is a negative impact on level of functioning Childs needs for food, clothing, housing, medical attention are not being met, causing severe risk Insufficient material needs leads to frequent negative impact on the child An occasional negative impact due to this depravation

34 34 Caregiver Resources, Social Support Caregiver difficulties in providing a home setting that is free of known risk factors (abuse, parental alcoholism) or in providing for the childs emotional & social needs Caregiver is hostile, rejecting, or does not want child to return to the home Family members are insensitive, angry and/or resentful to the youth Family not able to provide warmth, security, & sensitivity

35 35 Score Interpretation Total Score of:Corresponds to clients who are: 0-30Likely referred to qualified health professional 40-70Likely requires outpatient services Likely requires outpatient care with additional services of a supportive or intensive nature Likely requires intensive, community- based services, although some youths may need acute residential services at some point > 140Very intensive services would be required; maybe in residential or inpatient settings at some point

36 Milestones in Implementation and Uptake 1999 Ministry Mandate 2000 Rater Reliability Begins for 120 Organizations 2002 Software Training and Train- the-Trainer Begins (c) Barwick 2004 First Annual Report 2007 Advisory Expands 2008 CoPs Reach ~ Focus on Wiki Supports for Practitioners 2010 Possible Implementation of CAFAS v6.0 Endless Possibilities…

37 (c) Barwick 3 CAFAS Implementation

38 38 Implementation Supports (c) Barwick

39 Annual Reports (c) Barwick

40 40 4 Ontario Outcomes (c) Barwick Table 2.1 Analyzable cases in period Annual ReportAnalyzable cases Ratio (2008: other years) Ratio (current : previous year) 2008 (fiscal year) 26, , , , ,

41 (c) Barwick Total cases submitted by export deadline: N = 52,423 Next: Restrict to the exporting time-frame Find cases outside of the admission date interval required by last export (01/04/ /03/2009) N = 151 Cases outside the range or with erroneous admission dates N = 52,423 Next: Restrict to the reporting time-frame Find cases closed prior to 01/04/2008 or cases with a T1 evaluation after 31/03/2009 N = 24,144 N = 28,128 Next: Restrict to cases within the 6-18 yrs old age range Find cases outside the age range N = 516 N = 27,612 Next: Restrict to cases without a pre- treatment evaluation (T1) Find cases without a T1 N = 638 Total analyzable cases: N = 26,974 Exclude Retain Cases without T1 or a T14 (Exit) CAFAS evaluation: N = 419 Cases without a T1 but with a T14 (Exit) CAFAS evaluation: N = 219 Cases with just T14 CAFAS evaluation N = 170 Cases with T14 and at least one other CAFAS evaluation: N = 49 Retain Exclude Retain

42 (c) Barwick Region Participant agencies Agencies submitting data Participant agencies Agencies submitting data Participant agencies Agencies submitting data Participant agencies Agencies submitting data Participant agencies Agencies submitting data No. of Analyzable Cases Submitted Central East ,580 Central West ,235 Eastern ,872 Hamilton- Niagara ,990 North East ,013 Northern ,566 South East ,241 South West , ,730 Total ,974 Number and Regional Distribution of Mandated Agencies Submitting Data

43 Gender Distribution of Children and Youth Receiving CMH Services and CAFAS Rating (c) Barwick

44 Age Distribution (c) Barwick

45 Children with Complex Needs (c) Barwick

46 Complex Needs (2) (c) Barwick

47 Severity at Entry to Treatment for Ontario and Regions (2005: N=9,065; N= 2006: N=18,255; 2007: N=23,566; 2008: N=26,974) (c) Barwick 0-30 Some need for service Outpatient needs Outpatient plus extra supports CE CW E HN NE N SE SW TO

48 Severity at Entry to Treatment (2) (c) Barwick Intensive needs 140+ Very intensive supports CE CW E HN NE N SE SW TO

49 Severe Impairment on CAFAS Subscales at Entry to Treatment – years 2005 to 2008 (c) Barwick

50 Average CAFAS Subscale Score at Entry to Treatment (T1) by Sex (N for Boys varies between 15,371 and 15,394 and N for Girls varies between 11,437 and 11,455 due to missing subscale scores) (c) Barwick

51 Total analysable cases: N = 26,974 Find evaluations subsequent to Entry CAFAS and retain them Find cases with a T14 evaluation (Exit) Calculate Last CAFAS evaluation N = 3,531 Retain Exclude N = 14,757 Exclude open cases and cases where all evaluations are dated before T1 or after the close date N =11,038 Last CAFAS = T14 N = 12,217 Cases with just T1 Exclude Retain N = 188 Last CAFAS = the most recent evaluation dated before or at the closing date N = 1,388 Cases with T14 but still open N = 9,650 Cases closed with T14 N = 11,226 Total cases with a Last CAFAS (candidates for Outcome Reports) N = 3,719 Cases without T14. Retain for further investigation N =11,038 Cases with T14. Retain for further investigations Exit from Services

52 Time between Entry to Treatment and Last CAFAS (c) Barwick

53 Percentage of CAFAS Evaluations (c) Barwick Look at all the missed opportunities to help me have better outcomes !

54 Change in Average CAFAS Total Score from Treatment Entry to Last CAFAS (N=9,663) (c) Barwick

55 Change in Average Score on CAFAS Subscales from Treatment Entry to Last CAFAS (N varies between 11,098 and 11,113 because of missing subscale scores ) (c) Barwick

56 Absolute Change in Level of Functioning (N=6,721 for 2006; N=9,663 for 2007 and N=10,955 for 2008 ) (c) Barwick Not Improved 2006: 26.1 % 2007: 25.6 % 2008: 25.8 % Improved: 2006: 73.9 % 2007: 74.4 % 2008: 74.2% No Change

57 Severity of Child Functioning for Various Jurisdictions (c) Barwick Author / Source Sample Description CAFAS total score Mean (SD) DiffEffect Size EntryExit, 2008 N= 10,955 children and youth served in community and hospital based mental health organizations = = , 2007 N= 9,462 children and youth served in community and hospital based mental health organizations = = , 2006 N= 6,721 children and youth served in community and hospital based mental health organizations = = , 2005 N= 2,164 children and youth served in community and hospital based mental health organizations = = , 2004 N=964 children and youth served in community and hospital based mental health organizations Hodges, 2003 N=11,815 youth referred to public mental health in fiscal year Of these, N=2,501 had an intake and discharge CAFAS Not reported in manual 0.66,, MATCH N=678 children served by Georgias Multi-Agency Team for Children who have severe emotional disturbances requiring mental health treatment in a residential setting, 64% male and 36% female. 54% Caucasian. Results are for those with an intake and discharge rating, N= No standard deviations reported, hence, no effect size calculation. Hodges, Xue & Wotring 2004 N=5, 638 youths with serious emotional disturbance (score above 50) ages 7-17 years served in community mental health service providers in = =

58 (c) Barwick Summary

59 Benefits for the Kids (c) Barwick Practitioner use of the CAFAS provides a common language and common metric for the CMH system in Ontario (system integration) Systematic assessment of functioning in multiple areas of the clients life is imperative for comprehensive assessment and formulation The systematic measurement of a clients response to treatment over the course of formulation and treatment has been shown to improve outcomes Patient outcomes can be improved if therapists are alerted to treatment response

60 60 Triaging for level of risk Periodic assessment of treatment response leads to improved outcomes Outcome data for service planning Increased receptivity and awareness of EBPs and outcome management; Capacity building for EBP implementation and evaluation; Common language & metric across childrens sector services; Advancing knowledge about how to roll out EBPs and support practice change Benefits for Practitioners (c) Barwick

61 Benefits for the Provider Organization Knowing who they serve Matching their client populations with appropriate services Hiring staff that meets client needs Becoming learning organizations Improving capacity to implement other changes Demonstrating their impact Building accountability and pride in service delivery (c) Barwick

62 Benefits for the CYMH system (c) Barwick Determine – for the first time ever – if Ontario children improve as a function of the services they receive System-wide use of CAFAS builds accountability for the quality of the services we provide Access to services is only meaningful if services are effective Provides an evidence-base from which to develop system and organization-level service delivery improvements

63 Thank you (c) Barwick

64 Utilizing CAFAS Exports & Reports at CTYS & Reports at CTYS

65 The CAFAS in Ontario Quarterly Report An excellent resource for performance measurement At 40+ pages, its long and not readily accessible for management & staff consumption Capturing trends over time requires another tool

66 Task: Abridge the Quarterly Report Pull key variables & performance indicators from the Report each quarter Enable comparison over time Present data in a simple spreadsheet Append charts to aid interpretation

67 Solution: Agency CAFAS Performance since 2006.xls

68 Features of the Spreadsheet Key variables are the rows, shown on left-hand side Each column represents data from one Quarterly Report Data is copied from the Report to the spreadsheet Trend symbols added as final column Charts appended as separate sheets

69 Sample Chart

70 How the spreadsheet is used Presented to senior management each quarter Highlights and key trends noted Required actions taken Spreadsheet is accessible enough to be useful as a tool for teaching management & staff how to interpret quantitative performance data

71 Reprocessing CAFAS Exports for Program- level Evaluation Agency-wide views dont reveal whats happening at the program level Program level results are required for effective evaluation and QI Program level results needed to explain agency-wide trends

72 Solution: CAFAS Program Results.xls Split Split the CAFAS Export data set into program sub-samples Reprocess Reprocess each sub-sample to create program-level reports Assemble Assemble results in one spreadsheet to allow comparison

73 CAFAS Program Results 2009Q3 for 36 mo. period ending Sep 30/09

74 How we do it.... Program-level analysis run every second quarter (six month intervals) Each spreadsheet (2008Q1, 2008Q3...) is placed in the same Excel workbook Program spreadsheets are created capturing performance history for each program Each programs spreadsheet is copied into its own workbook and charts are appended there.

75 Program-level Performance History: GROUPWORK Results since 2008

76 Program-level Chart: GW Male/Female Improvement Scores

77 Techie stuff.... Client ID#2 field in CAFAS is used to enter program codes (CO, GW,....) Re-run the CAFAS Export modifying the standard filter to include Client ID#2 field Use SPSS to split the export sample into program sub- samples Run reports in SPSS for each program sub-sample Cristina Vladthe staff A big thank-you to Cristina Vlad and the staff at CAFAS in Ontario for sharing their syntax files and training us on their use!

78 Value-added: some examples 1.Monitoring staff data input (compliance) 2.Face validity check on risk profile of program 3.Gender differences in improvement scores in two programs (CO, ERSP) > what do these mean?

79 Questions? Roger Rolfe, MEd, RMFT Research & QA Dept ext/238

80 Measuring Hope Building a Culture of Research and Evaluation at Pine River Institute Samantha Yamada, MEd Co-Founder

81 Pine River Institute year olds year olds Treatment for substance abuse Treatment for substance abuse Family-based Outdoor Leadership Experience Residential Treatment TransitionAftercare

82 Objectives 1. Describe the Process 2. Findings 3. Lessons Learned

83 Measuring Hope The importance of Mission The importance of Mission Defining Success Defining Success Asking The Right Questions Asking The Right Questions Being Curious Being Curious

84 Year 1 GoalsActivities Key Lessons Establish research as a priority for Pine River Data collection Challenges: Low enrollment Unstable work environment Lack of resources Competing priorities Development of data collection tools Sporadic data collection Focus groups post- program First outcomes report (internal document shared at staff retreat) Networking with key partners Feedback to staff is critical for buy-in Qualitative data collection is useful early in program development

85 Year 2 GoalsActivities Key Lessons Consistent data collection Build resources Research network Challenges: Admin changeovers Competing priorities Lack of resources Research Coordinator established Two outcome reports Quarterly feedback of results to staff Partnership with York University PhD Class Grant Application and Award Collaborate with a university Have a staff member responsible for research Support from leadership is critical

86 Year 3 GoalsActivitiesKey Lessons Improve data collection Knowledge Translation Research network Challenges: Competing priorities Joined collaborative NATSAP research study Hired Research Assistant Two outcome reports Monthly meetings with staff for professional development Online data collection Expanding collaborations to additional research projects Partnerships are KEY Regular contact with staff is essential Even imperfect research can help with resource gathering and policy impacts Personal connections (e.g. phone calls or in-person meetings) are best for data collection

87 Decreased Problem Substance Use

88 Increased Academic Functioning

89 Improved Quality of Life and Future Orientation

90 Qualitative Data I can say now that I no longer want to be part of my old lifestyle but in my last treatment I still did. I can see good things about me and I have goals and believe I can have a future. Student 2007 I can say now that I no longer want to be part of my old lifestyle but in my last treatment I still did. I can see good things about me and I have goals and believe I can have a future. Student 2007 My wilderness experience was the most incredible six weeks of the past decade. Student 2008 My wilderness experience was the most incredible six weeks of the past decade. Student 2008 I have learned much more about the meaning of my family and friends as well [as] more about myself and others around me. Student 2008 I have learned much more about the meaning of my family and friends as well [as] more about myself and others around me. Student 2008 My child arrived home some weeks ago from treatment at the Pine River Institute…The first few days went well and I pinched myself...Two week passed. School was good…at six weeks were in a rhythm. Life is good, – my childs phrase, not mine. I see a future, I believe it may be bright. PRI parent My child arrived home some weeks ago from treatment at the Pine River Institute…The first few days went well and I pinched myself...Two week passed. School was good…at six weeks were in a rhythm. Life is good, – my childs phrase, not mine. I see a future, I believe it may be bright. PRI parent

91 Lessons Learned Qualitative data collection is great early on Qualitative data collection is great early on Buy-in from leadership is critical Buy-in from leadership is critical Have regular meetings to share findings with staff Have regular meetings to share findings with staff Partnerships with universities or other agencies are great ways to increase research capacity Partnerships with universities or other agencies are great ways to increase research capacity Explicitly allocate resources toward research and evaluation Explicitly allocate resources toward research and evaluation Reporting findings aids in obtaining resources for the organization and impacting policy Reporting findings aids in obtaining resources for the organization and impacting policy


Download ppt "Performance measurement: Finding our way from outputs to outcomes."

Similar presentations


Ads by Google