Presentation is loading. Please wait.

Presentation is loading. Please wait.

Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design.

Similar presentations


Presentation on theme: "Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design."— Presentation transcript:

1 Session 3

2 Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design useable measurement tools and address the practical issue of developing an organizational structure that allows you to implement and sustain your evaluation.

3 What to Expect from this Series 1.Learn about steps required to implement outcome evaluation 2.Design a program logic model 3.Identify question and indicators 4.Identify appropriate evaluation/measurement tools 5.Develop an implementation plan     

4 From outcomes, to questions, to indicators, to methods Did our program have an impact??? What were we trying to change? (Long-Term Outcome Objectives) How were we planning to make that contribution? (Activities) What sorts of things would we see if the expected change was happening? (Indicators) What particular contribution were we going to make to that change? (Short-Term Outcome Objectives) How can we document those observations in a systematic way? (Methods)

5 Today: FIRST, figure out what the implications of your indicator exercise are for you SECOND, prioritize the possible actions, and THIRD, come up with a viable action plan

6 A Framework for Outcome Measurement (We will go through the elements of this table one by one throughout the day)

7 Analyzing your Indicator exercise data Figuring Out What Your First Steps Ought to Be

8 Typical Situations We’re having a hard time coming up with evaluation questions We don’t have a lot of existing indicators: we don’t gather much data now We gather lots of data, but we don’t really analyze it

9 Typical Situations We gather data and we use it, but it isn’t telling us much that’s new We don’t make good use of our evaluation findings when promoting our programs We’ve got so many programs, ideas and needs, we don’t know where to start. Different programs are at very different stages.

10 Pulling out the specific implications (example) The measurement possibilities that emerged out of the exercise: Extracting some more information out of existing data, particularly around trends over time in attendance and access, and around participant satisfaction. Making contact with partner agencies that run similar programs and/or program which refer to or from yours, in order to get their impressions of how your clients are doing and also to find out what their experiences have been with attendance rates. Creating some type of new tool to gather feedback from clients. Some combination of a pre/post knowledge test, a more detailed feedback form, and/or more exploratory focus group discussions. More ambitious and sophisticated ideas like videotaping parent/child interactions and tracking your clients after they leave your program to see what happens with future pregnancies or housing choices.

11 Pulling out the specific implications (example) The next step for you seems to be an intensive and focused analysis of your rich existing data You may also want to think about new ways of communicating these findings in new ways for outside audiences. In addition, you are thinking of adding a very in-depth and unstructured qualitative component to the mix through some case studies.

12 Pulling out the specific implications (example) Possible Next Steps: Doing some intensive analysis of the data you have already gathered. Revamping your existing client feedback survey to ask more behavioural questions and perhaps switching to a pre-test post- test format Developing a tool that would allow you to seek feedback from caregivers or family members

13 Discussion What are the emerging first steps for you?

14 Keep in Mind We will talk this afternoon about estimating resource implications of these ideas.

15 The Basic Options for Action The SituationThe Strategy We’re having a hard time coming up with evaluation questions We need to clarify our theory before measurement can help us (maybe)

16 The Basic Options for Action The SituationThe Strategy We don’t have a lot of existing indicators: we don’t gather much data now We need to come up with some basic tools to get us started

17 A quick overview of basic evaluation tools

18 Pros and cons of popular measurement techniques Retrospective reflections or stories Self report (interview or survey) Peer/family/worker report Direct observation File review Clinical assessments Less Rigour More Rigour

19 The “Big Three” Surveys Focus groups Individual Interviews

20 When to Use Focus Groups: The Advantages Quick, cheap, easy to assemble direct interaction between researcher and respondents flexible - can be used for many purposes in many settings good for obtaining data from children or those who are not literate provides an opportunity to involve people in data analysis through open recording produces results which are easy to understand is often useful and enjoyable for participants

21 Limitations of Focus Groups convenience sampling and small sample size limit ability to generalize responses of participants are not independent a few members can dominate the results focus groups require a fair bit of skill to lead well the data which result, though rich, can sometimes be difficult to analyze because they are so unstructured

22 Tips for Writing Good Survey Questions Stay Focused: Know why a survey is the right tool for a job, know the questions you want the survey to answer, and know how you hope to act on the findings. Provide Feedback: If you let people know about generated from a survey, they are more likely to participate next time Pilot Test!

23 Tips for Writing Good Survey Questions Include a good mix of question types. Open-ended questions which provide some direction for people (such as, what are some of the things you like best about the program?) tend to elicit more responses than completely open questions like “please include any additional comments here”

24 Tips for Writing Good Survey Questions Keep the text of the survey short, but lay it out in an easy-to-read way. Avoid Leading Questions: –How much do you think we should increase our evaluation budget? Avoid Double Negatives: –Would it be better not to avoid increasing our evaluation budget, yes or no? Avoid “War and Peace” Questions: –Please describe your idea for a new management and accountability structure for our organization.

25 Organizing a good satisfaction survey Provide an introduction that explains the purpose Start by asking for basic descriptive information about people’s experience with the program (e.g., how long have you been coming? How did you find out about us?) Move to questions about changes in knowledge (what did you learn? How much do you feel you know about x?)

26 Organizing a good satisfaction survey Proceed to questions about behaviour & about whether they have used what they have learned Satisfaction and critical reflection questions work well next: (What was most useful? Would you do it again?) Personal information at the end

27 The Basic Options for Action The SituationThe Strategy We gather lots of data, but we don’t really analyze it We need to learn how to analyze and interpret

28 Suggested Steps in Data Analysis 1. Organize Data 2. Review Original Questions 3. Summarize and Code Centre for Research & Education in Human Services

29 Steps in Data Analysis 4. Generate Themes 5. Begin Writing 6. Provide and Receive Feedback Centre for Research & Education in Human Services

30 Creative ideas for extracting the most out of data: Natural time series or comparison group designs Data from similar organizations Don’t forget the qualitative! Be selective and speak to the questions you can speak to The power of triangulation

31 Rapport Youth Services A Case Study in Analysis of Existing Data

32 The Context Large youth counselling agency Long history (11 years) of using a client tracking database to keep track of who was served, what kind and how much service, etc. Also included data from a client satisfaction form

33 Rapport Youth & Family Services: Details of Program Activities Slide 1 Community Relations Program Supports Programming Community Connections Network with community Market and promote services Promote partnerships Referral Refer clients to appropriate services Provide information about other services Intake/ Assessment Assess clients’ needs Determine appropriateness of cases to Rapport’s mandate (goodness of fit) Program Activities Counseling Therapy Develop therapeutic relationships Help youth to build their strengths Promote protective factors Address risk factors Teach skills e.g. Stress reduction Anger man’gmnt Communication Support parents Develop healthy relations within social networks (rapport) Assist youth with school difficulties Enable youth to make good choices Groups Provide safe setting for youth to meet together Promote protective factors Address risk factors Teach skills e.g. Stress reduction Anger managem’t Communication Provide information Support parents Develop healthy relations within social networks (rapport) Enable youth to make good choices ECLYPSE Provide multi- service under one roof (Drop-in) Reach out to hard to reach students Introduce youth to services Identify crises Provide technical support Promote Mental health Supportive relations Harm reduction (drugs/alcohol use) Safe sexuality Provide employment support Administration and Management Agency Management/Administration ▪ Recruiting staff ▪ Training staff ▪ Supervising staff ▪ Gauging comm. needs ▪ Developing programs ▪ Regulating case flow ▪ Securing resources ▪ Developing policies Clinical Administration ▪ Case preparation▪ Research ▪ Session planning ▪ Case documentation

34 Rapport Youth &Family Services: Linking Activities to Short-term and Long-term Objectives Slide 1 Reduce service duplication Programming Program Activities Short-term Objectives/ Outcomes Long-term Outcomes Empowered Youth and Families Behavioural Outcomes Community Relations Program Supports Community Connections Connect clients to appropriate services ReferralIntake/ Assessment Outreach to community Counseling Therapy Groups ECLYPSE Build credibility Social Outcomes Increased healthy relations within families Increased communication skills Healthier Communities Youth who are successful in life Engaged happy youth Youth making better choices Increased stress reduction skills Increased anger management skills Increased use of healthy coping mechanisms Increased individual self- worth Increased healthy behaviors Increased school success among youth Raise awareness of services Decreased CAS/YJS involvement Increased social/emotional well being Understand clients/family situations Increased pro- social relations Address risk factors Promote protective factors Connect/reconnect youth to community services

35 Our approach Developed a logic model Spent time cleaning and organizing data and moving it from various formats into one spreadsheet. Conducting basic descriptive analyses Presented these to staff, and went back to do more in-depth analyses

36 Some of the Challenges Who counts as a “case”? What if someone comes back to the organization for more support 4 years later? Or participates in 4 different programs at once? How do you deal with changes in how the staff made use of the database over time (e.g., geography fields)? What to do when the satisfaction form (the only source of “outcome” data) isn’t filled out by very many people?

37 Findings About Presenting Issues About 41% of all cases presented were conduct issues, 26% were family, peer or relational issues 16% were anxieties, depression or emotional issues. Male clients were more likely to present conduct issues Female clients were more likely to present family, peer and relational issues, and anxieties, depression and emotional issues. An analysis of presenting issues by geographical area revealed a similar pattern for Rapport’s three main geographical areas.

38 Findings About Outcomes Clients who completed the follow-up survey experienced improvements in family dynamics, fighting in the home, and interactions at school. 99.5% of clients who completed the client satisfaction questionnaire said they were satisfied with Rapport’s services, 93% said they received the services they needed, and 95% said the services they received helped them to better deal with their problems. 90% of clients rated Rapport’s services as good or excellent, 93% of them indicated that they would return to Rapport if they needed help, and 95% said they would recommend Rapport if a friend needed help.

39 The Basic Options for Action The SituationThe Strategy We gather data and we use it, but it isn’t telling us much that’s new We need to revamp our existing tools and gather some new, complementary data. We need to get a bit more ambitious and probing with our questions We need to dig in the research literature for findings to build on

40 Easy ways to “beef up” methodological rigour StrategyRationale Incorporate more descriptive, behavioural questions Less vulnerable to social desirability bias, easier to understand Incorporate a greater variety of data types Allows you to “back up” and strengthen findings Narrow your focus with research questions Measure a few things well Measure at regular intervalsCreate your own “comparison groups” Add follow-up measures to time of intervention measures Allows for tracking of a greater range of outcomes, opens up possibility of failure Find a comparison groupIsolate the contribution of your program

41 Easy ways to make your design more practical & efficient Narrow your focus with research questions Compare the “return” on different measurement choices and Use indicator lists to minimize the length and intrusiveness of surveys Stop gathering data you don’t use Invest in buy-in, and save time on data collection. Get input from stakeholders at each stage. Pilot test!!!!

42 Criteria for Judging the Trustworthiness of an Evaluation accurate recording in the field prolonged engagement - knowing the context takes time peer checks - members of a research team debrief and challenge one another saturation - qualitative information is rich and detailed enough to ensure that key themes have not been missed

43 Criteria for Judging the Trustworthiness of an Evaluation (continued) well established "audit trail" to track where you drew information from to reach conclusions triangulation - findings are supported by multiple methods and multiple stakeholders endorsement of participants - findings are fed back and validated by stakeholders clear program description and statement of objectives links between exploration and confirmation

44 The Basic Options for Action The SituationThe Strategy We don’t make good use of our evaluation findings when promoting our programs Link back to the logic model to come up with strong core messages, then “pin” the data to those messages

45 Communicating and Using Results

46 Communication: Questions to Consider How much interpretation do you want to do (how much do others do)? Who should act after learning about the findings? What do you want them to do with the findings? How do you package information in order to facilitate this?

47 The Basic Options for Action The SituationThe Strategy We’ve got so many programs, ideas and needs, we don’t know where to start. Different programs are at very different stages. We need to think about our evaluation priorities at an organizational level We may need to pick a program to pilot evaluation strategies.

48 Discussion Continued What are the emerging first steps for you?

49 The Evaluation Action Plan

50 Where are we going to find the time for all this?

51 Key questions for discussion Exactly what kind of resources are missing? What’s going to take a lot of time or expertise? What kinds of supports might make it easier for you?


Download ppt "Session 3. Implementing Outcome Evaluation (Part Two): Practical Implementation of Your Evaluation Framework At this workshop, we will help you design."

Similar presentations


Ads by Google