Presentation is loading. Please wait.

Presentation is loading. Please wait.

Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable.

Similar presentations


Presentation on theme: "Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable."— Presentation transcript:

1 Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable Discussion For more information, contact: Jason Newberry jason@communitybasedresearch.ca

2 Purpose of Todays Session To identify and discuss challenges associated with measuring the outcomes of anonymous services or services whose clients are hard to reach To identify and discuss challenges associated with measuring the outcomes of anonymous services or services whose clients are hard to reach To provide agencies with some strategies/tips for improving their outcome evaluation for anonymous services To provide agencies with some strategies/tips for improving their outcome evaluation for anonymous services To engage United Way staff and service providers in a round table discussion of the possibilities and limitations of measuring outcomes for anonymous services To engage United Way staff and service providers in a round table discussion of the possibilities and limitations of measuring outcomes for anonymous services

3 A ROUGH AGENDA Who you are…the types of agencies present and the services offered. Evaluation that you have done in the past. What you are expected to demonstrate through evaluation (by funders, by your board, by the community, etc.). Common evaluation problems. Generating solutions – what are reasonable expectations for evaluating these types of services?

4 INTRODUCTIONS Who you are… Who you serve… What you provide, or do… What you expect to achieve immediately with participants… What you expect to achieve in the longer term with participants…

5 Canadian Mental Health Assoc. Family Association for Mental Health Everywhere Peel Housing & Property Peel Literacy Guild Peel Social Services Food path Caledon / Dufferin Victim Services Victim services peel Ontario Victims Services Secretariat Ontario Works in Peel Telecare Brampton Distress Centre Peel United Way of Oakville What we have in common… Providing services in which the organization does not know, exactly, who is being served. Providing services in which confidentiality is guaranteed and where sensitive information is exchanged. Providing single time services (no formal follow up contact). Providing low dose services; many other social & personal factors contribute to outcomes Providing services to a mobile, transient population, often with very difficult needs. Providing services that are crisis oriented and preventive; focus is on maintenance, may not focus on improvement Program types are very common; however, quality evaluation of programs is scarce Success of services depends on other agencies

6 Expectations of Evaluation Funding bodies, boards, the community, government expect… Data about people served (who they are, what they are like, how many of them, etc.) Data about impact (how did people improve as a result of services)

7 Characteristic of service Implications for Evaluation Providing services in which the organization does not know, exactly, who is being served. - Can only collect data at time of service - Can only collect limited demographic data - Attempts at data collection compromises rapport and program theory Providing services in which confidentiality is guaranteed and where sensitive information is exchanged. - Confidentiality threatened by evaluation - Attempts at data collection compromises rapport and program theory Providing single time services (no formal follow up contact). - No opportunity for follow up & measures of change Providing low dose services; many other social & personal factors contribute to outcomes - Difficult to determine unique impact of program; cant control other factors Providing services to a mobile, transient population, often with very difficult needs. - Very high attrition from program - Low participation in evaluation - Difficult to find and track people Providing services that are crisis oriented and preventive; focus is on maintenance, may not focus on improvement - If not designed to improve then difficult to (and nonsensical to try to) measure change Program types are very common; however, quality evaluation of programs is scarce - Difficult to trail blaze & problem solve without a base from which to work. Success of services depends on other agencies - Success of evaluation depends on performance & evaluation of others

8 Why do you think your organization is making a difference? Evidence from others (other services, other evaluations, research literature) Evidence from ourselves (informal and formal evaluation) Logic, reason, intuition

9 Given these circumstances, challenges, and expectations, what resources & strategies are available to agencies so that they can speak to program impact? Needs assessment Program logic and theory Evidence from the literature Detailed process evaluation Theoretically important immediate outcomes Strategic use of qualitative data Innovative, program-specific ideas about outcome evaluation

10 Needs assessment Ongoing demonstration of community need suggests that community believes in importance of program and that it carries benefits Ongoing demonstration of community need suggests that community believes in importance of program and that it carries benefits Key informant interviews, focus groups, community surveys help warrant the program by gaining buy in from potential service users Key informant interviews, focus groups, community surveys help warrant the program by gaining buy in from potential service users

11 Program logic and theory Program logic and theory Even if you cannot collect data on outcomes, you can still comprehensively describe the logic of your program – the links between what you do and the subsequent impacts on people Even if you cannot collect data on outcomes, you can still comprehensively describe the logic of your program – the links between what you do and the subsequent impacts on people Create a program logic model linking activities to outcomes Create a program logic model linking activities to outcomes Provide a list of validity assumptions that support all the links made in your model Provide a list of validity assumptions that support all the links made in your model

12 Program logic and theory (cont.) Program logic and theory (cont.) Your program guarantees anonymity, confidentiality and/or serves people that are difficult to reach BECAUSE it is crucial to the purpose, logic, or success of the program. For example, anonymity is a validity assumption that, if violated, compromises the program theory. Therefore, an evaluation that requires breaking anonymity is not an evaluation of the program as it was designed. If your programs theory does not rely on anonymity, then it need not be anonymous for the purposes of outcome evaluation

13 Evidence from the literature Evidence from the literature (academic, research, government, best practices) will help demonstrate that your program follows a theoretical rationale that is empirically supported. Evidence from the literature (academic, research, government, best practices) will help demonstrate that your program follows a theoretical rationale that is empirically supported. Often the research cited is not something you could actually do Often the research cited is not something you could actually do

14 Detailed process evaluation A detailed process evaluation can take the place of outcome evaluation by demonstrating the theoretically conditions under which an outcome would be expected. A detailed process evaluation can take the place of outcome evaluation by demonstrating the theoretically conditions under which an outcome would be expected. Structure a process evaluation around testing the validity assumptions that link activities to short-term outcomes Structure a process evaluation around testing the validity assumptions that link activities to short-term outcomes - are we serving the right people? - are the services being delivered as planned?

15 Theoretically important immediate outcomes If you are engaged in direct service, there is always the theoretical possibility of observing very immediate outcomes. Where possible, data on these can be gathered and assessed against process information If you are engaged in direct service, there is always the theoretical possibility of observing very immediate outcomes. Where possible, data on these can be gathered and assessed against process information

16 Strategic use of qualitative data Qualitative data is often readily accessible and can be strategically used Qualitative data is often readily accessible and can be strategically used Use QD to complement quantitative data. Use QD to complement quantitative data. Testimonials from staff, volunteers, and clients Testimonials from staff, volunteers, and clients Journals, observations, media, etc. Journals, observations, media, etc.

17 Innovative, program-specific ideas about assessing outcomes Even though outcomes may be difficult to gather…. …are their still creative ways to find out about outcomes?

18 Activities Immediate outcomes Validity Assumptions Short-term outcomes Long-term outcomes Validity Assumptions Main focus of evaluation focus is on process and implementation and direct examination of validity assumptions; theoretically important immediate outcomes are assessed (if it does not compromise the service) Secondary focus of evaluation (only possible is practical/ethical constraints are addressed; creative innovation) Likely not evaluable (unlikely to have resources for systematic investigation; probable violation of program theory; ethical considerations; program theory is weak (diluted) at this point Central Focus of Anonymous (or other problematic) Evaluations

19 The Role of Funding Bodies What does this mean to funders and expectations of outcome evaluation? What does this mean to funders and expectations of outcome evaluation?


Download ppt "Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable."

Similar presentations


Ads by Google