Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Dr Linda Allin Division of Sport Sciences The value of real life evaluation research for student learning and employability in Sports Development.
Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Exploring an outcome-led approach to work with young people North West Region Wendy Flint 13 June 2013.
Donald T. Simeon Caribbean Health Research Council
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Designing an Effective Evaluation Strategy
Determining Your Program’s Health and Financial Impact Using EPA’s Value Proposition Brenda Doroski, Director Center for Asthma and Schools U.S. Environmental.
Building Disaster-Resilient Places STEP ONE – Forming a Collaborative Planning Team.
Project Monitoring Evaluation and Assessment
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Research Plan: Using Data to Create Impactful Pride Campaigns
5 Keys to a Superior PSGC proposal: 1. Create a well written, compelling research backed narrative that describes the problem & serves as justification.
Improvement Service / Scottish Centre for Regeneration Project: Embedding an Outcomes Approach in Community Regeneration & Tackling Poverty Effectively.
Problem Identification
Problem Analysis Intelligence Step 2 - Problem Analysis Developing solutions to complex population nutrition problems (such as obesity or food insecurity)
LEVEL 1 All powerpoints in one.
Evaluation. Practical Evaluation Michael Quinn Patton.
Educational Research: Action Research in Schools
Submission Writing Fundamentals – Part Webinar Series Leonie Bryen.
Customer Focus Module Preview
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Key Findings from the Economic Impact Assessment of the CRC Programme 13 December 2005.
Reporting and Using Evaluation Results Presented on 6/18/15.
Evaluation – the why’s, the what’s and the how’s 2014 Dr Basma Ellahi (PI) Cat Crum (PhD Student)
Qualitative Evaluation of Keep Well Lanarkshire Alan Sinclair Keep Well Evaluation Officer NHS Lanarkshire.
Research Methods in Psychology (Pp 1-31). Research Studies Pay particular attention to research studies cited throughout your textbook(s) as you prepare.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Community Assessments Presentation for the Nutrition, Environment, & Food Systems for Empowerment Internship Program Sharon Lezberg 6/2015.
Evaluating a Research Report
Logic Models and Theory of Change Models: Defining and Telling Apart
Managing Learning and Knowledge Capital Human Resource Development: Chapter 4 HRD Needs Investigation: An overview Copyright © 2010 Tilde University Press.
Workshop 6 - How do you measure Outcomes?
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
System Implementation and Monitoring Regional Session Spring, 2014 Resources are available at sim.abel.yorku.ca.
Quality Assessment July 31, 2006 Informing Practice.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Mapping the logic behind your programming Primary Prevention Institute
Monitoring and Evaluation
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Researching your contemporary issue From How to Write an Effective Special Study Dodson, Jarvis & Melhuish.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Evaluation design and implementation Puja Myles
Making it Count! Program Evaluation For Youth-Led Initiatives.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
 The key concept when looking at research methods is to determine the ways in which sociologist go about developing theories.  A theory is a general.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
Initial Project Aims To increase the capacity of primary schools in partnership with parents to implement a sustainable health and sexuality education.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Stuart Murray Age Concern Wigan Borough Elaine Jones & Joan Brogden Volunteer Researchers Wigan Borough.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Introduction Social ecological approach to behavior change
GETTING 100% BUY-IN FOR YOUR NONPROFIT TECHNOLOGY ADOPTION Presenter: Kathryn Engelhardt-Cronk Founder/CEO/President Community TechKnowledge, Inc.
Solving Problems Together. OBJECTIVES At the end of this Lecture the student will be able to 1. Discuss the use of face work and politeness theory in.
Introduction Social ecological approach to behavior change
Program Evaluation ED 740 Study Team Project Program Evaluation
Introduction to Program Evaluation
Logic Models and Theory of Change Models: Defining and Telling Apart
Presentation transcript:

Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable Discussion For more information, contact: Jason Newberry

Purpose of Todays Session To identify and discuss challenges associated with measuring the outcomes of anonymous services or services whose clients are hard to reach To identify and discuss challenges associated with measuring the outcomes of anonymous services or services whose clients are hard to reach To provide agencies with some strategies/tips for improving their outcome evaluation for anonymous services To provide agencies with some strategies/tips for improving their outcome evaluation for anonymous services To engage United Way staff and service providers in a round table discussion of the possibilities and limitations of measuring outcomes for anonymous services To engage United Way staff and service providers in a round table discussion of the possibilities and limitations of measuring outcomes for anonymous services

A ROUGH AGENDA Who you are…the types of agencies present and the services offered. Evaluation that you have done in the past. What you are expected to demonstrate through evaluation (by funders, by your board, by the community, etc.). Common evaluation problems. Generating solutions – what are reasonable expectations for evaluating these types of services?

INTRODUCTIONS Who you are… Who you serve… What you provide, or do… What you expect to achieve immediately with participants… What you expect to achieve in the longer term with participants…

Canadian Mental Health Assoc. Family Association for Mental Health Everywhere Peel Housing & Property Peel Literacy Guild Peel Social Services Food path Caledon / Dufferin Victim Services Victim services peel Ontario Victims Services Secretariat Ontario Works in Peel Telecare Brampton Distress Centre Peel United Way of Oakville What we have in common… Providing services in which the organization does not know, exactly, who is being served. Providing services in which confidentiality is guaranteed and where sensitive information is exchanged. Providing single time services (no formal follow up contact). Providing low dose services; many other social & personal factors contribute to outcomes Providing services to a mobile, transient population, often with very difficult needs. Providing services that are crisis oriented and preventive; focus is on maintenance, may not focus on improvement Program types are very common; however, quality evaluation of programs is scarce Success of services depends on other agencies

Expectations of Evaluation Funding bodies, boards, the community, government expect… Data about people served (who they are, what they are like, how many of them, etc.) Data about impact (how did people improve as a result of services)

Characteristic of service Implications for Evaluation Providing services in which the organization does not know, exactly, who is being served. - Can only collect data at time of service - Can only collect limited demographic data - Attempts at data collection compromises rapport and program theory Providing services in which confidentiality is guaranteed and where sensitive information is exchanged. - Confidentiality threatened by evaluation - Attempts at data collection compromises rapport and program theory Providing single time services (no formal follow up contact). - No opportunity for follow up & measures of change Providing low dose services; many other social & personal factors contribute to outcomes - Difficult to determine unique impact of program; cant control other factors Providing services to a mobile, transient population, often with very difficult needs. - Very high attrition from program - Low participation in evaluation - Difficult to find and track people Providing services that are crisis oriented and preventive; focus is on maintenance, may not focus on improvement - If not designed to improve then difficult to (and nonsensical to try to) measure change Program types are very common; however, quality evaluation of programs is scarce - Difficult to trail blaze & problem solve without a base from which to work. Success of services depends on other agencies - Success of evaluation depends on performance & evaluation of others

Why do you think your organization is making a difference? Evidence from others (other services, other evaluations, research literature) Evidence from ourselves (informal and formal evaluation) Logic, reason, intuition

Given these circumstances, challenges, and expectations, what resources & strategies are available to agencies so that they can speak to program impact? Needs assessment Program logic and theory Evidence from the literature Detailed process evaluation Theoretically important immediate outcomes Strategic use of qualitative data Innovative, program-specific ideas about outcome evaluation

Needs assessment Ongoing demonstration of community need suggests that community believes in importance of program and that it carries benefits Ongoing demonstration of community need suggests that community believes in importance of program and that it carries benefits Key informant interviews, focus groups, community surveys help warrant the program by gaining buy in from potential service users Key informant interviews, focus groups, community surveys help warrant the program by gaining buy in from potential service users

Program logic and theory Program logic and theory Even if you cannot collect data on outcomes, you can still comprehensively describe the logic of your program – the links between what you do and the subsequent impacts on people Even if you cannot collect data on outcomes, you can still comprehensively describe the logic of your program – the links between what you do and the subsequent impacts on people Create a program logic model linking activities to outcomes Create a program logic model linking activities to outcomes Provide a list of validity assumptions that support all the links made in your model Provide a list of validity assumptions that support all the links made in your model

Program logic and theory (cont.) Program logic and theory (cont.) Your program guarantees anonymity, confidentiality and/or serves people that are difficult to reach BECAUSE it is crucial to the purpose, logic, or success of the program. For example, anonymity is a validity assumption that, if violated, compromises the program theory. Therefore, an evaluation that requires breaking anonymity is not an evaluation of the program as it was designed. If your programs theory does not rely on anonymity, then it need not be anonymous for the purposes of outcome evaluation

Evidence from the literature Evidence from the literature (academic, research, government, best practices) will help demonstrate that your program follows a theoretical rationale that is empirically supported. Evidence from the literature (academic, research, government, best practices) will help demonstrate that your program follows a theoretical rationale that is empirically supported. Often the research cited is not something you could actually do Often the research cited is not something you could actually do

Detailed process evaluation A detailed process evaluation can take the place of outcome evaluation by demonstrating the theoretically conditions under which an outcome would be expected. A detailed process evaluation can take the place of outcome evaluation by demonstrating the theoretically conditions under which an outcome would be expected. Structure a process evaluation around testing the validity assumptions that link activities to short-term outcomes Structure a process evaluation around testing the validity assumptions that link activities to short-term outcomes - are we serving the right people? - are the services being delivered as planned?

Theoretically important immediate outcomes If you are engaged in direct service, there is always the theoretical possibility of observing very immediate outcomes. Where possible, data on these can be gathered and assessed against process information If you are engaged in direct service, there is always the theoretical possibility of observing very immediate outcomes. Where possible, data on these can be gathered and assessed against process information

Strategic use of qualitative data Qualitative data is often readily accessible and can be strategically used Qualitative data is often readily accessible and can be strategically used Use QD to complement quantitative data. Use QD to complement quantitative data. Testimonials from staff, volunteers, and clients Testimonials from staff, volunteers, and clients Journals, observations, media, etc. Journals, observations, media, etc.

Innovative, program-specific ideas about assessing outcomes Even though outcomes may be difficult to gather…. …are their still creative ways to find out about outcomes?

Activities Immediate outcomes Validity Assumptions Short-term outcomes Long-term outcomes Validity Assumptions Main focus of evaluation focus is on process and implementation and direct examination of validity assumptions; theoretically important immediate outcomes are assessed (if it does not compromise the service) Secondary focus of evaluation (only possible is practical/ethical constraints are addressed; creative innovation) Likely not evaluable (unlikely to have resources for systematic investigation; probable violation of program theory; ethical considerations; program theory is weak (diluted) at this point Central Focus of Anonymous (or other problematic) Evaluations

The Role of Funding Bodies What does this mean to funders and expectations of outcome evaluation? What does this mean to funders and expectations of outcome evaluation?