Community-Based Research Workshop Series CBR 304 A Participatory Approach to Programme Evaluation.

Slides:



Advertisements
Similar presentations
Numbers Treasure Hunt Following each question, click on the answer. If correct, the next page will load with a graphic first – these can be used to check.
Advertisements

Community-Based Research Workshop Series CBR 206 Writing Effective Letters of Intent.
Moral Character and Character Education
Steps to Success with Team National
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Author: Julia Richards and R. Scott Hawley
Science Subject Leader Training
Introduction to Performance Measurement for Senior Corps Project STAR Support and Training for Assessing Results Recorded 9/10/2007.
Overview of Performance Measurement. Learning Objectives By the end of the module, you will be able to: Describe what performance measurement is, and.
Improving the Effectiveness of Interviewer Administered Surveys though Refusal Avoidance Training Grace E. ONeill Presented by Anne Russell U.S. Census.
UNITED NATIONS Shipment Details Report – January 2006.
1 Assessing Health Needs Gilbert Burnham, MD, PhD Johns Hopkins University.
Writing Pseudocode And Making a Flow Chart A Number Guessing Game
1. 2 Choosing and preparing for a career is the most challenging developmental task of all for the late adolescent and young adult. It is essential for.
Youth Development & Youth Leadership
1. 2 Why are Result & Impact Indicators Needed? To better understand the positive/negative results of EC aid. The main questions are: 1.What change is.
1 RA I Sub-Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Casablanca, Morocco, 20 – 22 December 2005 Status of observing programmes in RA I.
Program Goals, Objectives and Performance Indicators A guide for grant and program development 3/2/2014 | Illinois Criminal Justice Information Authority.
Custom Statutory Programs Chapter 3. Customary Statutory Programs and Titles 3-2 Objectives Add Local Statutory Programs Create Customer Application For.
BUILDING THE CAPACITY TO ACHIEVE HEALTH & LEARNING OUTCOMES
1 Career Pathways for All Students PreK-14 2 Compiled by Sue Updegraff Keystone AEA Information from –Iowa Career Pathways –Iowa School-to-Work –Iowa.
Supported by 1 1 kids learn from people who care welcome! velkomin!
QA & QI And Accreditation.  A continuous process to review, critique, and implement measurable positive change in public health policies, programs or.
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
1 Implementing Internet Web Sites in Counseling and Career Development James P. Sampson, Jr. Florida State University Copyright 2003 by James P. Sampson,
Week 2 The Object-Oriented Approach to Requirements
1 Kids Helpline Australia Wendy Protheroe. 2 Established in 1991 National Service 24/7 Private and Confidential 5 – 25 years Counselling and support via.
1 SESSION 5- RECORDING AND REPORTING IN GRADES R-12 Computer Applications Technology Information Technology.
Presenter: Beresford Riley, Government of
Restaurant.org/Show #NRAShow How to Get the Most Leads from NRA Show Brian Moon VP Convention Sales & Allied Membership April 17,
Bright Futures Guidelines Priorities and Screening Tables
2008 Johns Hopkins Bloomberg School of Public Health Setting Up a Smoking Cessation Clinic Sophia Chan PhD, MPH, RN, RSCN Department of Nursing Studies.
1 NM Behavioral Health Collaborative New Mexico Behavioral Health Plan for Children, Youth and Their Families March 2007.
Developing and Implementing a Monitoring & Evaluation Plan
Promoting Regulatory Excellence Self Assessment & Physiotherapy: the Ontario Model Jan Robinson, Registrar & CEO, College of Physiotherapists of Ontario.
1 RA III - Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Buenos Aires, Argentina, 25 – 27 October 2006 Status of observing programmes in RA.
Basel-ICU-Journal Challenge18/20/ Basel-ICU-Journal Challenge8/20/2014.
1..
Settlement Program Logic Model
AHS IV Trivia Game McCreary Centre Society
CONTROL VISION Set-up. Step 1 Step 2 Step 3 Step 5 Step 4.
1 Impact Assessment. 2 Demographics 3 Sex and Age.
Goals, Outcomes and Program Evaluation Community Memorial Foundation March 5, 2014.
7/16/08 1 New Mexico’s Indicator-based Information System for Public Health Data (NM-IBIS) Community Health Assessment Training July 16, 2008.
Model and Relationships 6 M 1 M M M M M M M M M M M M M M M M
Indicator 1 – Number of Older Americans Indicator 2 – Racial and Ethnic Composition.
Creating a Culture of Quality Improvement
Analyzing Genes and Genomes
1 Phase III: Planning Action Developing Improvement Plans.
PSSA Preparation.
Essential Cell Biology
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Employment Ontario Literacy and Basic Skills Performance Management Reports Training For Service Providers.
Immunobiology: The Immune System in Health & Disease Sixth Edition
1 What Counts: Measuring the Benefits of Early Intervention in Hawai’i Beppie Shapiro Teresa Vast Center for Disability Studies University of Hawai`i With.
Data, Now What? Skills for Analyzing and Interpreting Data
Giving Kids What They Need to Succeed
Developing a Logic Model
TOOLS FOR POLICY DEVELOPMENT AND ANALYSIS Goals and Objectives.
Molly Chamberlin, Ph.D. Indiana Youth Institute
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
The Logic Model An Introduction. Slide 2 Innovation Network, Inc. Who We Are National nonprofit organization Committed to evaluation as a tool for empowerment.
1 CIL-NET, a project of ILRU – Independent Living Research Utilization CIL-NET Presents… 1 Outcome Measures for CILs A National Onsite Training Logic Models.
OVERVIEW Partners in Pregnancy is a community program aimed at giving young couples the resources they need with their pregnancy. Young pregnant couples.
Strictly adhere to the FTC model and all of ACS’s requirements for General Preventive services Maintain caseload of 45 families Conduct 2 face-to-face.
Using Logic Models to Create Effective Programs
Logic Models How to Integrate Data Collection into your Everyday Work.
SUCCESSFUL MEASURING JAMES HARDIN, DIRECTOR, HEALTH IMPACT
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
Presentation transcript:

Community-Based Research Workshop Series CBR 304 A Participatory Approach to Programme Evaluation

2 Things we will cover Why program evaluation? Review lots of jargon. Creating program logic models. Using PLMs as a tool for evaluation design.

3 Things we will not cover Specific evaluation designs Qualitative/quantitative methods

4 Introduction Your name Organization you are associated with What is the first word that comes to mind when you hear evaluation?

5 Introduction continued What is evaluation? Why should we evaluate? Who should do evaluations? How often should evaluation be done? Who uses evaluation findings?

6 What is Evaluation? Program evaluation is a collection of methods, skills and sensitivities necessary to determine whether a human service is needed and likely to be used, whether the service is sufficiently intensive to meet the unmet needs identified, whether the service is offered as planned, and whether the service actually does help people… at a reasonable cost without unacceptable side effects – Posavac & Carey, 1997, p.2

7 Why Evaluate? To assess the needs of the community To devote resources to unmet needs To verify that a program is providing intended services To determine which services provide the best results To assess what processes are effective in delivering and managing programs To provide information needed to maintain and improve quality of program

8 Who Evaluates? Program staff Independent / external consultants Academics Program users Funders

9 How often? In a regular program cycle: Too early – not enough to evaluate Too late – become crisis management

10 Who Uses Evaluation Findings? Program planners Program staff Program management Program funders Policy makers Legislators Service users

11 Traditional vs CBR Evaluation TraditionalCBR Outside expertTeam of stakeholders Expert defines problems and solutions Stakeholders collectively decide focus of evaluation Report may or may not be used for change Early buy-in from stakeholders increases likelihood of uptake Capacities leave with expertCapacity is built internally

12 Participatory Continuum Participatory nature depends on where questions originate from and where decision making power lies: Researcher /Funder Researcher consults community Community based

13 Small Group What freaks you out about evaluation? What makes it challenging? What makes it exciting?

14 D2D CASE STUDY D2D is a street youth serving agency that wants to respond to the needs of street involved youth in thecommunity. D2D offers an all night drop in with meals, access to health care workers, computer classes, and a mentorship program. They run from 7 pm to 7 am, 5 days a week. They have 10 staff and serve 500 kids a weeks. Their funding comes from the Ministry of Children and Family, drug prevention money, and a youth resiliency and empowerment grant from the Z Foundation. They want to evaluate their services.

15 D2D CASE STUDY 1) Who do you think should be on your evaluation team? (Why?) 2) What are some immediate questions you have for the D2D? 3) Where would you start?

Where to start? A good programme plan!

17 Planning & Evaluation Cycle Plan Programme Establish Need Act on findings Assess Results Implement Programme

18 Establishing Need Walking tours Interviews with formal and informal leaders Community forums Voting with your feet Visioning process Photovoice Literature review Client data

The first step in evaluation: Articulating what you are doing and why… (in other words clarify – your goals, objectives and activities) What you are doing? (practice) Why you think it should work? (theory) What will change as a result of your efforts? (evaluation )

20 Essential Components of Programme Plans Goals: broad visioning statements –e.g. To promote the birth of healthy babies Objectives: Specific things you would like to see changed –e.g. To reduce substance use among pregnant women Activities: What you will do to make your goals and objectives happen –e.g. Provide substance use treatment program for pregnant women

21 D2D How would you help the D2D articulate its Goals? Objectives? Activities?

22 Program Logic Model A flow chart which depicts the logical relationships between program activities and the changes expected to occur as a results of these activities. - United Way PEOD

23 Program Logic Models – Elements INPUTS: Resources dedicated to program e.g. money, staff, volunteers facilities, supplies ACTIVITIES: What the program does with inputs e.g. sheltering, feeding, training, education OUTPUTS: Direct products of program activities e.g. # of youth accessing centre, hours of contact, meals served OUTCOMES Benefits for Participants: 1)Immediate 2)Short term 3) Long term CHANGES!

24 Teen Sexual Health Information Program INPUTS: 2 staff $130,000/year Training space Web Server Phone Lines ACTIVITIES: Trains 100 peer sexual health counsellors Provide face to face peer counselling Host peer web site Host peer phone line OUTPUTS: Meet with 100 youth per week face to face Field 100 calls/night Field 1,000 online questions/month OUTCOMES Goal: To empower teens to make healthy sexual decisions.

25 Outcomes Immediate: youth get advice they need, youth learn new things (knowledge) Short term: greater self-esteem, increased condom use Long term: Fewer STIs, fewer pregnancies, youth empowered to make healthy sexual decisions

26 Create a logic model for D2D! INPUTS: ACTIVITIES: OUTPUTS: OUTCOMES 1)Immediate 2)Short term 3) Long term CHANGES!

What to evaluate? So many options…

28 Aspects of a program that can be evaluated Effort – resources available and used Execution – adequacy of delivery Efficacy – benefits to clients Effectiveness – attainment of outcome Efficiency – achievement/costs

29 3 Types of Evaluation Formative or Process Evaluation Outcome or Impact Evaluation Economic Evaluation

30 Relationship between Types of Evaluation: Quit Program PROCESSIMPACT Short term OUTCOME Long term ECONOMIC Practice/ programme EffectBenefitsCheaper e.g. What happened? Did people like it? Why? e.g. Did people stop smoking? e.g. Lower rates of smoking disease? e.g. Is prevention cheaper than treatment?

31 Process? Outcome? Number of people attending the sessions Level of satisfaction with sessions Behaviour change (short and long term) Number of clients that come back to a session Fewer illnesses resulting

32 Making decisions If we only focus on process – we will never know about outcome If we only focus on outcome – we will never know why a programme works or doesnt A good evaluation should have elements of both that inform each other!

33 Deciding on D2D Will you focus on process? Why? Will you focus on outcome? Why? Which elements of process or outcome are you interested in zeroing in on?

34 From model to indicators INPUTS: ACTIVITIES: OUTPUTS: OUTCOMES 1)Immediate 2)Short term 3) Long term CHANGES!

35 From model to indicators Indicator Definition: Indicators are ways of phrasing your evaluation strategies… Indicators should be directly related to your expected outcomes Indicators should be measurable Indicators should have a time element You can have both process and outcome indicators!

36 From model to indicators PROGRAMMEOUTCOMEINDICATOR Homework programmeStudents perform at grade level % participants who earn passing marks in next report card Prenatal care for substance abusing women Reduction in alcohol consumption % participants who report no alcohol consumption in 3 rd trimester

37 TYA Indicators Try and create some indicators for D2D (remember they should come directly from your outputs and outcomes) How will you collect them? What resources will you need to put in place?

38 Wrap-up Outstanding Questions

39 Workshop Evaluation Your feedback is extremely important! Please complete the workshop evaluation…. Thank you!