Presentation is loading. Please wait.

Presentation is loading. Please wait.

Community-Based Research Workshop Series CBR 304 A Participatory Approach to Programme Evaluation.

Similar presentations


Presentation on theme: "Community-Based Research Workshop Series CBR 304 A Participatory Approach to Programme Evaluation."— Presentation transcript:

1 Community-Based Research Workshop Series CBR 304 A Participatory Approach to Programme Evaluation

2 2 Things we will cover Why program evaluation? Review lots of jargon. Creating program logic models. Using PLMs as a tool for evaluation design.

3 3 Things we will not cover Specific evaluation designs Qualitative/quantitative methods

4 4 Introduction Your name Organization you are associated with What is the first word that comes to mind when you hear evaluation?

5 5 Introduction continued What is evaluation? Why should we evaluate? Who should do evaluations? How often should evaluation be done? Who uses evaluation findings?

6 6 What is Evaluation? Program evaluation is a collection of methods, skills and sensitivities necessary to determine whether a human service is needed and likely to be used, whether the service is sufficiently intensive to meet the unmet needs identified, whether the service is offered as planned, and whether the service actually does help people… at a reasonable cost without unacceptable side effects – Posavac & Carey, 1997, p.2

7 7 Why Evaluate? To assess the needs of the community To devote resources to unmet needs To verify that a program is providing intended services To determine which services provide the best results To assess what processes are effective in delivering and managing programs To provide information needed to maintain and improve quality of program

8 8 Who Evaluates? Program staff Independent / external consultants Academics Program users Funders

9 9 How often? In a regular program cycle: Too early – not enough to evaluate Too late – become crisis management

10 10 Who Uses Evaluation Findings? Program planners Program staff Program management Program funders Policy makers Legislators Service users

11 11 Traditional vs CBR Evaluation TraditionalCBR Outside expertTeam of stakeholders Expert defines problems and solutions Stakeholders collectively decide focus of evaluation Report may or may not be used for change Early buy-in from stakeholders increases likelihood of uptake Capacities leave with expertCapacity is built internally

12 12 Participatory Continuum Participatory nature depends on where questions originate from and where decision making power lies: Researcher /Funder Researcher consults community Community based

13 13 Small Group What freaks you out about evaluation? What makes it challenging? What makes it exciting?

14 14 D2D CASE STUDY D2D is a street youth serving agency that wants to respond to the needs of street involved youth in thecommunity. D2D offers an all night drop in with meals, access to health care workers, computer classes, and a mentorship program. They run from 7 pm to 7 am, 5 days a week. They have 10 staff and serve 500 kids a weeks. Their funding comes from the Ministry of Children and Family, drug prevention money, and a youth resiliency and empowerment grant from the Z Foundation. They want to evaluate their services.

15 15 D2D CASE STUDY 1) Who do you think should be on your evaluation team? (Why?) 2) What are some immediate questions you have for the D2D? 3) Where would you start?

16 Where to start? A good programme plan!

17 17 Planning & Evaluation Cycle Plan Programme Establish Need Act on findings Assess Results Implement Programme

18 18 Establishing Need Walking tours Interviews with formal and informal leaders Community forums Voting with your feet Visioning process Photovoice Literature review Client data

19 The first step in evaluation: Articulating what you are doing and why… (in other words clarify – your goals, objectives and activities) What you are doing? (practice) Why you think it should work? (theory) What will change as a result of your efforts? (evaluation )

20 20 Essential Components of Programme Plans Goals: broad visioning statements –e.g. To promote the birth of healthy babies Objectives: Specific things you would like to see changed –e.g. To reduce substance use among pregnant women Activities: What you will do to make your goals and objectives happen –e.g. Provide substance use treatment program for pregnant women

21 21 D2D How would you help the D2D articulate its Goals? Objectives? Activities?

22 22 Program Logic Model A flow chart which depicts the logical relationships between program activities and the changes expected to occur as a results of these activities. - United Way PEOD

23 23 Program Logic Models – Elements INPUTS: Resources dedicated to program e.g. money, staff, volunteers facilities, supplies ACTIVITIES: What the program does with inputs e.g. sheltering, feeding, training, education OUTPUTS: Direct products of program activities e.g. # of youth accessing centre, hours of contact, meals served OUTCOMES Benefits for Participants: 1)Immediate 2)Short term 3) Long term CHANGES!

24 24 Teen Sexual Health Information Program INPUTS: 2 staff $130,000/year Training space Web Server Phone Lines ACTIVITIES: Trains 100 peer sexual health counsellors Provide face to face peer counselling Host peer web site Host peer phone line OUTPUTS: Meet with 100 youth per week face to face Field 100 calls/night Field 1,000 online questions/month OUTCOMES Goal: To empower teens to make healthy sexual decisions.

25 25 Outcomes Immediate: youth get advice they need, youth learn new things (knowledge) Short term: greater self-esteem, increased condom use Long term: Fewer STIs, fewer pregnancies, youth empowered to make healthy sexual decisions

26 26 Create a logic model for D2D! INPUTS: ACTIVITIES: OUTPUTS: OUTCOMES 1)Immediate 2)Short term 3) Long term CHANGES!

27 What to evaluate? So many options…

28 28 Aspects of a program that can be evaluated Effort – resources available and used Execution – adequacy of delivery Efficacy – benefits to clients Effectiveness – attainment of outcome Efficiency – achievement/costs

29 29 3 Types of Evaluation Formative or Process Evaluation Outcome or Impact Evaluation Economic Evaluation

30 30 Relationship between Types of Evaluation: Quit Program PROCESSIMPACT Short term OUTCOME Long term ECONOMIC Practice/ programme EffectBenefitsCheaper e.g. What happened? Did people like it? Why? e.g. Did people stop smoking? e.g. Lower rates of smoking disease? e.g. Is prevention cheaper than treatment?

31 31 Process? Outcome? Number of people attending the sessions Level of satisfaction with sessions Behaviour change (short and long term) Number of clients that come back to a session Fewer illnesses resulting

32 32 Making decisions If we only focus on process – we will never know about outcome If we only focus on outcome – we will never know why a programme works or doesnt A good evaluation should have elements of both that inform each other!

33 33 Deciding on D2D Will you focus on process? Why? Will you focus on outcome? Why? Which elements of process or outcome are you interested in zeroing in on?

34 34 From model to indicators INPUTS: ACTIVITIES: OUTPUTS: OUTCOMES 1)Immediate 2)Short term 3) Long term CHANGES!

35 35 From model to indicators Indicator Definition: Indicators are ways of phrasing your evaluation strategies… Indicators should be directly related to your expected outcomes Indicators should be measurable Indicators should have a time element You can have both process and outcome indicators!

36 36 From model to indicators PROGRAMMEOUTCOMEINDICATOR Homework programmeStudents perform at grade level % participants who earn passing marks in next report card Prenatal care for substance abusing women Reduction in alcohol consumption % participants who report no alcohol consumption in 3 rd trimester

37 37 TYA Indicators Try and create some indicators for D2D (remember they should come directly from your outputs and outcomes) How will you collect them? What resources will you need to put in place?

38 38 Wrap-up Outstanding Questions

39 39 Workshop Evaluation Your feedback is extremely important! Please complete the workshop evaluation…. Thank you!


Download ppt "Community-Based Research Workshop Series CBR 304 A Participatory Approach to Programme Evaluation."

Similar presentations


Ads by Google