Presentation is loading. Please wait.

Presentation is loading. Please wait.

Community program evaluation school

Similar presentations


Presentation on theme: "Community program evaluation school"— Presentation transcript:

1 Community program evaluation school
Final Review Ensuring Use and Sharing Lessons Learned May 19, 2016 Community program evaluation school

2 REVIEW TOPICS: Evaluation Types/Models Logic Models
Sample Size/Sampling Reporting Others?? We’re going to touch on some of the issues you raised last week, when some said they wanted more information about sampling and sample sizes, and quality of data – then we’ll look a little bit at basic steps involved in analyzing quantitative and qualitative data, doing some practice along the way. Review difference between quantitative and qualitative data, and examples …

3 Formative Summative Steady State Program Implementation
Simple & Complicated Programs Evaluation Activities by Program Stage Program Design & Initiation Mid-course Program Corrections Steady State Program Implementation Implementation Evaluation Outcome Evaluation Needs Assessment Logic Model/Theory of Change Formative Evaluation Process Evaluation Formative Summative

4 Quantitative surveys Since you usually cannot interview the whole target group, you have to create a sample (smaller group of the whole). Ideally your sample is a random sample. In statistics, a simple random sample is group of individuals (a sample) chosen from a larger population. Each individual is chosen randomly and entirely by chance. Asking all your friends in RP is NOT a random sample Asking everyone who comes to the TD centre is NOT a random sample Asking everyone who goes to the pool is NOT a random sample

5 Quantitative surveys Some ideas for (easy) random samples
For a random sample of RP residents: Random digit dialing Randomly choose households Randomly pick blocks On those blocks randomly pick houses In one study, we randomly choose a first house, then sampled every 4 houses after that. On each block we only interviewed 10 houses or stopped when we reached the end.

6 Quantitative surveys How many people do I need to approach???
Can do a formal sample size calculation Can informally estimate that you need at least 10 persons per variable in your data set Have swimming skills of RP residents increased? Swim skills (yes/no or degree of skills—4-5 variables), background like prior swimming lessons or experience(3-4 variables) , learned to swim at the RP pool (3-4 variables), age (1), gender, confidence (1) about the pool. (15 variables). Need 150 persons minimum

7 Evaluation Logic Model
Inputs Outputs Activities Participation Outcomes – Impacts Short Medium Long Evaluation Questions – Planning Evaluation Questions – Process Evaluation Questions - Outcomes For example… Indicators

8 Review where we are – note that the step is called “justify conclusions”
Because as central as data analysis is to evaluation, evaluators know that the evidence gathered for an evaluation does not necessarily speak for itself. Conclusions become justified when analyzed and findings (“the evidence”) are interpreted through the prism of values and standards that stakeholders bring, and then judged accordingly. Justification of conclusions is fundamental to utilization-focused evaluation. When agencies, communities, and other stakeholders agree that the conclusions are justified, they will be more inclined to use the evaluation results for program improvement.

9 Ensure Use of Evaluation Findings
Don’t wait until the end! Build it into every stage Collaborative, participatory Dissemination: Consider your audience Full disclosure and impartial reporting Don’t wait until the end. - involve the stakeholders during the planning stages to think through how potential findings will influence decision-making, planning; Build it into every stage of the evaluation process: During every stage of the evaluation, you’ll want to think about who will benefit from the evaluation findings and how best to communicate them. For example, you may want to conduct regular meetings with stakeholders to share findings in real time – or if that’s not feasible, send regular notices/newsletters about what’s been learned. This keeps everybody engaged, and focused, and most apt to use the findings The more collaborative, and participatory you are in the evaluation process, the more apt the findings are to be used When you’re figuring out how to disseminate the findings, you need to consider your audience. Match the timing, style, tone, and format of your findings to the audience. For example,

10 PRACTICE: Match Stakeholders to Format
Evaluation Course Funders (University of Toronto, Learning Centre) Course Participants Regent Park Community

11 Your turn Pool Staff (e.g. lifeguards, swimming instructors)
Pool users (e.g. families, seniors) Regent Park Community

12 Evaluation Report – Sample Outline
Executive Summary Background and Purpose Evaluation Methods Results Discussion and Recommendations Executive Summary – 1 page summary with the most important findings (for the audience) and lessons learned – Not technical or jargony, be brief and clear and impartial Background and Purpose include program background Evaluation rationale Stakeholder identification and engagement Program description Key evaluation questions/focus Methods Design Sampling procedures Measures/performance indicators Data collection and processing procedures Analysis Limitations Results Discussion and Recommendations


Download ppt "Community program evaluation school"

Similar presentations


Ads by Google