Making it Count! Program Evaluation For Youth-Led Initiatives.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Teacher Work Sample Contextual Factors Learning Goals
March 23, Todays Outcomes By the end of todays workshop, participants will be able to... Revise and/or finalize SAOs within their areas Discover.
Donald T. Simeon Caribbean Health Research Council
Tired of hanging around Evaluating projects with young people.
Note: Lists provided by the Conference Board of Canada
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Kathy Keeley Northland Foundation Strengthening Communities April 14, 2011 Writing Result Measures and Program Evaluation.
Corry Bregendahl Leopold Center for Sustainable Agriculture Ames, Iowa.
Project Monitoring Evaluation and Assessment
Using your data to make your project better Ceris Anderson - StreetGames.
Motivation Are you motivated to achieve what you really want in life? And how hard do you push yourself to get things done? Wanting to do something and.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Getting started with evaluation.
Molly Chamberlin, Ph.D. Indiana Youth Institute
Educational Solutions for Workforce Development PILOT WORKSHOP EVALUATION MARY RICHARDSON MER CONSULTING.
Measuring for Success Module Nine Instructions:
Reflective practice Session 4 – Working together.
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
OSSE School Improvement Data Workshop Workshop #4 June 30, 2015 Office of the State Superintendent of Education.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
D OCUMENT O UR W ORK (DOW) VDSS O UTCOME R EPORT VAdata: Virginia’s Sexual and Domestic Violence Data Collection System.
Step 6: Implementing Change. Implementing Change Our Roadmap.
Measuring the Value of Your Volunteer Efforts Nikki Russell Volunteer Initiatives Manager United Way of King County.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
The Evaluation Plan.
S.M.A.R.T. Goals CBI 360° Health Why are S.M.A.R.T. Goals Important? Setting a goal gives you the motivation you need to achieve what you want.
Indicators Dr Murali Krishna Public Health Foundation of India.
Supporting voluntary and community action supporting voluntary and community action Measuring Outcomes How do we understand the changes we are making …
Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
It is my business to locate trouble… Making Evaluation Work.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Welcome! Please join us via teleconference: Phone: Code:
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
Making an impact: making impact measurement work for you Chris Dayson, Sarah Pearson and Peter Wells 15 November 2012.
Evaluating and measuring impact in career development: extension workshop Presented by – Date – Just to identify strengths and areas to improve are no.
Demonstrating the Outcomes What Does it Take to Make a Difference? Joint PDI / STRADA MASTERCLASS.
Center for Leadership Development Guarantee The Money: Making Your Case Through Program Evaluation.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
Evaluation Workshop Self evaluation – some workable ideas and approaches.
The Logic Model An Outcomes-Based Program Model. What is a Logic Model? “a systematic and visual way to present and share your understanding of the relationships.
Successfully Conducting Employee Performance Appraisals Wendy L. McCoy Director HR & Benefits Florida Conference of The United Methodist Church.
RLG Australia RBA in Complex Organisations Kate Tye
Measuring for Success Module Nine. Reflecting on the Previous Session What was most useful? What progress have you made? Any comments or questions?
Enhancing your Program through Developing Shared Vision and Mission.
Interviews By Mr Daniel Hansson.
1 “Raise the Civic Canopy” 3 rd Annual “Raise the Civic Canopy” Event February 15, 2007 Belmar Center, Lakewood, CO.
Purpose of Learning Workbook  To build skills of how to write grants  To provide methods and models to support idea creation  To learn how to communicate.
Theory of Change Models Outcomes-Based Program Design Models.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
Prevention Education Meeting May 29, 2013 Evaluation 101.
Human Services Planning Board (HSPB) Process for Getting from ‘Talk to Action’ using Results-based Accountability (RBA) December 2015.
Childhood Neglect: Improving Outcomes for Children Presentation P21 Childhood Neglect: Improving Outcomes for Children Presentation Measuring outcomes.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
How to show your social value – reporting outcomes & impact
Assessment in student life
Monitoring and Evaluation in Asset Based Approaches
Closing the circle and Reviewing the plan
OUTCOME MEASUREMENT TRAINING
Providing Evidence for your Impact
Evaluation Jacqui McDowell.
SUCCESSFUL MEASURING JAMES HARDIN, DIRECTOR, HEALTH IMPACT
Resource 1. Evaluation Planning Template
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
Presentation transcript:

Making it Count! Program Evaluation For Youth-Led Initiatives

What Is Evaluation? Evaluation is a systematic determination of a subject's merit, worth and significance, using criteria governed by a set of standards. Wikipedia In plain language, evaluation measures the value of something against the goals we identified. In community development, this often means we measure to determine if we are making the changes we hoped in our participants and our communities.

Why Evaluate? For our learning! We know what works, what doesn’t and can make revisions to our programs and services This is your time to experiment, practice, learn and grow—this kind of learning will help prepare you for your future! To be accountable! Funders (and the public) want to know what was done and accomplished with the money provided You can show your community what has been achieved and the progress that’s been made You know you’re making a difference—and you can prove it!

MAKING CHANGE Defining Your Outcomes

Outcomes What are outcomes? Outcomes are changes in behaviour, attitude, skills, condition, knowledge. We can see outcomes in our community, our participants or clients, and even in ourselves.

Questions to Consider To identify your outcomes, think about: What difference do you WANT to make in your community with your programs and services? What changes do you hope or expect to see in your clients or participants? What changes do you hope or expect to see in yourself, your staff, and your volunteers?

Examples of Outcomes Youth feel a sense of increased safety Increased participation of youth in community events Youth experience increased access to opportunities Youth increase participation in civic activities (ex. Voting) Decrease in violence involving youth

Measuring Outcomes Outcomes, like change, take time; some of the outcomes you identify may not happen for a long time and therefore may be difficult to measure, manage or track Think about shorter term outcomes you might be able to measure more easily, like changes in knowledge: The youth participants have an increased understanding of sexual health The youth participants feel an increased sense of self- confidence

Outputs are different than outcomes. While outcomes talk about changes we want to see in participants and community, outputs talk about hard numbers and deliverables. Output: the immediate results of our program activities and services, usually numeric, or the deliverables (hard product) that we create as a result of our activities.

Examples of Outputs ActivitiesOutputs Host employment workshops# of workshops # of attendees per workshop # of total attendees 1 toolkit 1 workshop guide Host budgeting/financial management workshops # of workshops # of attendees per workshop # of total attendees 1 toolkit 1 workshop guide

Measuring Outputs Measuring outputs is generally far easier than measuring outcomes Often outputs are measured through simple tracking devices, like attendance, or the completion of a deliverable

Indicators /Measures Indicators are the measurements that you’ll track to determine if you’ve achieved your outcomes. There are generally two types: Qualitative: these measures do not involve numbers, and instead focus on descriptions of changes, testimonials, and stories of change Quantitative: these are numerical measures (numbers! Numbers! Numbers!) that show the changes that have been made  Have Indicators that capture both kinds of information

Indicators QualitativeQuantitative PurposeTo provide context, detail, and description around the changes your program or services are creating. To provide clear data (proof) that demonstrates changes that your program or services are creating. BenefitCreates a human element that allows people to connect to the story. Paints a picture, tells a story that’s easily understood. Demonstrates change beyond one person or one story. Shows the bigger picture. DisadvantagesCan be seen as an individual success that doesn’t show the whole story. Misses the human impact element—doesn’t tell a story or create empathy.

Indicators Indicators can be hard to develop or select. Try thinking about: What measurement will prove that we’ve achieved our outcome? What information do we already collect or what information is collected elsewhere that we can use? How can you collect the information you need?

Examples of Indicators OutcomeIndicator The youth participants have an increased understanding of sexual health -increased # of youth can identify sexual health resources in their communities -increased # of youth report using condoms with other forms of protection … The youth participants feel an increased sense of self-confidence -increased # of youth indicate they have more hope about their future -increased # of youth are entering/using spaces that they wouldn’t have before …

Program Planning In order to evaluate at the end of a program, you need to think about evaluation right from the beginning! Program planning is when you outline your outcomes, outputs, and indicators, among other things. When you begin to plan a program, take into consideration: Your community’s likes, wants, needs, assets Your organization’s mission, strengths The outcomes you want to achieve The activities that will allow you to achieve your outcomes

Types of Evaluation Evaluation can be simple or it can be very complex. This may depend on: The type of program The length of the program The amount of funding/resources you have The outcomes you identified The activities or services you’re offering

The Fundamentals At it’s core, evaluation is about collecting feedback from your participants about their experience using your programs or services, and the changes that may (or may not) have come about because of them.

Even with minimal resources, you can do the following: 1.Get “baseline” data: This means knowing where your participants are at when they start coming to your program or using your services. This can be a simple intake process, a survey on the very first day, or asking a few key questions to the group that help you understand what they understand about the program topic. 2.Take attendance: Know how many program activities were held or services offered, know how many people used them and who used them.

3.Get feedback: After every program activity or service, ask the participants what they thought—a survey, a few questions to the group—ask about the quality of the program or service, but also ask them about what they learned, what they still don’t understand, and how the program can be better. 4.Find out what’s changed: At the end of the program or service offering, ask the same questions you asked to get your baseline data. Make note of what’s changed in the knowledge, attitude, and behaviours of your participants. Host focus groups or ask for testimonials to get stories. Make sure your participants know how you’re going to use what they say (ex. Put it on the website, share it in a grant application….)

RBA: Population Accountability Results-based accountability talks about some BIG outcomes, that it calls “population accountability” It refers to changes in the well-being of WHOLE populations (beyond participants) over time The challenge is you need to be able to a.) measure over a longer period of time and b.) demonstrate that your programs and services made the change While this is important, it can be difficult to apply

RBA: Performance Accountability RBA also talks about the value of a program or service to participants through something called “performance accountability” It’s based on two simple questions: How much did we do? How well did we do it?

How much did we do? How well did we do it? Is anyone better off? QuantityQuality Effect Effort # % Stories (Outputs) (Response to activities and processes) (Outcomes)

Young Women’s Sexual Health Program Is anyone better off? -# of workshops -# of field trips -# of new sexual health resources shared -participants’ feedback on value of workshops and field trips, and shared resources -# of participants accessing new sexual health resources -# of participants who feel more confident about sexual health -# of or % of participants who utilize protection/contraception -participants feel empowered about their own sexual health Quantity: How much did we do? Quality: How well did we do it? Effect Effort

Tools for Measurement Once you identify what you want to measure, you need to identify and create or find the tools to measure. What is the best way to gather the information you need? What tools are appropriate for your participants and community? How will you get the best and richest information?

Tools for Measurement: Surveys Focus Groups Interviews Art Projects Census Data (identify broader changes)

An Evaluation Plan Activity, Output, or Outcome IndicatorBaselineTargetsData Source Collection Methods FrequencyResponsib ility What you’re going to measure Where your target population is at the beginning How much you expect they will change or achieve What will you use to collect the data you need? When and how will you collect it? Who is in charge of implement ing this plan? -# of youth accessing sexual health resources -2 of 15 participant accessed sexual health resources -10 of 15 participant accessing sexual health resources -youth participants will share resource usage with us -survey -start of program, at midway point, at end of program -program coordinato r

Quiz Time! What did you learn? 1.What is the purpose of evaluation in community development? 2.What is the difference between outcomes and outputs? 3.What kind of indicators exist? What’s the value of each? 4.What does results-based accountability’s performance accountability model measure? 5.What are some tools you can use to collect measurements?