How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 1: EVALUATION BASICS Anita M. Baker, Ed.D. Evaluation.

Slides:



Advertisements
Similar presentations
How to Use the Bruner Foundation Guide & Powerpoint Slides Evaluation Essentials:A Guide for Nonprofit Organizations and Their Evaluation Partners. (the.
Advertisements

Goals, Outcomes and Program Evaluation Community Memorial Foundation March 5, 2014.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Anita M. Baker, Ed.D. Jamie Bassell Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 2 Bruner Foundation Rochester, New.
Kathy Keeley Northland Foundation Strengthening Communities April 14, 2011 Writing Result Measures and Program Evaluation.
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
The EHFA Standards Council has developed a series of standards for several of the occupations in the fitness industry with job purpose as the foundation.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Sustaining Community Projects: Logic Model Construction and Implementation CYFAR Evaluation Team CYFAR, 2005.
Evaluation. Practical Evaluation Michael Quinn Patton.
CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.
How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 2: DATA COLLECTION Anita M. Baker, Ed.D. Evaluation.
Standards and Guidelines for Quality Assurance in the European
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
How to Develop the Right Research Questions for Program Evaluation
Training for Improved Performance
2014 AmeriCorps External Reviewer Training
Performance Monitoring and Financial Reports Performance Monitoring and Financial Reports UNAIDS and Unified Budget and Workplan (UBW)
OKLAHOMA QUALITY IMPROVEMENT COLLABORATIVE INTERIM TRAINING Marlene Mason MCPP Healthcare Consulting, Inc. October 28, 2010.
Evaluation: A Necessity for the Profession, an Essential Support for Practitioners Linking Intervention with Outcome Bryan Hiebert University of Victoria.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Sustainability… Start Now for a Vibrant Future Sustainability Workshop for Persistently Dangerous Schools Grantees Philadelphia, PA Tuesday, September.
Washington State Library OBE Retreat October 20, 2006 Matthew Saxton & Eric Meyers.
Performance Measurement and Analysis for Health Organizations
The Evaluation Plan.
Anita M. Baker, Ed.D. Jamie Bassell Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 3 Bruner Foundation Rochester, New.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
Outcome Based Evaluation for Digital Library Projects and Services
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Workshop 6 - How do you measure Outcomes?
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Draft Outline of Framework for Evaluation of Family Resource Centres Kieran McKeown Version V6 – 7 th September 2011, CDI Conference in Dublin Family Support.
OVERVIEW Partners in Pregnancy is a community program aimed at giving young couples the resources they need with their pregnancy. Young pregnant couples.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Classroom Assessment for Student Learning March 2009 Assessment Critiquing.
Overview of Chapters 11 – 13, & 17
Guidance for Completing Interim Report I Evaluation Webinar Series 3 Dec 2013.
Program Evaluation Dr. Ruth Buzi Mrs. Nettie Johnson Baylor College of Medicine Teen Health Clinic.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
THIS PRESENTATION IS INTENDED AS ONE COMPLETE PRESENTATION. HOWEVER, IT IS DIVIDED INTO 3 PARTS IN ORDER TO FACILITATE EASIER DOWNLOADING AND VIEWING,
Basic Nursing: Foundations of Skills & Concepts Chapter 9
1 Nemours Health Planning and Evaluation Collaborative Learning Session I: Designing Effective Health Programs.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Sustainability Planning Framework and Process Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 ©
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Chapter 15: Getting Started on the Assessment Path Essential Issues to Consider.
 2007 Johns Hopkins Bloomberg School of Public Health Section B Logic Models: The Pathway Model.
Using Logic Models to Create Effective Programs
Outcome-based Planning and Evaluation Gloria Latimer, Ed.S, Director of Community Programs Jason Vahling, M.P.H., Community Program Specialist.
Promoting Family Economic Success in San Francisco.
Anita M. Baker, Ed.D. Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 3 Bruner Foundation Rochester, New York.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Development of Gender Sensitive M&E: Tools and Strategies.
February 2, PM ET. Since the Summit… WE LISTENED…. Here’s what’s happening….. Curriculum Working Group is hard at work …… Why we are having these.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Logic Models How to Integrate Data Collection into your Everyday Work.
Provide instruction.
Outcome Logic Model Guide
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Program Analysis Overview
Logic Models and Theory of Change Models: Defining and Telling Apart
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
Performance Improvement Projects: From Idea to PIP
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
Presentation transcript:

How Do You Know When Your Programs Really Work? Evaluation Essentials for Program Managers Session 1: EVALUATION BASICS Anita M. Baker, Ed.D. Evaluation Services Hartford Foundation for Public Giving, Nonprofit Support Program: BEC Bruner Foundation

These materials are for the benefit of any 501c3 organization. They MAY be used in whole or in part provided that credit is given to the Bruner Foundation. They may NOT be sold or redistributed in whole or part for a profit. Copyright © by the Bruner Foundation 2012 * Please see supplementary materials for a sample agenda, activities and handouts Bruner Foundation Rochester, New York

2 How to Use the Bruner Foundation Evaluation Essentials for Program Managers Powerpoint Slides The Evaluation Essentials for Program Managers slides were developed as part of a Bruner Foundation special project, by evaluation trainer Anita Baker – Evaluation Services, and jointly sponsored by the Hartford Foundation for Public Giving. They were tested initially with a single organization in Rochester, NY (Lifespan) as part of the Evaluation Support Project The materials were revised and re-tested with three nonprofit organizations as part of the Anchoring Evaluation project in The slides, intended for use in organizations that have already participated in comprehensive evaluation training, include key basic information about evaluation planning, data collection and analysis in three separate presentations. Organization officials or evaluation professionals working with nonprofit organization managers are encouraged to review the slides, modify order and add/remove content according to training needs. (Please note that the first session begins with a presentation of “results” as a framework to help trainees see the overall relevance of evaluative capacity, i.e., what they are working toward. There is an ancillary file with multiple slides of “results” which can be substituted depending on trainee organization program focus.) Additional Materials To supplement these slides there are sample agendas, supporting materials for activities, and other handouts. There are “placeholder” slides with just a picture of the target with an arrow in the bullseye that signify places where activities can be undertaken. Be sure to move or eliminate these depending on the planned agenda.Other more detailed versions of the Evaluation Essentials materials area also available in Participatory Evaluation Essentials: An Updated Guide for Nonprofit Organizations and Their Evaluation Partners and the accompanying 6-session slide presentation. These materials are also available on the Bruner Foundation and Evaluation Services websites free of charge. Whether you are an organization leader or an evaluation professional working to assist nonprofit organization staff, we hope that the materials provided here will support your efforts. When you have finished using the Evaluation Essentials for Program Managers series have trainees take our survey. Bruner Foundation Rochester, New York

What if you saw results like these? i

Or results like these?  More than 90% of case managers at all sites but location C indicated they had fully adopted the Program model (PM).  Two-thirds or more of clients at all sites but location C reported improved quality of life. SITE% of clients reporting improved quality of life since PM initiated. A69% B73% C40% D71% E66% ii

Or these? iii

What if you saw results like these? RESULTS Desired Outcome * 65% of Clients show slowed or prevented disease progression at 6 and 12 months 83%87% * 75% of clients are fully engaged in HIV primary medical care 96% * 80% of clients show progress in 2 or more areas of service plan 90%94% * 50% of clients with mental health issues show improvement in mental health function by 6 months 97% * 75% of clients enrolled in SA treatment decrease use of drugs/alcohol after accessing services 93%92% * 90% of clients show improved or maintained oral health at 6 and 12 months 92%94% iv

Logical Considerations for Planning 1. Think about the results you want. 2. Decide what strategies will help you achieve those results? 3. Think about what inputs you need to conduct the desired strategies. 4. Specify outcomes, identify indicators and targets.** DECIDE IN ADVANCE, HOW GOOD IS GOOD ENOUGH 5. Document how services are delivered. 6. Evaluate actual results (outcomes). 1

Outcomes and Indicators  Changes in behavior, skills, knowledge, attitudes, condition or status.  Specific, measurable characteristics or changes that represent achievement of an outcome. 2

Indicator: Reminders  Many outcomes have more than one indicator  Identify the set of indicators that accurately signal achievement of an outcome (get stakeholder input) 3

Targets Specify the amount or level of outcome attainment expected, hoped for or required. Targets can be set....  Relative to external standards (when available)  Past performance/similar programs  Professional hunches 4

Target: Reminders  Should be specified in advance. Requires buy in.  Carefully word targets so they are not over or under-ambitious, make sense, and are in sync with time frames.  If target indicates change in magnitude – be sure to specify initial levels and what is positive. 5

Outcome, Indicator, Target - EXAMPLE Outcome Participants will be actively involved in afterschool activities Indicators At least 500 students will participate each month. Students will attend 70% or more of all available sessions. At least half of participants will participate in 100 or more hours per semester. 6

Outcome, Indicator, Target - EXAMPLE Outcome Participants will learn important skills Indicators 75% of campers’ parents will report their child learned something new at camp. Two-thirds of campers enrolled in swimming will demonstrate competency in 3 basic strokes. Most campers (85%) will demonstrate mastery of all performance dance moves. 7

Outcome, indicator, target - EXAMPLE OutcomeIndicators 65% of clients show slowed or prevented disease progression at 6 and 12 months Sustained CD4 counts within 50 cells Viral loads < % of clients with MH issues show improvement at 3 months, by 6 months or at program end. Maintaining or decreasing mental health distress symptoms from baseline to follow-up using SDS 8

9 Indicator Examples with Time References OutcomesIndicators Initial: Teens are knowledgeable of prenatal nutrition and health guidelines Program participants are able to identify food items that are good sources of major dietary requirements Intermediate: Teens follow proper nutrition and health guidelines Participants are within proper ranges for prenatal weight gain Participants abstain from smoking Participants take prenatal vitamins Longer Term: Teens deliver healthy babies Newborns weigh at least 5.5 pounds and score 7 or above on the APGAR scale.

Outcomes, indicators and targets activity

How do you know when your programs really work?.... EVALUATION Program Evaluation Thoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, inform decisions. 10

How do you know when your programs really work?.... EVALUATION Program Evaluation Thoughtful, systematic collection and analysis of information 10

How do you know when your programs really work?.... EVALUATION Program Evaluation Thoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, 10

How do you know when your programs really work?.... EVALUATION Program Evaluation Thoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, inform decisions. 10

What do you need to do to conduct Evaluation?  Specify key questions  Specify an approach (develop an evaluation design)  Apply evaluation logic  Collect and analyze data  Summarize and share findings 11

Key Questions Focus and drive the evaluation. Should be carefully specified and agreed upon in advance of other evaluation work. Generally represent a critical subset of information that is desired. 12

Evaluation Question Criteria  It is possible to obtain data to address the questions.  There is more than one possible “answer” to the question.  The information to address the questions is wanted and needed.  It is known how resulting information will be used internally (and externally).  The questions are aimed at changeable aspects of activity. 13

Participants identify questions using criteria

How do you know when your programs really work?.... EVALUATION Program Evaluation Thoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, inform decisions.

Types, Focuses and Timing of Evaluation TYPEFOCUSTIMING MonitoringCompliance with terms of a grant, or program design Period of the grant or program duration FormativeImplementation Short/Mid-Term Outcomes While program is operating While program is operating, at certain key junctures SummativeLong-term outcomesAs or after the program ends 14

Evaluators

Characteristics of Effective Evaluators  Basic knowledge of substantive area being evaluated  Knowledge about and experience with program evaluation  Field is un-regulated  First graduate level training programs in evaluation recent  Good references from sources you trust  Personal style and approach fit (MOST IMPORTANT) 15

Evaluation Strategy Clarification  All Evaluations Are:  Partly social  Partly political  Partly technical  Both qualitative and quantitative data can be collected and used and both are valuable.  There are multiple ways to address most evaluation needs.  Different evaluation needs call for different designs, data and data collection strategies. 16

Evaluation Purposes Evaluations are conducted to :  Render judgment  Inform decision-making  Facilitate improvements  Generate knowledge  Specify at earliest stages of evaluation planning.  Obtain input from stakeholders. 17

Who are Evaluation Stakeholders, and Why Do They Matter?  Decision-makers  Information-seekers  Those directly involved with the evaluation subject  Most programs/strategies have multiple stakeholders. Organization managers, clients and/or their caregivers, program staff, program funders, partner organizations  Stakeholders have diverse, often competing interests related to programs and evaluation.  Certain stakeholders are the primary intended users of evaluation. 18