Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31, 2014 1.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Matthias Wentzlaff-Eggebert, BZgA, Cologne, Germany This work is part of the Joint Action on Improving Quality in HIV Prevention (Quality Action), which.
Best Start Conference January Peel Health Great Beginnings Initiative  In 1999, McCain and Mustard’s Early Years Study documented the importance.
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
Creating an Evaluation Plan Freya Bradford, Senior Consultant NorthSky Nonprofit Network May 23, 2013.
Developing a Logic Model
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Program Evaluation Spero Manson PhD
Evaluation. Practical Evaluation Michael Quinn Patton.
ACTION PLANNING. Session Objectives By the end of the session, participants will be able to: List the goals of Action Planning Explain the Action Planning.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
For more information, please contact Jonny Andia at 1.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Molly Chamberlin, Ph.D. Indiana Youth Institute
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
A Tool to Monitor Local Level SPF SIG Activities
The Comprehensive School Health Education Curriculum:
Impact Evaluation: Initiatives, Activities, & Coalitions Stephen Horan, PhD Community Health Solutions, Inc. September 12, 2004.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Process Evaluation Intermediate Injury Prevention Course August 23-26, 2011 Billings, MT.
Key Performance Measures, Evaluation Plans, and Work Plan
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Training of Process Facilitators Training of Process Facilitators.
The Evaluation Plan.
Program Evaluation and Logic Models
Evaluation Assists with allocating resources what is working how things can work better.
Working Definition of Program Evaluation
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Workshop 6 - How do you measure Outcomes?
Technology Transfer Execution Framework. 2 © 2007 Electric Power Research Institute, Inc. All rights reserved. Relationship Between Your EPRI Value and.
Guidance for Completing Interim Report I Evaluation Webinar Series 3 Dec 2013.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
Copyright © 2014 by The University of Kansas Refining the Program Intervention Based on Research.
1-2 Training of Process Facilitators 3-1. Training of Process Facilitators 1- Provide an overview of the role and skills of a Communities That Care Process.
Virginia WebEx Program Division Support for Substantial School Improvement 1.
Copyright 2010, The World Bank Group. All Rights Reserved. Reducing Non-Response Section B 1.
Family Resource and Youth Services Centers: Action Component Plan.
DESIGNING, IMPLEMENTING, AND EVALUATING CAREER DEVELOPMENT PROGRAMS AND SERVICES Career Development Interventions in the 21 st Century 4 th Edition Spencer.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Mapping the logic behind your programming Primary Prevention Institute
Strategies for Knowledge Management Success SCP Best Practices Showcase March 18, 2004.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
Making it Count! Program Evaluation For Youth-Led Initiatives.
Office of Service Quality
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
ACSM at Country Level Sub Group Meeting
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Session 2: Developing a Comprehensive M&E Work Plan.
Getting to the Root of the Problem Learn to Serve 501 Commons November 6, 2013 Bill Broesamle.
TeamSTEPPS for Office-Based Care Implementation Planning.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Designing Effective Evaluation Strategies for Outreach Programs
Performance and Quality Improvement
Presentation transcript:

Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,

2

Objectives By the end of the workshop, participants will be able to: Define common evaluation terms Explain the difference between outcome and process evaluation Describe methods for different evaluation Construct evaluation indicators Identify resources to assist with evaluation 3

Definition of Evaluation The systematic application of methods to collect and analyze information about the activities, characteristics, and outcomes of programs with the intent of furthering its development and improvement 4

Reasons to Evaluate Accountability to the funder, to the staff, to the clients, and to the community Evaluation can tell us if the most vulnerable populations are receiving appropriate and effective services Demonstrate effectiveness to funders, administration, community stakeholders, community leadership, clients Improve implementation and effectiveness of programs Better manage limited resources Document program accomplishments Justify current program funding Document program development and activities to help ensure successful replication and future planning 5

It Is All Connected 6 Program Implementation Program Evaluation Program Development

Relationship Between Planning, Implementation, and Outcomes 7 PlanningImplementation Outcomes

Implications of Not Knowing How an Intervention Was Implemented 8 Planning???????Outcomes

9 Formative Evaluation ◦Collects data describing the needs of the population and the factors that put them at risk Answers questions such as: How should the intervention be designed or modified to address population needs? Is this standardized program appropriate for the needs of our community?

10 Process Monitoring ◦Collects data describing the characteristics of the population served, the services provided, and the resources used to deliver those services Answers questions such as: What services were delivered? Who was served? What resources were used?

11 Process Evaluation ◦Collects and analyzes detailed data about how the program was delivered, differences between the intended population and the population served, and access to the intervention Answers questions such as: Was the intervention implemented as intended? Did the program reach the intended audience? What barriers did clients experience in accessing the program? How did the population perceive the quality of the program?

12 Outcome Monitoring ◦Collects information about client outcomes before and after the intervention, such as knowledge, attitudes, skills, or behaviors Answers questions such as: What changes took place? What changes did our participants experience? Did the expected outcomes occur?

13 Outcome Evaluation ◦Collects data about outcomes before and after the intervention for clients as well as with a similar group that did not participate in the intervention being evaluated Answers questions such as: Did the program cause the outcomes? How replicable is this?

14 Impact Evaluation ◦Collects data about health conditions at the reservation, state, regional, and/or national levels Answers questions such as: What long-term effects do all of the related programming have on the health condition?

Evaluation Related to Planning, Implementation, and Outcomes 15 Planning Implementation Outcomes Formative Evaluation Process Monitoring Process Evaluation Outcome Monitoring Outcome Evaluation Impact Evaluation

Building Upon One Another 16 Planning Effective Programs and Interventions Determining what Services were Delivered to Whom Determining if Program was Implemented as Intended Determining if Program Caused Outcomes Determining if Program Achieved its Outcome Objectives Determining Broader Impacts Impact Evaluation Outcome Evaluation Outcome Monitoring Process Evaluation Process Monitoring Formative Evaluation

Data Collection Process Monitoring Sign-in Sheets Demographic & Participant Information Participant Counts Process Evaluation Satisfaction Surveys & Questionnaires Focus Group Comparison of what happened vs intent Outcome Monitoring Pre-post Tests on Knowledge, Attitudes, etc. Case Reviews Long term Follow-up Outcome Evaluation Experimental and Control Groups Replication Using Same Outcome Monitoring 17

Measures ◦How and what do we measure so that we can show if we were successful? ◦The number of people enrolling in treatment ◦The knowledge of community members Indicators ◦What indicates that we were successful in our program? ◦An increase in the number of people enrolling in methamphetamine treatment. ◦An increase in the knowledge of community members regarding the impact of methamphetamine on the body 18

Kinds of Indicators Process Indicators Output Indicators Outcome Indicators Short Term Long Term 19

CASE STUDY We are designing a project that will target methamphetamine prevention through the implementation of a structured 8-session intervention. We will recruit youth to attend the intervention sessions and go through all 8 sessions. The intervention is designed to teach them about methamphetamine and raise their confidence to say no to peer pressure to use. We will have a series of 3 total cohorts (for a total of youth). 20

Evaluation Planning Example: ◦Process Indicator: An increase in the number of people enrolling in the intervention ◦Process Objective: By the end of the first year, there will be a 10% increase in the number of youth enrolling in the intervention Example: ◦Outcome Indicator: An decrease in the self-reported use of methamphetamine by youth on the Reservation ◦Outcome Objective: Within two years after the end of the intervention, there will be a 20% decrease in the self-reported use of methamphetamine by youth between the ages of 13 and 18 on X Reservation 21

Activity: Let’s Evaluate this Workshop! 22 Outcome Evaluation Outcome Monitoring Process Evaluation Process Monitoring Formative Evaluation Activities or tools Questions to ask

23 Evaluation Planning

Strategies to Promote Utilization Develop buy-in among evaluation stakeholders Clearly identify the intended users of the evaluation data Identify evaluation questions meaningful to the intended users Decide how the data will be used before the evaluation is conducted Present data in a user-friendly format Define clear responsibilities for data collection, analysis, storage and reporting 24

Evaluation Technical Assistance Resources National Indian Health Board NCUIH Tribal Epidemiology Centers Websites Volunteers (e.g., skilled board members) TCU and university faculty Program officers Evaluation consultants 25

Working with an Evaluation Consultant Select a consultant who knows the topic Select a consultant who is culturally competent and can communicate clearly with different stakeholders (e.g., program managers, front line staff, community members) Clarify the roles and responsibilities of the consultant and stakeholders Establish a workplan and timeline with deliverables for the consultant Meet regularly with the consultant to monitor progress Use the consultant to build internal capacity 26

Take Home Messages Evaluation is only valuable if it is used Evaluation should be part of a program implementation All team members should be involved in evaluation efforts Planning is key Evaluation is FUN! 27

Questions? 28

Thank you! ROBERT FOLEY 29