Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Generalized Model for Program Planning
Introduction to Monitoring and Evaluation
School Improvement Through Capacity Building The PLC Process.
Finding an Evidence- Based Program. Objectives Know how to use your needs assessment and program goals and objectives to help you select your program.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Does It Work? Evaluating Your Program
Session 3 - Health Needs Assessment
Title I Needs Assessment and Program Evaluation
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
How to Develop the Right Research Questions for Program Evaluation
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
PROGRAM PLANNING, IMPLEMENTATION & EVALUATION The Service Delivery Model Link Brenda Thompson Jamerson, Chair Services to Youth Facet May 8-12, 2013.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
Evaluation: A Necessity for the Profession, an Essential Support for Practitioners Linking Intervention with Outcome Bryan Hiebert University of Victoria.
Process Evaluation Intermediate Injury Prevention Course August 23-26, 2011 Billings, MT.
Step 6: Implementing Change. Implementing Change Our Roadmap.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Evaluation – the why’s, the what’s and the how’s 2014 Dr Basma Ellahi (PI) Cat Crum (PhD Student)
Measuring the effectiveness of your volunteer program Meals On Wheels Association of America 2011 Annual Conference August 30, 2011 Stacey McKeever, MPH,
Evaluation in the GEF and Training Module on Terminal Evaluations
Too expensive Too complicated Too time consuming.
Preparing for the Main Event Using Logic Models as a Tool for Collaboratives Brenda M. Joly Leslie M. Beitsch August 6, 2008.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evidence-Based Public Health: A Course in Chronic Disease Prevention MODULE 4: Developing a Concise Statement of the Issue Ross Brownson March 2013.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
Evaluation Tools and Models Northern Plains CCC Program Comprehensive Cancer Control 2008 Annual Grantee Meetings May 14th, 2008 Atlanta, GA.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Program Evaluation.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
This work is part of the Joint Action on Improving Quality in HIV Prevention (Quality Action), which has received funding from the European Union within.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Prevention Education Meeting May 29, 2013 Evaluation 101.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
1 Live like Your Life Depends on it. Advancing the Message Section of Chronic Disease Prevention & Nutrition Services.
Session 13: Monitoring and Evaluation PubH325 Global Social Marketing Donna Sherard, MPH November 30, 2009.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
Are we there yet? Evaluating your graduation SiMR.
Session 2: Developing a Comprehensive M&E Work Plan.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
UTILIZING TELEPHONE INTERVIEWERS AS COUNSELORS: LESSONS LEARNED FROM A SMOKING REDUCTION STUDY Bridget Gaglio, MPH 1, Tammy Smith, BS 2, Erica Ferro, MA.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Logic Models How to Integrate Data Collection into your Everyday Work.
Evaluating the Quality and Impact of Community Benefit Programs
Evaluation of Programs
Session VII: Formulation of Monitoring and Evaluation Plan
Designing Effective Evaluation Strategies for Outreach Programs
Evaluation in the GEF and Training Module on Terminal Evaluations
Evaluation of Programs
Presentation transcript:

Evelyn Gonzalez Program Evaluation

AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts

 Overview of evaluation  Defining SMART objectives for your goals  Know how to use different methods of evaluation  Be more willing to evaluate your efforts OBJECTIVES

…the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming. (Patton, Utilization Focused Evaluation, 1997) WHAT IS PROGRAM EVALUATION?

 Did the program/intervention work?  Was it worth it?  What worked; what didn’t?  Who did we reached?  Did we get our monies worth? WHY EVALUATE

WHEN SHOULD WE BEGIN EVALUATION?

An evaluation plan is the “blueprint”  What will be evaluated  What information will be collected  When will it be collected  What will be done with the results EVALUATION PLAN

CDC FRAMEWORK FOR EVALUATION

Collect Analyze Data START Implementation of the program Gather data as you go Monitor Planning Establish Goals & Objectives Establish baseline Identify an Evidence- Base Program (EBP) Evaluation As you implement End of program Community/Audience Stakeholders Planning Phase Implementation Phase Evaluation Phase Involve Stakeholders Share results with Community & Stakeholders

 The “grand reason” for engaging in your public health effort  Span 3 or more years  State the desired end result of the program. GOALS: DEFINITION

 More specific than goals.  They state how the goals will be achieved in a certain timeframe.  Well written objectives are SMART:  Specific  Measurable  Achievable  Realistic and Relevant  Time-framed OBJECTIVES: DEFINITION

 Specific  Who are you reaching (priority audience)?  What intervention will you use?  Where, setting S.M.A.R.T.

 Measurable  Dosing, how many times will you do the intervention  What is the expected outcome  Increase of X% following the intervention  Decrease of smoking by X% S.M.A.R.T.

 Attainable  Is your intervention feasible?  Realistic and Relevant  Does the objective match the goal?  Is it evidence-based program (EBP)? S.M.A.R.T.

 Time-framed  By when do you anticipate the change?  End of the session  3,6,9 months  5 years S.M.A.R.T.

 You are working on an intervention that will increase awareness about breast cancer risk  Objective 1: Participants will be aware of the major risk factors for developing breast cancer.  How can this be re-written to be SMART? SMART OBJECTIVE EXERCISE

 Original:  Participants will be aware of the major risk factors for developing breast cancer.  SMART Objective:  Upon post test following the intervention, participants will be able to identify 3 major risk factors for developing breast cancer. SMART OBJECTIVE EXERCISE

 Original:  This program will increase screening for colorectal cancer in Arkansas.  SMART:  Colorectal screening will be increased by 5%, over the prior year for age appropriate males in Arkansas. RE-WRITTEN:

 Objective 1: Public Education for Breast Cancer Screening –  Increase knowledge and improve attitudes of all women with regards to the importance of breast cancer screening Strategy 1 – Promote campaigns to educate the public about the importance of mammography.  Action 1 – Increase awareness among all women 40 and older of the importance of regular breast cancer screening GOAL: PROMOTE AND INCREASE THE APPROPRIATE UTILIZATION OF HIGH-QUALITY BREAST CANCER SCREENING

 Planning—Develop the questions, consult with the program stakeholders or resources, make a timeline  Data Collection—Pilot testing. How will the questions be asked? Who will ask them?  Data Analysis—Who will analyze the data and how?  Reporting—Who will report and how? Who will receive the data and when? How will it affect the program  Application—How could your results be applied in other places? THE EVALUATION PROCEDURE

 Look at the evaluation methods used in the original EBP.  When discussing evaluation, think about these questions:  What is important to know?  What do you need to know versus what is nice to know?  What will be measured and how?  How will this information be used? PLANNING FOR EVALUATION

 Indicators or measures are the observable and measurable data that are used to track a program’s progress in achieving its goals.  Monitoring (program or outcome monitoring, for example) refers to on-going measurement activity SOME DEFINITIONS…

 Process evaluation can find problems early on in the program.  It includes an assessment of the staff, budget review, and how well the program is doing overall.  For this kind of evaluation, it may be useful to keep a log sheet to record each of your activities. From Windsor et al., 1994 PROCESS EVALUATION

 Impact evaluation can tell if the program has a short-term effect on the behavior, knowledge, and attitudes of your population.  It also measures the extent to which you have met your objectives. From Green and Kreuter, 1991 IMPACT EVALUATION

 Outcome evaluation looks to see if the long-term program goals were met.  These goals could be changes in rates of illness or death, as well as in the health status of your population. From McKenzie & Smeltzer, 1997 OUTCOME EVALUATION

 Identify Program Goals  For each goal:  Identify Process Objectives  Identify Outcome Objectives  For each objective:  Identify Indicators  Identify Data Source  Plan Data Collection  Plan Data Analysis APPLICATION TO YOUR PROGRAM:

DATA COLLECTION METHODS Surveys Interviews Focus Groups Observation Document Review

 You may develop a way to compare the baseline data from the needs assessment with the final outcome of your program.  Pre/Post survey in an education session.  This will let you see if you have achieved your objectives. PRE- AND POST-EVALUATION

 Primary sources  Quantitative: Surveys/questionnaires  Qualitative: Focus groups, public meetings, direct observation  Qualitative: In-depth interviews with community leaders, interviews with other program planners. INFORMATION COLLECTION

 Will depend on which EBP/Intervention selected  Answer these questions:  What specific behaviors do I want my audience to acquire or enhance?  What information or skills do they need to learn to act in a new way?  What resources do I need to carry out the program?  What methods would best help me meet my objectives? STRATEGIES

USING MIXED DATA SOURCES/METHODS  Involves using more than one data source and/or data collection method.

 Your objectives should be measurable so that they can be evaluated.  The evaluation should be in line with your objectives.  Try not to make up new things to evaluate. PROGRAM OBJECTIVES AND EVALUATION

 You may want to do a pilot test in order to evaluate the effect of your program.  A pilot test is a practice run using a small group who are similar to your target audience. PILOT TESTING

 Evidence-based programs have already done some type of evaluation.  Look to see how the program was evaluated before. Try to use the same methods.  You do not have to evaluate everything! REPLICATING THE EVALUATION

MONITORING PROGRESS

NOW THAT YOU’VE COLLECTED THE DATA, WHAT DO YOU DO WITH IT?  Analyzing data  Who  When  How  Interpretation of results and sharing findings

 Must be able to answer this!  Do not just look for the good outcomes  Learn from what didn’t work  Share both the positive and negative outcomes SO WHAT?

DEVELOPING RECOMMENDATIONS Your evaluation’s recommendations should be:  Linked with the original goals/SMART objectives.  Based on answers to your evaluation questions.  Should have stakeholder input  Tailored to the end users of the evaluation results to increase ownership and motivation to act.

SHARING RECOMMENDATIONS Community  Executive Summary  Final Report  Newsletter article(s)  Website article  Town hall meeting(s)  Radio interviews  Local newspapers Institution & Yourself  Executive Summary  Final Report  Journal articles  Professional conferences  Poster sessions  Meetings with colleagues

TIPS & CONSIDERATIONS  Consult with partners with evaluation experience  Budget 10-15% for evaluation  Staffing  Build a database  Analysis  Consider pilot testing your program  Pilot test your evaluation method & tool(s)

Trust yourself. You know more than you think you do! Benjamin Spock