Evaluating Tobacco Control/Policy Programs Kimber Richter, PhD, MPH University of Kansas Medical Center Departments of Preventive Medicine and Public Health.

Slides:



Advertisements
Similar presentations
Southeastern Association of Educational Opportunity Program Personnel 38 th Annual Conference January 30 – February 3, 2010 Upward Bound Internal & External.
Advertisements

How to Enhance Personal Productivity By Janet Hadley
Designing an Effective Evaluation Strategy
Class Meeting #4 Qualitative Methods Measurement Tools & Strategies.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Creating an Evaluation Plan Freya Bradford, Senior Consultant NorthSky Nonprofit Network May 23, 2013.
PPA 502 – Program Evaluation Lecture 8b – Basic Guide to Program Evaluation.
Chapter Three: Determining Program Components by David Agnew Arkansas State University.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Formative and Summative Evaluations
1 14. Project closure n An information system project must be administratively closed once its product is successfully delivered to the customer. n A failed.
Evaluation. Practical Evaluation Michael Quinn Patton.
3 Methods for Collecting Data Mgt Three Major Techniques for Collecting Data: 1. Questionnaires 2. Interviews 3. Observation.
UOFYE Assessment Retreat
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Getting started with evaluation.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Chapter 3 Needs Assessment
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Business and Management Research
MANAGEMENT OF MARKETING
Reporting and Using Evaluation Results Presented on 6/18/15.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Prof. A. Taleb-Bendiab Room 605 A. Taleb-Bendiab, Module: Research Methods,
Impact assessment framework
The Evaluation Plan.
Conducting an Interview
Too expensive Too complicated Too time consuming.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Evaluation Assists with allocating resources what is working how things can work better.
The Formative & Summative Evaluation Process TS & EOC SAEOPP Priority 1 Training PY 2009.
Rethinking Homelessness Their Future Depends on it!
Copyright : Hi Tech Criminal Justice, Raymond E. Foster Public PolicyPublic Policy and Practice in Criminal Justice Criminal Justice Public.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Contribution Analysis Made Do-Able Rod Laird Organisation 18 September 2007 Euston, London Mike Hendricks, Ph.D.
R ESTAURANT M ANAGEMENT (HM 432) CHAPTER 5 Planning and Conducting Effective Meetings.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Research and Business Proposals and Planning for Business Reports
Chapter 15 Qualitative Data Collection Gay, Mills, and Airasian
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Applied Market Research Interviews. Preparation for Interview Choose a setting with little distraction. Avoid loud lights or noises, ensure the interviewee.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Recruit, Train, and Educate Airmen to Deliver Airpower for America How Focus Groups Can Help Your Unit 1.
Program Evaluation Luke’s Perspective. Ideal Framework Context: What needs to be done? (before) Input: How should it be done? (before) Process: Is it.
Interviews By Mr Daniel Hansson.
“Problems” in Marketing Research MAR 6648: Marketing Research January 6, 2010.
Program Assessment. Before you get started Need to determine if the program can be evaluated or should be evaluated. Is the answer already available?
Strategic Planning Crossing the ICT Bridge Project Trainers: Lynne Gibb Sally Dusting-Laird.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Program Evaluation Principles and Applications PAS 2010.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Preparing a Written Report Prepared by: R Bortolussi MD FRCPC and Noni MacDonald MD FRCPC.
Making it Count! Program Evaluation For Youth-Led Initiatives.
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
Program Assessment – an overview Karen E. Dennis O: sasoue.rutgers.edu.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Fact Finding (Capturing Requirements) Systems Development.
Logic Models How to Integrate Data Collection into your Everyday Work.
Designing Effective Evaluation Strategies for Outreach Programs
Job Analysis Chapter 4 Md. Al-Amin.
Research Methods: Data Collection
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 8. Routine monitoring.
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Program Evaluation Essentials-- Part 2
Business and Management Research
3 Methods for Collecting Data
Presentation transcript:

Evaluating Tobacco Control/Policy Programs Kimber Richter, PhD, MPH University of Kansas Medical Center Departments of Preventive Medicine and Public Health KU-MPH For more info, or worksheets included in this talk, contact me at Adapted from: Basic Guide to Program Evaluation: Written by Carter McNamara

2 Objectives Understand basic concepts of program evaluation Learn how to be a smart consumer, if you hire consultants Build a realistic, pragmatic program evaluation Much of this talk is taken from links from: Betty Jung’s Evaluation Website:

3 What it is, isn’t Carefully getting information to make decisions about programs needs assessments, accreditation, cost/benefit analysis, effectiveness, efficiency, formative, summative, goal-based, process, outcomes Don’t have to be an expert to design/implement one Many evaluations generate data that is unimportant or irrelevant Bust some myths: Not always or solely about success or failure of a program Complicated and requires knowledge of statistics, reliability, validity Does require thinking about what info you need to make decisions about program issues or needs Does require a commitment to understanding what it really going on Basic Guide to Program Evaluation: Written by Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Copyright McNamara, MBA, PhD, Authenticity Consulting, LLC

4 What can it do Help you understand what the program is really about Produce data that is helpful with PR or promoting services Understand impact of program on clients Help compare programs to decide which should be kept Examine, describe programs for dissemination elsewhere Basic Guide to Program Evaluation: Written by Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Copyright McNamara, MBA, PhD, Authenticity Consulting, LLC

5 Planning your evaluation Everybody wants to know everything about the program Usually resources are limited and force you to choose what to evaluate Need to decide what you need to know in order to make important decisions – how are we going to act on this data? Like physicians – don’t do a test unless it will alter course of treatment The more focused the evaluation, the more efficient it can be Better to do a good job evaluating 1-2 things, than a sloppy job on 6 Trade-offs in breadth/depth of evaluation Basic Guide to Program Evaluation: Written by Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Copyright McNamara, MBA, PhD, Authenticity Consulting, LLC

6 Who’s doing an evaluation soon? Or writing a grant application?

7 Basic ingredients: organization and programs Need an Organization Well-defined mission goals Board, staff working well No major crises that should be dealt with instead of eval. Evaluate programs in light of mission/resources in organization Need Programs Need to know what your clients need (needs assessment) Useful to define programs in “logic model” terms, to facilitate evaluation: –Inputs – what is needed to run program (money, facilities, customers, staff) –Process – how program is carried out (counseling, case management, daycare) –Outputs – units of service (# customers served, clients counseled, rallys held) –Outcomes – impacts on customers/clients (increased mental health, richer artistic appreciation)

8 3 major types of evaluations Goals-based evaluation Programs are established to meet one or more specific goals Goals are often described in the original program plans. Evaluate the extent to which programs meet predetermined goals or objectives. Process-based evaluation Fully understanding how a program works -- how does it produce results Useful if: –programs are long-standing and have changed over the years –employees or customers report a large number of complaints about the program –appear to be large inefficiencies in delivering program services –accurately portraying how a program truly operates (e.g., for replication elsewhere). Outcomes-based evaluation Does program produce outcomes/benefits to participants The United Way ( has excellent site for this

9 Describe program:____________________________ Inputs Process Outputs Outcomes Description Evaluation Question Measure

10 Questions to answer before starting 1.For what purposes is the evaluation being done, i.e., what do you want to be able to decide as a result of the evaluation? 2.Who are the audiences for the information from the evaluation, e.g., customers, bankers, funders, board, management, staff, customers, clients? 3.What kinds of information are needed to make the decision you need to make and/or enlighten your intended audiences, e.g., information to really understand the process of the product or program (its inputs, activities and outputs), the customers or clients who experience the product or program, strengths and weaknesses of the product or program, benefits to customers or clients (outcomes), how the product or program failed and why? 4.From what sources should the information be collected, e.g., employees, customers, clients, groups of customers or clients and employees together, program documentation? 5.How can that information be collected in a reasonable fashion, e.g., questionnaires, interviews, examining documentation, observing customers or employees, conducting focus groups among customers or employees? 6.When is the information needed (so, by when must it be collected)? 7.What resources are available to collect the information? Basic Guide to Program Evaluation: Written by Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Copyright McNamara, MBA, PhD, Authenticity Consulting, LLC

11 Goals based evaluation 1.How were the program goals (and objectives, is applicable) established? Was the process effective? 2.What is the status of the program's progress toward achieving the goals? 3.Will the goals be achieved according to the timelines specified in the program implementation or operations plan? If not, then why? 4.Do personnel have adequate resources (money, equipment, facilities, training, etc.) to achieve the goals? 5.How should priorities be changed to put more focus on achieving the goals? (Depending on the context, this question might be viewed as a program management decision, more than an evaluation question. 6.How should timelines be changed (be careful about making these changes - know why efforts are behind schedule before timelines are changed)? 7.How should goals be changed (be careful about making these changes - know why efforts are not achieving the goals before changing the goals)? Should any goals be added or removed? Why? 8.How should goals be established in the future? Basic Guide to Program Evaluation: Written by Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Copyright McNamara, MBA, PhD, Authenticity Consulting, LLC

12 Process based evaluation 1.How do customers or clients come into the program? 2.What is required of customers or client? 3.Was the program carried out according to your original plans. (Michael Patton, prominent researcher, writer and consultant in evaluation, suggests that the most important type of evaluation to carry out may be this implementation evaluation to verify that your program ended up to be implemented as you originally planned.) 4.How do employees select which products or services will be provided to the customer or client? 5.What is the general process that customers or clients go through with the product or program? 6.What do customers or clients consider to be strengths of the program? 7.What do staff consider to be strengths of the product or program? 8.What typical complaints are heard from employees and/or customers? 9.What do employees and/or customers recommend to improve the product or program? Basic Guide to Program Evaluation: Written by Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Copyright McNamara, MBA, PhD, Authenticity Consulting, LLC

13 Outcomes based evaluation 1.Identify the major outcomes that you want to examine or verify for the program under evaluation. 2.Choose the outcomes that you want to examine, prioritize the outcomes and, if your time and resources are limited, pick the top two to four most important outcomes to examine for now. 3.For each outcome, specify what observable measures, or indicators, will suggest that you're achieving that key outcome with your clients. This is often the most important and enlightening step in outcomes-based evaluation. However, it is often the most challenging and confusing step. 4.Specify a "target" goal of clients, i.e., what number or percent of clients you commit to achieving specific outcomes with, e.g., "increased self-reliance (an outcome) for 70% of adult, African American women living in the inner city of Minneapolis as evidenced by the following measures (indicators)...“ 5.Identify what information is needed to show these indicators, e.g., you'll need to know how many clients in the target group went through the program, how many of them reliably undertook their own transportation to work and stayed off drugs, etc. 6.Decide how can that information be efficiently and realistically gathered (see Selecting Which Methods to Use below). Consider program documentation, observation of program personnel and clients in the program, questionnaires and interviews about clients perceived benefits from the program, case studies of program failures and successes, etc.Selecting Which Methods to Use Basic Guide to Program Evaluation: Written by Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Copyright McNamara, MBA, PhD, Authenticity Consulting, LLC

14 Four levels of evaluation Usually, the farther down, the more useful Reactions and Feelings Poor measures of outcome, but important for other types of eval Knowledge or Learning Changes in attitudes/perceptions or knowledge Skills Clients have applied the learning to enhance behavior Effectiveness Improved performance/health status

15 Methods to collect information MethodOverall PurposeAdvantagesChallenges questionnaires, surveys, checklists when need to quickly and/or easily get lots of information from people in a non threatening way -can complete anonymously -inexpensive to administer -easy to compare and analyze -administer to many people -can get lots of data -many sample questionnaires already exist -might not get careful feedback -wording can bias client's responses -are impersonal -in surveys, may need sampling expert -doesn't get full story interviews when want to fully understand someone's impressions or experiences, or learn more about their answers to questionnaires -get full range and depth of information -develops relationship with client -can be flexible with client -can take much time -can be hard to analyze and compare -can be costly -interviewer can bias client's responses documentation review when want impression of how program operates without interrupting the program; is from review of applications, finances, memos, minutes, etc. -get comprehensive and historical information -doesn't interrupt program or client's routine in program -information already exists -few biases about information -often takes much time -info may be incomplete -need to be quite clear about what looking for -not flexible means to get data; data restricted to what already exists

16 Methods to collect information, Cont’d MethodOverall PurposeAdvantagesChallenges observation to gather accurate information about how a program actually operates, particularly about processes -view operations of a program as they are actually occurring -can adapt to events as they occur -can be difficult to interpret seen behaviors -can be complex to categorize observations -can influence behaviors of program participants -can be expensive focus groups explore a topic in depth through group discussion, e.g., about reactions to an experience or suggestion, understanding common complaints, etc.; useful in evaluation and marketing -quickly and reliably get common impressions -can be efficient way to get much range and depth of information in short time - can convey key information about programs -can be hard to analyze responses -need good facilitator for safety and closure -difficult to schedule 6-8 people together case studies to fully understand or depict client's experiences in a program, and conduct comprehensive examination through cross comparison of cases -fully depicts client's experience in program input, process and results -powerful means to portray program to outsiders -usually quite time consuming to collect, organize and describe -represents depth of information, rather than breadth

17 Overall goals in selecting methods 1.What information is needed to make current decisions about a product or program? 2.Of this information, how much can be collected and analyzed in a low-cost and practical manner, e.g., using questionnaires, surveys and checklists? 3.How accurate will the information be (reference the above table for disadvantages of methods)? 4.Will the methods get all of the needed information? 5.What additional methods should and could be used if additional information is needed? 6.Will the information appear as credible to decision makers, e.g., to funders or top management? 7.Will the nature of the audience conform to the methods, e.g., will they fill out questionnaires carefully, engage in interviews or focus groups, let you examine their documentations, etc.? 8.Who can administer the methods now or is training required? 9.How can the information be analyzed? Basic Guide to Program Evaluation: Written by Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Copyright McNamara, MBA, PhD, Authenticity Consulting, LLC

18 Analysing data Start by reviewing evaluation goals Analyzing quantitative information Store master copy of data away. Use a copy for making edits, cutting and pasting, etc. Tabulate the information, i.e., add up the number of ratings, rankings, yes's, no's for each question. Compute means or frequencies Analyzing qualitative information 1. Read through all the data. 2. Organize comments into similar categories, e.g., concerns, suggestions, strengths, weaknesses, similar experiences, program inputs, recommendations, outputs 3. Label the categories or themes, e.g., concerns, suggestions, etc. 4. Attempt to identify patterns, or associations and causal relationships in the themes

19 Outline of report 1.Title Page (name of the organization that is being, evaluated; date) 2.Table of Contents 3.Executive Summary (one-page, concise overview of findings and recommendations) 4.Purpose of the Report (what type of evaluation(s) was conducted, what decisions are being aided by the findings of the evaluation, who is making the decision, etc. 5.Background About Organization and Product/Service/Program that is being evaluated 6.Overall Evaluation Goals (eg, what questions are being answered by the evaluation) 7.Methodology 8.Interpretations and Conclusions (from analysis of the data/information) 9.Recommendations (regarding the decisions that must be made about the product/service/program 10.Appendices: content of the appendices depends on the goals of the evaluation report, eg.: Basic Guide to Program Evaluation: Written by Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Copyright McNamara, MBA, PhD, Authenticity Consulting, LLC

20 Avoid common pitfalls 1. Don't balk at evaluation because it seems far too "scientific." It's not. There is no "perfect" evaluation design. Don't worry about the plan being perfect. It's far more important to do something, than to wait until every last detail has been tested. Work hard to include some interviews in your evaluation methods. Questionnaires don't capture "the story," and the story is usually the most powerful depiction of the benefits of your services. Don't interview just the successes. You'll learn a great deal about the program by understanding its failures, dropouts, etc. Don't throw away evaluation results once a report has been generated. Results don't take up much room, and they can provide precious information later when trying to understand changes in the program.