Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Engaging Your Community in Practical Program Evaluation Stephanie Welch, MS-MPH, RD, LDN Knox County Health Department Phone (865)
Introduction to Monitoring and Evaluation
Conducting the Community Analysis. What is a Community Analysis?  Includes market research and broader analysis of community assets and challenges 
Developing and Implementing a Monitoring & Evaluation Plan
Donald T. Simeon Caribbean Health Research Council
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
What You Will Learn From These Sessions
Designing an Effective Evaluation Strategy
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Evaluation in the Field: Putting Concepts into Action Janet Myers, PhD MPH Richard Vezina, MPH CAPS HIV Prevention Conference April 21, 2006.
Formative and Summative Evaluations
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
Chapter 15 Evaluation.
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
CDC Evaluation Process Harlen Hays, MPH Office of Health Promotion, KDHE.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 2: Developing an evaluation plan.
Reporting and Using Evaluation Results Presented on 6/18/15.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Too expensive Too complicated Too time consuming.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Introduction to Evaluation Odette Parry & Sally-Ann Baker
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
EVALUATION THEORY AND MODEL Theory and model should have symbiotic relationship with practice Theory and model should have symbiotic relationship with.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Evaluation Approaches, Frameworks & Designs HSC 489 – Chapter 14.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
 2007 Johns Hopkins Bloomberg School of Public Health Introduction to Program Evaluation Frances Stillman, EdD Institute for Global Tobacco Control Johns.
Evaluation design and implementation Puja Myles
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Program Evaluation Principles and Applications PAS 2010.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Welcome to Program Evaluation Overview of Evaluation Concepts Copyright 2006, The Johns Hopkins University and Jane Bertrand. All rights reserved. This.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
 2007 Johns Hopkins Bloomberg School of Public Health Section B Logic Models: The Pathway Model.
Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Improved socio-economic services for a more social microfinance.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Evaluating with Perspective: Mobilizing Community Participation in Program Evaluation …its not too late!
Evaluating the Quality and Impact of Community Benefit Programs
Designing Effective Evaluation Strategies for Outreach Programs
Right-sized Evaluation
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
EVALUATION THEORY AND MODEL
M & E Plans and Frameworks
Presentation transcript:

Program Evaluation and Measurement Janet Myers

Objectives for today… To define and explain concepts and terms used in program evaluation. To understand the role of evaluation in planning and implementing health programs. To explore some quantitative and qualitative methods used to collect data for program evaluations.

What is Program Evaluation? Program evaluation is the use of social research methods to systematically investigate the effectiveness of programs in ways that are adapted to their political and organizational environments. Key components (Rossi et al)  Application of social research methods  Effectiveness of social programs  Adaptation to political and organizational context  Informing social action to improve social conditions

Why Evaluate? Ensure program effectiveness and appropriateness Demonstrate accountability Contribute to public health knowledge base Improve program operations and service delivery

Components of Program Evaluation There are 4 general components to comprehensive program evaluation:  Formative evaluation: What’s necessary to carry out the program in accord with the desired goals and objectives?  Process evaluation: How was the program implemented?  Outcome evaluation: Did the program meet its objectives?  Impact evaluation: Was the ultimate goal of the program achieved?

Every program has… Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities

Formative Evaluation Use during the development of the project to test ideas and concepts on target population. Or, before a new phase…provides information for improvement by identifying aspects of the existing program that are successful and areas in need of improvement. Generally focuses on the content and design of the program, with results useful to program staff.

Every program has… Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities

Process Evaluation Assesses extent to which program has been implemented as planned. Assesses participant and stakeholder experience and satisfaction with the program. It can help to…  Create a better learning environment  Improve presentation skills  Show accountability to funder  Reflect the target populations  Track service units

Process Evaluation Identify how a product or outcome is produced. Create detailed description of the program. Identify strengths & weaknesses of a program. In the case of negative outcomes, process data is important for understanding whether the outcome is due to the intervention design (design failure) or to whether the intervention was implemented as intended (implementation failure).

Process Evaluation Questions: Key questions in process evaluation:  What are we doing?  Are we doing it right?  Are we implementing the program as planned?

Every program has… Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities

Outcome Evaluation  Gauges the extent to which a program produces the improvements it intends  Examines effectiveness.  Can measure unintended outcomes.  In simple terms, “what is different as a result of your efforts?”

Outcomes at different stages… Initial outcomes: The first benefits or changes experienced by participants, usually involving changes in knowledge, skills or attitudes. Intermediate outcomes: Occur after the initial outcomes and link them to the longer-term outcomes desired for clients. Often, they involve behavior change. Longer-term outcomes: Measurable results that take longer to achieve, such as changes in their conditions, clinical health status or quality of life.

Outcome Evaluation Questions  Was the desired change(s) attained?  To what degree did the desired change(s) occur?  Is the program working to make a difference?

Every program has… Every program evaluation should have… Impact Indicators Outcome Indicators Process Indicators GoalsObjectivesActivities

Impact Evaluation Impact is sometimes used to mean “outcome.” Impact is perhaps better defined as a longer- term or collective outcome.  For clinical training programs, impacts may be improved patient outcomes. In global M&E, incidence or prevalence of disease.

A note about impact…  Most program evaluations focus on measuring the process and outcomes.  Measuring impact requires significant resources that most programs don’t have.  It’s also difficult to link the more immediate effects of a program to broad, often community level, impacts.

Conducting an Evaluation CDC Framework for Program Evaluation: Steps and Standards

Steps in Evaluation (CDC Framework) Engage stakeholders Engage stakeholders Those involved, those affected, primary intended users Describe the program Describe the program Need, expected effects, activities, resources, stage, context, logic model Focus the evaluation design Focus the evaluation design Purpose, users, uses, questions, methods, agreements Gather credible evidence Gather credible evidence Indicators, sources, quality, quantity, logistics Justify conclusions Justify conclusions Standards, analysis/synthesis, interpretation, judgment, recommendations Ensure use and share lessons learned Ensure use and share lessons learned Design, preparation, feedback, follow-up, dissemination

Standards for “Effective” Evaluation Utility Utility Serve the information needs of intended users Feasibility Feasibility Be realistic, prudent, diplomatic, and frugal Propriety Propriety Behave legally, ethically, and with due regard for the welfare of those involved and those affected Accuracy Accuracy Reveal and convey technically accurate information

Designing an Evaluation (1) Figure out your questions: What will this be used for? Determine your resources  Staffing  Time  Materials  $$$ Consider Methods  Quantitative vs. Qualitative  In-depth or quick and dirty

Designing an Evaluation (2) Guided by Objectives…select Process and Outcome Indicators  Relevant  Measurable  Improvable Instrument/Tool Development  Don’t reinvent the wheel! Analysis: Get answers to your questions Reporting: Formal & Informal

Ways to collect Evaluation Data Use existing documents/data Quantitative Methods Qualitative Methods Some questions to ask:  Primary v. secondary data?  Qualitative v. quantitative?

Research Design Qualitative methods: interviews, focus groups, observation, document analysis Quantitative methods: surveys, medical record abstraction, pre-test, post-test This is another course…

Analysis Evaluation is not clinical trials research. Analysis can be straightforward. Easy stats are often more useful, depending on audience.

An Example…Routine Testing Evaluation in CHCs Goal: Improve access to HIV testing in community health center settings. Objectives: By one year after initial training, offer routine testing to 100% of patients between 13 and 64; link new positives to care. Activities: Develop test algorithm for each site; All-staff training; establish mechanism for documenting offer and receipt of testing; support implementation

Evaluation Components For the Goal: Improve access to HIV testing in community health center settings Identify appropriate impact indicator/s.

Evaluation Components For the Objectives: 1. By one year after initial training, offer routine testing to 100% of patients between 13 and 64; 2. Link new positives to care. Identify appropriate outcome indicator/s.

Evaluation Components For the activities: 1. Develop test algorithm for each site; 2. Conduct All-staff training; 3. Establish mechanism for documenting offer and receipt of testing; 4. Support implementation. Identify appropriate process indicator/s.

Dissemination Planning for it is important Framing is important Think about broad audience (consumers, stakeholders, policymakers) See:

Influence of Evaluation Findings  Policy change: Achieving state funding for universal preschool  Program change: Improved service delivery  Change in individual behavior: Reducing consumer purchases of a certain type of seafood  Change in practice: Having pediatricians add screening and provide information about childhood obesity to routine interactions with patients  Structural change: Developing a strong service delivery organization where there has not been one before

More Resources … Comments/Questions: Janet Myers