Program Evaluation ED 740 Study Team Project Program Evaluation

Slides:



Advertisements
Similar presentations
Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable.
Advertisements

Donald T. Simeon Caribbean Health Research Council
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
Measuring and Monitoring Program Outcomes
V MEASURING IMPACT Kristy Muir Stephen Bennett ENACTUS November 2013.
Program Evaluation Spero Manson PhD
Analyzing the Business Case
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Critique of Research Outlines: 1. Research Problem. 2. Literature Review. 3. Theoretical Framework. 4. Variables. 5. Hypotheses. 6. Design. 7. Sample.
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Develop the Right Research Questions for Program Evaluation
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Case Management 1 What Will be Covered? 1. Organizational Assessments (Module 1) 2. Designing Quality Services (Module 2) 3. HIV Prevention and Care Advancing.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
Program Fidelity Influencing Training Program Functioning and Effectiveness Cheryl J. Woods, CSW.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
STRATEGIC ENVIRONMENTAL ASSESSMENT METHODOLOGY AND TECHNIQUES.
 Now we are ready to write our evaluation report.  Basically we are going to fill our content to the checklist boxes we learned in lec2. S519.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Feasibility Study.
THE PROCESS APPROACH. Basic Concepts of Quality Outline Introduction to the Process Approach Types of Process Identification of Processes Process Analysis.
PREPARING FOR MONITORING AND EVALUATION 27 – 31 May 2013 Bangkok Bangkok Office Asia and Pacific Regional Bureau for Education United Nations Educational,
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Experimental Design Ragu, Nickola, Marina, & Shannon.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
BEHAVIOR BASED SELECTION Reducing the risk. Goals  Improve hiring accuracy  Save time and money  Reduce risk.
Stages of Research and Development
Logic Models How to Integrate Data Collection into your Everyday Work.
Overview of Intervention Mapping
Building Evidence of Effectiveness
Chapter 11: Quasi-Experimental and Single Case Experimental Designs
DATA COLLECTION METHODS IN NURSING RESEARCH
Evaluation: For Whom and for What?
Resource 1. Involving and engaging the right stakeholders.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
The assessment process For Administrative units
What is Good Assessment? A Liberal Education Core Example
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
Fundamentals of Monitoring and Evaluation
QIC-AG Logic Model Template
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Research & Writing in CJ
Initiating systems development
Introduction to Program Evaluation
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
FEASIBILITY STUDY Feasibility study is a means to check whether the proposed system is correct or not. The results of this study arte used to make decision.
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Chapter Six Training Evaluation.
Communicate the Impact of Poor Cost Information on a Decision
Chapter Eight: Quantitative Methods
Logical Framework I want to design a project by planning out the logic
Continuous Improvement Plan (CIP)
It’s Not Just for OMB Anymore…
Using Data for Program Improvement
Building a Strong Outcome Portfolio
Using Data for Program Improvement
IENE – INTERCULTURAL EDUCATION OF NURSES AND MEDICAL STAFF IN EUROPE
6 Chapter Training Evaluation.
Experiment or quasiexperiment: success and failure. Lithuanian case
Presentation transcript:

Program Evaluation ED 740 Study Team Project Program Evaluation

Program Evaluation A systematic method for collecting analyzing, and using information to answer questions about projects, policies and programs, particularly about their effectiveness and efficiency. Program Evaluation A systematic method for collecting analyzing, and using information to answer questions about projects, policies and programs, particularly about their effectiveness and efficiency.

Building Blocks Assessing needs Assessing program theory Assessing implementation Determining causation Reliability, Validity and Sensitivity The Shoestring Approach Methodological challenges Building Blocks Assessing needs Assessing program theory Assessing implementation Determining causation Reliability, Validity and Sensitivity The Shoesting Approach Methodological challenges

Program Theory Assessment Program Theory Assessment Needs Assessment and Program Theory Assessment Needs Assessment and Program Theory Assessment

Needs Assessment Examines the target population to determine if the need specified in the program is in fact a problem. If so, how the problem might best be dealt with? Needs Assessment Examines the target population to determine if the need specified in the program is in fact a problem. If so, how the problem might best be dealt with?

Needs Assessment Needs Assessment Includes… Identifying and diagnosing the actual problem the program is trying to address Who or what is affected by the problem? How widespread is the problem? What are the measurable effects that are caused by the problem? Needs Assessment Includes… Identifying and diagnosing the actual problem the program is trying to address Who or what is affected by the problem? How widespread is the problem? What are the measurable effects that are caused by the problem?

Example… Housing program aimed at mitigating homelessness. Example…

Program Theory Assessment Program Theory Assessment An assumption, implicit in the way the program is designed, about how the program’s action are supposed to achieve the outcomes it intends. Assesses whether the logic is plausible Important to review existing research that has been done it the area. Program Theory Assessment An assumption, implicit in the way the program is designed, about how the program’s action are supposed to achieve the outcomes it intends. Assesses whether the logic is plausible Important to review existing research that has been done it the area.

Example… Example… HIV Prevention Program HIV Prevention Program

Assessing Implementation

Assessing Implementation Assessing Implementation Process Evaluation Impact Evaluation Cost – Benefit Analysis Assessing Implementation Process Evaluation Impact Evaluation Cost – Benefit Analysis

Process Evaluation Describes the functions of the program: who received services, what were the services, what were the environmental factors of the program and when were the service delivered? Process Evaluation Describes the functions of the program: who received services, what were the services, what were the environmental factors of the program and when were the service delivered?

Examples of Data Collection Demographic information Number of persons served in program Description of intake and assessment process, number of interviews or interactions, description of number of training activities, specific curriculum delivered Time and length of specific intervention activities Context in which activities were delivered Examination of how long services were delivered Descriptors of who delivered services Examples of Data Collection Demographic information Number of persons served in program Description of intake and assessment process, number of interviews or interactions, description of number of training activities, specific curriculum delivered Time and length of specific intervention activities Context in which activities were delivered Examination of how long services were delivered Descriptors of who delivered services

Impact Evaluation Attempts to answer the question of what behaviors, attitudes, community change or other hypothesized expected improvement occurred as a result of the program’s activities. Impact Evaluation Attempts to answer the question of what behaviors, attitudes, community change or other hypothesized expected improvement occurred as a result of the program’s activities.

Cost - Benefit Analysis Cost - Benefit Analysis An assessment of different alternatives in order to see whether the benefits outweigh the costs. A monetary value or other measurement unit is placed on both tangible and intangible factors related to implementation of a specific activity. Cost-benefit analysis attempts to assess if the program’s benefits exceed the program’s implementation costs. Cost - Benefit Analysis An assessment of different alternatives in order to see whether the benefits outweigh the costs. A monetary value or other measurement unit is placed on both tangible and intangible factors related to implementation of a specific activity. Cost-benefit analysis attempts to assess if the program’s benefits exceed the program’s implementation costs.

Example $8,000 cost for installation of energy efficient windows by homeowner -$1,200 rebate = $6,800 cost paid by owner $1,000 annual electric bill before installation of windows $6,800/$200 = 34 years before return on investment Example $8,000 cost for installation of energy efficient windows by homeowner -$1,200 rebate = $6,800 cost paid by owner $1,000 annual electric bill before installation of windows $6,800/$200 = 34 years before return on investment

The Shoestring Approach

The Shoestring Approach The Shoestring Approach Program evaluators conduct impact evaluations when working under budget, time, or data constraints. The Shoestring Approach Program evaluators conduct impact evaluations when working under budget, time, or data constraints.

The Shoestring Approach The Shoestring Approach The evaluator is not called in until the project is already well advanced Tight deadline for completing the evaluation Combined with a limited budget and without access to baseline data Budget, political, or methodological reasons it is not possible to collect baseline data on a control group Dealing with constraints related to costs, time and data The Shoestring Approach The evaluator is not called in until the project is already well advanced Tight deadline for completing the evaluation Combined with a limited budget and without access to baseline data Budget, political, or methodological reasons it is not possible to collect baseline data on a control group Dealing with constraints related to costs, time and data

The Shoestring Approach cont. The Shoestring Approach cont. Identifying the strengths and weaknesses of the evaluation design Taking measures to address the threats and strengthen the evaluation design and conclusions When necessary, many of these corrective measures can be introduced at a very late state The approach was designed to assist evaluators working in developing countries, the principles are equally applicable in industrial nations The Shoestring Approach cont. Identifying the strengths and weaknesses of the evaluation design Taking measures to address the threats and strengthen the evaluation design and conclusions When necessary, many of these corrective measures can be introduced at a very late state The approach was designed to assist evaluators working in developing countries, the principles are equally applicable in industrial nations

Determining Causation

Chicken or the Egg Causation Direct causation is very difficult Without randomly assigned test and control groups, providing causation is impossible Conclusively showing that an intervention has created a data point requires longitudinal studies Causation Direct causation is very difficult Without randomly assigned test and control groups, providing causation is impossible Conclusively showing that an intervention has created a data point requires longitudinal studies Chicken or the Egg

Chicken or the Egg Causation Multiple resources interact in a setting which may or may not impact results Self-selection in programs being evaluated contributes to the problem Another consideration must be unintended results – both positive and negative Causation Multiple resources interact in a setting which may or may not impact results Self-selection in programs being evaluated contributes to the problem Another consideration must be unintended results – both positive and negative Chicken or the Egg

Reliability, Validity and Sensitivity

Reliability, Validity and Sensitivity Credible evaluations rely on the use of reliable, valid and sensitive instruments. Reliability, Validity and Sensitivity Credible evaluations rely on the use of reliable, valid and sensitive instruments.

Reliability is… the “extent to which the measure produces the same results when used repeatedly to measure the same thing” (Rossi et al., 2004, p. 218). Reliability is… the “extent to which the measure produces the same results when used repeatedly to measure the same thing” (Rossi et al., 2004, p. 218).

Validity is… the “extent to which it measures what it is intended to measure” (Rossi et al., 2004, p. 219). Validity is… the “extent to which it measures what it is intended to measure” (Rossi et al., 2004, p. 219).

Sensitivity the instrument must be able to discern potential changes on the social problem the instrument is insensitive if it contains items measuring outcomes which the program couldn’t possibly effect (Rossi et al., 2004). Sensitivity the instrument must be able to discern potential changes on the social problem the instrument is insensitive if it contains items measuring outcomes which the program couldn’t possibly effect (Rossi et al., 2004).

Challenges Challenges

Challenges The challenges of program evaluation vary from organization to organization, region to region, and country to country. Challenges The challenges of program evaluation vary from organization to organization, region to region, and country to country.

Challenges Cost: can be expensive Expertise: require a high level of expertise to verify the goals of the program Measurement: must ensure the performance measures will answer the questions that will determine the effectiveness of the program Time: can be time consuming and could divert staff from day-to-day functioning Challenges Cost: can be expensive Expertise: require a high level of expertise to verify the goals of the program Measurement: must ensure the performance measures will answer the questions that will determine the effectiveness of the program Time: can be time consuming and could divert staff from day-to-day functioning

Challenges when conducting program evaluations in developing countries: Culture “Culture can influence many facets of the evaluation process, including data collection, evaluation of the program implementation, and the analysis and understanding of the results of the evaluation” (Ebbutt, 1998). Challenges when conducting program evaluations in developing countries: Culture “Culture can influence many facets of the evaluation process, including data collection, evaluation of the program implementation, and the analysis and understanding of the results of the evaluation” (Ebbutt, 1998).

Challenges when conducting program evaluations in developing countries: Language “Language can be a major barrier to communicating concepts which the evaluator is trying to access, and translation is often required” (Ebbutt, 1998). Challenges when conducting program evaluations in developing countries: Language “Language can be a major barrier to communicating concepts which the evaluator is trying to access, and translation is often required” (Ebbutt, 1998).