Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Evaluation Research Applying Research Skills for Everyday Practical Purposes.
M & E for K to 12 BEP in Schools
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
MODULE 8: PROJECT TRACKING AND EVALUATION
Chapter 1: An Overview of Program Evaluation Presenter: Stephanie Peterson.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
Chapter 2 Flashcards.
Assessing Program Impact Chapter 8. Impact assessments answer… Does a program really work? Does a program produce desired effects over and above what.
Measuring and Monitoring Program Outcomes
V MEASURING IMPACT Kristy Muir Stephen Bennett ENACTUS November 2013.
Program Evaluation It’s Not Just for OMB Anymore….
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Tailoring Evaluations
Evaluating the Quality and Impact of Reproductive Health Research Jane T. Bertrand FRONTIERS/Tulane Southampton Jan. 23, 2001.
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
Types of Evaluation.
Standards and Guidelines for Quality Assurance in the European
How to Develop the Right Research Questions for Program Evaluation
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Too expensive Too complicated Too time consuming.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
© 2006 Prentice Hall Leadership in Organizations 4-1 Chapter 4 Participative Leadership, Delegation, and Empowerment.
Research methods in clinical psychology: An introduction for students and practitioners Chris Barker, Nancy Pistrang, and Robert Elliott CHAPTER 11 Evaluation.
Tools for Civil Society to Understand and Use Development Data: Improving MDG Policymaking and Monitoring Module 3: MDGs and the Policy Cycle.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Review: Alternative Approaches II What three approaches did we last cover? What three approaches did we last cover? Describe one benefit of each approach.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Program Evaluation.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
 2007 Johns Hopkins Bloomberg School of Public Health Introduction to Program Evaluation Frances Stillman, EdD Institute for Global Tobacco Control Johns.
Chapter Thirteen – Organizational Effectiveness.  Be able to define organizational effectiveness  Understand the issues underpinning measuring organizational.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Workshop #6: Measuring Access to Justice: Why is it important and how are we going to do just that? Society of Ontario Adjudicators and Regulators Annual.
From Monitoring Through Evaluation To Impact Assessment The case of NHDRs and vulnerable groups Andrey Ivanov Human Development Adviser, Bratislava RSC.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Welcome to Program Evaluation Overview of Evaluation Concepts Copyright 2006, The Johns Hopkins University and Jane Bertrand. All rights reserved. This.
Session VI Evaluation & Wrap Up. Evaluation Activities directed at collecting, analyzing, interpreting, and communicating information about the effectiveness.
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
Basic Concepts of Outcome-Informed Practice (OIP).
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Planning Planning is considered the most important element of the administrative process. The higher the level of administration, the more the involvement.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Stages of Research and Development
Logic Models How to Integrate Data Collection into your Everyday Work.
Chapter 11: Quasi-Experimental and Single Case Experimental Designs
Designing Effective Evaluation Strategies for Outreach Programs
Monitoring and Evaluation: A Review of Terms
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
EVALUATION RESEARCH These are those types of research that evaluate the impact of social interventions. Motivated by the desire of social scientist to.
Building a Strong Outcome Portfolio
Presenter: Kate Bell, MA PIP Reviewer
MONITORING AND EVALUATION IN TB/HIV PROGRAMS
M & E Plans and Frameworks
Presentation transcript:

Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.

Two Types of Evaluation At the most basic level there are two types of evaluation: Formative: primary purpose is to provide information to improve a program. Summative: concerned with providing information about program adoption, continuation, or expansion. 2

Two Types of Evaluation 3 Relative Emphasis Program Life Summative Evaluation Formative Evaluation

What Are You Assessing? The evaluation plan must consider the situation or context of the evaluation. The purpose of the evaluation. Improvement (formative evaluation). Accountability (summative evaluation). Knowledge generation. Hidden agendas. 4

What Are You Assessing? The evaluation plan must consider the situation or context of the evaluation. The program’s structure and context. The stage of program development. Administrative and political context. Conceptual and organizational structure of the program. 5

What Are You Assessing? The evaluation plan must consider the situation or context of the evaluation. The resources available for the evaluation. Personnel Equipment Facilities to support data collection, analyses, and reporting. Is specialised expertise needed? 6

What Are You Assessing? Typically, evaluation involves assessing one or more of the five following areas: 1.The need for a program. 2.The program’s design. 3.The program’s implementation or process. 4.The program’s impacts or outcomes. 5.The efficiency of a program. 7

Needs Assessment Purpose of a social program is to alleviate a social problem. Needs assessments assess: The nature, extent, and distribution of a social problem. How these features will impact the design of an intervention program. 8

Needs Assessment These assessments can be conducted when… Planning a new program. Restructuring an existing program. Determining if an existing program is responsive to the current needs of the target population. 9

Assessing Program Theory Program theory evaluation is an evaluation guided by a program’s explicit theory of how it causes intended outcomes. Program theory outlines the following: Target population Resources Activities Outcomes A program’s theory is represented by Theory of Change Models (TCMs) and Program Logic Models (PLMs). 10

Assessing Program Theory 11 Conditions before the project With no information about the program, it remains a mysterious black box. The evaluation will also not be able to attribute the final conditions to the specific aspects of the program. Conditions after the project The Program is a

Assessing Program Theory Assessing program theory can be done by applying SMART principles (Knowlton & Phillips, 2009). S pecific – the program is clear enough to implement and evaluate. M easurable – indicators can be qualified and/or quantified. A ction oriented – activities will provoke the desired change in targets. R ealistic – the program is plausible and feasible. T imed – duration of activities and intended outcomes are specified. 12

Process Evaluation Given that a valid theory on how to intervene on a diagnosed social problem has been done, the program must be implemented well to be successful. Two forms: Process evaluation Program monitoring 13

Process Evaluation Process evaluation looks at service utilization and program organization. Service Utilization The extent to which the intended targets receive services. Important when participation is voluntary, or participants must learn new procedures, or habits. Coverage Bias 14

Process Evaluation Process evaluation looks at service utilization and program organization. Program Organization How well the program is in terms of managing its efforts and using its resources to accomplish its activities. Service delivery: None or not enough intervention Wrong intervention Unstandardized intervention 15

Outcome Evaluation Also known as an impact assessment. Assesses the extent to which a program produces the outcomes it intends. But the desired changes could be caused by factors unrelated to the program. Interested in the changes produced by the program above and beyond any external elements. 16

Outcome Evaluation Program outcomes The states or conditions of the target that changed as a result of a program. 1.Observed characteristics of the target population or social conditions. 2.Occur to different degrees. 17

Outcome Evaluation Program outcomes Outcome level – status of an outcome at a point in time. Outcome change – difference between outcome levels at different points in time. Program effect – the unique portion of an outcome that can be attributed to only the program. 18

Outcome Evaluation 19 Outcome Variable Before Program During Program After Program Outcome status without program Outcome status with program Program effect Post-program outcome level Pre-program outcome level Outcome Change (Rossi, Lipsey, & Freeman, 2004)

Efficiency Assessment Due to limited resources, program accomplishments must be judged against program costs. Two types of efficiency assessments: Cost-benefit analysis Cost-effectiveness analysis Such assessments require placing a dollar amount on program activities and benefits. 20

Typical Evaluation Questions Needs Assessment What is the nature and magnitude of the problem? What are the characteristics of the population in need? What are the needs of the population? What services are needed? How much service is needed and over what period of time? What service delivery arrangements are needed to provide services to the population? 21

Typical Evaluation Questions Program Theory Assessment Is the program doing the right thing? How could the program better address the needs of participants? How should the program be organized? What are the best delivery systems for the services? Is the program specific enough to measure? 22

Typical Evaluation Questions Process Evaluation How many persons are receiving services? Are administrative and service objectives being met? Are the intended targets receiving services? Are the necessary program functions being performed adequately? Are resources being used effectively and efficiently? 23

Typical Evaluation Questions Outcome Evaluation Are the goals and outcomes being achieved? Do the services have adverse side effects? Are some participants affected more by the services than others? Is the problem the services are intended to address made better? 24

Typical Evaluation Questions Efficiency Assessment Are resources used efficiently? Is the cost reasonable in relation to the magnitude of the benefits? Would alternative approaches yield equivalent benefits at less cost? 25

QUESTIONS? 26 Jonathan Brown, M.A.