Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Process Evaluation Susan Kasprzak, March 13, 2009.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Site-Based Decision Making Campus Planning. Restructuring A process through which a district or school alters the pattern of its structures (vision, rules,
M & E for K to 12 BEP in Schools
Program Theory and Logic Models (1) CHSC 433 Module 2/Chapter 5 Part 1 L. Michele Issel, PhD UIC School of Public Health.
Donald T. Simeon Caribbean Health Research Council
 Is extremely important  Need to use specific methods to identify and define target behavior  Also need to identify relevant factors that may inform.
What You Will Learn From These Sessions
Chapter 1: An Overview of Program Evaluation Presenter: Stephanie Peterson.
Developing Indicators to Assess Program Outcomes Kathryn E. H. Race Race & Associates, Ltd. Panel Presentation at American Evaluation Association Meeting.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Ray C. Rist The World Bank Washington, D.C.
Introduction to Monitoring and Evaluation for National TB Programs 20 September 2005.
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
Orientation to Performance and Quality Improvement Plan
Agenda: Block Watch outcome map Program Theory overview Evaluation theory overview Mentoring Evaluation Assignment 1 Evaluation Debrief.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Evaluation. Practical Evaluation Michael Quinn Patton.
Types of Evaluation.
Legal & Administrative Oversight of NGOs Establishing and Monitoring Performance Standards.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Health Program Effect Evaluation Questions and Data Collection Methods CHSC 433 Module 5/Chapter 9 L. Michele Issel, PhD UIC School of Public Health.
Justice Information Network Strategic Plan Development Justice Information Network Board March 18, 2008 Mo West, JIN Program Manager.
Health Program: Implementation CHSC 433 Module 4/Chapter 7 L. Michele Issel, PhD UIC School of Public Health.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
HSA 171 CAR. 1436/ 7/4  The results of activities of an organization or investment over a given period of time.  Organizational Performance: ◦ A measure.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
PowerPoint Presentation by Charlie Cook The University of West Alabama Copyright © 2005 Prentice Hall, Inc. All rights reserved. Operations Management.
Outcome Based Evaluation for Digital Library Projects and Services
Certification and Accreditation CS Phase-1: Definition Atif Sultanuddin Raja Chawat Raja Chawat.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
IDEV 624 – Monitoring and Evaluation Introduction to Process Monitoring Payson Center for International Development and Technology Transfer Tulane University.
Program Theory and Logic Models (2) CHSC 433 Module 3/Chapter 5 Part 2 L. Michele Issel, PhD UIC School of Public Health.
Overview of Evaluation ED Session 1: 01/28/10.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Context Evaluation knowing the setting Context Evaluation knowing the setting.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Evaluation design and implementation Puja Myles
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Designing New Programs Design & Chronological Perspectives (Presentation of Berk & Rossi’s Thinking About Program Evaluation, Sage Press, 1990)
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Copyright © 2011 Pearson Education, Inc. All rights reserved. Behavioral Assessment: Initial Considerations Chapter 20.
Concepts/ definitions/ meanings of program plan, program scheme, program development, concept paper Program plan is a systematic arrangement of elements.
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
There are 6 main components to Care Provider’s Committed to Quality Program: Visionary Leadership Mission Statement Customer Satisfaction Employee Satisfaction.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Implementation and Sustainability in the US National EBP Project Gary R. Bond Dartmouth Psychiatric Research Center Lebanon, NH, USA May 27, 2014 CORE.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
Symposium CLIENT –PROVIDER RELATIONSHIP AS AN ACTIVE INGREDIENT IN DELIVERY OF SOCIAL SERVICES Organizer: Jeanne C. Marsh, PhD, MSW University of Chicago.
Dr. Julia H. Bryan College of Public Affairs Doctor in Public Administration University of Baltimore (2013 Graduate) October 19, 2013.
Logic Models How to Integrate Data Collection into your Everyday Work.
Program Evaluation ED 740 Study Team Project Program Evaluation
Designing Effective Evaluation Strategies for Outreach Programs
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Hannah Hirschland, LMSW, Managing Director of Analytics & Evaluation
Behavioral Assessment: Initial Considerations
Process Evaluation the implementation phase
MONITORING AND EVALUATION IN TB/HIV PROGRAMS
Performance and Quality Improvement
Module 3 Part 2 Developing and Implementing a QI Plan: Planning and Execution Adapted from: The Health Resources and Services Administration (HRSA) Quality.
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health

Making Widgets Count not only the widgets, but who gets the widgets and what goes into making the widgets.

Definitions l Systematic examination of coverage and delivery l Measuring inputs into the program l Finding out if the program has all its parts, if the parts are functional and operational

Program Components Discrete interventions, or groups of interventions, of the overall program that are designed to independently or synergistically effect recipients. Objectives per component

Types of Implementation Evaluations l Effort: quantity of input l Monitoring: use of MIS information l Process: internal dynamics, strengths and weaknesses l Component: assess distinct program parts l Treatment: what was supposed to have an effect

Purpose of Process Evaluation l Assurance and accountability l Understanding of outcomes l Mid-course corrections l Program replicability

From Perspective of: l Evaluator l Funders (Accountability) l Management for the program

Stakeholders and Expectations l Focus on the explicit Objectives Program descriptions l Uncover the implicit Review program theory Review objectives Role play possible outcomes

Program Theory Components

Organizational Plan How to garner, configure, and deploy resources, organize program activities so that the intended service is developed and maintained

Service Utilization Plan How the intended target population receives the intended amount of the intended intervention through interaction with the program’s service delivery system

Rigor in Process Evaluation l Appropriateness of the method l Sampling strategy l Validity and Reliability l Timing of data collection

Key Decision: 1 How much effort to expend ~ what data are needed to accurately describe the program. Choose based on: Expected across site variation Making report credible Literature about the intervention

Key Decision: 2 What to look for ~ what program features are most critical, valuable to describe. Choose based on: What is most often cited in program proposal The budget What may be related to program failure

Go Back to Objectives l Process objectives per program component: How much Of what Will be done By who By when

Components and Objectives

Possible Foci of Process Evaluation l Place: Site, Program l People Practitioner/provider Recipient/participant l Processes Activities Processes Structure l Policy

Levels of Analysis l Individuals Program participants Program providers l Programs As a whole l Geographic locations Regions and state

Types of Questions ? l What was done and by whom? l How well was it done and how much was done? l What contributed to success/failure? l How much of what resources were used? l Is there program drift?

Sources of Program Variability l Staff preferences and interest l Materials availability and appropriateness l Participants expectations, receptivity, etc l Site physical environment and organizational support

Roots of Program Failure

Causes of Program Failure l Non-Program No participants No program done l Wrong intervention Not appropriate for the problem l Unstandardized intervention Across site, within program variations

Program Failure cont l Mis-management of program operations l Wrong recipients l Barriers to the program l Program components unevenly delivered, monitored

Data Sources from Program l Resources used l Participant provided data Quality Match with process evaluation l Existing records Sampling of records Validity and reliability issues

Data Sources from Evaluator l Surveys and Interview of Participants l Observation of Interactions l Survey and Interview staff

Evaluating Structural Inputs l Organizational structure Supervision of staff Place in organizational hierarchy l Facilities, equipment l Human resources Leadership Training

Measures of Delivery l Measures of program delivery l Measures of coverage l Measures of effectiveness

Measures of Implementation l Measures of Volume (Outputs): Number of services provided l Measures of Workflow: Client time Staff work time

Targets, Recipients, and Coverage

Measures of Coverage Undercoverage= # recipients in need /# in need Overcoverage= # recipients not in need /# recipients Coverage efficiency= (under - over) x 100

Measures of Effectiveness Effectiveness Index = % reached per program standard per program component Program Effectiveness Index = Sum of Effectiveness Indexes/# program components

Bias in Participation l due to self-selection l results in under or overcoverage l may be related to recruitment l can be identified with good data collection (monitoring)

Measures of Efficiency l ratio of input per output l productivity per staff, per cost, per hour l cost per participant, per intervention l etc...

Evaluating Costs l Payments by agency l Payments by secondary funders l Payments by participants versus charges!

Monitoring and CQI l Similar types of data presentation Control charts Fishbone diagrams Flow charts Gantt charts etc. l Overlapping purposes

Reaching Conclusions l Compare data to objectives l Compare data to needs assessment data l Compare data to other sites or other programs

Worksheet Exercise l For each program objective: What is the focus and level of the process evaluation What data sources needed Who collects data

References Rossi, Freeman & Lipsey (1999). Evaluation: A systematic approach. Sage Publications Patton (1997). Utilization focused evaluation. Sage Publications. King, Morris, Fitz-Gibbon (1987). How to assess program implementation. Sage Publications. Weiss (1972). Evaluation Research. Prentice Hall