Siân Curtis, PhD OVC Evaluation Dissemination Meeting,

Slides:



Advertisements
Similar presentations
Rwanda Case Study Additional Slides on Stakeholder Involvement.
Advertisements

GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Monitoring and Evaluation of National Tuberculosis Programs Regional Workshop Kyiv, Ukraine May 23-26, 2006.
Begin with the End in Mind
Contraceptive discontinuation in urban Honduras Janine Barden-O’Fallon, PhD Ilene Speizer, PhD University of North Carolina at Chapel Hill, USA 29 September.
LINKING DATA TO ACTION Session 6. Session Objectives By the end of this session, you will be able to:  Identify priority decisions and programmatic questions.
Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
Context of Decision Making
UNDERSTANDING DATA AND INFORMATION FLOW Session 4.
BGH M&E Working Group Updates Siân Curtis. Brief History: PRH M&E WG  2004 evaluation of OPRH M&E recommended establishing a CA M&E WG  OPRH M&E WG.
Strengthening Information Systems for Community Based HIV Programs Heidi Reynolds and Florence Nyangara Global Health Mini University 9 October 2009.
Business as Unusual: Changing the Approach to Monitoring OVC Programs Karen G. Fleischman Foreit, PhD Futures Group/MEASURE Evaluation.
CHESAPEAKE BAY PROGRAM PARTNERSHIP MANAGEMENT BOARD MEETING MAY 9, 2012 ANNAPOLIS, MD Social Science Action Team: Incorporating Social Science into the.
Case management versus M&E in the context of OVC programs: What have we learned? Jenifer Chapman, PhD Futures Group/MEASURE Evaluation.
Bridging the Research-to-Practice Gap Session 1. Session Objectives  Understand the importance of improving data- informed decision making  Understand.
Linking Data with Action Part 1: Seven Steps of Using Information for Decision Making.
Using Data for Decision Making. Learning Objectives By the end of the session, participants will be able to: 1. Raise awareness of the importance of using.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Violence Against Women and Girls A Compendium of Monitoring and Evaluation Indicators By Shelah S. Bloom Presented by: Anupa Deshpande.
Introduction to Group Work. Learning Objectives The goal of the group project is to provide workshop participants with an opportunity to further develop.
Unmet Need Exercise  Review the trends in CPR. What do you conclude about the performance of the FP program in each country?  Review trends in unmet.
Day 4: Field Practicum This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency for.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop.
MER Essential Survey Indicators Jenifer Chapman, PhD & Lisa Parker, PhD February 2, 2015.
Data Use for Gender-Aware Health Programming Welcome and Introductions.
Data Demand & Use: Information Use Map Webinar Series #2 Tuesday, January 24, 2012 Presenters: Eric Geers and Tara Nutley.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Integration of Community Based Services It seems like a good idea, but how to make it work? Molly Cannon Palladium/MEASURE Evaluation September 28, 2015.
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 9:
Difference-in-Differences Models
Data Quality Assurance Workshop
Introduction MODULE 2: Indicators and Data Collection and Reporting
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 5:
RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS MODULE 10:
Session: 5 Using the RDQA tool for System Assessment
Using Data to Inform Community-Level Management
Introduction MODULE 6: RHIS Data Demand and Use
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Training of Trainers on the OVC Household Vulnerability Prioritization Tool.
Measuring Success Toolkit
MEASURE Evaluation Using a Primary Health Care Lens Gabriela Escudero
Presenting an Information Needs Framework for PEPFAR OVC Programs
Monitoring and Evaluation of HIV/AIDS Programs Workshop Overview
Introduction to Comprehensive Evaluation
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 4:
Overview of the RHIS Rapid Assessment Tool
Session: 4 Using the RDQA tool for Data Verification
Training Content and Orientation
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 3:
Introduction RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS
Introduction to Health Informatics:
Introduction to the PRISM Framework
Information Systems for Health:
Process Improvement, System Design, and Usability Evaluation
Information Systems for Health:
Introduction to Health Informatics:
Session: 6 Understanding & Using the RDQA Tool Output
Introduction MODULE 7: RHIS Governance and Management of Resources
Process Improvement, System Design, and Usability Evaluation
Data and Interoperability:
Use of Information for Decision Making
Introduction to Health Informatics
Session: 9 On-going Monitoring & Follow Up
Process Improvement, System Design, and Usability Evaluation
EHRs and Privacy Protection in LMICs
Presentation transcript:

Lessons Learned from OVC Evaluations for Future Public Health Evaluations Siân Curtis, PhD OVC Evaluation Dissemination Meeting, September 3rd, 2009, Washington, DC

Growing Emphasis on Evaluation IOM PEPFAR evaluation report and reauthorization legislation Global Fund 5 year impact evaluation and OR initiative CFGD “When will we ever learn” report 3IE Initiative IHP M&E Working Group – common evaluation framework initiative USAID evaluation revitalization efforts .

Ideal Impact Assessment

Challenges to Implementing Rigorous Impact Evaluation Need to think about evaluation at the beginning not at the end, but it is hard to attract attention at that point Timing – projects are already underway and it is hard to incorporate a strong evaluation design Scale – many projects are too small to expect to be able to demonstrate impact Pressure for rapid results to inform programs now Expectations of multiple stakeholders – scope, competing objectives, multiple/unclear research questions Political will – need someone in a position of authority to buy in and advocate for evaluation

Methodological Constraints to Rigorous Impact Evaluation Non-random placement of programs - intervention areas and control areas often not comparable Suitable control areas may not exist – other programs in control areas or cross-over of interventions to control areas Need/ability to control for other factors beyond the program that might affect outcomes (Victora, Black, and Bryce 2009)

OVC Evaluation Experience Timing programs already underway – no baseline; post-test only design; Length and intensity of exposure - short duration of exposure (i.e. less 2 years) but impact likely to be longer term Scale coverage low in intervention areas – quality of beneficiary lists some programs small

OVC Evaluation Experience Pressure for rapid results post-test only design; short program exposure. Multiple stakeholder Supports data use; Managing expectations regarding scope and coverage of study Political will/leadership – needs to be strong to facilitate buy-in from all stakeholders

OVC Evaluation Experience Program participation was non-random purposive selection of intervention areas self-selection into (some) programs controls different from beneficiaries. Control areas – contamination between program and control areas i.e. some children in control areas reported receiving interventions

Additional Issues for OVC Evaluation Multiple outcome domains – what to focus on? Measurement tools for outcome domains vary in how widely tested they are and how well they work Measurement and comparison of cost-effectiveness across multiple domains new Lack of standardized interventions/variable intensity and quality Wide variation in combination of interventions offered and way the programs are implemented

Data Use Critical to think about data use throughout the evaluation process, not just at the end Engagement of stakeholders critical to understanding the evaluation questions from different perspectives and creating ownership and demand Proactive and explicit data use activities will help stakeholders understand and apply findings - recommendations from them better than from research team

Conclusions Continuing challenge to develop pragmatic evaluation designs that meet rigorous scientific standards within field realities – ongoing area of research Recognize the long term benefits of evaluations for future programs – “public good” Takes time for programs to scale-up and to have an effect – evaluations need to be ongoing. More work needed to test measures of OVC outcomes – often multi-dimensional Attention to data use (both short and long term) throughout process needed

Visit us online at http://www.cpc.unc.edu/measure. MEASURE Evaluation is funded by the U.S. Agency for International Development through Cooperative Agreement GHA-A-00-08-00003-00 and is implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill, in partnership with Futures Group International, ICF Macro, John Snow, Inc., Management Sciences for Health, and Tulane University. The views expressed in this presentation do not necessarily reflect the views of USAID or the United States government. Visit us online at http://www.cpc.unc.edu/measure.