Evaluating the Quality and Impact of Reproductive Health Research Jane T. Bertrand FRONTIERS/Tulane Southampton Jan. 23, 2001.

Slides:



Advertisements
Similar presentations
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Objectives of the Workshop.
Advertisements

Introduction to Monitoring and Evaluation
Developing and Implementing a Monitoring & Evaluation Plan
1 Department of State Program Evaluation Policy Overview Spring 2013.
Donald T. Simeon Caribbean Health Research Council
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Monitoring and Evaluation for HES Activities
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
Module 1: Key concepts in data demand & use
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Problem Analysis Intelligence Step 2 - Problem Analysis Developing solutions to complex population nutrition problems (such as obesity or food insecurity)
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Evaluation. Practical Evaluation Michael Quinn Patton.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Continuous Quality Improvement (CQI)
WHAT IS “CLASS”? A BRIEF ORIENTATION TO THE CLASS METHODOLOGY.
Reporting and Using Evaluation Results Presented on 6/18/15.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
State Health Access Reform Evaluation Lynn A. Blewett, Ph.D. State Health Access Data Assistance Center State Coverage Initiatives (SCI) National Meeting.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
The Evaluation Plan.
Evaluation in the GEF and Training Module on Terminal Evaluations
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Too expensive Too complicated Too time consuming.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Evaluation Assists with allocating resources what is working how things can work better.
1. IASC Operational Guidance on Coordinated Assessments (session 05) Information in Disasters Workshop Tanoa Plaza Hotel, Suva, Fiji June
Assessment on the implementation of the Mediterranean Strategy for Sustainable Development Dr Nicola Cantore Overseas Development Institute,
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Evaluating a Research Report
XXX_DECRIPT_MON00/1 Quality and impact of Social Science and Operations Research by the Special Programme in Human Reproduction Department of Reproductive.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Assessments. Assessment in the Project Cycle DESIGN IMPLEMENTATION MONITORING EVALUATION ASSESSMENT.
RTI, MUMBAI / CH 41 IMPLEMENTING THE PERFORMANCE AUDIT PLAN FOR THE SELECTED SUBJECT DAY 4 SESSION NO.1 (THEORY) BASED ON CHAPTER 4 PERFORMANCE AUDITING.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
The Program Evaluation Cycle Module 3. 2 Overview n Overview of the evaluation cycle n Major components of the cycle n Main products of an evaluation.
Structural, Policy and Legal Assessment Presented by Ms. Kokuteta Mutembei HIV/AIDS BI-ANNUAL REVIEW 2008.
Presents: Information for participants: Your microphone will be muted for the formal presentation. If your audio portion the presentation is not working,
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
PARIS21 Statistical Capacity Building (SCB) Indicators Presentation to PARIS21 Lucie Laliberté October, 2002.
Context Evaluation knowing the setting Context Evaluation knowing the setting.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Development Impact Evaluation in Finance and Private Sector 1.
Improving Decision Making for Results The Role of Ex Ante Poverty Impact Assessment Third International Round Table MfDR Hanoi, February 2007 Wolf M. Dio,
© 2002, CARE USA. All rights reserved. Applying Quality Standards in Impact Evaluation: Case of CARE Program Quality Framework and Evaluation Policy Ahmed.
Developing a programme for the implementation of the 2008 SNA and supporting statistics Seminar on Developing a programme for the implementation of the.
Linking Data with Action Part 1: Seven Steps of Using Information for Decision Making.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
United States Agency for International Development Bureau for Global Health Office of Population and Reproductive Health Policy Update.
ACTED AME Appraisal, Monitoring and Evaluation. Summary 1/ ACTED AME department 2/ AME Responsibilities 3/ AME throughout project cycle 4/ Involvement.
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
Outcomes Evaluation A good evaluation is …. –Useful to its audience –practical to implement –conducted ethically –technically accurate.
Session 2: Developing a Comprehensive M&E Work Plan.
Evaluation What is evaluation?
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Session: 5 Using the RDQA tool for System Assessment
Right-sized Evaluation
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Tracking development results at the EIB
Process Evaluation the implementation phase
RRI Baseline and Endline
Presenter: Kate Bell, MA PIP Reviewer
Presentation transcript:

Evaluating the Quality and Impact of Reproductive Health Research Jane T. Bertrand FRONTIERS/Tulane Southampton Jan. 23, 2001

Why Evaluate? To determine whether OR studies have the desired impact of changing service delivery or policy To identify factors influencing utilization To highlight the importance of utilizing the results to researchers involved To apply lessons learned to other OR studies To be accountable to donors

What are we evaluating? Interventions: – What has been the impact of the intervention on the target population? – Example: teen pregnancy program in England Research: – what has been the impact of research on service delivery and policy?

Advantages of op. research to government official/policy makers Allows them to test out controversial interventions on a small scale at lower political risk If successful, take credit and scale up. If unsuccessful, “that was just a trial.”

Increased emphasis on evaluation in USAID-funded projects The EVALUATION Project in 1991: – Improve state-of-the art in program evaluation MEASURE Evaluation – 1997 to present: – Apply improved evaluation methods in the field USAID switched from log frame approach to results framework: – Strategic objective, intermediate results – EMPHASIS ON RESULTS, not on ACTIVITIES – Based on a tracking of indicators

Evaluating Operations Research In the past, process evaluation: – How many projects? How well done? – Qualitative assessments-short term impacts Need to develop an assessment of impact: – Has OR succeeded in changing service delivery procedures or influencing policy?

Approach developed under FRONTIERS Drew on indicators developed by an O.R. working group under the EVALUATION Project Pre-tested methodology on completed projects in selected countries: – 1999: Peru, Kenya, Philippines – 2000: Honduras, Senegal, Bangladesh

Data collection process Two person evaluation team: – FRONTIERS/Tulane staff, consultant Duration of data collection: – one week in country Sources of data – Project reports, other documentation – Key informant interviews using assessment form Assessment forms: (see Appendix A) – used to guide discussion – used to present/document results

Types of indicators Process Impact Contextual factors

Process Indicators P-1. Implementing organization actively participated in study design P-2. Implementing organization actively participated in conduct of OR project P-3. Study accomplished its research objectives P-4. Intervention was implemented as planned P-5. Completed without delays that would compromise validity of research design

Process indicators (cont’d) P-6. Implementing agency participated in developing programmatic recommendations P-7 Continuity in key personnel over the life of the project P-8. TA judged sound; congenial manner P-9. Study design was technically sound P-10. Research design feasible in local context

Process indicators (cont’d) P-11. Results judged credible/valid locally P-12. Research relevant to local program managers P-13. Study included an assessment of costs P-14. Results disseminated to key audiences P-15. Results readily available in written form

Impact Indicators I-1. Based on OR results, organization implemented activities to improve services I-2. Improvements in service delivery were observable I-3. Improvement still observable 24 months post-implementation. I-4. Implementing agency conducted subsequent OR I-5. …conducted OR without PC assistance

Impact Indicators (cont’d) I-6. Intervention scaled up - same organization I-7. Intervention adopted - another organization I-8. Intervention replicated in another country I-9. Change in national policy linked to OR study I-10. Original donors funded activities based on results I-11. New donors funded activities based on OR

Contextual factors: Factors that facilitated: – Conduct of study – Utilization- of results Factors that impeded: – Conduct of study – Utilization of results

FINDINGS: THREE CASE STUDIES Limited to intervention/evaluative studies Total number of projects: 28 Bangladesh: 10 Honduras: 10 Senegal: 8

Process Indicators: Three Countries P 1 – P 7 Indicators Percentage of Projects with Positive Score on Indicators 28/28 26/28 10/10 10/12 26/26 21/26

Process Indicators: Three Countries P 8 - P 15 28/28 21/24 27/27 26/27 28/28 27/27 28/28 Indicators Percentage of Projects with Positive Score on Indicators

Impact Indicators: Three Countries I 1- I 6 25/27 21/21 19/21 13/18 2/3 18/22 Indicators Percentage of Projects with Positive Score on Indicators

Impact Indicators: Three Countries I 7- I 11 9/17 2/13 10/27 5/23 7/23 Indicators Percentage of Projects with Positive Score on Indicators

Advantages of Methodology Both quantitative and qualitative Summary table of data easily produced and interpreted Concrete examples included Provides rich information on factors affecting utilization

Limitations Can not prove cause and effect Rather: “plausible attribution” if: – change in service delivery occurred after intervention, and – change is consistent with OR results Requires some subjective judgements; potential for bias Staff turnover may affect quality of data

Next steps Apply methodology to all FRONTIERS projects (n=75+) Timing: – At end of project – 36 months later Project monitor to report Subset (25%) to be verified by external team Compile results in ACCESS data base

Analyses to be Conducted at Close of FRONTIERS Creation of scale for performance of each project on process and impact Correlations and cluster analysis of different indicators in the data set Determinants of impact: what indicators of process are significantly related to impact? Meta-analyses: by country, region, topic

…wish us luck Stay tuned for the results. Thanks for attending.