A syndrome of Irregular Enthusiasm: Increasing the Utilisation of Evaluation findings in the UPHOLD project BY Apollo Nkwake Visit

Slides:



Advertisements
Similar presentations
RESULTS BASED PARTICIPATORY BUDGETING South-South Partnership Programs and Peer-to- Peer Learning.
Advertisements

Donald T. Simeon Caribbean Health Research Council
Celebrating Achievements
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
Assessment of Malaria Outcome Indicators Using Lot Quality Assurance Sampling (LQAS): Estimation of Bed Net Coverage in Mozambique Caitlin Biedron ASPH.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
Comprehensive M&E Systems
I NTRODUCTION TO B ASIC D ATA A NALYSIS AND I NTERPRETATION FOR H EALTH P ROGRAMS.
What is H(M)IS?. Purpose of HIS “is to produce relevant information that health system stakeholders can use for making transparent and evidence-based.
COMMUNITY SYSTEMS STRENGTHENING Keynote address by PROF MIRIAM K. WERE ON THE OCCASION OF THE AMREF HEALTH AFRICA INTERNATIONAL CONFERENCE, 2014 NAIROBI,
By Denis Kaffoko,(MSC.DE,B.STAT,PCGME) The effect of Scale up of TB-DOTS Services on Case Detections and Treatment success rates in Central Uganda.
Evaluation. Practical Evaluation Michael Quinn Patton.
At the end of this module, participants should have a better understanding of the following : Elements of Gender Mainstreaming Basics of Gender Analysis.
1 Management Sciences for Health MSH Building Local Capacity Project Stronger health systems. Greater health impact. Strengthening M&E capacity of civil.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
The Oxford Health Alliance The Oxford Health Alliance Community Interventions for Health: Methodology Confronting the Epidemic.
Importance of Health Information Systems Information explosion during 1990s  It is estimated that in the next 50 years, the amount of knowledge currently.
Community Learning Systems : Increasing Accountability for Results from Development Assistance Joy P. Mukaire Lawrence L. Jackson Annual Conference 2009.
Cross Border Animal Health Plan of Action – Kenya and Uganda Four Strategic areas 1. To improve prevention, management and control of cross border animal.
LOT QUALITY ASSURANCE SAMPLING (LQAS). What is LQAS A sampling method that:  Is simple, in-expensive, and probabilistic.  Combines two standard statistical.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
1 Improvement Measurement plan in Vulnerable Children Program Orame Ngozi M&E/KM Advisor University Research Company Plot 432,AMMA House,Yakubu J. Pam.
RESULTS BASED MANAGEMENT IN TANZANIA. STAGE 1 Service Delivery Survey - MDAs have to undertake these surveys which focus on external customers and are.
HSA 171 CAR. 1436/ 7/4  The results of activities of an organization or investment over a given period of time.  Organizational Performance: ◦ A measure.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Unit 10. Monitoring and evaluation
Monitoring Monitoring forms part of the project cycle: Project Identification Planning Appraisal - Decision Implementation – Monitoring Evaluation Difference.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Strengthening Health Information Systems: Creating an Information Culture Manila, June 14, 2011 Theo Lippeveld, MD, MPH,
Primer on Monitoring and Evaluation. The 3 Pillars of Monitoring and Evaluation  Identifying the Performance Indicators  Collecting information using.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Training Course on “Training of Trainers from the Greater Mekong Sub- Region on Decentralized Education Planning in the Context of Public Sector Management.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Structural, Policy and Legal Assessment Presented by Ms. Kokuteta Mutembei HIV/AIDS BI-ANNUAL REVIEW 2008.
A Brief Overview of Lot Quality Assurance Sampling (LQAS) and its application in the Health Sector Joseph Valadez, PhD, ScD, MPH Liverpool School of Tropical.
THE REPUBLIC OF UGANDA National AIDS Conference Presentation during the 4 th Uganda AIDS partnership Forum, Munyonyo, 31 st January 2006 By James Kaboggoza-Ssembatya,
UNESCO INSTITUTE for STATISTICS UNESCO Institute for Statistics: Statistical Capacity Building.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Managing Statistics in Conflict Situations: The case of Afghanistan Abdul Rahman Ghafoori President General Central Statistics Organization, Afghanistan.
Managing Public Budget to Facilitate Economic Growth and Reduce Poverty Public Expenditure Analysis & Management Staff Training Course May , 2001.
Module 8: Monitoring and Evaluation Gap Analysis and Intervention Plans.
Revisions Proposed to the CIS Plan by the Global Office Misha V. Belkindas Budapest, July 3-4, 2013.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
PREPARING FOR MONITORING AND EVALUATION 27 – 31 May 2013 Bangkok Bangkok Office Asia and Pacific Regional Bureau for Education United Nations Educational,
ABSTRACT THE CHALLENGE OF INTEGRATING A RDU TRAINING MODEL INTO THE REALITY OF A HEALTH SYSTEM CONTEXT Problem Statement: The Dar es Salaam Urban Health.
Strategic Planning on Statistical Development for Fragile and Conflict States: Afghanistan Experience Abdul Rahman Ghafoori President General, Central.
Kathy Corbiere Service Delivery and Performance Commission
Human resources for maternal, newborn and child health: opportunities and constraints in the Countdown priority countries Neeru Gupta Health Workforce.
The South African Mother Baby Friendly Initiative Experience
● Lack of accessible usable information that is open to all communities ● Poor information dissemination and access ● A mismatch between available.
Comprehensive M&E Systems: Identifying Resources to Support M&E Plans for National TB Programs Lisa V. Adams, MD E&E Regional Workshop Kiev, Ukraine May.
Data sources Facilitators’ Workshop on District Health Performance Improvement Lilongwe, 25 th – 27 th March 2015.
Session 2: Developing a Comprehensive M&E Work Plan.
Development of Gender Sensitive M&E: Tools and Strategies.
Sampling Overview and Resources: LQAS Day3: Session 9:30-10:00pm – Jennifer Luna Operations Research Workshop February 16, 2011.
Information Use Part II Informing Decisions Strengthening Programs through Improved Use of Data and Information.
Module 3: Ensuring Sustainability Session 1. Empowerment of the Community 1 CBDRR Framework Training - Myanmar Red Cross Society.
Knowledge Practice & Coverage Survey (KPC) Overview & resources Moving from research objectives, questions, hypothesis to questionnaire design Day 3: Session.
Module 2: Random Sampling Background, Key Concepts and Issues Outcome Monitoring and Evaluation Using LQAS.
ITC-ILO/ACTRAV Course A Trade Union Training on Occupational Safety, Health & HIV/AIDS (26/11 – 07/12/2012, Turin) Introduction to National Occupational.
Strategic Information on ART Scale up Kevin O'Reilly Department of HIV/AIDS WHO.
Utilizing Evidence to Guide Program Implementation
Developing reporting system for SDG and Agenda 2063, contribution of National Statistical System, issues faced and challenges CSA Ethiopia.
iCCM Recommended Indicators
EMIS. EDUCATION MANAGEMENT INFORMATION SYSTEM (EMIS)
CAPACITY DEVELOPMENT THROUGH SYSTEMS USE, RESULTS AND sustainable development goals Workshop on New Approaches to Statistical Capacity Development,
Assessment Implementation
Comprehensive M&E Systems
Presentation transcript:

A syndrome of Irregular Enthusiasm: Increasing the Utilisation of Evaluation findings in the UPHOLD project BY Apollo Nkwake Visit

 A conceptual framework  The experience  Lessons Presentation Structure

The dilemma of utilization Doing evaluations Using evaluations for evidence based policy How Evals are done Time Quality Relevance Credibility Participation Evidence validity/rigor Dissemination/access Behavioral and institutional factors A culture of evaluation Capacity to use findings The Gap

About UPHOLD (1)  Six-year integrated social services program designed by the Government of Uganda and USAID  Operated in 34 districts covering 42% Uganda’s population (~ 11.8m people)  Overall project aim was to increase access and utilization of sustainable and quality social services in Education, Health and HIV/AIDS in support of USAID’s Strategic Objective 8 (SO8) which aims to improve human capacity  Partnered with >120 CSOs and 34 local government

 To strengthen existing data collection and information systems for Grantees ( both LGs and CSOs)  To build district capacities in planning and evidence-based decision-making by making accurate annual data available  To utilize performance results to target interventions  To document and share key lessons learned The UPHOLD ‘Information’ Mandate

Context for doing and using evaluations in UPHOLD  Uganda is decentralized  Local governments are CSO actors are closer to the ground and need more localized information for programming  DHS’ and other higher level sources do not provide information disaggregate to local levels  Capacity issues-low capacity to generate reliable information  Resource constraints  Varied capacity levels-planning, collection, processing and use of information  Availability of information is not linked to service delivery planning

Using LQAS as a cost effective and simple tool  Originally developed in the 1920s to control the quality of output in industrial production processes  Involves taking a small random sample of a manufactured batch (lot) and test the sampled items for quality  If the number of defective items in the sample exceeds a pre- determined criteria (decision rule), then the lot is rejected  The decision rule is based on the desired production standards and a statistically determined sample size ‘n’ is chosen so that the manager has a high probability of accepting lots that meet the quality standards and rejecting lots that fail to meet those standards

Adapting LQAS for public health programs  Can be used locally, at the level of a “County or sub-county,” to identify priority areas or indicators that are not reaching average coverage or an established benchmark  Can provide an accurate measure of coverage or Service system quality at a more aggregate level (e.g. District coverage)

A B C D E F G  Assume a district has seven Counties  Each County is ‘supervised’ by one person  The level of ‘success’ in each county is to be measured The Basic LQAS Principles - I

A B C D E F G Good Below Average or Established Benchmark

Good Below Average or Established Benchmark Identify the reasons for program problems Develop targeted solutions Maintain the program at the current level Identify Supervisors that can help other workers improve their performance What can be done in the circumstances?

 Each District is a ‘Supervision Unit’ and each County or Health sub-District or sub-County is a ‘Supervision Area’  Sample units are the Households, schools and health facilities  What is evaluated is the overall success in delivering specific services Application of LQAS to monitor UPHOLD and Partners’ programs

Using the LQAS method (1)  ~ 200 partner staff trained for 2 weeks in the LQAS methodology in 2004, 2005, 2006 & 2007  LQAS was done on an annual basis in with in the 34 project districts, involving CSO and LG partners in 2004, 2005, 2006 & 2007  Districts are divided into supervision areas  District officers are involved in planning, data collection, analysis and dissemination  Fitted with in the planning cycle

Using the LQAS method (2)  19 villages are sampled from each of the five ‘Supervision Areas’ in each district  Data is Collected From, Households, Schools, Health Facilities  5 households are sampled from each village and a different questionnaire administered to each of the sampled households (~12,300 households covered)  Schools and Health Facilities also surveyed (423 health Units and 1,449 Schools)

What LQAS can do  A sampling method that: –Can be used locally, at the level of a ‘supervision area’ (e.g., district) to identify priority areas (e.g., county, sub-county) or indicators that are not reaching average coverage or an established benchmark –Can provide an accurate measure of coverage or health system quality at a more aggregate level (e.g., district, program catchment area etc.) –Not diagnostic

Why we supported CSO and LG partners to adopt LQAS  Low sample size needs (n=19 in most cases)  Simple to apply yet has very specific conclusions  District level people could be trained to entirely ‘own’ this methodology  Provides high quality information at low & affordable cost  Fast – ‘supervision areas’ are able to conduct self- evaluation and obtain results immediately after the survey  Results are locally relevant and can be utilized in district level annual planning and decision-making

Example: monitoring malaria prevention in Bushenyi district-W. Uganda % Under-5 sleeping under ITN the night before survey

Example: monitoring uptake of reproductive health services in Wakiso district-C Uganda % of mothers who delivered at health centers

What we learned  Sequence: there should be strong link between evidence generation, planning, resource allocation and service delivery/policy  In a decentralized setting with conflicting political interests, a piece of evidence does much  Results based reviews are very empowering, and enthusing  Emphasizing use is strengthening quality It is possible to make comparisons within and across districts and measure success in comparison to national targets LQAS has fostered more equitable allocation of resources at district level due to ‘evidence-based’ planning

What we learned from supporting CSOs and LGs to use LQAS (Contd)  The methodology is simple to use and CSO & LG personnel can be trained to carry out and analyze the annual surveys  ‘Start-up’ training costs may be high, but this is often a once-off as persons already employed at district level are utilized for the surveys  At a cost of ~2,700 USD per district, LQAS is suitable for annual routine data collection even in resource limited settings

Visit