Common lessons on how we measure service delivery Markus Goldstein World Bank.

Slides:



Advertisements
Similar presentations
RE-THINKING ACCOUNTABILITY Social Accountability and the Search for More Effective Public Expenditure Jeff Thindwa Participation and Civic Engagement.
Advertisements

RESULTS BASED PARTICIPATORY BUDGETING South-South Partnership Programs and Peer-to- Peer Learning.
High Level Regional Consultation for Policy Makers to Enhance Leadership in Planning the National HIV & AIDS Response S P Aligning AIDS & Development Planning.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
HIV Capacity Building Summit  March 19, 2013  Johannesburg Building local NGO capacity, effectively and sustainably: Implications of selected USAID-
Module 1: Key concepts in data demand & use
Evaluation. Practical Evaluation Michael Quinn Patton.
1 Developing A Quality Assurance Program For The US Census Bureau’s Business Register David. D. Chapman US CENSUS BUREAU 1.
1 Module 6 Putting It All Together. 2 Learning Objectives At the end of this session participants will understand: The monitoring and evaluation process.
The 8-7 National Poverty Reduction Program in China: the National Strategy and its Impact Wang Sangui, Li Zhou, Ren Yanshun.
System of Environmental-Economic Accounting SEEA Implementation Guide and Diagnostic Tool Alessandra Alfieri UNSD.
Generating evidence for change: Implementing the post-ICIUM research agenda Dennis Ross-Degnan, ScD Harvard Medical School and Harvard Pilgrim Health Care.
Tracking Public Expenditure: A Guide Waly Wane Development Research Group The World Bank Are You Being Served? June 2009.
Integrated household based agricultural survey methodology applied in Ethiopia, new developments and comments on the Integrated survey frame work.
Measurement Matters: The Use of PETS and QSDS Public Expenditure Analysis and Management Course Ritva Reinikka Development Research Group (DEC) Public.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
State Health Access Reform Evaluation Lynn A. Blewett, Ph.D. State Health Access Data Assistance Center State Coverage Initiatives (SCI) National Meeting.
AN INTRODUCTION Country Systems. Outline 1. What are Country Systems? 2. What does it mean to use country systems? 3. Why does the ‘use of country systems’
HOUSEHOLD SURVEY PROGRAMME IN UGANDA: PAST EXPERIENCES AND FUTURE PLANS By James Muwonge Uganda Bureau of Statistics OCTOBER, 2009.
Community Action for Health An Overview AGCA Secretariat Population Foundation of India.
OSSE School Improvement Data Workshop Workshop #1 January 30, 2015 Office of the State Superintendent of Education.
Public Expenditure Tracking and Service Delivery Surveys Qualidade do Gasto Publico no Brasil June 26-27, 2003 Ritva Reinikka Development Research Group,
Development Aid Statistics, Reporting and Aid Effectiveness Misha Belkindas Development Data Group World Bank May 2008.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
1 Webinar: Challenges in Clinical Training Ben Wallace, Executive Director, Clinical Training Reform Health Workforce Australia.
Statistics and cooperation: Rome, 24 November 2005 Statistics to Inform Development Policy: the Role of PARIS21 Presentation by Antoine Simonpietri, PARIS21.
SECTOR-WIDE APPROACH – a Planning Tool for Samoa Ms. Makerita Luatimu – Tiotio (Public Administration Sector Coordinator) Mr. Talatalaga Matau – (ACEO:
Results Frameworks Somalia Joint Needs Assessment November 2005.
Census Mapping A Case of Zambia UN Workshop on Census Cartography and Management, Lusaka, 8-12 th October 2007.
Assessments. Assessment in the Project Cycle DESIGN IMPLEMENTATION MONITORING EVALUATION ASSESSMENT.
Public Expenditure Tracking Surveys(PETS) Zerubabel Ojoo Management systems and Economic Consultants Jan
AADAPT Workshop South Asia Goa, December 17-21, 2009 Maria Isabel Beltran 1.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Public Expenditure Tracking in Uganda (and elsewhere) Budget Management and Financial Accountability Washington, DC March 2004 Magnus Lindelow
Partnership Analysis & Enhancement Tool Kit Cindy S. Soloe Research Triangle Institute (RTI) April Y. Vance Centers for Disease Control and Prevention.
African Centre for Statistics United Nations Economic Commission for Africa Addressing Data Discrepancies in MDG Monitoring: The Role of UN Regional Commissions.
Page1 Decentralization of Functions International Conference on Governance and Accountability in Social Sector Decentralization Dana Weist
Measuring Service Delivery Markus Goldstein DECRG/AFTPM.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
TBS 2008-H. Tata & M. Babaley Mapping and In-depth Assessment of Medicines Procurement and Supply Systems WHO Technical Briefing Seminar 17 th -21 st November.
Development Impact Evaluation in Finance and Private Sector 1.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Targeting of Public Spending Menno Pradhan Senior Poverty Economist The World Bank office, Jakarta.
Health Facility Surveys: An Introduction by Magnus Lindelow and Adam Wagstaff December 12, 2001 Revised April 25, 2006 Based on Policy Research Working.
Clerical & Support Staff Introduction to Moving ON Audits & Clinical Indicators.
IDENTIFYING USES AND USERS OF CENSUS DATA Republic of Zambia Central Statistical Office Pretoria, South Africa 24 th March 2014 Presented by Nchimunya.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
1 Overview of Economic Statistics in Africa UNECA Andry Andriantseheno Regional Workshop on Basic Economic Statistics Addis-Ababa October 2007.
Ce projet est financé par l’Union européenne PAGE 1 Regional Workshop on Sampling for Census of Agriculture 2010 and Agricultural Surveys October.
Public Expenditure Tracking and Service Delivery Surveys 11 th International Anti-Corruption Conference Seoul May 26, 2003 Magnus Lindelow Development.
Local HWTS Monitoring Eva Manzano, CAWST Technical Advisor Laos Vientiane, Lao PDR November 11, 2014.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
COLLECTING DATA: SURVEYS AND ADMINISTRATIVE DATA PBAF 526 Rachel Garshick Kleit, PhD Class 8, Nov 21, 2011.
Life circumstances and service delivery Community survey Finalise pilot survey (June 2006) List of dwellings completed (September 2006) Processes, systems.
Model of an Effective Program Review October 2008 Accrediting Commission for Community and Junior Colleges.
How can information systems help us?
Global and regional programmes in support of 2020 Round Population and Housing censuses United Nations Regional Workshop on the 2020 World Programme on.
Public Expenditure Tracking and Service Delivery Surveys
Quality assurance in population and housing census SUDAN’s EXPERIANCE in QUALITY assurance of Censuses By salah El din. A . Magid OUR EXPERIANCE IN 5.
Developing reporting system for SDG and Agenda 2063, contribution of National Statistical System, issues faced and challenges CSA Ethiopia.
Some lessons from schools surveys in Indonesia and Papua New Guinea
Funding the Implementation of the Busan Action Plan for Statistics
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
SRH & HIV Linkages Agenda
Service Array Assessment and Planning Purposes
Operational Aspects of Impact Evaluation
LOCALIZING AGENDA 2030 AND THE SDGS: UNDG’S “MAPS” APPROACH IN CAMBODIA.
Presentation transcript:

Common lessons on how we measure service delivery Markus Goldstein World Bank

Outline Big picture issues – Conceptualization and definition of the exercise Getting started – Issues involved in the design of the exercise Implementation issues

The biggest picture: is it worth it? Costs – Monetary costs – Frictions generated during data collection – Things that will go wrong/get complicated and require time and effort Benefits – use to which data will be put – Strengthen monitoring – Increase accountability – Answer policy relevant research questions – Rigorously evaluate programs You affect both of these. In the end, weigh feasibility against potential policy impact.

The Bigger Picture Carefully consider scope versus depth – What is the central question this exercise is designed to answer? e.g. Report cards give perceptions – but these may not correlate with outcomes (Bihar v Kerala, Lundberg) – Too many cooks, while providing a strong constituency, can spoil the broth – PETS e.g. – focus on one piece of the puzzle, even ask about funds by name (Filmer)

The Bigger Picture Weigh the trade off between relevance and comparability – Relevance: fairly accurate measurement, cover a swath of the population of concern, and useful for local decision making – Comparability: compare results across providers and settings, including x-countries – More accurate = less likely to be comparable

The Bigger Picture Make sure the data will be used – Data particularly underused in multitopic surveys – Identify concrete uses and applications (and person responsible) beforehand – Dissemination and consultation plan helps – Be open to new uses in early phases of design and collection

Getting started Administrative data are the starting point for any exercise – Admin data can sometimes get us farther than we think – broader coverage, more frequently collected than surveys – Examine admin data before purposive survey – If possible, use service delivery data exercise to strengthen admin data (e.g. PETS – financial systems) – Make sure improving admin data is part of national agenda – Admin data will be first step (e.g. school survey needs a school census)

Getting started Admin data that are actively used are more likely to be of better quality When looking at the effects of a program, admin data may provide critical info – Complement with discussions w/program staff to better understand what they mean The svc delivery measurement process may itself change the underlying process that produces the data – e. g. report cards, PETS (better at hiding or better system)

Getting started Be sure to tread lightly – Cooperation of staff is critical (except absenteeism) – Unannounced visits are risky & potentially expensive – Broaden scope in some cases to make survey less threatening Build cooperation around the service delivery measurement exercise – Bring together ministries, donors, researchers (Bekh et al.: consultative/advisory teams) – Costly but benefits (identify diverse data, more likely to be used)

Getting started Use the service delivery measurement exercise to strengthen institutions – This is an opening to build institutional capacity & commitment to the collection and use of monitoring data – House it within ministry to help build capacity Triangulate to improve accuracy – e.g. informal payments as measured in facility surveys and exit surveys (Lundberg) – Compare same source at different levels (central or facility) or different modes of reporting (e.g. different individuals within a facility) – e.g. ghost workers

How much did you pay today? (from exit survey)

Expenditure on health services (from household survey)

Getting started Keep the broader picture in mind in designing the measurement tool – e.g. Indonesia survey tracked central gov’t transfers but also showed local gov’ts were reducing their allocations (Filmer) – think GE Do qualitative work – Qual is not only useful in refining the quant tool, but which tool to use – Iterate again later (if you can) to better understand results

Getting started Do pilot testing – Check if tool is producing data you need – We have less experience about what works and what does not in service delivery Think carefully about the unit of observation and the sampling strategy – e.g. demand side versus supply side Think about the level of disaggregation – Particular issue with admin data – although sometimes lower level data exists and it is possible to collect it

Getting started Examine potential ways to apply geographical information – Powerful way to integrate, link and display different data – Requires proper training – Geo-referenced data needs to be time referenced – Central coordination will help avoid duplication and potential discrepancies – Geo data on households can be sensitive

Getting started Don’t forget mobile service providers In looking at policy changes or program changes, don’t forget implementation lags, or even information lags

Implementation Listen to survey and data collection teams – e.g. of useful areas: Gaps in the data collection instrument Effectiveness of questions Implementation lags – Make sure to debrief early Time the data collection carefully – When are records destroyed? – Seasonal use of facilities? – Environmental conditions may change over the year – Flow of funds between center and facilities may vary over the year

Implementation Be specific about definitions (e.g. PETS) Linking different types of data increases the power of the analysis, but is messy – Identifiers can vary (admin/political units, different admin levels, by project, etc) – The earlier you start, the better – Consider GIS, backup methods Set up your data so that it can be used by others

Implementation Consider the potential for applications of census data, including small area estimation (census data is common…) Use a data entry program that allows rapid identification & rectification of mistakes Make copies of all records discovered (don’t know when you’ll need it) and used (for checking later)

Thank you Are You Being Served on the web:

Report cards (Lundberg) Compare facility survey data with report cards in Uganda Significant correlations: – Waiting time (-) – Consultation time (-) – Treated politely (+) – Asked questions (+) Not significant – Given physical exam – Touched during examination – Pulse taken

Perceptions unpacked (Lundberg) Compare facility survey data with exit polls in Uganda Significant correlations: – Waiting time (-) – Consultation time (-) – Treated politely (+) – Asked questions (+) Not significant – Given physical exam – Touched during examination – Pulse taken