Presentation is loading. Please wait.

Presentation is loading. Please wait.

Common lessons on how we measure service delivery Markus Goldstein World Bank.

Similar presentations

Presentation on theme: "Common lessons on how we measure service delivery Markus Goldstein World Bank."— Presentation transcript:

1 Common lessons on how we measure service delivery Markus Goldstein World Bank

2 Outline Big picture issues – Conceptualization and definition of the exercise Getting started – Issues involved in the design of the exercise Implementation issues

3 The biggest picture: is it worth it? Costs – Monetary costs – Frictions generated during data collection – Things that will go wrong/get complicated and require time and effort Benefits – use to which data will be put – Strengthen monitoring – Increase accountability – Answer policy relevant research questions – Rigorously evaluate programs You affect both of these. In the end, weigh feasibility against potential policy impact.

4 The Bigger Picture Carefully consider scope versus depth – What is the central question this exercise is designed to answer? e.g. Report cards give perceptions – but these may not correlate with outcomes (Bihar v Kerala, Lundberg) – Too many cooks, while providing a strong constituency, can spoil the broth – PETS e.g. – focus on one piece of the puzzle, even ask about funds by name (Filmer)

5 The Bigger Picture Weigh the trade off between relevance and comparability – Relevance: fairly accurate measurement, cover a swath of the population of concern, and useful for local decision making – Comparability: compare results across providers and settings, including x-countries – More accurate = less likely to be comparable

6 The Bigger Picture Make sure the data will be used – Data particularly underused in multitopic surveys – Identify concrete uses and applications (and person responsible) beforehand – Dissemination and consultation plan helps – Be open to new uses in early phases of design and collection

7 Getting started Administrative data are the starting point for any exercise – Admin data can sometimes get us farther than we think – broader coverage, more frequently collected than surveys – Examine admin data before purposive survey – If possible, use service delivery data exercise to strengthen admin data (e.g. PETS – financial systems) – Make sure improving admin data is part of national agenda – Admin data will be first step (e.g. school survey needs a school census)

8 Getting started Admin data that are actively used are more likely to be of better quality When looking at the effects of a program, admin data may provide critical info – Complement with discussions w/program staff to better understand what they mean The svc delivery measurement process may itself change the underlying process that produces the data – e. g. report cards, PETS (better at hiding or better system)

9 Getting started Be sure to tread lightly – Cooperation of staff is critical (except absenteeism) – Unannounced visits are risky & potentially expensive – Broaden scope in some cases to make survey less threatening Build cooperation around the service delivery measurement exercise – Bring together ministries, donors, researchers (Bekh et al.: consultative/advisory teams) – Costly but benefits (identify diverse data, more likely to be used)

10 Getting started Use the service delivery measurement exercise to strengthen institutions – This is an opening to build institutional capacity & commitment to the collection and use of monitoring data – House it within ministry to help build capacity Triangulate to improve accuracy – e.g. informal payments as measured in facility surveys and exit surveys (Lundberg) – Compare same source at different levels (central or facility) or different modes of reporting (e.g. different individuals within a facility) – e.g. ghost workers

11 How much did you pay today? (from exit survey)

12 Expenditure on health services (from household survey)

13 Getting started Keep the broader picture in mind in designing the measurement tool – e.g. Indonesia survey tracked central gov’t transfers but also showed local gov’ts were reducing their allocations (Filmer) – think GE Do qualitative work – Qual is not only useful in refining the quant tool, but which tool to use – Iterate again later (if you can) to better understand results

14 Getting started Do pilot testing – Check if tool is producing data you need – We have less experience about what works and what does not in service delivery Think carefully about the unit of observation and the sampling strategy – e.g. demand side versus supply side Think about the level of disaggregation – Particular issue with admin data – although sometimes lower level data exists and it is possible to collect it

15 Getting started Examine potential ways to apply geographical information – Powerful way to integrate, link and display different data – Requires proper training – Geo-referenced data needs to be time referenced – Central coordination will help avoid duplication and potential discrepancies – Geo data on households can be sensitive

16 Getting started Don’t forget mobile service providers In looking at policy changes or program changes, don’t forget implementation lags, or even information lags

17 Implementation Listen to survey and data collection teams – e.g. of useful areas: Gaps in the data collection instrument Effectiveness of questions Implementation lags – Make sure to debrief early Time the data collection carefully – When are records destroyed? – Seasonal use of facilities? – Environmental conditions may change over the year – Flow of funds between center and facilities may vary over the year

18 Implementation Be specific about definitions (e.g. PETS) Linking different types of data increases the power of the analysis, but is messy – Identifiers can vary (admin/political units, different admin levels, by project, etc) – The earlier you start, the better – Consider GIS, backup methods Set up your data so that it can be used by others

19 Implementation Consider the potential for applications of census data, including small area estimation (census data is common…) Use a data entry program that allows rapid identification & rectification of mistakes Make copies of all records discovered (don’t know when you’ll need it) and used (for checking later)

20 Thank you Are You Being Served on the web:

21 Report cards (Lundberg) Compare facility survey data with report cards in Uganda Significant correlations: – Waiting time (-) – Consultation time (-) – Treated politely (+) – Asked questions (+) Not significant – Given physical exam – Touched during examination – Pulse taken

22 Perceptions unpacked (Lundberg) Compare facility survey data with exit polls in Uganda Significant correlations: – Waiting time (-) – Consultation time (-) – Treated politely (+) – Asked questions (+) Not significant – Given physical exam – Touched during examination – Pulse taken

Download ppt "Common lessons on how we measure service delivery Markus Goldstein World Bank."

Similar presentations

Ads by Google