Presentation is loading. Please wait.

Presentation is loading. Please wait.

The National Evaluation Platform

Similar presentations

Presentation on theme: "The National Evaluation Platform"— Presentation transcript:

1 The National Evaluation Platform
Approach Robert E Black MD, MPH Institute for International Programs Bloomberg School of Public Health Johns Hopkins University

2 Outline Why a new approach is needed
National Evaluation Platforms (NEPs): The basics Country example: Malawi Practicalities and costs

3 Most current evaluations of large-scale programs aim to use designs like this
No program Impact Coverage No impact No coverage Two basic designs at present: Before-and-after evaluation in program area Comparison of program and non-program areas over time

4 But reality is much more complex
General socioeconomic and other contextual factors Impact Coverage Routine health services Interventions in other sectors Other health programs Program Other nutrition and health programs

5 Mozambique How to evaluate the impact of USAID-supported programs?
Traditional approach: intervention versus comparison areas Source: Hilde De Graeve, Bert Schreuder.

6 Mozambique Simultaneous implementation of multiple programs
Separate, uncoordinated, inefficient evaluations if any Inability to compare different programs due to differences in methodological approaches and indicators Source: Hilde De Graeve, Bert Schreuder.

7 New evaluation designs are needed
Lancet, 2007 Large-scale programs Evaluators do not control timetable or strength of implementation Multiple simultaneous programs with overlapping interventions and aims Contextual factors that cannot be anticipated Need for country capacity and local evidence to guide programming Bulletin of WHO, 2009 Sources: Victora CG, Bryce JB, Black RE. Learning from new initiatives in maternal and child health. Lancet 2007; 370 (9593): Victora CG, Black RE, Bryce J. Evaluating child survival programs. Bull World Health Organ 2009; 87: 83.

8 National Evaluation Platforms: The Basics
Lancet, 2011 National Evaluation Platforms: The Basics

9 Builds on a common evaluation framework, adapted at country level
Common principles (with IHP+, Countdown, etc.) Standard indicators Broad acceptance

10 Evaluation databases with districts as the units
District-level databases covering the entire country Containing standard information on: Inputs (partners, programs, budget allocations, infrastructure) Processes/outputs (DHMT plans, ongoing training, supervision, campaigns, community participation, financing schemes such as conditional cash transfers) Outcomes (availability of commodities, quality of care measures, human resources, coverage) Impact (mortality, nutritional status) Contextual factors (demographics, poverty, migration) Permits national-level evaluations of multiple simultaneous programs

11 A single data base with districts as the rows
Core Data Points from Health Sector Core Data Points from Other Sectors Nutrition Surveillance System Women’s education Rainfall patterns National Stocks data base DHS Quality Checking & Feedback to Source District Chitipa Karonga …. HMIS

12 Types of comparisons supported by the platform approach
Areas with or without a given program Traditional before-and-after analysis with a comparison group Dose response analyses Regression analyses of outcome variables according to dose of implementation Stepped wedge analyses In case program is implemented sequentially

13 Evaluation platform Interim (formative) data analyses
Are programs being deployed where need is greatest? Correlate baseline characteristics (mortality, coverage, SES, health systems strength, etc) with implementation strength Allows assessment of placement bias Is implementation strong enough to have an impact? Document implementation strength and run simulations for likely impact (e.g., LiST) How to best increase coverage? Correlate implementation strength/approaches with achieved coverage (measured in midline surveys) How can programs be improved? Disseminate preliminary findings with feedback to government and partners (All analyses at district level)

14 Evaluation platform Summative data analyses
Did programs increase coverage? Comparison of areas with and without each program over time Dose-response time-series analyses correlating strength of program implementation to achieved coverage Was coverage associated with impact? Dose-response time-series analyses of coverage and impact indicators Simulation models (e.g. LiST) to corroborate results Did programs have an impact on mortality and nutritional status? Dose-response time-series analyses correlating strength of program implementation with impact measures

15 The platform approach can contribute to all types of designs
Having baseline information on all districts allows researchers to measure and control placement bias In real life one cannot predict which districts will have strong implementation and which ones will not In intervention/comparison designs, it is important to document that comparison districts are free of the intervention Collecting information on several outcomes allows assessment of side-effects of the program on other health indicators

16 Country Example: CCM in Malawi

17 Simultaneous implementation of multiple programs
Separate, uncoordinated, inefficient evaluations (if any) Inability to compare different programs due to differences in methodological approaches and indicators This shows the presence of the various funding partners across the various provinces. As you can see the howl coutnry is covered with their flags and logo´s. Technical agencies such as WHO and UNICEF are practically present in all provinces. Other mainly bilteral are only present in one (Denmark, EC, Finland, Flanders France) or two provinces (Catalunya, Ireland, Italy, Spain, Swiss, UK). Within each province an agency often covers only one or a few districts.

18 Malawi CCM scale-up limits use of intervention-comparison design
… and implemented in Hard-to-Reach Areas! (March 2011) CCM supported in all districts beginning in 2009… Proportion of Hard-to-Reach Areas with ≥1 Functional Village Clinic, March 2011

19 Malawi adaptation of National Evaluation Platform approach
National Evaluation Platform design using dose-response analysis, with Dose = Program implementation strength Response = Increases in coverage; decreases in mortality Evaluation Question: Are increases in coverage and reductions in mortality greater in districts with stronger MNCH program implementation?

20 Platform design overview
Design element Data sources (sample = 29 districts) Documentation of program implementation and contextual factors Full documentation every 6 months through systematic engagement of DHMTs Quality of care survey at 1st- level health facilities Existing 2009 data to be used for 18 districts; repeat survey in 2011 Quality of care at community level (HSAs) Desirable to conduct in all 28 districts (Not included in this budget proposal) Intervention coverage DHS 2010, with samples of 1,000 households representative at district level in all 28 districts DHS/MICS 2014 with samples representative at district level in all 28 districts Costs Costing exercises in ≈ 1/3 of districts distributed by region and chosen systematically to reflect differences in implementation strategy or health system context Impact (under-five mortality and nutritional status) End-line household survey (MICS or DHS?) in 2014 Modeled estimates of impact based on measured changes in coverage using LiST

21 National Evaluation Platform: Progress in Malawi - 1
Continued district level documentation in 16 districts Pilot of cellphone interviews for community-level documentation Stakeholder meeting in April 2011 Full endorsement by the MOH Partners urged to coordinate around developing a common approach for assessment of CCM and non-CCM program implementation strength Need to allow sufficient implementation time to increase likelihood of impact MOH addressed letter to donors requesting support for platform Partners’ meetings in September and December 2011 to agree on plans for measuring implementation strength

22 National Evaluation Platform: Progress in Malawi - 2
All partners (SCF, PSI, WHO, UNICEF) actively monitoring CCM implementation in their districts Funding secured for 16 of 28 districts; additional funding for remaining districts seems probable Discussions under way about broadening platform to cover nutrition programs Other countries expressing interest! Mozambique, Bangladesh, Burkina Faso, …

23 Analysis Plan “Dose” “Response”
CCM implementation strength (per 1,000 pop): CHWs CHWs trained in CCM CHWs supervised CHWs with essential commodities available Financial inputs Change in Tx rates for childhood illnesses Change in U5M

24 Contextual Factors Categories Indicators Rainfall patterns
ENVIRONMENTAL, DEMOGRAPHIC AND SOCIOECONOMIC Rainfall patterns Average annual rainfall; seasonal rain patterns Altitude Height above sea level Epidemics Qualitative Humanitarian crises Socio-economic factors Women’s education & literacy; household assets; ethnicity, religion and occupation of head of household Demographic Population; population density; urbanization; total fertility rate; family size Fuel costs! Added as this slowed program implementation in HEALTH SYSTEMS AND PROGRAMS User fees Changes in user fees for IMCI drugs Other MNCH Health Programs The presence of other programs or partners working in MNCH

25 Advantageous context for NEP
strong network of MNCH partners implementing CCM administrative structure decentralized to districts SWAp II in development now district-level data bases (2006 MICS, 2010 DHS, Malawi Socio-Economic Database (MASEDA)) DHS includes approx. 1,000 households in each district

26 Practicalities and Limitations

27 Sample sizes must be calculated on a country-by-country basis
Statistical power (likelihood of detecting an effect) will depend on: Number of districts in country (fixed; e.g. 28 in Malawi) How strongly the program is implemented, and by how much implementation affects coverage and mortality How much implementation varies from district to district Baseline coverage levels Presence of other programs throughout the districts How many households are included in surveys in each district May require oversampling

28 Practical arrangements
Platform should be led by credible independent partner (e.g. University or Statistical Office) Supported by an external academic group if necessary Steering committee with MOH and other relevant government units (Finance, Planning), Statistical Office, international and bilateral organizations, NGOs, etc.

29 Main costs of the platform approach
Building and maintaining database with secondary information already collected by others Requires database manager and statistician/epidemiologist for supervision May require reanalysis of existing surveys, censuses, etc Keeping track of implementation of different programs at district level Requires hiring local informants, training them and supervising their work Adding special assessments (costs, quality of care, etc) May require substantial investments in facility or CHW surveys Oversampling household surveys May require substantial investments But this will not be required in all countries

30 Summary: Evaluation platform
Limitations Observational design (but no other alternative is possible) High cost particularly due to large size of surveys But cheaper than standalone surveys Requires transparency and collaboration by multiple programs and agencies Advantages Adapted to current reality of multiple simultaneous programs/interventions Identification of selection biases Promotes country ownership and donor coordination Evaluation as a continuous process Flexible design allows for changes in implementation

31 Thank you

Download ppt "The National Evaluation Platform"

Similar presentations

Ads by Google