Presentation is loading. Please wait.

Presentation is loading. Please wait.

There and Back Again: An Impact Evaluator’s Tale Paul Gertler University of California, Berkeley July, 2015.

Similar presentations


Presentation on theme: "There and Back Again: An Impact Evaluator’s Tale Paul Gertler University of California, Berkeley July, 2015."— Presentation transcript:

1 There and Back Again: An Impact Evaluator’s Tale Paul Gertler University of California, Berkeley July, 2015

2 Answer these questions What is impact evaluation? What makes a good impact evaluation? Why is impact evaluation valuable? 1 2 3

3 Answer these questions What is impact evaluation? What makes a good impact evaluation? Why is impact evaluation valuable? 1 2 3

4 “ Impact Evaluation An assessment of the causal effect of a project, program or policy on beneficiary outcomes. Estimates the change in outcomes attributable to the intervention.

5 Impact Evaluation Answers What is effect of information on hand washing, hygiene and child health? Does paying primary health care workers for performance improve access & quality? Do early childhood education programs improve subsequent learning? Does expansion of urban sewares reduce open defecation & improve health?

6 Impact Evaluation Answers What was the effect of the program on outcomes? How much better off are the beneficiaries because of the program/policy? How would outcomes change if changed program design? Is the program cost-effective? Traditional M&E cannot answer these.

7 Answer these questions What is impact evaluation? What makes a good impact evaluation? Why is impact evaluation valuable? 1 2 3

8 How are Impact Evaluations Useful? To inform program design As an input to funding decisions As a means of influencing ideas

9 How are Impact Evaluations Useful? As an input to funding decisions

10 PROGRESA / Oportunidades (Mexico) Inform Program DesignInfluence IdeasInput to Funding Decisions Evaluation shows significant impacts on education and health → scaled up & adopted by new presidential administration. Households paid to send children to school and regular health checkups First evaluation of large- scale CCT program Families enrolled in Oportunidades

11 Roof Rain Water Cisterns -- Brazil Inform Program DesignInfluence IdeasInput to Funding Decisions Northeastern states very dry Collect rain from roofs during rainy season Store in cistern Evaluation: No significant impact on health Families value cisterns Value of house & Depression

12 Water Sensors

13 Low Cost Pre-Schools (Mozambique) Inform Program DesignInfluence IdeasInput to Funding Decisions Pre-schools constructed in 30 villages (Save the Children) Volunteer community members trained to staff schools Evaluation shows significant impacts on health and education indicators → scaled up to cover 600 rural communities.

14 How are Impact Evaluations Useful? To inform program design

15 Improving Access to Essential Medicines (Zambia) Influence IdeasInput to Funding DecisionsInform Program Design Clinics receive monthly supplies from district stores Clinics receive supplies direct from central stores Stock-outs are reduced under both distribution systems: Direct distribution more cost-effective → replicated across Zambia (World Bank, 2010) $4.20 / day of averted stock- out $14.50 / day of averted stock- out

16 Targeting the Poor (Indonesia) Influence IdeasInput to Funding DecisionsInform Program Design Randomized field experiments evaluating accuracy of 3 methods for targeting the poor: Community-based Proxy means test (PMT) Self-targeting PMT & community-based found to be most accurate → findings used by Indonesian government to build registry of poorest 40% (World Bank, 2012)

17 Ongoing work o Slum sanitation connection fees (Kenya) o Private sanitation facilities subsidies (India) o Social versus private o Management of public toilets (India) o Behavior change (CLTS)

18 How are Impact Evaluations Useful? As a means of influencing ideas

19 Conditional Cash Transfer (CCT) Programs Input to Funding DecisionsInform Program DesignInfluence Ideas Countries implementing CCT programs in 1997

20 Conditional Cash Transfer (CCT) Programs Input to Funding DecisionsInform Program DesignInfluence Ideas Countries implementing CCT programs in 2011

21 Results-Based Financing (Rwanda) Input to Funding DecisionsInform Program DesignInfluence Ideas Health clinics paid to immunize children and encourage women to give birth in a clinic Treatment group received bonuses according to performance; Control group received grant regardless of performance Evaluation shows RBF has significant impact on prenatal care → results inspire RBF designs in other countries, including Nigeria (Basingra et al, 2010) 15% Standard deviation increase due to RBF Treatment (RBF) Facilities Control Facilities

22 Results-Based Financing (Argentina) Input to Funding DecisionsInform Program DesignInfluence Ideas Plan Nacer used to provide additional health care coverage to women and young children Federal resources allocated to provinces based on enrollment and health results achieved by each province Evaluation shows RBF has significant impact on prenatal care, infant/ maternal mortality → results inspire RBF program in Dominican Republic & Peru

23 Answer these questions Why is evaluation valuable? What makes a good impact evaluation? 1 2

24 How to assess impact What is beneficiary’s diarrhea incidence of diarrhea in last 3-days with program compared to without program? Compare same individual with & without programs at same point in time Formally, program impact is: α = (Y | P=1) - (Y | P=0) e.g. How much does an safe water intervention reduce diarrhea?

25 Solving the evaluation problem Estimated impact is difference between treated observation and counterfactual. Counterfactual : what would have happened without the program. Use Control group to estimate counterfactual. Never observe same individual with and without program at same point in time. Counterfactual is key to impact evaluation.

26 Possible Solutions Need to guarantee comparability of treatment and control groups. ONLY remaining difference is intervention. Consider: o Experimental design/randomization o Quasi-experiments

27 Counterfactual Criteria Treated & Counterfactual (1) Have identical characteristics, (2) Except for benefiting from the intervention. No other reason for differences in outcomes of treated and counterfactual. Only reason for the difference in outcomes is due to the intervention.

28 2 Counterfeit Counterfactuals Before and After Those not enrolled o Those who choose not to enroll in the program o Those who were not offered the program Same individual before the treatment Problem: Cannot completely know why the treated are treated and the others not.

29 1. Before and After: Example Agricultural assistance program o Financial assistance to purchase inputs. o Compare rice yields before and after. o Before is normal rainfall, but after is drought. o Find fall in rice yield. o Did the program fail? o Could not separate (identify) effect of financial assistance program from effect of rainfall.

30 2. Those not enrolled: Example With no insurance: Those that did not buy, have lower medical costs than that did Health insurance offered Compare health care utilization of those who got insurance to those who did not o Who buys insurance?: those that expect large medical expenditures o Who does not?: those who are healthy Poor estimate of counterfactual

31 Program placement: example Compare fertility in villages offered program to fertility in other villages Government offers a family planning program to villages with high fertility Program targeted based on fertility, so (1)Treatments have high fertility and (2)counterfactuals have low fertility. Estimated program impact confounded with targeting criteria

32 What's wrong? Selection bias: People choose to participate for specific reasons 1 2 3 Many times reasons are related to the outcome of interest Cannot separately identify impact of the program from these other factors/reasons

33 Need to know… All the reasons why someone gets the program and others not. All the reasons why individuals are in the treatment versus control group. If reasons correlated w/ outcome cannot identify/separate program impact from other explanations of differences in outcomes.

34 Possible Solutions Need to guarantee comparability of treatment and control groups. ONLY remaining difference is intervention. In this seminar we will consider: o Experimental design/randomization o Quasi-experiments (Regression Discontinuity, Double differences) o Instrumental Variables.

35 These solutions all involve… Knowing how the data are generated. Randomization o Give all equal chance of being in control or treatment groups o Guarantees that all factors/characteristics will be on average equal btw groups o Only difference is the intervention If not, need transparent & observable criteria for who is offered program.

36 Working Smarter in IE Be strategic in selecting programs to evaluate Don’t assume costly data collection is always necessary Use IE to maximize program efficiency, not just impact Impact evaluation as an operational research tool Use administrative data when possible Use IE’s to fill knowledge gaps & assess alternatives for key programs

37 Ensuring the Impact of Impact Evaluations Engage early. Engage Often. Work locally. Think globally. Tale of 2 Evaluations Involve stakeholders at every stage Prospective – IE part of design Foster relationships on the ground with decision-makers Results inform decisions beyond the borders of country studied

38 Water Privatization -- Argentina o Municipal water & sanitation o Public versus Private management o Evaluation: o Increase in water quality o Increase in access by poor o Large reductions in child mortality No Influence on policy: Academic study without stakeholders involvement

39 Improving Housing in Urban Slums (Piso Firme) Inform Program DesignInfluence IdeasInput to Funding Decisions Replacing Dirt Floors with Cement Floors (Piso Firme) Evaluation shows significant impacts on reducing diarrhea, parasitic infections & improving cognitive development Scaled up to 3 million households o Moving families to new housing developments (Tu Casa) o No Impact – canceled program

40 Ensuring the Impact of Impact Evaluations Engage early. Engage Often. Work locally. Think globally. Tale of 2 Evaluations Involve stakeholders at every stage Prospective – IE part of design Foster relationships on the ground with decision-makers Results inform decisions beyond the borders of country studied

41 Messages o IE Useful for policy o Resource allocation o Benefit design o Influence global ideas o What to evaluate o High cost programs with large #’s beneficiaries o Little existing knowledge o Start early – prospective o Build control groups into rollout o Work locally think globally


Download ppt "There and Back Again: An Impact Evaluator’s Tale Paul Gertler University of California, Berkeley July, 2015."

Similar presentations


Ads by Google