Presentation is loading. Please wait.

Presentation is loading. Please wait.

Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.

Similar presentations


Presentation on theme: "Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative."— Presentation transcript:

1 Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative

2 2 How to turn this child…

3 3 …into this child

4 4 Why Evaluate? Fiscal accountability –Allocate limited budget to what works best Program effectiveness –Managing by results: do more of what works Political sustainability –Negotiate budget –Inform constituents

5 5 Traditional M&E and Impact Evaluation monitoring to track implementation efficiency (input- output) INPUTSOUTCOMESOUTPUTS MONITOR EFFICIENCY EVALUATE EFFECTIVENESS $$$ BEHAVIOR impact evaluation to measure effectiveness (output-outcome)

6 6 Question types and methods M&E: monitoring & process evaluation Descriptiveanalysis Causalanalysis ▫What was the effect of the program on outcomes? ▫How would outcomes change under alternative program designs? ▫Is the program cost-effective? ▫Is program being implemented efficiently? ▫Is program targeting the right population? ▫Are outcomes moving in the right direction? Impact Evaluation:

7 7 Answer with traditional M&E or IE? Are nets being delivered as planned? Do IPTs increase cognitive ability? What is the correlation between HIV treatment and prevalence? How does HIV testing affect prevention behavior?  M&E  IE  M&E  IE

8 8 Efficacy & Effectiveness Efficacy: –Proof of Concept –Pilot under ideal conditions Effectiveness: –At scale –Normal circumstances & capabilities –Lower or higher impact? –Higher or lower costs?

9 9 Use impact evaluation to…. Test innovations Scale up what works (e.g. de-worming) Cut/change what does not (e.g. HIV counseling) Measure effectiveness of programs (e.g. JTPA ) Find best tactics to change people’s behavior (e.g. bring children to school) Manage expectations

10 10 What makes a good impact evaluation?

11 11 Evaluation problem Compare same individual with & without a program at the same point in time BUT Never observe same individual with and without program at same point in time Formally the impact of the program is: α = (Y | P=1) - (Y | P=0) Example –How much does an anti-malaria program lower under-five mortality?

12 12 Solving the evaluation problem Counterfactual: what would have happened without the program Estimate counterfactual –i.e. find a control or comparison group Counterfactual Criteria –Treated & counterfactual groups have identical initial average characteristics –Only reason for the difference in outcomes is due to the intervention

13 13 “Counterfeit” Counterfactuals Before and after: –Same individual before the treatment Non-Participants: –Those who choose not to enroll in program, or –Those who were not offered the program –Problem: We can not determine why some are treated and some are not

14 14 Before and After Example Food Aid –Compare mortality before and after –Observe mortality increases –Did the program fail? –“Before” normal year, but “after” famine year  Cannot separate (identify) effect of food aid from effect of drought

15 15 Before & After Compare Y before & after intervention Before & after counterfactual =B Estimated impact =A-B Control for time varying factors True counterfactual=C True impact=A-C A-B is under-estimated Time Y AfterBefore A B C t-1t Treatment B

16 16 Non-Participants…. Compare non-participants to participants Counterfactual: non-participant outcomes Problem: why did they not participate? Estimated Impact α i = (Y it | P=1) - (Y kt | P=0), Hypothesis : (Y kt | P=0) = (Y it | P=0)

17 17 Mothers who came to the health unit for ORT and mothers who did not? Communities that applied for funds for IRS and communities that did not? People who receive ART and people who do not? Child had diarrhea Access to clinic Costal and mountain Epidemic and non-epidemic People with HIV Access to clinic Exercise: Why participants and non-participants might differ?

18 18 Health program example Treatment offered Who signs up? –Those who are sick –Areas with epidemics Have lower health status that those who do not sign up Healthy people/communities are a poor estimate of counterfactual

19 19 What's wrong? Selection bias: People choose to participate for specific reasons Many times reasons are directly related to the outcome of interest Cannot separately identify impact of the program from these other factors/reasons

20 20 Need to know… Why some get assigned to treatment and others to control group. If reasons correlated with outcome –cannot separately identify program impact from –these other “selection” factors The process by which data is generated

21 21 Possible Solutions… Guarantee comparability of treatment and control groups ONLY remaining difference is intervention How? –Experimental design/randomization –Quasi-experiments Regression Discontinuity Double differences –Instrumental Variables

22 22 These solutions all involve… EITHER Randomization –Give all equal chance of being in control or treatment groups –Guarantees that all factors/characteristics will be on average equal between groups –Only difference is the intervention OR Transparent & observable criteria for assignment into the program

23 23 Finding controls: opportunities Budget constraints: –Eligible who get it = potential treatments –Eligible who do not = potential controls Roll-out capacity: –Those who go first = potential treatments –Those who go later = potential controls

24 24 Finding controls: ethical considerations Do not delay benefits: Rollout based on budget/capacity constraints Equity: equally deserving populations deserve an equal chance of going first Transparent & accountable method –Give everyone eligible an equal chance –If rank based on criteria, then criteria should be measurable and public

25 25 Thank you


Download ppt "Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative."

Similar presentations


Ads by Google