Www.worldbank.org/hdchiefeconomist The World Bank Human Development Network Spanish Impact Evaluation Fund.

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

Impact Evaluation Methods: Causal Inference
Impact analysis and counterfactuals in practise: the case of Structural Funds support for enterprise Gerhard Untiedt GEFRA-Münster,Germany Conference:
Explanation of slide: Logos, to show while the audience arrive.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Explanation of slide: Logos, to show while the audience arrive.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Advantages and limitations of non- and quasi-experimental methods Module 2.2.
Mywish K. Maredia Michigan State University
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation Click to edit Master title style Click to edit Master subtitle style Impact Evaluation World Bank InstituteHuman Development Network.
What could go wrong? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
How Policy Evaluations and Performance Management are Used Maureen Pirog Rudy Professor of Public and Environmental Affairs, Indiana University Affiliated.
PAI786: Urban Policy Class 2: Evaluating Social Programs.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
IMPACTS OF A WORK-BASED POVERTY REDUCTION PROGRAM ON CHILDREN’S ACHIEVEMENT AND SOCIAL BEHAVIOR: THE NEW HOPE PROJECT Aletha C. Huston, Greg J. Duncan,
Guidance on Evaluation of Youth Employment Initiative
Randomized Control Trials for Agriculture Pace Phillips, Innovations for Poverty Action
Gender and Impact Evaluation
AADAPT Workshop South Asia Goa, December 17-21, 2009 Nandini Krishnan 1.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
CAUSAL INFERENCE Shwetlena Sabarwal Africa Program for Education Impact Evaluation Accra, Ghana, May 2010.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Session III Regression discontinuity (RD) Christel Vermeersch LCSHD November 2006.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Impact Evaluation for Evidence-Based Policy Making
There and Back Again: An Impact Evaluator’s Tale Paul Gertler University of California, Berkeley July, 2015.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
Randomized Assignment Difference-in-Differences
Impact Evaluation Using Impact Evaluation for Results Based Policy Making Arianna Legovini Impact Evaluation Cluster, AFTRL Slides by Paul J. Gertler &
The World Bank Human Development Network Spanish Impact Evaluation Fund.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Measuring causal impact 2.1. What is impact? The impact of a program is the difference in outcomes caused by the program It is the difference between.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.
Impact Evaluation Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative AFTRL.
Measuring Results and Impact Evaluation: From Promises into Evidence
Quasi Experimental Methods I
What do we know and don’t know…
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
Quasi-Experimental Methods
Development Impact Evaluation in Finance and Private Sector
Empirical Tools of Public Finance
1 Causal Inference Counterfactuals False Counterfactuals
Guidance on Evaluation of Youth Employment Initiative
Evaluating Impacts: An Overview of Quantitative Methods
Explanation of slide: Logos, to show while the audience arrive.
Class 2: Evaluating Social Programs
Sampling for Impact Evaluation -theory and application-
Class 2: Evaluating Social Programs
Applying Impact Evaluation Tools: Hypothetical Fertilizer Project
Presentation transcript:

The World Bank Human Development Network Spanish Impact Evaluation Fund

MEASURING RESULTS From Promises into Evidence IMPACT EVALUATION AND This material constitutes supporting material for the "Impact Evaluation in Practice" book. This additional material is made freely but please acknowledge its use as follows: Gertler, P. J.; Martinez, S., Premand, P., Rawlings, L. B. and Christel M. J. Vermeersch, 2010, Impact Evaluation in Practice: Ancillary Material, The World Bank, Washington DC ( The content of this presentation reflects the views of the authors and not necessarily those of the World Bank.

How do we turn this teacher…

… into this teacher?

Answer these questions Why is evaluation valuable? How to implement an impact evaluation? What makes a good impact evaluation? 1 2 3

Answer these questions Why is evaluation valuable? How to implement an impact evaluation? What makes a good impact evaluation? 1 2 3

Why Evaluate? Need evidence on what works Information key to sustainability Improve program/policy implementation Limited budget and bad policies could hurt o Design (eligibility, benefits) o Operations (efficiency & targeting) o Budget negotiations o Informing beliefs and the press o Results agenda and Aid effectiveness

Impact Evaluation Answers What was the effect of the program on outcomes? How much better off are the beneficiaries because of the program/policy? How would outcomes change if changed program design? Is the program cost-effective? Traditional M&E cannot answer these.

Impact Evaluation Answers What is effect of scholarships on school attendance & performance (test scores)? Does contracting out primary health care lead to an increase in access? Does replacing dirt floors with cement reduce parasites & improve child health? Do improved roads increase access to labor markets & raise income?

Answer these questions Why is evaluation valuable? How to implement an impact evaluation? What makes a good impact evaluation? 1 2 3

How to assess impact What is beneficiary’s test score with program compared to without program? Compare same individual with & without programs at same point in time Formally, program impact is: α = (Y | P=1) - (Y | P=0) e.g. How much does an education program improve test scores (learning)?

Solving the evaluation problem Estimated impact is difference between treated observation and counterfactual. Counterfactual : what would have happened without the program. Need to estimate counterfactual. Never observe same individual with and without program at same point in time. Counterfactual is key to impact evaluation.

Counterfactual Criteria Treated & Counterfactual (1) Have identical characteristics, (2) Except for benefiting from the intervention. No other reason for differences in outcomes of treated and counterfactual. Only reason for the difference in outcomes is due to the intervention.

2 Counterfeit Counterfactuals Before and After Those not enrolled o Those who choose not to enroll in the program o Those who were not offered the program Same individual before the treatment Problem: Cannot completely know why the treated are treated and the others not.

1. Before and After: Examples School scholarship program on enrollment Agricultural assistance program o Financial assistance to purchase inputs. o Compare rice yields before and after. o Before is normal rainfall, but after is drought. o Find fall in rice yield. o Did the program fail? o Could not separate (identify) effect of financial assistance program from effect of rainfall.

2. Those not enrolled: Example 1 Compare employment & earning of those who sign up to those who did not Job training program offered Who signs up? Those who are most likely to benefit -i.e. those with more ability- would have higher earnings than non-participants without job training Poor estimate of counterfactual

2. Those not enrolled: Example 2 With no insurance: Those that did not buy, have lower medical costs than that did Health insurance offered Compare health care utilization of those who got insurance to those who did not o Who buys insurance?: those that expect large medical expenditures o Who does not?: those who are healthy Poor estimate of counterfactual

Program placement: example Compare fertility in villages offered program to fertility in other villages Government offers a family planning program to villages with high fertility Program targeted based on fertility, so (1)Treatments have high fertility and (2)counterfactuals have low fertility. Estimated program impact confounded with targeting criteria

What's wrong? Selection bias: People choose to participate for specific reasons o Job Training: ability and earning o Health Insurance: health status and medical expenditures Many times reasons are related to the outcome of interest Cannot separately identify impact of the program from these other factors/reasons

Need to know… All the reasons why someone gets the program and others not. All the reasons why individuals are in the treatment versus control group. If reasons correlated w/ outcome cannot identify/separate program impact from other explanations of differences in outcomes.

Possible Solutions Need to guarantee comparability of treatment and control groups. ONLY remaining difference is intervention. In this seminar we will consider: o Experimental design/randomization o Quasi-experiments (Regression Discontinuity, Double differences) o Instrumental Variables.

These solutions all involve… Knowing how the data are generated. Randomization o Give all equal chance of being in control or treatment groups o Guarantees that all factors/characteristics will be on average equal btw groups o Only difference is the intervention If not, need transparent & observable criteria for who is offered program.

Road Map: The next 5 days The Context o Why do results matter? o Linking monitoring with evaluation. o Importance of evidence for policy. Today The Experience Group work on evaluation design and presentations. Wednesday, Thursday, Friday The Tools o Identification strategies. o Sample size and power. o Operational issues. Today, Monday, Tuesday

Thank You

? Q & A ?