Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

Impact Evaluation Methods: Causal Inference
The World Bank Human Development Network Spanish Impact Evaluation Fund.
An Overview Lori Beaman, PhD RWJF Scholar in Health Policy UC Berkeley
The World Bank Human Development Network Spanish Impact Evaluation Fund.
N ON -E XPERIMENTAL M ETHODS Shwetlena Sabarwal (thanks to Markus Goldstein for the slides)
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
Presented by Malte Lierl (Yale University).  How do we measure program impact when random assignment is not possible ?  e.g. universal take-up  non-excludable.
Impact Evaluation Click to edit Master title style Click to edit Master subtitle style Impact Evaluation World Bank InstituteHuman Development Network.
Regression Discontinuity Design William Shadish University of California, Merced.
Heterogeneous impact of the social program Oportunidades on contraceptive methods use in young adult women living in rural areas: limitations of the regression.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
TRANSLATING RESEARCH INTO ACTION What is Randomized Evaluation? Why Randomize? J-PAL South Asia, April 29, 2011.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Assessing the Distributional Impact of Social Programs The World Bank Public Expenditure Analysis and Manage Core Course Presented by: Dominique van de.
Chapter Four Experimental & Quasi-experimental Designs.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Session III Regression discontinuity (RD) Christel Vermeersch LCSHD November 2006.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Public Policy Analysis ECON 3386 Anant Nyshadham.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Measuring Impact 1 Non-experimental methods 2 Experiments
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Laura Chioda.
Randomized Assignment Difference-in-Differences
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Measuring Results and Impact Evaluation: From Promises into Evidence
Quasi Experimental Methods I
General belief that roads are good for development & living standards
Quasi Experimental Methods I
An introduction to Impact Evaluation
Quasi-Experimental Methods
Impact Evaluation Methods
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
Research Methods 3. Experimental Research.
Matching Methods & Propensity Scores
Matching Methods & Propensity Scores
EMPIRICAL STUDY AND FORECASTING (II)
Impact Evaluation Methods
Impact Evaluation Methods
1 Causal Inference Counterfactuals False Counterfactuals
Matching Methods & Propensity Scores
Implementation Challenges
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Methods: Difference in difference & Matching
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
Sampling for Impact Evaluation -theory and application-
SAMPLING AND STATISTICAL POWER
Applying Impact Evaluation Tools: Hypothetical Fertilizer Project
EVALUATION OF THE ZANZIBAR UNIVERSAL PENSION SCHEME (ZUPS) TO ASSESS THE IMPACT OF THE SOCIAL PENSION ON RECIPIENTS AND THEIR FAMILIES Logo Logo The sample:
Module 3: Impact Evaluation for TTLs
Steps in Implementing an Impact Evaluation
TTL: Nyambura Githagui
Analysis of Covariance
Using Big Data to Solve Economic and Social Problems
Steps in Implementing an Impact Evaluation
How Should We Select and Define Trial Estimands
Presentation transcript:

Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez

2 Measuring Impact Experimental design/randomization Quasi-experiments –Regression Discontinuity –Double differences (diff in diff) –Other options

3 Case 4: Regression Discontinuity Assignment to treatment is based on a clearly defined index or parameter with a known cutoff for eligibility RD is possible when units can be ordered along a quantifiable dimension which is systematically related to the assignment of treatment The effect is measured at the discontinuity – estimated impact around the cutoff may not generalize to entire population

4 Anti-poverty programs  targeted to households below a given poverty index Pension programs  targeted to population above a certain age Scholarships  targeted to students with high scores on standardized test CDD Programs  awarded to NGOs that achieve highest scores Indexes are common in targeting of social programs

5 Target transfer to poorest households Construct poverty index from 1 to 100 with pre-intervention characteristics Households with a score <=50 are poor Households with a score >50 are non-poor Cash transfer to poor households Measure outcomes (i.e. consumption) before and after transfer Example: Effect of Cash Transfer on Consumption

6

7 Non-Poor Poor

8

9 Treatment Effect

10 Oportunidades assigned benefits based on a poverty index Where Treatment = 1 if score <=750 Treatment = 0 if score >750 Case 4: Regression Discontinuity

11 Case 4: Regression Discontinuity 2 Baseline – No treatment

12 Treatment Period Case 4: Regression Discontinuity

13 Potential Disadvantages of RD Local average treatment effects – not always generalizable Power: effect is estimated at the discontinuity, so we generally have fewer observations than in a randomized experiment with the same sample size Specification can be sensitive to functional form: make sure the relationship between the assignment variable and the outcome variable is correctly modeled, including: –Nonlinear Relationships –Interactions

14 Advantages of RD for Evaluation RD yields an unbiased estimate of treatment effect at the discontinuity Can many times take advantage of a known rule for assigning the benefit that are common in the designs of social policy –No need to “exclude” a group of eligible households/individuals from treatment

15 Measuring Impact Experimental design/randomization Quasi-experiments –Regression Discontinuity –Double differences (Diff in diff) –Other options

16 Case 5: Diff in diff Compare change in outcomes between treatments and non-treatment –Impact is the difference in the change in outcomes Impact = (Y t1 -Y t0 ) - (Y c1 -Y c0 )

17 Time Treatment Outcome Treatment Group Control Group Average Treatment Effect

18 Time Treatment Outcome Treatment Group Control Group Estimated Average Treatment Effect Average Treatment Effect

19 Diff in Diff Fundamental assumption that trends (slopes) are the same in treatments and controls Need a minimum of three points in time to verify this and estimate treatment (two pre- intervention)

20 Case 5: Diff in Diff

21 Impact Evaluation Example – Summary of Results