Presentation is loading. Please wait.

Presentation is loading. Please wait.

Econometric Approaches to Causal Inference: Difference-in-Differences and Instrumental Variables Graduate Methods Master Class Department of Government,

Similar presentations


Presentation on theme: "Econometric Approaches to Causal Inference: Difference-in-Differences and Instrumental Variables Graduate Methods Master Class Department of Government,"— Presentation transcript:

1 Econometric Approaches to Causal Inference: Difference-in-Differences and Instrumental Variables Graduate Methods Master Class Department of Government, Harvard University February 25, 2005

2 Overview: diff-in-diffs and IV Data Randomized experimentObservational data or natural experiment or natural experiment Problem We cannot observe theOVB, selection bias, counterfactual (what ifsimultaneous causality counterfactual (what ifsimultaneous causality treatment group had not treatment group had not received treatment) received treatment) Method Difference-in-differencesInstrumental variables

3 Diff-in-diffs: basic idea Suppose we randomly assign treatment to some units (or nature assigns treatment “as if” by random assignment) To estimate the treatment effect, we could just compare the treated units before and after treatment However, we might pick up the effects of other factors that changed around the time of treatment Therefore, we use a control group to “difference out” these confounding factors and isolate the treatment effect

4 Diff-in-diffs: without regression One approach is simply to take the mean value of each group’s outcome before and after treatment Treatment groupControl group Treatment groupControl group Before T B C B Before T B C B After T A C A After T A C A and then calculate the “difference-in-differences” of the means: Treatment effect = (T A - T B ) - ( C A - C B )

5 Diff-in-diffs: with regression We can get the same result in a regression framework (which allows us to add regression controls, if needed): y i = β 0 + β 1 treat i + β 2 after i + β 3 treat i *after i + e i wheretreat = 1 if in treatment group, = 0 if in control group wheretreat = 1 if in treatment group, = 0 if in control group after = 1 if after treatment, = 0 if before treatment The coefficient on the interaction term (β 3 ) gives us the difference-in-differences estimate of the treatment effect

6 Diff-in-diffs: with regression To see this, plug zeros and ones into the regression equation: y i = β 0 + β 1 treat i + β 2 after i + β 3 treat i *after i + e i Treatment Control Treatment Control Group Group Difference Group Group Difference Before β 0 + β 1 β 0 β 1 Before β 0 + β 1 β 0 β 1 After β 0 + β 1 + β 2 + β 3 β 0 + β 2 β 1 + β 3 After β 0 + β 1 + β 2 + β 3 β 0 + β 2 β 1 + β 3 Difference β 2 + β 3 β 2 β 3 Difference β 2 + β 3 β 2 β 3

7 Diff-in-diffs: example Card and Krueger (1994) What is the effect of increasing the minimum wage on employment at fast food restaurants? Confounding factor: national recession Treatment group = NJ Before = Feb 92 Control group = PAAfter = Nov 92 FTE i = β 0 + β 1 NJ i + β 2 Nov92 i + β 3 NJ i *Nov92 i + e i

8 Diff-in-diffs: example FTE i = β 0 + β 1 NJ i + β 2 Nov92 i + β 3 NJ i *Nov92 i + e FTE FTE Control group (PA) Control group (PA) Treatment group (NJ) Treatment group (NJ) Time Time Treatment effect of minimum wage increase = FTE

9 Diff-in-diff-in-diffs A difference-in-difference-in-differences (DDD) model allows us to study the effect of treatment on different groups If we are concerned that our estimated treatment effect might be spurious, a common robustness test is to introduce a comparison group that should not be affected by the treatment For example, if we want to know how welfare reform has affected labor force participation, we can use a DD model that takes advantage of policy variation across states, and then use a DDD model to study how the policy has affected single versus married women

10 Diff-in-diffs: drawbacks Diff-in-diff estimation is only appropriate if treatment is random - however, in the social sciences this method is usually applied to data from natural experiments, raising questions about whether treatment is truly random Also, diff-in-diffs typically use several years of serially-correlated data but ignore the resulting inconsistency of standard errors (see Bertrand, Duflo, and Mullainathan 2004)

11 IV: basic idea Suppose we want to estimate a treatment effect using observational data The OLS estimator is biased and inconsistent (due to correlation between regressor and error term) if there is - omitted variable bias - selection bias - simultaneous causality If a direct solution (e.g. including the omitted variable) is not available, instrumental variables regression offers an alternative way to obtain a consistent estimator

12 IV: basic idea Consider the following regression model: y i = β 0 + β 1 X i + e i Variation in the endogenous regressor X i has two parts - the part that is uncorrelated with the error (“good” variation) - the part that is correlated with the error (“bad” variation) The basic idea behind instrumental variables regression is to isolate the “good” variation and disregard the “bad” variation

13 IV: conditions for a valid instrument The first step is to identify a valid instrument A variable Z i is a valid instrument for the endogenous regressor X i if it satisfies two conditions: 1. Relevance:corr (Z i, X i ) ≠ 0 2. Exogeneity:corr (Z i, e i ) = 0

14 IV: two-stage least squares The most common IV method is two-stage least squares (2SLS) Stage 1: Decompose X i into the component that can be predicted by Z i and the problematic component predicted by Z i and the problematic component X i =  0 +  1 Z i +  i X i =  0 +  1 Z i +  i Stage 2: Use the predicted value of X i from the first-stage regression to estimate its effect on Y i regression to estimate its effect on Y i y i =  0 +  1 X-hat i + i y i =  0 +  1 X-hat i + i Note: software packages like Stata perform the two stages in a single regression, producing the correct standard errors

15 IV: example Levitt (1997): what is the effect of increasing the police force on the crime rate? This is a classic case of simultaneous causality (high crime areas tend to need large police forces) resulting in an incorrectly- signed (positive) coefficient To address this problem, Levitt uses the timing of mayoral and gubernatorial elections as an instrumental variable Is this instrument valid? Relevance: police force increases in election years Exogeneity: election cycles are pre-determined

16 IV: example Two-stage least squares: Stage 1: Decompose police hires into the component that can be predicted by the electoral cycle and the problematic be predicted by the electoral cycle and the problematic component component police i =  0 +  1 election i +  i police i =  0 +  1 election i +  i Stage 2: Use the predicted value of police i from the first-stage regression to estimate its effect on crime i regression to estimate its effect on crime i crime i =  0 +  1 police-hat i + i crime i =  0 +  1 police-hat i + i Finding: an increased police force reduces violent crime (but has little effect on property crime) (but has little effect on property crime)

17 IV: number of instruments There must be at least as many instruments as endogenous regressors Let k = number of endogenous regressors m = number of instruments m = number of instruments The regression coefficients are exactly identified if m=k (OK) exactly identified if m=k (OK) overidentified if m>k (OK) overidentified if m>k (OK) underidentified if m { "@context": "http://schema.org", "@type": "ImageObject", "contentUrl": "http://images.slideplayer.com/1377468/8/slides/slide_16.jpg", "name": "IV: number of instruments There must be at least as many instruments as endogenous regressors Let k = number of endogenous regressors m = number of instruments m = number of instruments The regression coefficients are exactly identified if m=k (OK) exactly identified if m=k (OK) overidentified if m>k (OK) overidentified if m>k (OK) underidentified if mk (OK) overidentified if m>k (OK) underidentified if m

18 IV: testing instrument relevance How do we know if our instruments are valid? Recall our first condition for a valid instrument: 1. Relevance: corr (Z i, X i ) ≠ 0 Stock and Watson’s rule of thumb: the first-stage F-statistic testing the hypothesis that the coefficients on the instruments are jointly zero should be at least 10 (for a single endogenous regressor) A small F-statistic means the instruments are “weak” (they explain little of the variation in X) and the estimator is biased

19 IV: testing instrument exogeneity Recall our second condition for a valid instrument: 2. Exogeneity: corr (Z i, e i ) = 0 If you have the same number of instruments and endogenous regressors, it is impossible to test for instrument exogeneity But if you have more instruments than regressors: Overidentifying restrictions test – regress the residuals from the 2SLS regression on the instruments (and any exogenous control variables) and test whether the coefficients on the instruments are all zero

20 IV: drawbacks It can be difficult to find an instrument that is both relevant (not weak) and exogenous Assessment of instrument exogeneity can be highly subjective when the coefficients are exactly identified IV can be difficult to explain to those who are unfamiliar with it

21 Sources Stock and Watson, Introduction to Econometrics Bertrand, Duflo, and Mullainathan, “How Much Should We Trust Differences-in-Differences Estimates?” Quarterly Journal of Economics February 2004 Card and Krueger, "Minimum Wages and Employment: A Case Study of the Fast Food Industry in New Jersey and Pennsylvania," American Economic Review, September 1994 Angrist and Krueger, “Instrumental Variables and the Search for Identification: From Supply and Demand to Natural Experiments,” Journal of Economic Perspectives, Fall 2001 Levitt, “Using Electoral Cycles in Police Hiring to Estimate the Effect of Police on Crme,” American Economic Review, June 1997


Download ppt "Econometric Approaches to Causal Inference: Difference-in-Differences and Instrumental Variables Graduate Methods Master Class Department of Government,"

Similar presentations


Ads by Google