Download presentation

Presentation is loading. Please wait.

Published byAlisha Hames Modified over 2 years ago

1
Christopher Dougherty EC220 - Introduction to econometrics (chapter 3) Slideshow: multicollinearity Original citation: Dougherty, C. (2012) EC220 - Introduction to econometrics (chapter 3). [Teaching Resource] © 2012 The Author This version available at: Available in LSE Learning Resources Online: May 2012 This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 License. This license allows the user to remix, tweak, and build upon the work even for commercial purposes, as long as the user credits the author and licenses their new creations under the identical terms.

2
X 2 X 3 Y MULTICOLLINEARITY 1 Suppose that Y = 2 + 3X 2 + X 3 and that X 3 = 2X 2 – 1. There is no disturbance term in the equation for Y, but that is not important. Suppose that we have the six observations shown.

3
MULTICOLLINEARITY 2 The three variables are plotted as line graphs above. Looking at the data, it is impossible to tell whether the changes in Y are caused by changes in X 2, by changes in X 3, or jointly by changes in both X 2 and X 3. Y X3X3 X2X2

4
Change from previous observation X 2 X 3 Y MULTICOLLINEARITY 3 Numerically, Y increases by 5 in each observation. X 2 changes by 1.

5
MULTICOLLINEARITY 4 Hence the true relationship could have been Y = 1 + 5X 2. Y X3X3 X2X2 Y = 1 + 5X 2 ?

6
MULTICOLLINEARITY 5 However, it can also be seen that X 3 increases by 2 in each observation. Change from previous observation X 2 X 3 Y

7
MULTICOLLINEARITY 6 Hence the true relationship could have been Y = X 3. Y X3X3 X2X2 Y = X 3 ?

8
MULTICOLLINEARITY 7 These two possibilities are special cases of Y = 3.5 – 2.5p + 5pX (1 – p)X 3, which would fit the relationship for any value of p. Y X3X3 X2X2 Y = 3.5 – 2.5p + 5pX (1 – p)X 3

9
MULTICOLLINEARITY 8 Y X3X3 X2X2 Y = 3.5 – 2.5p + 5pX (1 – p)X 3 There is no way that regression analysis, or any other technique, could determine the true relationship from this infinite set of possibilities, given the sample data.

10
MULTICOLLINEARITY 9 What would happen if you tried to run a regression when there is an exact linear relationship among the explanatory variables?

11
MULTICOLLINEARITY 10 We will investigate, using the model with two explanatory variables shown above. [Note: A disturbance term has now been included in the true model, but it makes no difference to the analysis.]

12
MULTICOLLINEARITY 11 The expression for the multiple regression coefficient b 2 is shown above. We will substitute for X 3 using its relationship with X 2.

13
MULTICOLLINEARITY 12 First, we will replace the terms highlighted.

14
MULTICOLLINEARITY 13 We have made the replacement.

15
MULTICOLLINEARITY 14 Next, the terms highlighted now.

16
MULTICOLLINEARITY 15 We have made the replacement.

17
MULTICOLLINEARITY 16 Finally this term.

18
MULTICOLLINEARITY 17 Again, we have made the replacement.

19
MULTICOLLINEARITY 18 It turns out that the numerator and the denominator are both equal to zero. The regression coefficient is not defined.

20
MULTICOLLINEARITY 19 It is unusual for there to be an exact relationship among the explanatory variables in a regression. When this occurs, it s typically because there is a logical error in the specification.

21
. reg EARNINGS S EXP EXPSQ Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | EXPSQ | _cons | MULTICOLLINEARITY 20 However, it often happens that there is an approximate relationship. For example, when relating earnings to schooling and work experience, it if often reasonable to suppose that the effect of work experience is subject to diminishing returns.

22
. reg EARNINGS S EXP EXPSQ Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | EXPSQ | _cons | MULTICOLLINEARITY 21 A standard way of allowing for this is to include EXPSQ, the square of EXP, in the specification. According to the hypothesis of diminishing returns, 4 should be negative.

23
. reg EARNINGS S EXP EXPSQ Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | EXPSQ | _cons | MULTICOLLINEARITY 22 We fit this specification using Data Set 21. The schooling component of the regression results is not much affected by the inclusion of the EXPSQ term. The coefficient of S indicates that an extra year of schooling increases hourly earnings by $2.75.

24
. reg EARNINGS S EXP Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | _cons | MULTICOLLINEARITY 23 In the specification without EXPSQ it was 2.68, not much different.

25
. reg EARNINGS S EXP EXPSQ Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | EXPSQ | _cons | MULTICOLLINEARITY 24 The standard error, 0.23 in the specification without EXPSQ, is also little changed and the coefficient remains highly significant.

26
. reg EARNINGS S EXP EXPSQ Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | EXPSQ | _cons | MULTICOLLINEARITY 25 By contrast, the inclusion of the new term has had a dramatic effect on the coefficient of EXP. Now it is negative, which makes little sense, and insignificant.

27
MULTICOLLINEARITY 26 Previously it had been positive and highly significant.. reg EARNINGS S EXP Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | _cons |

28
. reg EARNINGS S EXP EXPSQ Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | EXPSQ | _cons | MULTICOLLINEARITY 27 The coefficient of EXPSQ is also strange. It is positive, suggesting increasing returns to experience. However, it is not significant.

29
. reg EARNINGS S EXP EXPSQ Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | EXPSQ | _cons | MULTICOLLINEARITY 28 The reason for these problems is that EXPSQ is highly correlated with EXP. This makes it difficult to discriminate between the individual effects of EXP and EXPSQ, and the regression estimates tend to be erratic.. cor EXP EXPSQ (obs=540) | EXP EXPSQ EXP | EXPSQ |

30
. reg EARNINGS S EXP EXPSQ EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | EXPSQ | _cons | reg EARNINGS S EXP EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | _cons | MULTICOLLINEARITY 29 The high correlation causes the standard error of EXP to be larger than it would have been if EXP and EXPSQ had been less highly correlated, warning us that the point estimate is unreliable.

31
. reg EARNINGS S EXP EXPSQ EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | EXPSQ | _cons | reg EARNINGS S EXP EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | _cons | MULTICOLLINEARITY 30 When high correlations among the explanatory variables lead to erratic point estimates of the coefficients, large standard errors and unsatisfactorily low t statistics, the regression is said to said to be suffering from multicollinearity.

32
. reg EARNINGS S EXP EXPSQ EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | EXPSQ | _cons | reg EARNINGS S EXP EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | _cons | MULTICOLLINEARITY 31 Note that the coefficients remain unbiased and the standard errors remain valid.

33
. reg EARNINGS S EXP EXPSQ EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | EXPSQ | _cons | reg EARNINGS S EXP EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | _cons | MULTICOLLINEARITY 32 Multicollinearity may also be caused by an approximate linear relationship among the explanatory variables. When there are only 2, an approximate linear relationship means there will be a high correlation, but this is not always the case when there are more than 2.

34
Copyright Christopher Dougherty These slideshows may be downloaded by anyone, anywhere for personal use. Subject to respect for copyright and, where appropriate, attribution, they may be used as a resource for teaching an econometrics course. There is no need to refer to the author. The content of this slideshow comes from Section 3.4 of C. Dougherty, Introduction to Econometrics, fourth edition 2011, Oxford University Press. Additional (free) resources for both students and instructors may be downloaded from the OUP Online Resource Centre Individuals studying econometrics on their own and who feel that they might benefit from participation in a formal course should consider the London School of Economics summer school course EC212 Introduction to Econometrics or the University of London International Programmes distance learning course 20 Elements of Econometrics

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google