Download presentation

Presentation is loading. Please wait.

Published byJesus Partridge Modified about 1 year ago

1

2
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Eleven

3
Perfect Multicollinearity (Chapter 6) When two or more independent variables have a perfect (error free) linear relationship with each other Which assumption does this violate? –Violates Assumption 2 Example –Equation 6-1, Page 116 Consequences: OLS is not able to estimate the model Remedy –Drop one variable

4
Imperfect Multicollinearity When two or more independent variables have an imperfect linear relationship with each other Example –Equation 6-3, Page 117

5
Consequences of Imperfect Multicollinearity B hats are unbiased B hats have higher than normal standard errors –What does this imply with regards to t- test? –We may conclude that Bs are not significantly different from zero while they are significantly different from zero. Adding and subtracting variables and observations will affect B hats significantly The adjusted R squared remains largely unaffected The B hats of uncorrelated variables remain unaffected

6
Detection of Multicollinearity If you have high adjusted R squared but low t-scores suspect a multicollinearity problem

7
Test for Multicollinearity 1.Calculate the correlation coefficients between each pair of independent variables and each independent variable and the dependent variable. EViews direction: quick group statistics correlation –Two rules 1) If |r X1, X2 | > |r X1, Y | problem, or 2) If |r X1, X2 |> 0.8 problem –Problem: this approach only detects the multicollinearity between two variables

8
Test for Multicollinearity 2.Regress each independent variable on the other independent variable Do an F-test of significance at 1% level If reject the null hypothesis multicollinearity is a problem

9
Sometimes 3 or more independent variables are correlated Example Income = f (wage rate, tax rate, hours of work, ….) Wage rate, tax rate and hours of work may be all highly correlated with each other Simple correlation coefficient may not capture this.

10
Test of Multicollinearity among 3 or more independent variables Regress each independent variable (say X 1 ) on the other independent variables (X 2, X 3, X 4 ) Then calculate VIF VIF = 1 / (1- R 2 ) If VIF >4 then X1 is highly correlated with the other independent variables Do the same for other independent variables

11
Remedies for Multicollinearity 1) If your main goal is to use the equation for forecasting and you don’t want to do specific t- test on each estimated coefficient then do nothing. This is because multicollinearity does not affect the predictive power of your equation. 2) If it seems that you have a redundant variable, drop it. –Example: You don’t need both real and nominal interest rates in your model

12
Remedies for Multicollinearity 3) If all variables need to stay in the equation, transform the multicollinear variables –Example: Number of domestic cars sold = B0 + B1 average price of domestic cars + B2 average price of foreign cars +…..+ e Problems: Prices of domestic and foreign cars are highly correlated Solution: –Number of domestic cars sold = B0 + B1 the ratio of average price of domestic cars to the average price of foreign cars +…..+ e 4) Increase the sample size or choose a different random sample

13
Assignment 9 (30 points) Due: Before 10:00 PM, Friday, November , Page , page 132

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google