Download presentation

Presentation is loading. Please wait.

Published byJesus Partridge Modified over 2 years ago

1
**Welcome to Econ 420 Applied Regression Analysis**

Study Guide Week Eleven

2
**Perfect Multicollinearity (Chapter 6)**

When two or more independent variables have a perfect (error free) linear relationship with each other Which assumption does this violate? Violates Assumption 2 Example Equation 6-1, Page 116 Consequences: OLS is not able to estimate the model Remedy Drop one variable

3
**Imperfect Multicollinearity**

When two or more independent variables have an imperfect linear relationship with each other Example Equation 6-3, Page 117

4
**Consequences of Imperfect Multicollinearity**

B hats are unbiased B hats have higher than normal standard errors What does this imply with regards to t- test? We may conclude that Bs are not significantly different from zero while they are significantly different from zero. Adding and subtracting variables and observations will affect B hats significantly The adjusted R squared remains largely unaffected The B hats of uncorrelated variables remain unaffected

5
**Detection of Multicollinearity**

If you have high adjusted R squared but low t-scores suspect a multicollinearity problem

6
**Test for Multicollinearity**

Calculate the correlation coefficients between each pair of independent variables and each independent variable and the dependent variable. EViews direction: quick group statistics correlation Two rules 1) If |rX1, X2| > |rX1, Y | problem, or 2) If |rX1, X2 |> 0.8 problem Problem: this approach only detects the multicollinearity between two variables

7
**Test for Multicollinearity**

Regress each independent variable on the other independent variable Do an F-test of significance at 1% level If reject the null hypothesis multicollinearity is a problem

8
**Sometimes 3 or more independent variables are correlated**

Example Income = f (wage rate, tax rate, hours of work, ….) Wage rate, tax rate and hours of work may be all highly correlated with each other Simple correlation coefficient may not capture this.

9
**Test of Multicollinearity among 3 or more independent variables**

Regress each independent variable (say X1) on the other independent variables (X2, X3, X4) Then calculate VIF VIF = 1 / (1- R2) If VIF >4 then X1 is highly correlated with the other independent variables Do the same for other independent variables

10
**Remedies for Multicollinearity**

1) If your main goal is to use the equation for forecasting and you don’t want to do specific t- test on each estimated coefficient then do nothing. This is because multicollinearity does not affect the predictive power of your equation. 2) If it seems that you have a redundant variable, drop it. Example: You don’t need both real and nominal interest rates in your model

11
**Remedies for Multicollinearity**

3) If all variables need to stay in the equation, transform the multicollinear variables Example: Number of domestic cars sold = B0 + B1 average price of domestic cars + B2 average price of foreign cars +…..+ e Problems: Prices of domestic and foreign cars are highly correlated Solution: Number of domestic cars sold = B0 + B1 the ratio of average price of domestic cars to the average price of foreign cars +…..+ e 4) Increase the sample size or choose a different random sample

12
**Assignment 9 (30 points) Due: Before 10:00 PM, Friday, November 9**

11, Page 131 13, page 132

Similar presentations

OK

Stat 112 Notes 9 Today: –Multicollinearity (Chapter 4.6) –Multiple regression and causal inference.

Stat 112 Notes 9 Today: –Multicollinearity (Chapter 4.6) –Multiple regression and causal inference.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on any business plan Ppt on power grid failure prep Weekends by appt only Ppt on self development of people Ppt on wireless networks Ppt on history of australia for kids Ppt on javascript events tutorials Ppt on credit default swaps index Ppt on personality development for mba students Ppt on human evolution