Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 MF-852 Financial Econometrics Lecture 8 Introduction to Multiple Regression Roy J. Epstein Fall 2003.

Similar presentations


Presentation on theme: "1 MF-852 Financial Econometrics Lecture 8 Introduction to Multiple Regression Roy J. Epstein Fall 2003."— Presentation transcript:

1 1 MF-852 Financial Econometrics Lecture 8 Introduction to Multiple Regression Roy J. Epstein Fall 2003

2 2 Topics Formulation and Estimation of a Multiple Regression Interpretation of the Regression Coefficients Omitted Variables Collinearity Advanced Hypothesis Testing

3 3 Multiple Regression Used when 2 or more independent variables explain the dependent variable: Y i =  0 +  1 X 1i +  2 X 2i + … +  k X ki + e i or Y i = X i  + e i

4 4 The Error Term Same assumptions as before: E(e i ) = 0 var(e i ) =  2 cov(X,e) = 0 cov(e i, e j ) = 0

5 5 The Error Term Same assumptions as before: E(e i ) = 0 var(e i ) =  2 cov(X,e) = 0 cov(e i, e j ) = 0

6 6 The Estimated Coefficients Measure the marginal effect of an independent variable, controlling for the other effects. I.e., effect of X i “all else equal” Can be sensitive to what other variables are included in the regression.

7 7 Omitted Variables Suppose true model is: Y i =  0 +  1 X 1i +  2 X 2i + e i But you leave out X 2. (by ignorance or lack of data) Does it matter?

8 8 Analysis of Omitted Variables Error term now includes e and X 2 : Y i =  0 +  1 X 1i + u i =  0 +  1 X 1i + [  2 X 2i + e i ] Two cases: A.X 2 correlated with X 1. biased — picks up effect of X 2 and attributes it to X 1. B.X 2 uncorrelated with X 1. No bias.

9 9 Case Study — MIT Lawsuit

10 10

11 11 Collinearity Let Y i =  0 +  1 X 1i +  2 X 2i + e i Suppose X 1 and X 2 highly correlated. What difference does it make? Hard to estimate  1 and  2. No bias, but large standard errors.

12 12 Collinearity—Diagnosis Neither X 1 or X 2 has a significant t statistic BUT X 1 is significant when X 2 is left out of the regression and vice versa. Test joint significance with F test.

13 13 Exact Collinearity Let Y i =  0 +  1 X 1i +  2 X 2i + e i Suppose X 2 is exact linear function of X 1 E.g., X 2 = a + bX 1 Then cannot estimate model at all! Can also occur with 3 or more X’s.

14 14 Exact Collinearity—Example Regression to explain calories as function of fat content of foods X 1 is fat in ounces per portion X 2 is fat in same food in grams Then X 2i = 28.35 X 1i Can’t estimate Y i =  0 +  1 X 1i +  2 X 2i + e i Intuition: no independent information in X 2.

15 15 Tests of Restrictions Suppose H 0 :  2 = 2  1 in Y i =  0 +  1 X 1i +  2 X 2i + e i Test H 0 with reformulated model that embeds restriction: Y i =  0 +  1 (X 1i + 2X 2i ) +  2 X 2i + e i Under H 0,  2 = 0 Can test with usual t statistic

16 16 Test your Understanding! What is difference between exact collinearity, e.g., X 2i = 2X 1i And a coefficient restriction, e.g., H 0 :  2 = 2  1 ? Relate the concepts to the model.


Download ppt "1 MF-852 Financial Econometrics Lecture 8 Introduction to Multiple Regression Roy J. Epstein Fall 2003."

Similar presentations


Ads by Google