Presentation is loading. Please wait.

Presentation is loading. Please wait.

FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x 2 +...  k x k + u 1. Estimation.

Similar presentations


Presentation on theme: "FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x 2 +...  k x k + u 1. Estimation."— Presentation transcript:

1 FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x 2 +...  k x k + u 1. Estimation

2 FIN357 Li2 Similar to with Simple Regression  0 is still the intercept  1 to  k all called slope parameters u is still the error term (or disturbance) Still assume that E(u|x 1,x 2, …,x k ) = 0 Still minimizing the sum of squared residuals

3 FIN357 Li3 Interpreting the coefficient

4 FIN357 Li4 Goodness-of-Fit

5 FIN357 Li5 Goodness-of-Fit (continued)  Can compute the fraction of the total sum of squares (SST) that is explained by the model.  R 2 = SSE/SST = 1 – SSR/SST

6 FIN357 Li6 Too Many or Too Few Variables What happens if we include variables in our specification that don’t belong? OLS estimators remain unbiased. Our test will still be valid, but less powerful (meaning less likely to detect significant relationship) What if we exclude a variable from our specification that does belong? OLS estimators will usually be biased

7 FIN357 Li7 Omitted Variable Bias

8 FIN357 Li8 Omitted Variable Bias Summary Two cases where there is no bias using a simple regression  2 = 0, that is x 2 doesn’t really belong in model x 1 and x 2 are uncorrelated in the sample.

9 The following 3 slides are some technical notes. You could ignore them if find them boring.

10 10 Assumptions in OLS Population model is linear in parameters: y =  0 +  1 x 1 +  2 x 2 +…+  k x k + u We use a random sample of size n, {(x i1, x i2,…, x ik, y i ): i=1, 2, …, n}, from the population E(u|x 1, x 2,… x k ) = 0. None of the x’s is constant, and there are no exact linear relationships among x’s

11 FIN357 Li11 Further assumption Assume Var(u|x 1, x 2,…, x k ) =  2 (Homoskedasticity) Theses 5 assumptions are known as the Gauss-Markov assumptions

12 FIN357 Li12 The Gauss-Markov Theorem Given our 5 Gauss-Markov Assumptions it can be shown that OLS is “BLUE”, or Best Linear Unbiased Estimator. Thus, if the assumptions hold, use OLS


Download ppt "FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x 2 +...  k x k + u 1. Estimation."

Similar presentations


Ads by Google