Presentation is loading. Please wait.

Presentation is loading. Please wait.

政治大學中山所共同選修 課程名稱: 社會科學計量方法與統計-計量方法 Methodology of Social Sciences 授課內容: Further Inference in the Multiple Regression Model 日期: 2003 年 11 月 6 日.

Similar presentations


Presentation on theme: "政治大學中山所共同選修 課程名稱: 社會科學計量方法與統計-計量方法 Methodology of Social Sciences 授課內容: Further Inference in the Multiple Regression Model 日期: 2003 年 11 月 6 日."— Presentation transcript:

1 政治大學中山所共同選修 課程名稱: 社會科學計量方法與統計-計量方法 Methodology of Social Sciences 授課內容: Further Inference in the Multiple Regression Model 日期: 2003 年 11 月 6 日

2 政治大學 中山所選修 黃智聰 Restricted Least Square. Single parameter t test Joint null hypothesis F test The F test is based on a comparison of the sum of original, unrestricted multiple regression model to the sum of squared error from a regression model in which the null hypothesis is assumed to be true.

3 政治大學 中山所選修 黃智聰 Ex: y=α 0 +α 1 X 1 +α 2 X 2 + α 3 X 3 + e H 0 : α 2 = α 3 = 0 Ie y= α 0 +α 1 X 1 + e F= F ≧ F (J,T-K, α) reject null hypothesis P=P(F (2,96) ≧ F ) < 0.05 reject hypothesis SSER-SSEu/J SSEu/(T-K)

4 政治大學 中山所選修 黃智聰 No greater than or less than in null hypothesis H 0 : β 2 =0, β 3 =0 … β K =0 H 1 : β 2 =0 orβ 3 =0 or both are non zero at least one of β K is non zero. F= T 2 if J=1 Notice that if model is: y= β 0 + β 1 X 1 + β 2 X 2 2 + e =2β 2 X 2 implies that X 2 has different influence on each y. =β 1 some influence of X 1 on all y dX 2 dy dX 1

5 政治大學 中山所選修 黃智聰 Another example y= β 0 + β 1 X 1 + β 2 X 2 + e H 0 : β 1 =β 2 =0 H 0 : β 1 =β 2 y= β 0 + β 1 (X 1 + X 2 )+ e F 1 test F(1,T-3, α)

6 政治大學 中山所選修 黃智聰 8.6 Model Specification Three essential features of model choice : (1)choice of functional form (2)choice of explanatory variables (regressors) to be included in the model. (3)whether the multiple regression model assumption MR1-MR6

7 政治大學 中山所選修 黃智聰 1. Omitted and irrelevant variables Ex:y= β 0 + β 1 X 1 + β 2 X 2 + e If we don ’ t haveX 2,instead we regress y= β 0 +β 1 *X 1 + e then β 1 * =β 1 if Cov(X 1,X 2 ) =0 And we have a very strong null assumption which is β 1 =0 However, Cov(X 1,X 2 )=0 is very rare

8 政治大學 中山所選修 黃智聰 If an estimated equation has coefficients with unexpected sign, or unrealistic magnitudes, a possible cause of these strange results is the omission of an important variable. T-test or F- test, the two significant tests can assessing whether a variable or a group of variables should be included in an equation.

9 政治大學 中山所選修 黃智聰 Notice: Two possible reasons for a test outcome that does not reject a zero null hypothesis. (1) The corresponding variables have no influence any can be exclude from the model(but the outcome can ’ t reject null hypothesis) (2) The corresponding variables are important ones from inclusion in the model, but the data are not sufficiently good to reject H 0.

10 政治大學 中山所選修 黃智聰 1.P(can not reject H 0 │ null is true) Accept H 0 => insignificant coefficient. 2.P(can not reject H 0 │ null is not true) We could be excluding an irrelevant variable, but we also could be inducing omitted-variable bias in the remaining coefficient estimates.

11 政治大學 中山所選修 黃智聰 So => include as many as variables as possible? Y=β 0 +β 1 X 1 +β 2 X 2 +e <= true model --------- (1) But estimate Y=β 0 +β 1 X 1 +β 2 X 2 +β 3 X 3 +e ---- ----- (2) Var (b1),Var (b2),Var (b1) is greater in (2) than in (1) If X 3 and X 1, X 2, X 3 相關

12 政治大學 中山所選修 黃智聰 2. Testing for Model Misspecification: The RESET Test Misspecification: (1)omitted important variables (2) included irrelevant ones. (3) chosen a wrong functional form (4) violates the assumption Regression Specification Error Test (RESET) Detect omitted variables and incurrent functional form.

13 政治大學 中山所選修 黃智聰 Suppose: Y=β 0 +β 1 X 1 +β 2 X 2 +e =b 0 +b 1 X 1 +b 2 X 2 Y=β 0 +β 1 X 1 +β 2 X 2 +r 1 2 +e -------- (1) Y=β 0 +β 1 X 1 +β 2 X 2 +r 1 2 + r 2 3 +e -------- (2) (1) Test H 0 : r 1 =0 H 1 : r 1 ≠0 (2) Test H 0 : r 1 = r 2 =0 H 1 : r 1 ≠0 or r 2 ≠0 Reject H 0 => original model is inadequate and can be improved. Failure to reject H 0 => the test has not been able to detect any misspecification.

14 政治大學 中山所選修 黃智聰 8.7 Collinear Variables Many variables may move together in systematic ways, such variables are said to be collinear. When several collinear variables are involved, the problem is labeled collinearity or multcollinearity. Then any effort to measure the individual or separate effects ( marginal products) of various mixes of inputs from such data will be difficult.

15 政治大學 中山所選修 黃智聰 (1) relationships b/w valuables (2) values of an explanatory valuable do not vary or change much within the sample (difficult to isolate its impact) of data also collinearity. Consequences of collinear (1) the least squares estimator is not defined if Γ 23 (correlation coefficient)=±1 then, Var(b 2 ) is undefined since 0 appear in the denominator. (2) Nearly exact linear, some of Var, se, cov of LSE may be large. imprecise information by the sample data about the unknow parameter.

16 政治大學 中山所選修 黃智聰 (3) Se ↑ not significant collinear variables do not provide enough information to estimate their separate effects, even though theory may indicate the important in the relationship. (4) Sensitive to addition or deletion of a few observations or deletion of an apparently insignificant variables. (5) Accurate forecasts may be still be possible if the nature of the collinear relationship remains the same within the future sample observations.

17 政治大學 中山所選修 黃智聰 8.7.2 Inentifying and Mitigating Collinearity (1)Correlation Coefficient X 1 、 X 2.if > 0.8 0.9 strong linear association How about X 1 、 X 2 、 X 3 有 collinear (2)auxiliary regressions X 2 =a 1 x 1 +a 3 x 3 + …… a k x k +e If R 2 is high > 0.8large portion of the variation in X 2 is explained by variation in the other explanatory variable.


Download ppt "政治大學中山所共同選修 課程名稱: 社會科學計量方法與統計-計量方法 Methodology of Social Sciences 授課內容: Further Inference in the Multiple Regression Model 日期: 2003 年 11 月 6 日."

Similar presentations


Ads by Google