政治大學中山所共同選修 課程名稱: 社會科學計量方法與統計-計量方法 Methodology of Social Sciences 授課內容: Further Inference in the Multiple Regression Model 日期: 2003 年 11 月 6 日.

Slides:



Advertisements
Similar presentations
Cointegration and Error Correction Models
Advertisements

Further Inference in the Multiple Regression Model Hill et al Chapter 8.
Regression and correlation methods
Multivariate Regression
There are at least three generally recognized sources of endogeneity. (1) Model misspecification or Omitted Variables. (2) Measurement Error.
The Multiple Regression Model.
Chap 12-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 12 Simple Regression Statistics for Business and Economics 6.
Economics 20 - Prof. Anderson1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 7. Specification and Data Problems.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
1 SSS II Lecture 1: Correlation and Regression Graduate School 2008/2009 Social Science Statistics II Gwilym Pryce
Specifying an Econometric Equation and Specification Error
Specification Error II
4.3 Confidence Intervals -Using our CLM assumptions, we can construct CONFIDENCE INTERVALS or CONFIDENCE INTERVAL ESTIMATES of the form: -Given a significance.
Multiple Linear Regression Model
1Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 7. Specification and Data Problems.
政治大學 中山所共同選修 黃智聰 政治大學中山所共同選修 課程名稱: 社會科學計量方法與統計-計量方法 Methodology of Social Sciences 授課內容: The Simple Linear Regression Model 日期: 2003 年 10 月 9 日.
Econ 140 Lecture 131 Multiple Regression Models Lecture 13.
Chapter 4 Multiple Regression.
政治大學中山所共同選修 課程名稱: 社會科學計量方法與統計-計量方法 Methodology of Social Sciences 授課內容: Nonlinear Models 日期: 2003 年 11 月 13 日.
Multiple Regression Models
政治大學中山所共同選修 課程名稱: 社會科學計量方法與統計-計量方法 Methodology of Social Sciences 授課內容: Inference in the Simple Regression Model 日期: 2003 年 10 月日.
Multicollinearity Omitted Variables Bias is a problem when the omitted variable is an explanator of Y and correlated with X1 Including the omitted variable.
Further Inference in the Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Topic 3: Regression.
政治大學中山所共同選修 課程名稱: 社會科學計量方法與統計-計量方法 Methodology of Social Sciences 授課內容: Introduction to Econometrics 日期: 2003 年 9 月 24 日.
Introduction to Regression Analysis, Chapter 13,
政治大學中山所共同選修 課程名稱: 社會科學計量方法與統計-計量方法 Methodology of Social Sciences 授課內容: Dummy Variables 日期: 2003 年 11 月 13 日.
Multiple Linear Regression Analysis
Lecture 5 Correlation and Regression
Chapter 8 Forecasting with Multiple Regression
8.1 Ch. 8 Multiple Regression (con’t) Topics: F-tests : allow us to test joint hypotheses tests (tests involving one or more  coefficients). Model Specification:
Regression and Correlation Methods Judy Zhong Ph.D.
Chapter 11 Simple Regression
Applied Econometrics Second edition
Regression Method.
Chapter 14 Simple Regression
MultiCollinearity. The Nature of the Problem OLS requires that the explanatory variables are independent of error term But they may not always be independent.
Estimating Demand Functions Chapter Objectives of Demand Estimation to determine the relative influence of demand factors to forecast future demand.
Statistics for Business and Economics 7 th Edition Chapter 11 Simple Regression Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch.
1 B IVARIATE AND MULTIPLE REGRESSION Estratto dal Cap. 8 di: “Statistics for Marketing and Consumer Research”, M. Mazzocchi, ed. SAGE, LEZIONI IN.
Specification Error I.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Further Inference in the Multiple Regression Model
Environmental Modeling Basic Testing Methods - Statistics III.
1 B IVARIATE AND MULTIPLE REGRESSION Estratto dal Cap. 8 di: “Statistics for Marketing and Consumer Research”, M. Mazzocchi, ed. SAGE, LEZIONI IN.
Chap 6 Further Inference in the Multiple Regression Model
11 Chapter 5 The Research Process – Hypothesis Development – (Stage 4 in Research Process) © 2009 John Wiley & Sons Ltd.
Roger B. Hammer Assistant Professor Department of Sociology Oregon State University Conducting Social Research Specification: Choosing the Independent.
1. The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2. Homoscedasticity--the.
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
4-1 MGMG 522 : Session #4 Choosing the Independent Variables and a Functional Form (Ch. 6 & 7)
Simple Linear Regression and Correlation (Continue..,) Reference: Chapter 17 of Statistics for Management and Economics, 7 th Edition, Gerald Keller. 1.
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
Regression and Correlation
Basic Estimation Techniques
Further Inference in the Multiple Regression Model
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
Fundamentals of regression analysis
Chapter 11 Simple Regression
Further Inference in the Multiple Regression Model
Basic Estimation Techniques
Chapter 6: MULTIPLE REGRESSION ANALYSIS
Simple Linear Regression
Tutorial 1: Misspecification
Chapter 7: The Normality Assumption and Inference with OLS
Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences? It may be difficult to separate.
Introduction to Regression
Multiple Regression Berlin Chen
Presentation transcript:

政治大學中山所共同選修 課程名稱: 社會科學計量方法與統計-計量方法 Methodology of Social Sciences 授課內容: Further Inference in the Multiple Regression Model 日期: 2003 年 11 月 6 日

政治大學 中山所選修 黃智聰 Restricted Least Square. Single parameter t test Joint null hypothesis F test The F test is based on a comparison of the sum of original, unrestricted multiple regression model to the sum of squared error from a regression model in which the null hypothesis is assumed to be true.

政治大學 中山所選修 黃智聰 Ex: y=α 0 +α 1 X 1 +α 2 X 2 + α 3 X 3 + e H 0 : α 2 = α 3 = 0 Ie y= α 0 +α 1 X 1 + e F= F ≧ F (J,T-K, α) reject null hypothesis P=P(F (2,96) ≧ F ) < 0.05 reject hypothesis SSER-SSEu/J SSEu/(T-K)

政治大學 中山所選修 黃智聰 No greater than or less than in null hypothesis H 0 : β 2 =0, β 3 =0 … β K =0 H 1 : β 2 =0 orβ 3 =0 or both are non zero at least one of β K is non zero. F= T 2 if J=1 Notice that if model is: y= β 0 + β 1 X 1 + β 2 X e =2β 2 X 2 implies that X 2 has different influence on each y. =β 1 some influence of X 1 on all y dX 2 dy dX 1

政治大學 中山所選修 黃智聰 Another example y= β 0 + β 1 X 1 + β 2 X 2 + e H 0 : β 1 =β 2 =0 H 0 : β 1 =β 2 y= β 0 + β 1 (X 1 + X 2 )+ e F 1 test F(1,T-3, α)

政治大學 中山所選修 黃智聰 8.6 Model Specification Three essential features of model choice : (1)choice of functional form (2)choice of explanatory variables (regressors) to be included in the model. (3)whether the multiple regression model assumption MR1-MR6

政治大學 中山所選修 黃智聰 1. Omitted and irrelevant variables Ex:y= β 0 + β 1 X 1 + β 2 X 2 + e If we don ’ t haveX 2,instead we regress y= β 0 +β 1 *X 1 + e then β 1 * =β 1 if Cov(X 1,X 2 ) =0 And we have a very strong null assumption which is β 1 =0 However, Cov(X 1,X 2 )=0 is very rare

政治大學 中山所選修 黃智聰 If an estimated equation has coefficients with unexpected sign, or unrealistic magnitudes, a possible cause of these strange results is the omission of an important variable. T-test or F- test, the two significant tests can assessing whether a variable or a group of variables should be included in an equation.

政治大學 中山所選修 黃智聰 Notice: Two possible reasons for a test outcome that does not reject a zero null hypothesis. (1) The corresponding variables have no influence any can be exclude from the model(but the outcome can ’ t reject null hypothesis) (2) The corresponding variables are important ones from inclusion in the model, but the data are not sufficiently good to reject H 0.

政治大學 中山所選修 黃智聰 1.P(can not reject H 0 │ null is true) Accept H 0 => insignificant coefficient. 2.P(can not reject H 0 │ null is not true) We could be excluding an irrelevant variable, but we also could be inducing omitted-variable bias in the remaining coefficient estimates.

政治大學 中山所選修 黃智聰 So => include as many as variables as possible? Y=β 0 +β 1 X 1 +β 2 X 2 +e <= true model (1) But estimate Y=β 0 +β 1 X 1 +β 2 X 2 +β 3 X 3 +e (2) Var (b1),Var (b2),Var (b1) is greater in (2) than in (1) If X 3 and X 1, X 2, X 3 相關

政治大學 中山所選修 黃智聰 2. Testing for Model Misspecification: The RESET Test Misspecification: (1)omitted important variables (2) included irrelevant ones. (3) chosen a wrong functional form (4) violates the assumption Regression Specification Error Test (RESET) Detect omitted variables and incurrent functional form.

政治大學 中山所選修 黃智聰 Suppose: Y=β 0 +β 1 X 1 +β 2 X 2 +e =b 0 +b 1 X 1 +b 2 X 2 Y=β 0 +β 1 X 1 +β 2 X 2 +r 1 2 +e (1) Y=β 0 +β 1 X 1 +β 2 X 2 +r r 2 3 +e (2) (1) Test H 0 : r 1 =0 H 1 : r 1 ≠0 (2) Test H 0 : r 1 = r 2 =0 H 1 : r 1 ≠0 or r 2 ≠0 Reject H 0 => original model is inadequate and can be improved. Failure to reject H 0 => the test has not been able to detect any misspecification.

政治大學 中山所選修 黃智聰 8.7 Collinear Variables Many variables may move together in systematic ways, such variables are said to be collinear. When several collinear variables are involved, the problem is labeled collinearity or multcollinearity. Then any effort to measure the individual or separate effects ( marginal products) of various mixes of inputs from such data will be difficult.

政治大學 中山所選修 黃智聰 (1) relationships b/w valuables (2) values of an explanatory valuable do not vary or change much within the sample (difficult to isolate its impact) of data also collinearity. Consequences of collinear (1) the least squares estimator is not defined if Γ 23 (correlation coefficient)=±1 then, Var(b 2 ) is undefined since 0 appear in the denominator. (2) Nearly exact linear, some of Var, se, cov of LSE may be large. imprecise information by the sample data about the unknow parameter.

政治大學 中山所選修 黃智聰 (3) Se ↑ not significant collinear variables do not provide enough information to estimate their separate effects, even though theory may indicate the important in the relationship. (4) Sensitive to addition or deletion of a few observations or deletion of an apparently insignificant variables. (5) Accurate forecasts may be still be possible if the nature of the collinear relationship remains the same within the future sample observations.

政治大學 中山所選修 黃智聰 Inentifying and Mitigating Collinearity (1)Correlation Coefficient X 1 、 X 2.if > strong linear association How about X 1 、 X 2 、 X 3 有 collinear (2)auxiliary regressions X 2 =a 1 x 1 +a 3 x 3 + …… a k x k +e If R 2 is high > 0.8large portion of the variation in X 2 is explained by variation in the other explanatory variable.