REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY

Slides:



Advertisements
Similar presentations
Further Inference in the Multiple Regression Model Hill et al Chapter 8.
Advertisements

Multiple Regression Analysis
Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
Multivariate Regression
Welcome to Econ 420 Applied Regression Analysis
CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
CHAPTER 8 MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF INFERENCE
Specification Error II
Introduction and Overview
Studenmund(2006): Chapter 8
Multicollinearity Multicollinearity - violation of the assumption that no independent variable is a perfect linear function of one or more other independent.
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Statistics 350 Lecture 21. Today Last Day: Tests and partial R 2 Today: Multicollinearity.
All rights reserved by Dr.Bill Wan Sing Hung - HKBU 4A.1 Week 4a Multiple Regression The meaning of partial regression coefficients.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Multicollinearity Omitted Variables Bias is a problem when the omitted variable is an explanator of Y and correlated with X1 Including the omitted variable.
Chapter 9 Multicollinearity
Ekonometrika 1 Ekonomi Pembangunan Universitas Brawijaya.
AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Chapter 13.3 Multicollinearity.
Objectives of Multiple Regression
ECONOMETRICS I CHAPTER 5: TWO-VARIABLE REGRESSION: INTERVAL ESTIMATION AND HYPOTHESIS TESTING Textbook: Damodar N. Gujarati (2004) Basic Econometrics,
Lecture 17 Summary of previous Lecture Eviews. Today discussion  R-Square  Adjusted R- Square  Game of Maximizing Adjusted R- Square  Multiple regression.
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
MULTICOLLINEARITY: WHAT HAPPENS IF THE EGRESSORS
MultiCollinearity. The Nature of the Problem OLS requires that the explanatory variables are independent of error term But they may not always be independent.
Specification Error I.
2 Multicollinearity Presented by: Shahram Arsang Isfahan University of Medical Sciences April 2014.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation.
Chap 6 Further Inference in the Multiple Regression Model
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
5-1 MGMG 522 : Session #5 Multicollinearity (Ch. 8)
Linear Regression ( Cont'd ). Outline - Multiple Regression - Checking The Regression : Coeff. Determination Standard Error Confidence Interval Hypothesis.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED?
8- Multiple Regression Analysis: The Problem of Inference The Normality Assumption Once Again Example 8.1: U.S. Personal Consumption and Personal Disposal.
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
F-tests continued.
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Multiple Regression Analysis: Estimation
Chapter 9 Multiple Linear Regression
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
THE LINEAR REGRESSION MODEL: AN OVERVIEW
REGRESSION DIAGNOSTIC IV: MODEL SPECIFICATION ERRORS
Econometric methods of analysis and forecasting of financial markets
Multivariate Regression
EED 401: ECONOMETRICS Chapter # 11: MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED? Domodar N. Gujarati Haruna Issahaku.
Chapter 12: Regression Diagnostics
Fundamentals of regression analysis
STOCHASTIC REGRESSORS AND THE METHOD OF INSTRUMENTAL VARIABLES
THE LOGIT AND PROBIT MODELS
Chapter 6: MULTIPLE REGRESSION ANALYSIS
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED
I271b Quantitative Methods
Serial Correlation and Heteroscedasticity in
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED
Some issues in multivariate regression
LIMITED DEPENDENT VARIABLE REGRESSION MODELS
MULTIVARIATE REGRESSION MODELS
Multicollinearity Susanta Nag Assistant Professor Department of Economics Central University of Jammu.
Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences? It may be difficult to separate.
Financial Econometrics Fin. 505
Financial Econometrics Fin. 505
Lecturer Dr. Veronika Alhanaqtah
Serial Correlation and Heteroscedasticity in
Presentation transcript:

REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY CHAPTER 4 REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY Damodar Gujarati Econometrics by Example, second edition

MULTICOLLINEARITY One of the assumptions of the classical linear regression (CLRM) is that there is no exact linear relationship among the regressors. If there are one or more such relationships among the regressors, we call it multicollinearity, or collinearity for short. Perfect collinearity: A perfect linear relationship between the two variables exists. Imperfect collinearity: The regressors are highly (but not perfectly) collinear. Damodar Gujarati Econometrics by Example, second edition

CONSEQUENCES If collinearity is not perfect, but high, several consequences ensue: The OLS estimators are still BLUE, but one or more regression coefficients have large standard errors relative to the values of the coefficients, thereby making the t ratios small. Even though some regression coefficients are statistically insignificant, the R2 value may be very high. Therefore, one may conclude (misleadingly) that the true values of these coefficients are not different from zero. Also, the regression coefficients may be very sensitive to small changes in the data, especially if the sample is relatively small. Damodar Gujarati Econometrics by Example, second edition

VARIANCE INFLATION FACTOR For the following regression model: It can be shown that: and where σ2 is the variance of the error term ui, and r23 is the coefficient of correlation between X2 and X3. Damodar Gujarati Econometrics by Example, second edition

VARIANCE INFLATION FACTOR (CONT.) is the variance-inflating factor. VIF is a measure of the degree to which the variance of the OLS estimator is inflated because of collinearity. Damodar Gujarati Econometrics by Example, second edition

DETECTION OF MULTICOLLINEARITY 1. High R2 but few significant t ratios. 2. High pair-wise correlations among explanatory variables or regressors. 3. High partial correlation coefficients. 4. Significant F test for auxiliary regressions (regressions of each regressor on the remaining regressors). 5. High Variance Inflation Factor (VIF) – particularly exceeding 10 in value – and low Tolerance Factor (TOL, the inverse of VIF). Damodar Gujarati Econometrics by Example, second edition

REMEDIAL MEASURES What should we do if we detect multicollinearity? Nothing, for we often have no control over the data. Redefine the model by excluding variables may attenuate the problem, provided we do not omit relevant variables. Principal components analysis: Construct artificial variables from the regressors such that they are orthogonal to one another. These principal components become the regressors in the model. Yet the interpretation of the coefficients on the principal components is not as straightforward. Damodar Gujarati Econometrics by Example, second edition