Lecture 17 Summary of previous Lecture Eviews. Today discussion  R-Square  Adjusted R- Square  Game of Maximizing Adjusted R- Square  Multiple regression.

Slides:



Advertisements
Similar presentations
Chapter 5 Multiple Linear Regression
Advertisements

Multiple Regression Analysis
Welcome to Econ 420 Applied Regression Analysis
CHAPTER 8 MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF INFERENCE
Simple Regression. Major Questions Given an economic model involving a relationship between two economic variables, how do we go about specifying the.
Specification Error II
Introduction and Overview
Studenmund(2006): Chapter 8
Assumption MLR.3 Notes (No Perfect Collinearity)
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Statistics 350 Lecture 21. Today Last Day: Tests and partial R 2 Today: Multicollinearity.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Chapter 9 Multicollinearity
Ekonometrika 1 Ekonomi Pembangunan Universitas Brawijaya.
Empirical Estimation Review EconS 451: Lecture # 8 Describe in general terms what we are attempting to solve with empirical estimation. Understand why.
AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Chapter 13.3 Multicollinearity.
STATISTICS: BASICS Aswath Damodaran 1. 2 The role of statistics Aswath Damodaran 2  When you are given lots of data, and especially when that data is.
MULTICOLLINEARITY: WHAT HAPPENS IF THE EGRESSORS
MultiCollinearity. The Nature of the Problem OLS requires that the explanatory variables are independent of error term But they may not always be independent.
Dept of Economics Kuwait University
Specification Error I.
Regression Analysis. Scatter plots Regression analysis requires interval and ratio-level data. To see if your data fits the models of regression, it is.
MTH 161: Introduction To Statistics
2 Multicollinearity Presented by: Shahram Arsang Isfahan University of Medical Sciences April 2014.
Statistics and Econometrics for Business II Fall 2014 Instructor: Maksym Obrizan Lecture notes III # 2. Advanced topics in OLS regression # 3. Working.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Chapter Three TWO-VARIABLEREGRESSION MODEL: THE PROBLEM OF ESTIMATION
May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
AUTOCORRELATION: WHAT HAPPENS IF THE ERROR TERMS ARE CORRELATED?
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Multiple Regression Analysis: Estimation. Multiple Regression Model y = ß 0 + ß 1 x 1 + ß 2 x 2 + …+ ß k x k + u -ß 0 is still the intercept -ß 1 to ß.
5-1 MGMG 522 : Session #5 Multicollinearity (Ch. 8)
Multiple Regression Learning Objectives n Explain the Linear Multiple Regression Model n Interpret Linear Multiple Regression Computer Output n Test.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED?
Lecture 22 Summary of previous lecture Auto correlation  Nature  Reasons  Consequences  Detection.
8- Multiple Regression Analysis: The Problem of Inference The Normality Assumption Once Again Example 8.1: U.S. Personal Consumption and Personal Disposal.
Ch5 Relaxing the Assumptions of the Classical Model
Objectives By the end of this lecture students will:
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
Multiple Regression Analysis and Model Building
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
Fundamentals of regression analysis
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
Chapter 6: Multiple Regression – Additional Topics
I271B Quantitative Methods
Two-Variable Regression Model: The Problem of Estimation
Chapter 6: MULTIPLE REGRESSION ANALYSIS
Multiple Regression Models
Multiple Regression Analysis: Further Issues
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED
Multiple Regression Analysis: Estimation
Multicollinearity Susanta Nag Assistant Professor Department of Economics Central University of Jammu.
Instrumental Variables Estimation and Two Stage Least Squares
Linear Regression Summer School IFPRI
Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences? It may be difficult to separate.
Financial Econometrics Fin. 505
Financial Econometrics Fin. 505
Presentation transcript:

Lecture 17 Summary of previous Lecture Eviews

Today discussion  R-Square  Adjusted R- Square  Game of Maximizing Adjusted R- Square  Multiple regression model  Problem of Estimation

Problem of regression analysis CLRM assumes no Multicollinearity among the regressors included in the regression model. We will discuss:  What is the nature of Multicollinearity?  Is Multicollinearity really a problem?  What are its practical consequences?  How does one detect it?  What remedial measures can be taken to alleviate the problem of Multicollinearity?

Nature of Multicollinearity  MC means the existence of a “perfect,” or exact, linear relationship among some or all explanatory variables of a regression model: Example:

Ballantine view of MC

Logic behind Assuming No MC in the CLRM 1- If Multicollinearity is perfect, the regression coefficients of the X variables are indeterminate and their standard errors are infinite. 2- If Multicollinearity is less than perfect the regression coefficients, although determinate, possess large standard errors (in relation to the coefficients themselves), which means the coefficients cannot be estimated with great precision or accuracy.

Sources of Multicollinearity Four sources of Multicollinearity; 1- The data collection method employed, for example, sampling over a limited range of the values taken by the regressors in the population. 2- Constraints on the model or in the population being sampled. For example, in the regression of electricity consumption on income (X2) and house size (X3) there is a physical constraint in the population in that families with higher incomes generally have larger homes than families with lower incomes. 3- Model specification, for example, adding polynomial terms to a regression model, especially when the range of the X variable is small.

Sources of MC …… 4- An over determined model. This happens when the model has more explanatory variables than the number of observations. 5- Regressors included in the model share a common trend: Variable increase or decrease over time. Thus, in the regression of consumption expenditure on income, wealth, and population, the regressors income, wealth, and population may all be growing over time at more or less the same rate, leading to collinearity among these variables.

Theoretical Consequences of Multicollinearity If the assumptions of the classical model are satisfied, the OLS estimators of the regression estimators are BLUE. If Multicollinearity is very high, as in the case of near Multicollinearity, the OLS estimators still retain the property of BLUE. Then what is the Multicollinearity fuss all about? No statistical answer can be given Results: In large sample Multicollinearity is not a serious issue.

Practical Consequences of Multicollinearity

Detection of Multicollinearity  Multicollinearity is a question of degree and not of kind. Presence and absence of Multicollinearity is not the issue, the issue is its degree, high or low.  It is a feature of the sample and not of the population as it refers to the condition of the explanatory variables that are assumed to be no stochastic. So it is problem of sample not population.  No unique method of detecting it or measuring its strength.  Some rule of thumbs.

Rules to detect Multicollinearity

Rules to detect Multicollinearity…..