1. The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2. Homoscedasticity--the.

Slides:



Advertisements
Similar presentations
Further Inference in the Multiple Regression Model Hill et al Chapter 8.
Advertisements

Managerial Economics in a Global Economy
Multivariate Regression
Welcome to Econ 420 Applied Regression Analysis
Chapter 12 Simple Linear Regression
Forecasting Using the Simple Linear Regression Model and Correlation
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Quantitative Data Analysis: Hypothesis Testing
Specification Error II
Assumption MLR.3 Notes (No Perfect Collinearity)
Module II Lecture 6: Heteroscedasticity: Violation of Assumption 3
The Simple Linear Regression Model: Specification and Estimation
Chapter 13 Additional Topics in Regression Analysis
Multiple Linear Regression Model
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Statistics for Managers Using Microsoft® Excel 5th Edition
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
Multicollinearity Omitted Variables Bias is a problem when the omitted variable is an explanator of Y and correlated with X1 Including the omitted variable.
Statistical Analysis SC504/HS927 Spring Term 2008 Session 7: Week 23: 7 th March 2008 Complex independent variables and regression diagnostics.
Topic 3: Regression.
1.The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2.Homoscedasticity --the.
1 4. Multiple Regression I ECON 251 Research Methods.
Violations of Assumptions In Least Squares Regression.
Ekonometrika 1 Ekonomi Pembangunan Universitas Brawijaya.
Chapter 15: Model Building
Variance and covariance Sums of squares General linear models.
Multiple Linear Regression Analysis
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
Lecture 5 Correlation and Regression
Regression and Correlation Methods Judy Zhong Ph.D.
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
© 2004 Prentice-Hall, Inc.Chap 15-1 Basic Business Statistics (9 th Edition) Chapter 15 Multiple Regression Model Building.
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
MultiCollinearity. The Nature of the Problem OLS requires that the explanatory variables are independent of error term But they may not always be independent.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
1 B IVARIATE AND MULTIPLE REGRESSION Estratto dal Cap. 8 di: “Statistics for Marketing and Consumer Research”, M. Mazzocchi, ed. SAGE, LEZIONI IN.
Specification Error I.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Twelve.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Various topics Petter Mostad Overview Epidemiology Study types / data types Econometrics Time series data More about sampling –Estimation.
Chapter 5: Regression Analysis Part 1: Simple Linear Regression.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
10. Basic Regressions with Times Series Data 10.1 The Nature of Time Series Data 10.2 Examples of Time Series Regression Models 10.3 Finite Sample Properties.
11 Chapter 12 Quantitative Data Analysis: Hypothesis Testing © 2009 John Wiley & Sons Ltd.
1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
Simple Linear Regression (OLS). Types of Correlation Positive correlationNegative correlationNo correlation.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
1 B IVARIATE AND MULTIPLE REGRESSION Estratto dal Cap. 8 di: “Statistics for Marketing and Consumer Research”, M. Mazzocchi, ed. SAGE, LEZIONI IN.
Scatter Diagrams scatter plot scatter diagram A scatter plot is a graph that may be used to represent the relationship between two variables. Also referred.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Review Session Linear Regression. Correlation Pearson’s r –Measures the strength and type of a relationship between the x and y variables –Ranges from.
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED?
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Yandell – Econ 216 Chap 15-1 Chapter 15 Multiple Regression Model Building.
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
Multivariate Regression
Fundamentals of regression analysis
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
BEC 30325: MANAGERIAL ECONOMICS
Checking Assumptions Primary Assumptions Secondary Assumptions
Chapter 13 Additional Topics in Regression Analysis
BEC 30325: MANAGERIAL ECONOMICS
Presentation transcript:

1. The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2. Homoscedasticity--the probability distributions of the error term have a constant variance for all values of the independent variables (X i 's). Perfect multicollinearity is a violation of assumption (1).Heteroscedasticity is a violation of assumption (2)

Multicollinearity is a problem with time series regression Suppose we wanted to estimate the following specification using quarterly time series data: Auto Sales t =  0 +  1 Income t +  2 Prices t where Income t is (nominal income in quarter t and Prices t is an index of auto prices in quarter t. The data reveal there is a strong (positive) correlation between nominal income and car prices

0 (Nominal) income Car prices Approximate linear relationship between explanatory variables

Why is multicollinearity a problem? In the case of perfectly collinear explanatory variables, OLS does not work. In the case where there is an approximate linear relationship among the explanatory variables (X i ’s), the estimates of the coefficients are still unbiased, but you run into the following problems:  The estimates of the coefficients have high standard errors, weakening the capacity of the equation to produce accurate forecasts.  The aforementioned problem means small t-ratios and greater likelihood that the null hypotheses will not be rejected.  Muticollinearity means that the effect of the independent variables is mingled together--this makes it difficult for the researcher to disentangle the separate effects of the explanatory variables on the dependent variable.  Estimates of the coefficients (  ’s) are “unstable,” meaning that a comparatively small change in the data set can produce a big change in the estimate of the coefficient.

How do you know you have a problem with multicollinearity? +Do the estimates have high standard errors? Are the t-ratios microscopic? +Does the correlation matrix reveal a high correlation between explanatory variables--say, 0.70 or higher?

What can be done about multicollinearity? Increase sample size Delete one or more explanatory variables form your specification

3Heteroscedasticity sometimes shows up when we do regression analysis using cross-sectional data. 3Consider the following model: is the deterministic part of the equation and e i is the error term. Recall that we assume that E(  ) = 0

JAR #1JAR #2  = 0 Two distributions with the same mean and different variances

X1X1 X2X2 X2X2 X Y 0 P(  ) The disturbance distributions of heteroscedasticity

Household Income Spending for electronics Scatter diagram of ascending heteroscedasticity

Why is heteroscedasticity a problem?  Heteroscedasticity does not give us biased estimates of the coefficients--however, it does make the standard errors of the estimates unreliable. That is, we will understate the standard errors.  Due to the aforementioned problem, t-tests cannot be trusted. We run the risk of rejecting a null hypothesis that should not be rejected.