Chapter 11 Autocorrelation.

Slides:



Advertisements
Similar presentations
Autocorrelation Functions and ARIMA Modelling
Advertisements

Autocorrelation and Heteroskedasticity
Regression Analysis.
Applied Econometrics Second edition
Chapter 12 Simple Linear Regression
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Fourteen.
Forecasting Using the Simple Linear Regression Model and Correlation
Using SAS for Time Series Data
Lecture #9 Autocorrelation Serial Correlation
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Regression Analysis Notes. What is a simple linear relation? When one variable is associated with another variable in such a way that two numbers completely.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 21 Autocorrelation and Inferences about the Slope.
Chapter 12 Simple Linear Regression
Chapter 13 Additional Topics in Regression Analysis
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
Additional Topics in Regression Analysis
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter Topics Types of Regression Models
Topic 3: Regression.
Slide Copyright © 2010 Pearson Education, Inc. Active Learning Lecture Slides For use with Classroom Response Systems Business Statistics First Edition.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
1Prof. Dr. Rainer Stachuletz Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
Autocorrelation Lecture 18 Lecture 18.
Chapter 7 Forecasting with Simple Regression
Simple Linear Regression Analysis
DURBIN–WATSON TEST FOR AR(1) AUTOCORRELATION
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Multiple Linear Regression Analysis
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Hypothesis Testing in Linear Regression Analysis
Regression Method.
Autocorrelation Outline 1) What is it?
Serial Correlation and the Housing price function Aka “Autocorrelation”
What does it mean? The variance of the error term is not constant
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
Chapter 10 Hetero- skedasticity Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Twelve.
Pure Serial Correlation
Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Copyright © 2014 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
Autocorrelation in Time Series KNNL – Chapter 12.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Model Building and Model Diagnostics Chapter 15.
Problems with the Durbin-Watson test
Principles of Econometrics, 4t h EditionPage 1 Chapter 8: Heteroskedasticity Chapter 8 Heteroskedasticity Walter R. Paczkowski Rutgers University.
AUTOCORRELATION: WHAT HAPPENS IF THE ERROR TERMS ARE CORRELATED?
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Autocorrelation in Regression Analysis
8-1 MGMG 522 : Session #8 Heteroskedasticity (Ch. 10)
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
7-1 MGMG 522 : Session #7 Serial Correlation (Ch. 9)
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Metrics Lab Econometric Problems Lab. Import the Macro data from Excel and use first row as variable names Time set the year variable by typing “tsset.
Forecasting. Model with indicator variables The choice of a forecasting technique depends on the components identified in the time series. The techniques.
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Pure Serial Correlation
Autocorrelation.
Serial Correlation and Heteroskedasticity in Time Series Regressions
Serial Correlation and Heteroscedasticity in
BEC 30325: MANAGERIAL ECONOMICS
Autocorrelation.
Autocorrelation MS management.
BEC 30325: MANAGERIAL ECONOMICS
Serial Correlation and Heteroscedasticity in
Presentation transcript:

Chapter 11 Autocorrelation

Learning Objectives Understand the autoregressive structure of the error term Understand methods for detecting autocorrelation Understand how to correct for autocorrelation Understand unit roots and cointegration

What is Autocorrelation? Autocorrelation occurs when the error term in one period is related to the error term in previous periods. 𝜀 𝑡 =𝜌 𝜀 𝑡−1 + 𝑢 𝑡 where | 𝜌|<1 Positive Autocorrelation is when 𝜌 > 0 or when positive errors tend to follow positive errors and negative errors tend to follow negative errors. Negative Autocorrelation is when 𝜌 < 0 or when positive errors tend to follow negative errors and negative errors tend to follow positive errors.

No Autocorrelation

Positive Autocorrelation

Negative Autocorrelation

The Issues And Consequences Associated With Autocorrelation Problem: Autocorrelation violates time-series assumption T6, which states that the error terms must not be correlated across time-periods.   Consequences: Under autocorrelation parameter estimates are unbiased. Parameter estimates are not minimum variance among all unbiased estimators. Estimated standard errors are incorrect and all measures of precision based on the estimated standard errors are also incorrect.

Goals of this Chapter

An Important Caveat before Continuing With more advanced statistical packages, many researchers include a very simple command asking their chosen statistical program to provides standard error estimates that automatically correct for autocorrelation (Newey-West standard errors) Even though correcting for autocorrelation is straightforward, it important to first work through the more “old-school” examples that we do below before learning how to calculate Newey-West standard errors.

Understand the Autoregressive Structure Of the Error Term AR(1) – The error term this period is related to the error term last period 𝜀 𝑡 =𝜌 𝜀 𝑡−1 + 𝑢 𝑡 where | 𝜌|<1 AR(2) – The error term this period is related to the error term the last two periods 𝜀 𝑡 = 𝜌 1 𝜀 𝑡−1 + 𝜌 2 𝜀 𝑡−2 + 𝑢 𝑡 where | 𝜌|<1

Understand the Autoregressive Structure Of the Error Term AR(1,4) – The error term this period is related to the error term last period and the error term four periods ago 𝜀 𝑡 = 𝜌 1 𝜀 𝑡−1 + 𝜌 4 𝜀 𝑡−4 + 𝑢 𝑡 where | 𝜌 𝑖 |<1 AR(4) – The error term this period is related to the error term the last four periods 𝜀 𝑡 = 𝜌 1 𝜀 𝑡−1 + 𝜌 2 𝜀 𝑡−2 +𝜌 3 𝜀 𝑡−3 + 𝜌 4 𝜀 𝑡−4 + 𝑢 𝑡 where | 𝜌 𝑖 |<1

Understand Methods For Detecting Heteroskedasticity Informal methods - Graphs Formal methods using statistical tests - Durbin-Watson test - Regression test

Informal Method Either graph: The residuals against each independent variable… The residuals squared over time The residuals against lagged residuals and look for a pattern in the observations. If a pattern exists then that is evidence of autocorrelation.

Regression of Export Volume in England on Exchange Rate from 1930 to 2009

Notice how positive residuals tend to follow positive residuals and negative residuals tend to follow negative residuals.

This residual plot is obtained by checking the residual plot option in Excel when running a regression. As in the previous slide, notice how there is a pattern between the residuals and the independent variable.

The primary drawback of the informal method is that it is not clear how much of a pattern needs to exist to lead us to the conclusion that the model suffers from autocorrelation. This leads us to the need for formal tests of autocorrelation.

Formal Methods for Detecting Autocorrelation The formal methods that we consider are all based on statistical tests of the following general null and alternative hypotheses 𝐻 0 : the error terms are not correlated over time 𝐻 1 : the error terms are correlated over time

Testing for Autocorrelation Durbin-Watson Test Regression Test

Durbin-Watson Test for AR(1) How to do it: (1) 𝑦 𝑡 = 𝛽 0 + 𝛽 1 𝑥 1,𝑡 + 𝛽 2 𝑥 ,𝑡 +…+ 𝛽 𝑘 𝑥 𝑘,𝑡 + 𝜀 𝑡 and obtain the residuals, 𝑒 𝑡 = 𝑦 𝑡 − 𝑦 𝑡 . (2) The terms required to calculate the Durbin-Watson statistic, 𝑒 𝑡 2 , 𝑒 𝑡−1 and 𝑒 𝑡 − 𝑒 𝑡−1 2 . (3) Calculate the Durbin-Watson statistic 𝑑= 𝑡=2 𝑇 𝑒 𝑡 − 𝑒 𝑡−1 2 𝑡=1 𝑇 𝑒 𝑡 2 (4) Consult the appropriate Durbin-Watson table to determine whether to reject the null hypothesis of no autocorrelation. Rule of thumb is that if d is near 2 no AR(1) while if d is close to 0 or 4 then there is evidence of AR(1)

Durbin-Watson Test for AR(1) Why It Works: Under perfect positive autocorrelation, this period’s error always equals last period’s period error, meaning that 𝑑=0. Under perfect negative autocorrelation, this period’s error is always exactly opposite last period’s error, meaning that 𝑑=4. Accordingly, calculated values of the test statistic that are closer to 0 or closer to 4 indicate that autocorrelation is present in the data.

Durbin-Watson Test for AR(1) Potential Issues: (1) The test cannot be performed in models with lagged dependent variables. (2) The test can only be performed on models in which the suspected autocorrelation takes the form of AR(1). (3) The errors must be normally distributed. (4) The model must include an intercept. (5) There is an inconclusive region.

Inconclusive Region of the Durbin-Watson Test

Durbin-Watson Test Example 𝑑= 𝑡=2 𝑇 𝑒 𝑡 − 𝑒 𝑡−1 2 𝑡=1 𝑇 𝑒 𝑡 2 = 29875729432 404949622578 =0.0738 Critical Values: dLower= 1.61 dUpper= 1.66 Because 0.0738< 1.61 we reject the null hypothesis of no autocorrelation and conclude that the model is AR(1).

Regression Test for AR(1) How to do it: Estimate the population regression model 𝑦 𝑡 = 𝛽 0 + 𝛽 1 𝑥 1,𝑡 + 𝛽 2 𝑥 ,𝑡 +…+ 𝛽 𝑘 𝑥 𝑘,𝑡 +𝜀 and obtain the residuals, 𝑒 𝑖 = 𝑦 𝑖 − 𝑦 𝑖 . (2) Calculate the residuals lagged one-period 𝑒 1,𝑡−1 for each observation starting with 𝑡=2. (3) Estimate the population regression model 𝑒 𝑡 =𝜌 𝑒 1,𝑡−1 + 𝑢 𝑡 . (4) Perform a test of the individual significance of the estimated slope coefficient 𝜌 .  

Regression Test for AR(1) Why It Works: Autocorrelation of the form AR(1) exists if the current period errors are correlated with immediate prior period errors. Hence, if a regression of the current period residuals on the residuals lagged one period yields a statistically significant coefficient, we would conclude that the errors are correlated and that an AR(1) process does exist.

Regression Test for AR(1) for Trade Volume Data Dependent Variable is Residuals The individual significant of the lagged residuals is much less than 0.05 (or 0.01 for that matter) so we reject the null hypothesis of no AR(1) and conclude the model suffers from first order autocorrelation.

order autocorrelation. Regression Test for AR(2) for Trade Volume Data Dependent Variable is Residuals The significance F of the joint significance of the lagged residuals is much less than 0.05 (or 0.01 for that matter) so we reject the null hypothesis of no AR(2) and conclude the model suffers from second order autocorrelation.

Correcting for Autocorrelation Cochran-Orcutt Prais-Winsten Newey-West autocorrelation and heteroskedastic consistent standard errors

Cochran-Orcutt Correction for AR(1) Process How to do it: (1) Estimate the population regression model 𝑦 𝑡 = 𝛽 0 + 𝛽 1 𝑥 1,𝑡 + 𝛽 2 𝑥 ,𝑡 +…+ 𝛽 𝑘 𝑥 𝑘,𝑡 +𝜀 and obtain the residuals, 𝑒 𝑡 = 𝑦 𝑡 − 𝑦 𝑡 and the residuals lagged one-period 𝑒 1,𝑡−1 . (2) Estimate the regression model 𝑒 𝑡 =𝜌 𝑒 𝑡−1 + 𝜀 𝑡 to generate an estimate of 𝜌 . (3) Convert the data using the estimated value of 𝜌 . (4) Estimate the population regression model 𝑦 𝑡 = 𝛽 0 + 𝛽 1 𝑥 𝑡 + 𝜀 𝑡 .

Cochran-Orcutt Transformation for AR(1) Process Convert 𝑦 𝑡 into 𝑦 𝑡 = 𝑦 𝑡 − 𝜌 𝑦 𝑡−1 Convert the intercept into 𝛽 0 =1− 𝜌 Convert each independent variable 𝑥 𝑡𝑗 into 𝑥 𝑡𝑗 = 𝑥 𝑡𝑗 − 𝜌 𝑥 𝑡𝑗−1 for j = 1…k This works because if the error term is AR(1) then 𝜀 𝑡 into 𝜀 𝑡 = 𝜀 𝑡 −𝜌 𝜀 𝑡−1 = 𝑢 𝑡 Notice that 𝑢 𝑡 does not suffer from autocorrelation.

Cochran-Orcutt Correction for AR(1) Process Why It Works: In AR(1) processes, the current period error is related to the immediate prior period error according the equation. This method accounts for the correlation by using the observed data to estimate the value of 𝜌 and using that estimate to convert the data into a form that corrects for the correlation. 𝜀 𝑡 = 𝜀 𝑡 −𝜌 𝜀 𝑡−1 = 𝑢 𝑡

Cochran-Orcutt Correction in STATA How to do it: First declare the data to be time series data using the command tsset time Then use the command prais y x1 x2 , corc

Cochran-Orcutt Correction in STATA

Prais-Winsten Correction for AR(1) Process How to do it: (1) Repeat the first four steps of the Cochrane-Orcutt method. (2) Calculate 𝑦 1 = 1− 𝜌 2 ∙ 𝑦 1 and 𝑥 1 = 1− 𝜌 2 ∙ 𝑥 1 . (3) Estimate the population regression model 𝑦 𝑡 = 𝛽 0 + 𝛽 1 𝑥 𝑡 + 𝜀 𝑡 . Why It Works: For the same reason as the Cochrane-Orcutt procedure except that the resulting estimates are now BLUE (if 𝜌 is not estimated and the model is truly AR(1)) because all T observations are utilized.

Prais-Winsten Correction in STATA How to do it: First declare the data to be time series data using the command tsset time Then use the command prais y x1 x2

Prais-Winsten Correction in STATA

Newey-West Standard Errors The preferred method to correct for autocorrelation is to use Newey-West autocorrelation and heteroskedastic consistent standard errors. The coefficient estimates are still unbiased so the only thing that needs to be corrected are the standard errors. In STATA, the command is newey y x1 x2 x3

STATA Results with Newey-West Standard Errors

What is a Unit Root? Unit Root occurs when the parameter on the AR(1) process is equal to 1 or 𝜌 = 1. 𝜀 𝑡 =𝜌 𝜀 𝑡−1 + 𝑢 𝑡 where 𝜌 = 1 Explosive Time Series is when a random shock has an increasingly larger influence. Dickey-Fuller Test is the test that is used to test for a unit root.

Using the Dickey-Fuller Test to test for a Unit Root 𝐻 0 : 𝜌=1 𝐻 𝐴 : 𝜌<1 Use the command in STATA dfgls y If the test statistic is less than the critical value then fail to reject the null hypothesis and conclude there is a unit root.

STATA Results of Dickey-Fuller Test on Export Volume Because the test statistic is less than the critical value, we fail to reject the null hypothesis and conclude these data suffer from a unit root.

What to Do if the Data Suffer from a Unit Root? (1) First difference the data 𝑦𝑑𝑖𝑓𝑓= 𝑦 𝑡 − 𝑦 𝑡−1 and test if the first differencing eliminated the unit root. (2) Find a variable that is cointegrated with 𝑦 𝑡 Cointegration occurs when two variables each has a unit root but both variables move together such that a linear combination of the two variables does not have a unit root.