Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 11 Autocorrelation.

Similar presentations


Presentation on theme: "Chapter 11 Autocorrelation."— Presentation transcript:

1 Chapter 11 Autocorrelation

2 Learning Objectives Understand the autoregressive structure of the error term Understand methods for detecting autocorrelation Understand how to correct for autocorrelation Understand unit roots and cointegration

3 What is Autocorrelation?
Autocorrelation occurs when the error term in one period is related to the error term in previous periods. 𝜀 𝑡 =𝜌 𝜀 𝑡−1 + 𝑢 𝑡 where | 𝜌|<1 Positive Autocorrelation is when 𝜌 > 0 or when positive errors tend to follow positive errors and negative errors tend to follow negative errors. Negative Autocorrelation is when 𝜌 < 0 or when positive errors tend to follow negative errors and negative errors tend to follow positive errors.

4 No Autocorrelation

5 Positive Autocorrelation

6 Negative Autocorrelation

7 The Issues And Consequences Associated With Autocorrelation
Problem: Autocorrelation violates time-series assumption T6, which states that the error terms must not be correlated across time-periods. Consequences: Under autocorrelation parameter estimates are unbiased. Parameter estimates are not minimum variance among all unbiased estimators. Estimated standard errors are incorrect and all measures of precision based on the estimated standard errors are also incorrect.

8 Goals of this Chapter

9 An Important Caveat before Continuing
With more advanced statistical packages, many researchers include a very simple command asking their chosen statistical program to provides standard error estimates that automatically correct for autocorrelation (Newey-West standard errors) Even though correcting for autocorrelation is straightforward, it important to first work through the more “old-school” examples that we do below before learning how to calculate Newey-West standard errors.

10 Understand the Autoregressive Structure Of the Error Term
AR(1) – The error term this period is related to the error term last period 𝜀 𝑡 =𝜌 𝜀 𝑡−1 + 𝑢 𝑡 where | 𝜌|<1 AR(2) – The error term this period is related to the error term the last two periods 𝜀 𝑡 = 𝜌 1 𝜀 𝑡−1 + 𝜌 2 𝜀 𝑡−2 + 𝑢 𝑡 where | 𝜌|<1

11 Understand the Autoregressive Structure Of the Error Term
AR(1,4) – The error term this period is related to the error term last period and the error term four periods ago 𝜀 𝑡 = 𝜌 1 𝜀 𝑡−1 + 𝜌 4 𝜀 𝑡−4 + 𝑢 𝑡 where | 𝜌 𝑖 |<1 AR(4) – The error term this period is related to the error term the last four periods 𝜀 𝑡 = 𝜌 1 𝜀 𝑡−1 + 𝜌 2 𝜀 𝑡−2 +𝜌 3 𝜀 𝑡−3 + 𝜌 4 𝜀 𝑡−4 + 𝑢 𝑡 where | 𝜌 𝑖 |<1

12 Understand Methods For Detecting Heteroskedasticity
Informal methods - Graphs Formal methods using statistical tests - Durbin-Watson test - Regression test

13 Informal Method Either graph:
The residuals against each independent variable… The residuals squared over time The residuals against lagged residuals and look for a pattern in the observations. If a pattern exists then that is evidence of autocorrelation.

14 Regression of Export Volume in England on Exchange Rate from 1930 to 2009

15 Notice how positive residuals tend
to follow positive residuals and negative residuals tend to follow negative residuals.

16 This residual plot is obtained by checking the residual plot option in Excel when running a regression. As in the previous slide, notice how there is a pattern between the residuals and the independent variable.

17 The primary drawback of the informal method is that it is not clear how much of a pattern needs to exist to lead us to the conclusion that the model suffers from autocorrelation. This leads us to the need for formal tests of autocorrelation.

18 Formal Methods for Detecting Autocorrelation
The formal methods that we consider are all based on statistical tests of the following general null and alternative hypotheses 𝐻 0 : the error terms are not correlated over time 𝐻 1 : the error terms are correlated over time

19 Testing for Autocorrelation
Durbin-Watson Test Regression Test

20 Durbin-Watson Test for AR(1)
How to do it: (1) 𝑦 𝑡 = 𝛽 0 + 𝛽 1 𝑥 1,𝑡 + 𝛽 2 𝑥 ,𝑡 +…+ 𝛽 𝑘 𝑥 𝑘,𝑡 + 𝜀 𝑡 and obtain the residuals, 𝑒 𝑡 = 𝑦 𝑡 − 𝑦 𝑡 . (2) The terms required to calculate the Durbin-Watson statistic, 𝑒 𝑡 2 , 𝑒 𝑡−1 and 𝑒 𝑡 − 𝑒 𝑡−1 2 . (3) Calculate the Durbin-Watson statistic 𝑑= 𝑡=2 𝑇 𝑒 𝑡 − 𝑒 𝑡−1 2 𝑡=1 𝑇 𝑒 𝑡 2 (4) Consult the appropriate Durbin-Watson table to determine whether to reject the null hypothesis of no autocorrelation. Rule of thumb is that if d is near 2 no AR(1) while if d is close to 0 or 4 then there is evidence of AR(1)

21 Durbin-Watson Test for AR(1)
Why It Works: Under perfect positive autocorrelation, this period’s error always equals last period’s period error, meaning that 𝑑=0. Under perfect negative autocorrelation, this period’s error is always exactly opposite last period’s error, meaning that 𝑑=4. Accordingly, calculated values of the test statistic that are closer to 0 or closer to 4 indicate that autocorrelation is present in the data.

22 Durbin-Watson Test for AR(1)
Potential Issues: (1) The test cannot be performed in models with lagged dependent variables. (2) The test can only be performed on models in which the suspected autocorrelation takes the form of AR(1). (3) The errors must be normally distributed. (4) The model must include an intercept. (5) There is an inconclusive region.

23 Inconclusive Region of the Durbin-Watson Test

24 Durbin-Watson Test Example
𝑑= 𝑡=2 𝑇 𝑒 𝑡 − 𝑒 𝑡−1 2 𝑡=1 𝑇 𝑒 𝑡 2 = = Critical Values: dLower= 1.61 dUpper= 1.66 Because < 1.61 we reject the null hypothesis of no autocorrelation and conclude that the model is AR(1).

25 Regression Test for AR(1)
How to do it: Estimate the population regression model 𝑦 𝑡 = 𝛽 0 + 𝛽 1 𝑥 1,𝑡 + 𝛽 2 𝑥 ,𝑡 +…+ 𝛽 𝑘 𝑥 𝑘,𝑡 +𝜀 and obtain the residuals, 𝑒 𝑖 = 𝑦 𝑖 − 𝑦 𝑖 . (2) Calculate the residuals lagged one-period 𝑒 1,𝑡−1 for each observation starting with 𝑡=2. (3) Estimate the population regression model 𝑒 𝑡 =𝜌 𝑒 1,𝑡−1 + 𝑢 𝑡 . (4) Perform a test of the individual significance of the estimated slope coefficient 𝜌 .

26 Regression Test for AR(1)
Why It Works: Autocorrelation of the form AR(1) exists if the current period errors are correlated with immediate prior period errors. Hence, if a regression of the current period residuals on the residuals lagged one period yields a statistically significant coefficient, we would conclude that the errors are correlated and that an AR(1) process does exist.

27 Regression Test for AR(1) for Trade Volume Data Dependent Variable is Residuals
The individual significant of the lagged residuals is much less than 0.05 (or 0.01 for that matter) so we reject the null hypothesis of no AR(1) and conclude the model suffers from first order autocorrelation.

28 order autocorrelation.
Regression Test for AR(2) for Trade Volume Data Dependent Variable is Residuals The significance F of the joint significance of the lagged residuals is much less than 0.05 (or 0.01 for that matter) so we reject the null hypothesis of no AR(2) and conclude the model suffers from second order autocorrelation.

29 Correcting for Autocorrelation
Cochran-Orcutt Prais-Winsten Newey-West autocorrelation and heteroskedastic consistent standard errors

30 Cochran-Orcutt Correction for AR(1) Process
How to do it: (1) Estimate the population regression model 𝑦 𝑡 = 𝛽 0 + 𝛽 1 𝑥 1,𝑡 + 𝛽 2 𝑥 ,𝑡 +…+ 𝛽 𝑘 𝑥 𝑘,𝑡 +𝜀 and obtain the residuals, 𝑒 𝑡 = 𝑦 𝑡 − 𝑦 𝑡 and the residuals lagged one-period 𝑒 1,𝑡−1 . (2) Estimate the regression model 𝑒 𝑡 =𝜌 𝑒 𝑡−1 + 𝜀 𝑡 to generate an estimate of 𝜌 . (3) Convert the data using the estimated value of 𝜌 . (4) Estimate the population regression model 𝑦 𝑡 = 𝛽 0 + 𝛽 1 𝑥 𝑡 + 𝜀 𝑡 .

31 Cochran-Orcutt Transformation for AR(1) Process
Convert 𝑦 𝑡 into 𝑦 𝑡 = 𝑦 𝑡 − 𝜌 𝑦 𝑡−1 Convert the intercept into 𝛽 0 =1− 𝜌 Convert each independent variable 𝑥 𝑡𝑗 into 𝑥 𝑡𝑗 = 𝑥 𝑡𝑗 − 𝜌 𝑥 𝑡𝑗−1 for j = 1…k This works because if the error term is AR(1) then 𝜀 𝑡 into 𝜀 𝑡 = 𝜀 𝑡 −𝜌 𝜀 𝑡−1 = 𝑢 𝑡 Notice that 𝑢 𝑡 does not suffer from autocorrelation.

32 Cochran-Orcutt Correction for AR(1) Process
Why It Works: In AR(1) processes, the current period error is related to the immediate prior period error according the equation. This method accounts for the correlation by using the observed data to estimate the value of 𝜌 and using that estimate to convert the data into a form that corrects for the correlation. 𝜀 𝑡 = 𝜀 𝑡 −𝜌 𝜀 𝑡−1 = 𝑢 𝑡

33 Cochran-Orcutt Correction in STATA
How to do it: First declare the data to be time series data using the command tsset time Then use the command prais y x1 x2 , corc

34 Cochran-Orcutt Correction in STATA

35 Prais-Winsten Correction for AR(1) Process
How to do it: (1) Repeat the first four steps of the Cochrane-Orcutt method. (2) Calculate 𝑦 1 = 1− 𝜌 2 ∙ 𝑦 1 and 𝑥 1 = 1− 𝜌 2 ∙ 𝑥 1 . (3) Estimate the population regression model 𝑦 𝑡 = 𝛽 0 + 𝛽 1 𝑥 𝑡 + 𝜀 𝑡 . Why It Works: For the same reason as the Cochrane-Orcutt procedure except that the resulting estimates are now BLUE (if 𝜌 is not estimated and the model is truly AR(1)) because all T observations are utilized.

36 Prais-Winsten Correction in STATA
How to do it: First declare the data to be time series data using the command tsset time Then use the command prais y x1 x2

37 Prais-Winsten Correction in STATA

38 Newey-West Standard Errors
The preferred method to correct for autocorrelation is to use Newey-West autocorrelation and heteroskedastic consistent standard errors. The coefficient estimates are still unbiased so the only thing that needs to be corrected are the standard errors. In STATA, the command is newey y x1 x2 x3

39 STATA Results with Newey-West Standard Errors

40 What is a Unit Root? Unit Root occurs when the parameter on the AR(1) process is equal to 1 or 𝜌 = 1. 𝜀 𝑡 =𝜌 𝜀 𝑡−1 + 𝑢 𝑡 where 𝜌 = 1 Explosive Time Series is when a random shock has an increasingly larger influence. Dickey-Fuller Test is the test that is used to test for a unit root.

41 Using the Dickey-Fuller Test to test for a Unit Root
𝐻 0 : 𝜌=1 𝐻 𝐴 : 𝜌<1 Use the command in STATA dfgls y If the test statistic is less than the critical value then fail to reject the null hypothesis and conclude there is a unit root.

42 STATA Results of Dickey-Fuller Test on Export Volume
Because the test statistic is less than the critical value, we fail to reject the null hypothesis and conclude these data suffer from a unit root.

43 What to Do if the Data Suffer from a Unit Root?
(1) First difference the data 𝑦𝑑𝑖𝑓𝑓= 𝑦 𝑡 − 𝑦 𝑡−1 and test if the first differencing eliminated the unit root. (2) Find a variable that is cointegrated with 𝑦 𝑡 Cointegration occurs when two variables each has a unit root but both variables move together such that a linear combination of the two variables does not have a unit root.


Download ppt "Chapter 11 Autocorrelation."

Similar presentations


Ads by Google