Download presentation
Presentation is loading. Please wait.
Published byAmice Lloyd Modified over 9 years ago
1
AUTOCORRELATION 1 The third Gauss-Markov condition is that the values of the disturbance term in the observations in the sample be generated independently of each other. y x y = + x
2
AUTOCORRELATION 2 In the graph above, it is clear that this condition is violated. Positive values tend to be followed by positive ones, and negative values by negative ones. Successive values tend to have the same sign. This is described as positive autocorrelation. y y = + x x
3
AUTOCORRELATION 3 In this graph, positive values tend to be followed by negative ones, and negative values by positive ones. This is an example of negative autocorrelation. y y = + x x
4
First-order autoregressive autocorrelation: AR(1) AUTOCORRELATION 4 A particularly common type of autocorrelation, at least as an approximation, is first-order autoregressive autocorrelation, usually denoted AR(1) autocorrelation.
5
First-order autoregressive autocorrelation: AR(1) AUTOCORRELATION 5 It is autoregressive, because u t depends on lagged values of itself, and first-order, because it depends on only its previous value. u t also depends on t, an injection of fresh randomness at time t, often described as the innovation at time t.
6
First-order autoregressive autocorrelation: AR(1) Fifth-order autoregressive autocorrelation: AR(5) AUTOCORRELATION 6 Here is a more complex example of autoregressive autocorrelation. It is described as fifth- order, and so denoted AR(5), because it depends on lagged values of u t up to the fifth lag.
7
First-order autoregressive autocorrelation: AR(1) Fifth-order autoregressive autocorrelation: AR(5) Third-order moving average autocorrelation: MA(3) AUTOCORRELATION 7 The other main type of autocorrelation is moving average autocorrelation, where the disturbance term is a linear combination of the current innovation and a finite number of previous ones.
8
First-order autoregressive autocorrelation: AR(1) Fifth-order autoregressive autocorrelation: AR(5) Third-order moving average autocorrelation: MA(3) AUTOCORRELATION 8 This example is described as third-order moving average autocorrelation, denoted MA(3), because it depends on the three previous innovations as well as the current one.
9
AUTOCORRELATION 9 The rest of this sequence gives examples of the patterns that are generated when the disturbance term is subject to AR(1) autocorrelation. The object is to provide some bench- mark images to help you assess plots of residuals in time series regressions.
10
AUTOCORRELATION 10 We will use 50 independent values of , taken from a normal distribution with 0 mean, and generate series for u using different values of .
11
AUTOCORRELATION 11 We have started with equal to 0, so there is no autocorrelation. We will increase progressively in steps of 0.1.
12
AUTOCORRELATION 12
13
AUTOCORRELATION 13
14
AUTOCORRELATION 14 With equal to 0.3, a pattern of positive autocorrelation is beginning to be apparent.
15
AUTOCORRELATION 15
16
AUTOCORRELATION 16
17
AUTOCORRELATION 17 With equal to 0.6, it is obvious that u is subject to positive autocorrelation. Positive values tend to be followed by positive ones and negative values by negative ones.
18
AUTOCORRELATION 18
19
AUTOCORRELATION 19
20
AUTOCORRELATION 20 With equal to 0.9, the sequences of values with the same sign have become long and the tendency to return to 0 has become weak.
21
AUTOCORRELATION 21 The process is now approaching what is known as a random walk, where is equal to 1 and the process becomes nonstationary. The terms random walk and nonstationarity will be defined in the next chapter. For the time being we will assume | | < 1.
22
AUTOCORRELATION 22 Next we will look at negative autocorrelation, starting with the same set of 50 independently-distributed values of t.
23
AUTOCORRELATION 23 We will take larger steps this time.
24
AUTOCORRELATION 24 With equal to 0.6, you can see that positive values tend to be followed by negative ones, and vice versa, more frequently than you would expect as a matter of chance.
25
AUTOCORRELATION 25 Now the pattern of negative autocorrelation is very obvious.
26
= ============================================================ Dependent Variable: LGFOOD Method: Least Squares Sample: 1959 1994 Included observations: 36 ============================================================= Variable Coefficient Std. Error t-Statistic Prob. ============================================================= C 2.658875 0.278220 9.556745 0.0000 LGDPI 0.605607 0.010432 58.05072 0.0000 LGPRFOOD -0.302282 0.068086 -4.439712 0.0001 ============================================================= R-squared 0.992619 Mean dependent var 6.112169 Adjusted R-squared 0.992172 S.D. dependent var 0.193428 S.E. of regression 0.017114 Akaike info criter -5.218197 Sum squared resid 0.009665 Schwarz criterion -5.086238 Log likelihood 96.92755 F-statistic 2219.014 Durbin-Watson stat 0.613491 Prob(F-statistic) 0.000000 ============================================================= AUTOCORRELATION 26 Finally, we will look at a plot of the residuals of the logarithmic regression of expenditure on food on income and relative price.
27
AUTOCORRELATION 27 This is the plot of the residuals of course, not the disturbance term. But if the disturbance term is subject to autocorrelation, then the residuals will be subject to a similar pattern of autocorrelation.
28
AUTOCORRELATION 28 You can see that there is strong evidence of positive autocorrelation. Comparing the graph with the randomly generated patterns, one would say that is about 0.6 or 0.7. The next step is to perform a formal test for autocorrelation, the subject of the next sequence.
29
Copyright Christopher Dougherty 2000. This slideshow may be freely copied for personal use.
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.