Presentation is loading. Please wait.

Presentation is loading. Please wait.

Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.

Similar presentations


Presentation on theme: "Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary."— Presentation transcript:

1 Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary time series can be transformed into a stationary time series, modelled and back-transformed to original scale (e.g. for purposes of forecasting) ARIMA – models These parts can be modelled on a stationary series This part has to do with the transformation

2 AR-models (for stationary time series) Consider the model Y t = δ +  ·Y t –1 + e t with {e t } i.i.d with zero mean and constant variance = σ 2 (white noise) and where δ (delta) and  (phi) are (unknown) parameters Autoregressive process of order 1: AR(1) Set δ = 0 by sake of simplicity  E(Y t ) = 0   k = Cov(Y t, Y t-k ) = Cov(Y t, Y t+k ) = E(Y t ·Y t-k ) = E(Y t ·Y t+k )

3 Now:  0 = E(Y t ·Y t ) = E((  ·Y t-1 + e t )  Y t )=  · E(Y t-1 ·Y t ) + E(e t  Y t ) = =  ·  1 + E(e t  (  ·Y t-1 + e t ) ) =  ·  1 +  · E(e t  Y t-1 ) + E(e t ·e t )= =  ·  1 + 0 + σ 2 (for e t is independent of Y t-1 )  1 = E(Y t-1 ·Y t ) = E(Y t-1 ·(  ·Y t-1 + e t ) =  · E(Y t-1 ·Y t-1 ) + E(Y t-1 ·e t ) = =  ·  0 + 0 (for e t is independent of Y t-1 )  2 = E(Y t -2 ·Y t ) = E(Y t-2 ·(  ·Y t-1 + e t ) =  · E(Y t-2 ·Y t-1 ) + + E(Y t-2 ·e t ) =  ·  1 + 0 (for e t is independent of Y t-2 ) 

4  0 =   1 + σ 2  1 =  ·  0 Yule-Walker equations  2 =  ·  1 …   k =  ·  k-1 =…=  k ·  0  0 =  2 ·  0 + σ 2 

5 Note that for  0 to become positive and finite (which we require from a variance) the following must hold: This in effect the condition for an AR(1)-process to be weakly stationary Now, note that

6 Recall that  k is called the autocorrelation function (ACF) ”auto” because it gives correlations within the same time series. For pairs of different time series one can define the cross correlation function which gives correlations at different lags between series. By studying the ACF it might be possible to identify the approximate magnitude of .

7 Examples:

8

9

10 The general linear process AR(1) as a general linear process:

11 If |  | < 1  The representation as a linear process is valid |  | < 1 is at the same time the condition for stationarity of an AR(1)-process Second-order autoregressive process

12 Characteristic equation Write the AR(2) model as

13 Stationarity of an AR(2)-process The characteristic equation has two roots (second-order equation). (Under certain conditions there is one (multiple) root.) The roots may be complex-valued If the absolute values of the roots both exceed 1 the process is stationary. Absolute value > 1  Roots are outside the unit circle 1 i

14 Requires (  1,  2 ) to lie within the blue triangle. Some of these pairs define complex roots.

15 Finding the autocorrelation function Yule-Walker equations: Start with  0 = 1

16 For any values of  1 and  2 the autocorrelations will decrease exponentially with k For complex roots to the characteristic equation the correlations will show a damped sine wave behaviour as k increases. Se figures on page 74 in the textbook

17 The general autoregressive process, AR(p) Exponentially decaying Damped sine wave fashion if complex roots

18 Moving average processes, MA Always stationary MA(1)

19 General pattern: “cuts off” after lag q

20 Invertibility (of an MA-process) i.e. an AR(  )-process provided the rendered coefficients  1,  2, … fulfil the conditions of stationarity for Y t They do if the characteristic equation of the MA(q)-process has all its roots outside the unit circle (modulus > 1)

21 Autogregressive-moving average processes ARMA(p,q)

22 Non-stationary processes A simple grouping of non-stationary processes: Non-stationary in mean Non-stationary in variance Non-stationary in both mean and variance Classical approach: Try to “make” the process stationary before modelling Modern approach: Try to model the process in it original form

23 Classical approach Non-stationary in mean Example Random walk

24 More generally… First-order non-stationary in mean  Use first-order differencing Second-order non-stationary in mean  Use second order differencing …

25 ARIMA(p,d,q) Common: d ≤ 2 p ≤ 3 q ≤ 3

26 Non-stationarity in variance Classical approach: Use power transformations (Box-Cox) Common order of application: 1.Square root 2.Fourth root 3.Log 4.Reciprocal (1/Y) For non-stationarity both in mean and variance: 1.Power transformation 2.Differencing


Download ppt "Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary."

Similar presentations


Ads by Google