# R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.

## Presentation on theme: "R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods."— Presentation transcript:

1 R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods

2 Inference statistic analysis of the time series Now: measures about the significance extrapolated trends causal relations between two variables Cross-section analysis: Y is a realization of a stochastic process, for example the errors must have a determined probability distribution Time series analysis: Prognosis for y t+1, the influences of exogenous parameters can be investigated on this basis

3 A model that describes probability structures is called stochastic process. The model includes assumptions for the mechanisms generating the observed time series. A general assumption is the stationarity: 4a) Autocovariances with a lag greater than k are assumed zero → moving average models 4b) Autocovariances of a higher order can be calculated by variances of a lower order → autoregressive models weakly stationarity

4

5

6 Wavelet transformation, MMNR*100 hab data

7 Autoregressive (AR) models of order p a t error term: white noise

8 AR(1) process

9

10 Theoretical autocorrelation functions (ACF) ACF for AR(1)

11 Yule-Walker equations AR(1) AR(2) AR(p)

12 2 0 2 φ1φ1 1 0 φ2φ2 AR(2) Conditions of stationarity: In the area under the circle the AR(2) model describes a quasi-cycle process

13 z(1)=1 z(2)=2 time

14

15

16 Step wise calculation of the coefficients from the Yule- Walker equations k=1: k=2: The theoretical PACF of an AR(p) process has values different from zero, only for k=1,2,…,p ! Model identification tool partial autocorrelation function (PACF), as known from the cross-section statistics

17 φ 1 =1.7 φ 2 =-0.95 Theoretical autocorrelation functions (ACF) and partial autocorrelation function for an AR(2) process φ 1 =1.7 φ 2 =-0.95

18 Yule 1927

19

20

21 Residues Parameter estimation YuleIn this work φ1φ1 1.34251.3571 φ2φ2 -0.6550-0.6601 c00

22 Distribution of the residues Autocorrelation function of the residues

23 Moving-Average (MA) models AR models describe processes as a function of past z values, however as was shown for the AR(2) process z=1.7z t-1 -0.95z t-2 the process is forced by the noise a t. (with a theoretical infinite influence of the shocks). Now the idea is: as for the AR-process, to minimize the process parameters of finite series of a t with time lags

24 Autocorrelation for a MA(1) process for a MA(2) process

25 PACF? θ1θ1 1 0 θ2θ2 МА(2) Invertibility condition: In the area under the circle the MA(2) model describes a quasi-cycle process 2 0 2 Empiric ACF is a tool to identification of the MA order

26 Invertibility condition: For a MA(1) process we have

27 The MA(1) process can be presented by an AR( ) process In general MA(q) process can be presented by an AR( ) process and an AR(p) process can be presented by a MAR( ) process Box-Jenkins Principle: models with minimum number of parameters have to be used

28 Other models: ARMA: mixed model of AR and MA ARIMA: autoregressive integrating moving-average model it uses differences of the time series SARIMA: seasonal ARIMA model with constant seasonal figure VARMA : vector ARMA

29 Forecast AR(1)

30 MA(1) It can be shown that The MA models are not useful for prognosis

31 Forecast of the SSN by an AR(9) model

32 Dynamical regression Ordinary linear regression: X i may be transformed α and β can be optimally estimated by Ordinary Least Squares (OLS) using the assumptions: Y i is normal distributed, for X i it is not necessary to be stochastically distributed (for ex. can be fixed) 1.E(ε i )=0 2.ε i is not autocorrelated Cov(ε i, ε j )=0 3.ε i is normally distributed 4.Equilibrium conditions

33 For time series can be formally written ( i →t ): The assumption of equilibrium is not necessary However: In time series the error term is often autocorrelated -The estimations are not efficient (they have not the minimal variance) - Autocorrelations of X i can be transferred to ε, autocorrelations of ε produce deviations of σ ε from the true value, besides this implicates a not true value of σ β γ autocorr. of the residues λ autocorr. of the predictors

34 Simple lag model (models, dynamical by X) Distributed lag model the influence is distributed over k lags for example k=2 The statistical interpretation of β do not make sense

35 Therefore more restrictions are needed where the influence decreases exponentially with k. Then the model has only three parameters: α, β 0,δ For the model

36 where Using OLS, δ and β 0 can be estimated, and after this β k How do determine the parameters? Koyck transformation

37 Similar models Adaptive expectation model Partial adjustment model Models with two or more input variables

38 Model with autocorrelative error term We remember, that ε t in lin. regr. has to be N (0,σ). Here ε t is an AR(1) process. Estimation of the regr. coeff. by Cochrane/Orcutt method 1. By OLS estimation of α and β and calculation of the residues e t and estimation of the autocorrelation coeff.

39 new regr. equation where Note: to test if ε t is autocorrelated, the Durbin-Watson test can be applied

40

41

42 e = 0.0626*t-121.35

43 Autocorrelation function of the detrended residues

44 Partial Autocorrelation function of the detrended residues

45 I want to acknowledge to the Ministery of Education and Science to support this work under the contract DVU01/0120 Acknowledgement

Similar presentations