Download presentation

Presentation is loading. Please wait.

Published byMarina Madewell Modified over 4 years ago

1
The Spectral Representation of Stationary Time Series

2
Stationary time series satisfy the properties: 1.Constant mean (E(x t ) = ) 2.Constant variance (Var(x t ) = 2 ) 3.Correlation between two observations (x t, x t + h ) dependent only on the distance h. These properties ensure the periodic nature of a stationary time series

3
and X 1, X 1, …, X k and Y 1, Y 2, …, Y k are independent independent random variables with where 1, 2, … k are k values in (0, ) Recall is a stationary Time series

4
With this time series and We can give it a non-zero mean, , by adding to the equation

5
We now try to extend this example to a wider class of time series which turns out to be the complete set of weakly stationary time series. In this case the collection of frequencies may even vary over a continuous range of frequencies [0, ].

6
The Riemann integral The Riemann-Stiltjes integral If F is continuous with derivative f then: If F is is a step function with jumps p i at x i then:

7
First, we are going to develop the concept of integration with respect to a stochastic process. Let {U( ): [0, ]} denote a stochastic process with mean 0 and independent increments; that is E{[U( 2 ) - U( 1 )][U( 4 ) - U( 3 )]} = 0 for 0 ≤ 1 < 2 ≤ 3 < 4 ≤ . and E[U( ) ] = 0 for 0 ≤ ≤ .

8
In addition let G( ) =E[U 2 ( ) ] for 0 ≤ ≤ and assume G(0) = 0. It is easy to show that G( ) is monotonically non decreasing. i.e. G( 1 ) ≤ G( 2 ) for 1 < 2.

9
Now let us define: analogous to the Riemann-Stieltjes integral

10
Let 0 = 0 < 1 < 2 <... < n = be any partition of the interval. Let. Let i denote any value in the interval [ i-1, i ] Consider: Suppose that and there exists a random variable V such that *

11
Then V is denoted by:

12
Properties: 1. 2. 3.

13
The Spectral Representation of Stationary Time Series

14
Let {X( ): [0, ]} and {Y( ): l [0, ]} denote a uncorrelated stochastic process with mean 0 and independent increments. Also let F( ) =E[X 2 ( ) ] =E[Y 2 ( ) ] for 0 ≤ ≤ and F(0) = 0. Now define the time series {x t : t T}as follows:

15
Then

16
Also

19
Thus the time series {x t : t T} defined as follows: is a stationary time series with: F( ) is called the spectral distribution function: If f( ) = F ˊ ( ) is called then is called the spectral density function:

20
Note The spectral distribution function, F( ), and spectral density function, f( ) describe how the variance of x t is distributed over the frequencies in the interval [0, ]

21
The autocovariance function, (h), can be computed from the spectral density function, f( ), as follows: Also the spectral density function, f( ), can be computed from the autocovariance function, (h), as follows:

22
Example: Let {u t : t T} be identically distributed and uncorrelated with mean zero (a white noise series). Thus and

23
Graph:

24
Example: Suppose X 1, X 1, …, X k and Y 1, Y 2, …, Y k are independent independent random variables with Let 1, 2, … k denote k values in (0, ) Then

25
If we define {X( ): [0, ]} and {Y( ): [0, ]} Note: X( ) and Y( ) are “random” step functions and F( ) is a step function.

27
Another important comment In the case when F( ) is continuous then

28
in this case Sometimes the spectral density function, f( ), is extended to the interval [- , ] and is assumed symmetric about 0 (i.e. f s ( ) = f s (- ) = f ( )/2 ) It can be shown that

29
Hence From now on we will use the symmetric spectral density function and let it be denoted by, f( ).

30
Linear Filters

31
Let {x t : t T} be any time series and suppose that the time series {y t : t T} is constructed as follows:: The time series {y t : t T} is said to be constructed from {x t : t T} by means of a Linear Filter. input x t output y t Linear Filter a s

32
Let x (h) denote the autocovariance function of {x t : t T} and y (h) the autocovariance function of {y t : t T}. Assume also that E[x t ] = E[y t ] = 0. Then::

35
Hence where of the linear filter

36
Note: hence

37
Spectral density function Moving Average Time series of order q, MA(q) Let 0 =1, 1, 2, … q denote q + 1 numbers. Let {u t |t T} denote a white noise time series with variance 2. Let {x t |t T} denote a MA(q) time series with = 0. Note: {x t |t T} is obtained from {u t |t T} by a linear filter.

38
Now Hence

39
Example: q = 1

40
Example: q = 2

41
Spectral density function for MA(1) Series

42
Spectral density function Autoregressive Time series of order p, AR(p) Let 1, 2, … p denote p + 1 numbers. Let {u t |t T} denote a white noise time series with variance 2. Let {x t |t T} denote a AR(p) time series with = 0. Note: {u t |t T} is obtained from {x t |t T} by a linear filter.

43
Now Hence

44
Example: p = 1

45
Example: p = 2

46
Example : Sunspot Numbers (1770-1869)

48
Autocorrelation function and partial autocorrelation function

49
Spectral density Estimate

50
Assuming an AR(2) model

51
A linear discrete time series Moving Average time series of infinite order

52
Let 0 =1, 1, 2, … denote an infinite sequence of numbers. Let {u t |t T} denote a white noise time series with variance 2. –independent –mean 0, variance 2. Let {x t |t T} be defined by the equation. Then {x t |t T} is called a Linear discrete time series. Comment: A linear discrete time series is a Moving average time series of infinite order

53
The AR(1) Time series Let {x t |t T} be defined by the equation. Then

54
where and An alternative approach using the back shift operator, B. The equation: can be written

55
Now since The equation: has the equivalent form:

56
The time series {x t |t T} can be written as a linear discrete time series where and For the general AR(p) time series: [ (B)] -1 can be found by carrying out the multiplication

57
can be written: where Thus the AR(p) time series: Hence This called the Random Shock form of the series

58
can be written: where Thus the AR(p) time series: Hence This called the Random Shock form of the series

59
An ARMA(p,q) time series {x t |t T} satisfies the equation: where and The Random Shock form of an ARMA(p,q) time series:

60
Again the time series {x t |t T} can be written as a linear discrete time series namely where (B) =[ (B)] -1 [ (B)] can be found by carrying out the multiplication

61
Thus an ARMA(p,q) time series can be written: where

62
The inverted form of a stationary time series Autoregressive time series of infinite order

63
An ARMA(p,q) time series {x t |t T} satisfies the equation: where and Suppose that This will be true if the roots of the polynomial all exceed 1 in absolute value. The time series {x t |t T} in this case is called invertible.

64
Then where or

65
Thus an ARMA(p,q) time series can be written: where This is called the inverted form of the time series. This expresses the time series an autoregressive time series of infinite order.

Similar presentations

Presentation is loading. Please wait....

OK

Chain Rules for Entropy

Chain Rules for Entropy

© 2018 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google