Presentation is loading. Please wait.

Presentation is loading. Please wait.

Stationary Time Series AMS 586 1. The Moving Average Time series of order q, MA(q) where {Z t |t T} denote a white noise time series with variance 2.

Similar presentations


Presentation on theme: "Stationary Time Series AMS 586 1. The Moving Average Time series of order q, MA(q) where {Z t |t T} denote a white noise time series with variance 2."— Presentation transcript:

1 Stationary Time Series AMS 586 1

2 The Moving Average Time series of order q, MA(q) where {Z t |t T} denote a white noise time series with variance 2. Let {X t |t T} be defined by the equation. Then {X t |t T} is called a Moving Average time series of order q. (denoted by MA(q)) 2

3 The autocorrelation function for an MA(q) time series The autocovariance function for an MA(q) time series The mean value for an MA(q) time series 3

4 The autocorrelation function for an MA(q) time series Comment cuts off to zero after lag q. q 4

5 The Autoregressive Time series of order p, AR(p) where {Z t |t T} is a white noise time series with variance 2. Let {X t |t T} be defined by the equation. Then {Z t |t T} is called a Autoregressive time series of order p. (denoted by AR(p)) 5

6 The mean of a stationary AR(p) Assuming {X t |t T} is stationary, and take expectations of the equation, we obtain the mean μ: 6 Now we can center (remove the mean of) the time series as follows :

7 Computing the autocovariance of a stationary AR(p) Now assuming {X t |t T} is stationary with mean zero: 7 Multiplying by X t-h, h 0, and take expectations of the equation, we obtain the Yule-Walker equations for the autocovariance. Note, for a zero mean sequence:

8 Note: For h > 0, we have: The Autocovariance function (h) of a stationary AR(p) series Satisfies the equations: 8 For h = 0, we have:

9 with for h > p The Autocorrelation function (h) of a stationary AR(p) series Satisfies the equations: and 9

10 or: and c 1, c 2, …, c p are determined by using the starting values of the sequence (h). where r 1, r 2, …, r p are the roots of the polynomial 10

11 Conditions for stationarity Autoregressive Time series of order p, AR(p) 11

12 The value of X t increases in magnitude and Z t eventually becomes negligible. If 1 = 1 and = 0. The time series {X t |t T} satisfies the equation: The time series {X t |t T} exhibits deterministic behavior. 12

13 For a AR(p) time series, consider the polynomial with roots r 1, r 2, …, r p then {X t |t T} is stationary if |r i | > 1 for all i. If |r i | < 1 for at least one i then {X t |t T} exhibits deterministic behavior. If |r i | 1 and |r i | = 1 for at least one i then {X t |t T} exhibits non-stationary random behavior. 13

14 since: i.e. the autocorrelation function, (h), of a stationary AR(p) series tails off to zero. and |r 1 |>1, |r 2 |>1, …, | r p | > 1 for a stationary AR(p) series then 14

15 Special Cases: The AR(1) time Let {X t |t T} be defined by the equation. 15

16 Consider the polynomial with root r 1 = 1/ 1 1.{x t |t T} is stationary if |r 1 | > 1 or | 1 | < 1. 2.If |r i | 1 then {X t |t T} exhibits deterministic behavior. 3.If |r i | = 1 or | 1 | = 1 then {X t |t T} exhibits non- stationary random behavior. 16

17 Special Cases: The AR(2) time Let {X t |t T} be defined by the equation. 17

18 Consider the polynomial where r 1 and r 2 are the roots of (x) 1.{X t |t T} is stationary if |r 1 | > 1 and |r 2 | > 1. 2.If |r i | 1 then {X t |t T} exhibits deterministic behavior. 3.If |r i | 1 for i = 1,2 and |r i | = 1 for at least on i then {X t |t T} exhibits non-stationary random behavior. This is true if These inequalities define a triangular region for 1 and 2. 18

19 Patterns of the ACF and PACF of AR(2) Time Series In the shaded region the roots of the AR operator are complex 2 19

20 The Mixed Autoregressive Moving Average Time Series of order p,q The ARMA(p,q) series 20

21 The Mixed Autoregressive Moving Average Time Series of order p and q, ARMA(p,q) Let 1, 2, … p, 1, 2, … p, denote p + q +1 numbers (parameters). Let {Z t |t T} denote a white noise time series with variance 2. –uncorrelated –mean 0, variance 2. Let {X t |t T} be defined by the equation. Then {X t |t T} is called a Mixed Autoregressive- Moving Average time series - ARMA(p,q) series. 21

22 Mean value, variance, autocovariance function, autocorrelation function of an ARMA(p,q) series 22

23 Similar to an AR(p) time series, for certain values of the parameters 1, …, p an ARMA(p,q) time series may not be stationary. An ARMA(p,q) time series is stationary if the roots (r 1, r 2, …, r p ) of the polynomial (x) = 1 – 1 x – 2 x 2 - … - p x p satisfy | r i | > 1 for all i. 23

24 Assume that the ARMA(p,q) time series {X t |t T} is stationary: Let = E(X t ). Then or 24

25 The Autocovariance function, (h), of a stationary mixed autoregressive-moving average time series {X t |t T} be determined by the equation: Thus 25

26 Hence 26

27 27

28 We need to calculate: etc 28

29 The autocovariance function (h) satisfies: For h = 0, 1. …, q: for h > q: 29

30 We then use the first (p + 1) equations to determine: (0), (1), (2), …, (p) We use the subsequent equations to determine: (h) for h > p. 30

31 Example:The autocovariance function, (h), for an ARMA(1,1) time series: For h = 0, 1: for h > 1: or 31

32 Substituting (0) into the second equation we get: or Substituting (1) into the first equation we get: 32

33 for h > 1: 33

34 The Backshift Operator B 34

35 Consider the time series {X t : t T} and Let M denote the linear space spanned by the set of random variables {X t : t T} (i.e. all linear combinations of elements of {X t : t T} and their limits in mean square). M is a vector space Let B be an operator on M defined by: BX t = X t-1. B is called the backshift operator. 35

36 Note: 1. 2.We can also define the operator B k with B k X t = B(B(...BX t )) = X t-k. 3.The polynomial operator p(B) = c 0 I + c 1 B + c 2 B c k B k can also be defined by the equation. p(B)X t = (c 0 I + c 1 B + c 2 B c k B k )X t. = c 0 IX t + c 1 BX t + c 2 B 2 X t c k B k X t = c 0 X t + c 1 X t-1 + c 2 X t c k X t-k 36

37 4.The power series operator p(B) = c 0 I + c 1 B + c 2 B can also be defined by the equation. p(B)X t = (c 0 I + c 1 B + c 2 B )X t = c 0 IX t + c 1 BX t + c 2 B 2 X t +... = c 0 X t + c 1 X t-1 + c 2 X t If p(B) = c 0 I + c 1 B + c 2 B and q(B) = b 0 I + b 1 B + b 2 B are such that p(B)q(B) = I i.e. p(B)q(B)X t = IX t = X t than q(B) is denoted by [p(B)]

38 Other operators closely related to B: 1. F = B -1,the forward shift operator, defined by FX t = B -1 X t = X t+1 and 2. = I - B,the first difference operator, defined by X t = (I - B)X t = X t - X t-1. 38

39 The Equation for a MA(q) time series X t = 0 Z t + 1 Z t Z t q Z t-q + can be written X t = (B) Z t + where (B) = 0 I + 1 B + 2 B q B q 39

40 The Equation for a AR(p) time series X t = 1 X t X t p X t-p + + Z t can be written (B) X t = + Z t where (B) = I - 1 B - 2 B p B p 40

41 The Equation for a ARMA(p,q) time series X t = 1 X t X t p X t-p + + Z t + 1 Z t Z t q Z t-q can be written (B) X t = (B) Z t + where (B) = 0 I + 1 B + 2 B q B q and (B) = I - 1 B - 2 B p B p 41

42 Some comments about the Backshift operator B 1.It is a useful notational device, allowing us to write the equations for MA(q), AR(p) and ARMA(p, q) in a very compact form; 2.It is also useful for making certain computations related to the time series described above; 42

43 The partial autocorrelation function A useful tool in time series analysis 43

44 The partial autocorrelation function Recall that the autocorrelation function of an AR(p) process satisfies the equation: x (h) = 1 x (h-1) + 2 x (h-2) p x (h-p) For 1 h p these equations (Yule-Walker) become: x (1) = x (1) p x (p-1) x (2) = 1 x (1) p x (p-2)... x (p) = 1 x (p-1)+ 2 x (p-2) p. 44

45 In matrix notation: These equations can be used to find 1, 2, …, p, if the time series is known to be AR(p) and the autocorrelation x (h) function is known. 45

46 In this case If the time series is not autoregressive the equations can still be used to solve for 1, 2, …, p, for any value of p >1. are the values that minimizes the mean square error: 46

47 Definition: The partial auto correlation function at lag k is defined to be: 47

48 Comment: The partial auto correlation function, kk is determined from the auto correlation function, (h) 48

49 Some more comments: 1.The partial autocorrelation function at lag k, kk, can be interpreted as a corrected autocorrelation between X t and X t-k conditioning on the intervening variables X t-1, X t-2,..., X t-k+1. 2.If the time series is an AR(p) time series than kk = 0 for k > p 3.If the time series is an MA(q) time series than x (h) = 0 for h > q 49

50 A General Recursive Formula for Autoregressive Parameters and the Partial Autocorrelation function (PACF) 50

51 Let denote the autoregressive parameters of order k satisfying the Yule Walker equations: 51

52 Then it can be shown that: and 52

53 Proof: The Yule Walker equations: 53

54 In matrix form: 54

55 The equations for 55

56 and The matrix A reverses order 56

57 The equations may be written Multiplying the first equations by or 57

58 Substituting this into the second equation or and 58

59 Hence and or 59

60 Some Examples 60

61 Example 1: MA(1) time series Suppose that {X t |t T} satisfies the following equation: X t = Z t Z t – 1 where {Z t |t T} is white noise with = 1.1. Find: 1.The mean of the series, 2.The variance of the series, 3.The autocorrelation function. 4.The partial autocorrelation function. 61

62 Solution Now {X t |t T} satisfies the following equation: X t = Z t Z t – 1 Thus: 1.The mean of the series, = 12.0 The autocovariance function for an MA(1) is 62

63 Thus: 2.The variance of the series, (0) = and 3.The autocorrelation function is: 63

64 4.The partial auto correlation function at lag k is defined to be: Thus 64

65 65

66 Graph: Partial Autocorrelation function kk 66

67 Exercise: Use the recursive method to calculate kk and 67

68 Exercise: Use the recursive method to calculate kk and 68

69 Example 2: AR(2) time series Suppose that {X t |t T} satisfies the following equation: X t = 0.4 X t – X t – Z t where {Z t |t T} is white noise with = 2.1. Is the time series stationary? Find: 1.The mean of the series, 2.The variance of the series, 3.The autocorrelation function. 4.The partial autocorrelation function. 69

70 1.The mean of the series 3.The autocorrelation function. Satisfies the Yule Walker equations 70

71 hence 71

72 2.the variance of the series 4.The partial autocorrelation function. 72

73 The partial autocorrelation function of an AR(p) time series cuts off after p. 73

74 Example 3: ARMA(1, 2) time series Suppose that {X t |t T} satisfies the following equation: X t = 0.4 X t – Z t Z t – Z t – 1 where {Z t |t T} is white noise with = 1.6. Is the time series stationary? Find: 1.The mean of the series, 2.The variance of the series, 3.The autocorrelation function. 4.The partial autocorrelation function. 74

75 Theoretical Patterns of ACF and PACF 75 Type of Model Typical Pattern of ACF Typical Pattern of PACF AR (p)Decays exponentially or with damped sine wave pattern or both Cut-off after lags p MA (q)Cut-off after lags q Declines exponentially ARMA (p,q)Exponential decay

76 Reference GEP Box, GM Jenkins, GC Reinsel (1994) Time series analysis: Forecasting and control, Prentice- Hall. Brockwell, Peter J. and Davis, Richard A. (1991). Time Series: Theory and Methods. Springer-Verlag. We also thank colleagues who posted their notes as on-line open resources for time series analysis. 76


Download ppt "Stationary Time Series AMS 586 1. The Moving Average Time series of order q, MA(q) where {Z t |t T} denote a white noise time series with variance 2."

Similar presentations


Ads by Google