Presentation is loading. Please wait.

Presentation is loading. Please wait.

Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University

Similar presentations


Presentation on theme: "Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University"— Presentation transcript:

1 Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University Priyantha.wijayatunga@stat.umu.se Course homepage: http://www8.stat.umu.se/kursweb/vt012/staa2st017mom2/ http://www8.stat.umu.se/kursweb/vt012/staa2st017mom2/

2 Stochatsic Process Time series models provides facility to look time variation of, for example, log-returns of a stock, stock prices, etc. that may show constant long-term tread, any seasonality, etc. Time series is a realization (sample) from a stochastic process –a sequences of random variables with time order A time series model may have parameters to define it. A time series model without any excesss of parameters is a parsimonious model Usually time series models assumes stationarity

3 Stationarity Time series may have same fluctuations from period to period –same random behaviour over time Log-returns of a stock from year to year may have similar mean and standard deviation (a time invariant behaviour). Ususally stock prices themselves could not have similar random fluctuations due to inflation, etc Usually seasonal variations (seasoanl effects) are rare in fiancial time series as oppose to econometric time series such as suncream-sales figure–series, etc.

4 Stationarity Time series is said to be strictly stationary if probability distribution of one part of the time series is same as any other part that hassame duration: If the time series is written as {X t : t=1,...,N} probability distribution of k observations X n+1,....,X n+k is the same as X m+1,....,X m+k where n and m refers to some time points. Probability distribution is invariant of time origin This is a very strong assumption: all aspects of a the two probability distributions should be the same!! Weal stationarity: if the mean, variance and covariance are unchanged (skewness, kurtosis, qunatiles can change!!). Autocorrelation only depends on time distance

5 Autocorrelation function Stock prices may not be weakly stationary but change in stock prices can be weakly stationary Time series is written as {X t : t=1,...,N} Autocorrelation only depends on time distance

6 Making a Time Series Stationary Time series is written as Difference operator We can see if the differenced time series is weakly stationary We can continue differencing, usually 1 or 2 times is/are enough!! Compare the variances of differenced time series with each other – smaller the variance the better

7 Making a Time Series Stationary T X t ∆X t ∆∆X t 1 67,30 2 65,10 -2,20 3. 66,20 1,103,30 4. 62,90 -3,30 -4,40 5. 63,35 0,453,75 6. 62,80 -0,55 -1,00 7. 62,60 -0,200,35 8. 61,65 -0,95 -0,75 9. 62,40 0,751,70..... Descriptive Statistics NStd. DeviationVariance ∆X t 201,168381,365 ∆∆X t 192,054084,219 Valid N (listwise)19

8 Weak White Noise Process Weak white noise processes are weakly stationary. Weak white noise processes are useful in time series modeling as building blocks for other processes Because there is no dependence from past to present, we cannot predict future from past

9 Parameter Estimation in a Stationary Process Note: for sample autocovariance functions one may use 1/(n-h) Instead of 1/n but when n>>h it is a small difference between to estimates. Stationary process is ”somewhat” parsimonious. But it is not fully so, as we have infinitely many ρ(1), ρ(2), ρ(3),.. We will look at a class of models that are parsimonious.

10 Autoregressive models (AR) The time series is modeled such that present observations is a weighted average of past observations and a white noise processes, i.e., as a regression model Note that this is the simple linear regression X t on X t-1

11 Properties AR(1) So the price at t is just the weighted average of all past noises Stationarity of AR(1) when | ϕ |<1.

12 Autocorrelation function of AR(1) Autocorrelation function (ACF) of AR(1) only depends on ϕ (when it is <1). Above is its possible shapes If sample ACF (SACF) does not look like one of above ones then AR(1) model is not suitable for the data.

13 Nonstationary AR(1) When | ϕ |≥1 the AR(1) process is nonstationary (variance and autocorrelations are not finite) When ϕ =1 the AR(1) process is random walk The process is not stationary: variance increases with time linearly

14 Explosive AR(1) When | ϕ |>1 suppose AR(1) process starts with X 0 and µ=0 The process is not stationary: variance increases with time geometically. Explosive! This AR-process is not much useful becauese most time series do not have this property

15 Partial Autocorrelation Function Partial autocorrelation at lag p: The excess correlation between X t and X t-p (in an AR(p) model) that is not accounted by X t-1 through X t-p+1. Partial autocorrelation function ”cut–off” at lag p for a time series that fits with the model AR(p)

16 Residuals for Model Checking Residuals are the observed errors: the difference between observed and model’s fitted values Residuals should be random and close to zero for good models Plot the residuals for seeing patterns and violations of above properties See the autocorrelation (non–independence) of residuals by making an autocorrelogram Apply Box-Ljung tests on residual for checking if their autocorrelation is zero

17 Moving Average Model X t is moving average model of order 1, MA(1), if That is, present value is not dependent on the immediate past value(s) but weighted average of past values of a white noise process Mean is constant Autocorrelation function does not depend on the lag MA(1) process is unconditionally stationary

18 Moving Average Process of Order q X t is moving average model of order q, MA(q), if The problem with estimation of θ–coefficients is that since ε– values are not known both should be estimated simulatenously In an MA process the present value is dependent on unknown fatcors

19 AR(1) and MA The AR(1) process with mean 0 So AR(1) is the average of all past white noise values!!


Download ppt "Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University"

Similar presentations


Ads by Google