Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Autoregressive Integrated Moving Average (ARIMA) models
Dates for term tests Friday, February 07 Friday, March 07
Nonstationary Time Series Data and Cointegration
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Using SAS for Time Series Data
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
1 MF-852 Financial Econometrics Lecture 11 Distributed Lags and Unit Roots Roy J. Epstein Fall 2003.
Unit Roots & Forecasting
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Stationary Stochastic Process
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
Modeling Cycles By ARMA
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
Economics 20 - Prof. Anderson
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
L7: ARIMA1 Lecture 7: ARIMA Model Process The following topics will be covered: Properties of Stock Returns AR model MA model ARMA Non-Stationary Process.
Business Statistics - QBM117 Statistical inference for regression.
BOX JENKINS METHODOLOGY
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
It’s About Time Mark Otto U. S. Fish and Wildlife Service.
Byron Gangnes Econ 427 lecture 11 slides Moving Average Models.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Forecasting (prediction) limits Example Linear deterministic trend estimated by least-squares Note! The average of the numbers 1, 2, …, t is.
1 Chapter 5 : Volatility Models Similar to linear regression analysis, many time series exhibit a non-constant variance (heteroscedasticity). In a regression.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
The Box-Jenkins (ARIMA) Methodology
Ch16: Time Series 24 Nov 2011 BUSI275 Dr. Sean Ho HW8 due tonight Please download: 22-TheFed.xls 22-TheFed.xls.
Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
Subodh Kant. Auto-Regressive Integrated Moving Average Also known as Box-Jenkins methodology A type of linear model Capable of representing stationary.
Analysis of Financial Data Spring 2012 Lecture: Introduction Priyantha Wijayatunga Department of Statistics, Umeå University
Analysis of financial data Anders Lundquist Spring 2010.
EC 827 Module 2 Forecasting a Single Variable from its own History.
Statistics for Business and Economics Module 2: Regression and time series analysis Spring 2010 Lecture 8: Time Series Analysis and Forecasting 2 Priyantha.
Analysis of Financial Data Spring 2012 Lecture 5: Time Series Models - 3 Priyantha Wijayatunga Department of Statistics, Umeå University
Analysis of Financial Data Spring 2012 Lecture 9: Volatility Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Financial Econometrics Lecture Notes 2
Time Series Analysis and Its Applications
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Statistics 153 Review - Sept 30, 2008
Machine Learning Week 4.
Lecturer Dr. Veronika Alhanaqtah
Time Series introduction in R - Iñaki Puigdollers
CH2 Time series.
BOX JENKINS (ARIMA) METHODOLOGY
Stationary Stochastic Process
Presentation transcript:

Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University Course homepage:

Stochatsic Process Time series models provides facility to look time variation of, for example, log-returns of a stock, stock prices, etc. that may show constant long-term tread, any seasonality, etc. Time series is a realization (sample) from a stochastic process –a sequences of random variables with time order A time series model may have parameters to define it. A time series model without any excesss of parameters is a parsimonious model Usually time series models assumes stationarity

Stationarity Time series may have same fluctuations from period to period –same random behaviour over time Log-returns of a stock from year to year may have similar mean and standard deviation (a time invariant behaviour). Ususally stock prices themselves could not have similar random fluctuations due to inflation, etc Usually seasonal variations (seasoanl effects) are rare in fiancial time series as oppose to econometric time series such as suncream-sales figure–series, etc.

Stationarity Time series is said to be strictly stationary if probability distribution of one part of the time series is same as any other part that hassame duration: If the time series is written as {X t : t=1,...,N} probability distribution of k observations X n+1,....,X n+k is the same as X m+1,....,X m+k where n and m refers to some time points. Probability distribution is invariant of time origin This is a very strong assumption: all aspects of a the two probability distributions should be the same!! Weal stationarity: if the mean, variance and covariance are unchanged (skewness, kurtosis, qunatiles can change!!). Autocorrelation only depends on time distance

Autocorrelation function Stock prices may not be weakly stationary but change in stock prices can be weakly stationary Time series is written as {X t : t=1,...,N} Autocorrelation only depends on time distance

Making a Time Series Stationary Time series is written as Difference operator We can see if the differenced time series is weakly stationary We can continue differencing, usually 1 or 2 times is/are enough!! Compare the variances of differenced time series with each other – smaller the variance the better

Making a Time Series Stationary T X t ∆X t ∆∆X t 1 67, ,10 -2, ,20 1,103, ,90 -3,30 -4, ,35 0,453, ,80 -0,55 -1, ,60 -0,200, ,65 -0,95 -0, ,40 0,751, Descriptive Statistics NStd. DeviationVariance ∆X t 201,168381,365 ∆∆X t 192,054084,219 Valid N (listwise)19

Weak White Noise Process Weak white noise processes are weakly stationary. Weak white noise processes are useful in time series modeling as building blocks for other processes Because there is no dependence from past to present, we cannot predict future from past

Parameter Estimation in a Stationary Process Note: for sample autocovariance functions one may use 1/(n-h) Instead of 1/n but when n>>h it is a small difference between to estimates. Stationary process is ”somewhat” parsimonious. But it is not fully so, as we have infinitely many ρ(1), ρ(2), ρ(3),.. We will look at a class of models that are parsimonious.

Autoregressive models (AR) The time series is modeled such that present observations is a weighted average of past observations and a white noise processes, i.e., as a regression model Note that this is the simple linear regression X t on X t-1

Properties AR(1) So the price at t is just the weighted average of all past noises Stationarity of AR(1) when | ϕ |<1.

Autocorrelation function of AR(1) Autocorrelation function (ACF) of AR(1) only depends on ϕ (when it is <1). Above is its possible shapes If sample ACF (SACF) does not look like one of above ones then AR(1) model is not suitable for the data.

Nonstationary AR(1) When | ϕ |≥1 the AR(1) process is nonstationary (variance and autocorrelations are not finite) When ϕ =1 the AR(1) process is random walk The process is not stationary: variance increases with time linearly

Explosive AR(1) When | ϕ |>1 suppose AR(1) process starts with X 0 and µ=0 The process is not stationary: variance increases with time geometically. Explosive! This AR-process is not much useful becauese most time series do not have this property

Partial Autocorrelation Function Partial autocorrelation at lag p: The excess correlation between X t and X t-p (in an AR(p) model) that is not accounted by X t-1 through X t-p+1. Partial autocorrelation function ”cut–off” at lag p for a time series that fits with the model AR(p)

Residuals for Model Checking Residuals are the observed errors: the difference between observed and model’s fitted values Residuals should be random and close to zero for good models Plot the residuals for seeing patterns and violations of above properties See the autocorrelation (non–independence) of residuals by making an autocorrelogram Apply Box-Ljung tests on residual for checking if their autocorrelation is zero

Moving Average Model X t is moving average model of order 1, MA(1), if That is, present value is not dependent on the immediate past value(s) but weighted average of past values of a white noise process Mean is constant Autocorrelation function does not depend on the lag MA(1) process is unconditionally stationary

Moving Average Process of Order q X t is moving average model of order q, MA(q), if The problem with estimation of θ–coefficients is that since ε– values are not known both should be estimated simulatenously In an MA process the present value is dependent on unknown fatcors

AR(1) and MA The AR(1) process with mean 0 So AR(1) is the average of all past white noise values!!