Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 1 review: Quizzes 1-6.
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Regression with Time-Series Data: Nonstationary Variables
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Stationary Stochastic Process
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Business Forecasting Chapter 10 The Box–Jenkins Method of Forecasting.
Forecasting Purpose is to forecast, not to explain the historical pattern Models for forecasting may not make sense as a description for ”physical” behaviour.
BABS 502 Lecture 9 ARIMA Forecasting II March 23, 2009.
Moving Averages Ft(1) is average of last m observations
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
ARIMA-models for non-stationary time series
Modeling Cycles By ARMA
Some more issues of time series analysis Time series regression with modelling of error terms In a time series regression model the error terms are tentatively.
BABS 502 Lecture 8 ARIMA Forecasting II March 16 and 21, 2011.
Prediction and model selection
REGRESSION MODEL ASSUMPTIONS. The Regression Model We have hypothesized that: y =  0 +  1 x +  | | + | | So far we focused on the regression part –
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Economics 20 - Prof. Anderson
Non-Seasonal Box-Jenkins Models
K. Ensor, STAT Spring 2005 The Basics: Outline What is a time series? What is a financial time series? What is the purpose of our analysis? Classification.
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Forecast for the solar activity based on the autoregressive desciption of the sunspot number time series R. Werner Solar Terrestrial Influences Institute.
BOX JENKINS METHODOLOGY
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 2 review: Quizzes 7-12* (*) Please note that.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
It’s About Time Mark Otto U. S. Fish and Wildlife Service.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Autocorrelation, Box Jenkins or ARIMA Forecasting.
Module 3: Introduction to Time Series Methods and Models.
Big Data at Home Depot KSU – Big Data Survey Course Steve Einbender Advanced Analytics Architect.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Forecasting (prediction) limits Example Linear deterministic trend estimated by least-squares Note! The average of the numbers 1, 2, …, t is.
1 Chapter 5 : Volatility Models Similar to linear regression analysis, many time series exhibit a non-constant variance (heteroscedasticity). In a regression.
The Box-Jenkins (ARIMA) Methodology
MODELS FOR NONSTATIONARY TIME SERIES By Eni Sumarminingsih, SSi, MM.
Seasonal ARIMA FPP Chapter 8.
Computational Finance II: Time Series K.Ensor. What is a time series? Anything observed sequentially (by time?) Returns, volatility, interest rates, exchange.
Components of Time Series Su, Chapter 2, section II.
Ch16: Time Series 24 Nov 2011 BUSI275 Dr. Sean Ho HW8 due tonight Please download: 22-TheFed.xls 22-TheFed.xls.
Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Analysis of financial data Anders Lundquist Spring 2010.
Review of Unit Root Testing D. A. Dickey North Carolina State University (Previously presented at Purdue Econ Dept.)
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Model Building For ARIMA time series
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization) Inference is possible (though sometimes questionable) Static Normal-based inference not generally reliable Cyclic component hard to estimate Decomposition Easy to interpret Possible to have dynamic seasonal effects Cyclic components can be estimated Descriptive (no inference per def) Static in trend

Explanation to the static behaviour: The classical approach assumes all components except the irregular ones (i.e.  t and IR t ) to be deterministic, i.e. fixed functions or constants To overcome this problem, all components should be allowed to be stochastic, i.e. be random variates. A time series y t should from a statistical point of view be treated as a stochastic process. We will interchangeably use the terms time series and process depending on the situation.

Stationary and non-stationary time series Characteristics for a stationary time series: Constant mean Constant variance  A time series with trend is non-stationary!

Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary time series can be transformed into a stationary time series, modelled and back-transformed to original scale (e.g. for purposes of forecasting) ARIMA – models These parts can be modelled on a stationary series This part has to do with the transformation

Different types of transformation 1. From a series with linear trend to a series with no trend: First-order differences z t = y t – y t – 1 MTB > diff c1 c2

Note that the differences series varies around zero.

2. From a series with quadratic trend to a series with no trend: Second-order differences w t = z t – z t – 1 = (y t – y t – 1 ) – (y t – 1 – y t – 2 ) = y t – 2y t – 1 + y t – 2 MTB > diff 2 c3 c4

3. From a series with non-constant variance (heteroscedastic) to a series with constant variance (homoscedastic): Box-Cox transformations (per def 1964) Practically  is chosen so that y t +  is always > 0 Simpler form: If we know that y t is always > 0 (as is the usual case for measurements)

The log transform (ln y t ) usually also makes the data ”more” normally distributed Example: Application of root (  y t ) and log (ln y t ) transforms

AR-models (for stationary time series) Consider the model y t = δ +  ·y t –1 + a t with {a t } i.i.d with zero mean and constant variance = σ 2 and where δ (delta) and  (phi) are (unknown) parameters Set δ = 0 by sake of simplicity  E(y t ) = 0 Let R(k) = Cov(y t,y t-k ) = Cov(y t,y t+k ) = E(y t ·y t-k ) = E(y t ·y t+k )  R(0) = Var(y t ) assumed to be constant

Now: R(0) = E(y t ·y t ) = E(y t ·(  ·y t-1 + a t ) =  · E(y t ·y t-1 ) + E(y t ·a t ) = =  ·R(1) + E((  ·y t-1 + a t ) ·a t ) =  ·R(1) +  · E(y t-1 ·a t ) + E(a t ·a t )= =  ·R(1) σ 2 (for a t is independent of y t-1 ) R(1) = E(y t ·y t+1 ) = E(y t ·(  ·y t + a t+1 ) =  · E(y t ·y t ) + E(y t ·a t+1 ) = =  ·R(0) + 0 (for a t+1 is independent of y t ) R(2) = E(y t ·y t+2 ) = E(y t ·(  ·y t+1 + a t+2 ) =  · E(y t ·y t+1 ) + + E(y t ·a t+2 ) =  ·R(1) + 0 (for a t+1 is independent of y t ) 

R(0) =  ·R(1) + σ 2 R(1) =  ·R(0)Yule-Walker equations R(2) =  ·R(1) …  R(k ) =  ·R(k – 1) =…=  k ·R(0) R(0) =  2 ·R(0) + σ 2 

Note that for R(0) to become positive and finite (which we require from a variance) the following must hold: This in effect the condition for an AR(1)-process to be weakly stationary Note now that

ρ k is called the Autocorrelation function (ACF) of y t ”Auto” because it gives correlations within the same time series. For pairs of different time series one can define the Cross correlation function which gives correlations at different lags between series. By studying the ACF it might be possible to identify the approximate magnitude of 

Examples:

The look of an ACF can be similar for different kinds of time series, e.g. the ACF for an AR(1) with  = 0.3 could be approximately the same as the ACF for an Auto-regressive time series of higher order than 1 (we will discuss higher order AR-models later) To do a less ambiguous identification we need another statistic: The Partial Autocorrelation function (PACF): υ k = Corr (y t,y t-k | y t-k+1, y t-k+2,…, y t-1 ) i.e. the conditional correlation between y t and y t-k given all observations in-between. Note that –1  υ k  1

A concept sometimes hard to interpret, but it can be shown that for AR(1)-models with  positive the look of the PACF is and for AR(1)-models with  negative the look of the PACF is

Assume now that we have a sample y 1, y 2,…, y n from a time series assumed to follow an AR(1)-model. Example:

The ACF and the PACF can be estimated from data by their sample counterparts: Sample Autocorrelation function (SAC): if n large, otherwise a scaling might be needed Sample Partial Autocorrelation function (SPAC) Complicated structure, so not shown here

The variance function of these two estimators can also be estimated  Opportunity to test H 0 :  k = 0vs.H a :  k  0 or H 0 :  k = 0vs.H a :  k  0 for a particular value of k. Estimated sample functions are usually plotted together with critical limits based on estimated variances.

Example (cont) DKK/USD exchange: SAC: SPAC: Critical limits

Ignoring all bars within the red limits, we would identify the series as being an AR(1) with positive . The value of  is approximately 0.9 (ordinate of first bar in SAC plot and in SPAC plot)

Higher-order AR-models AR(2):or y t-2 must be present AR(3): or other combinations with  3 y t-3 AR(p): i.e. different combinations with  p y t-p

Stationarity conditions: For p > 2, difficult to express on closed form. For p = 2: The values of  1 and  2 must lie within the blue triangle in the figure below:

Typical patterns of ACF and PACF functions for higher order stationary AR-models (AR( p )): ACF: Similar pattern as for AR(1), i.e. (exponentially) decreasing bars, (most often) positive for  1 positive and alternating for  1 negative. PACF: The first p values of k are non-zero with decreasing magnitude. The rest are all zero (cut-off point at p ) (Most often) all positive if  1 positive and alternating if  1 negative

Examples: AR(2),  1 positive: AR(5),  1 negative: