Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

SMA 6304 / MIT / MIT Manufacturing Systems Lecture 11: Forecasting Lecturer: Prof. Duane S. Boning Copyright 2003 © Duane S. Boning. 1.
Autoregressive Integrated Moving Average (ARIMA) models
Dates for term tests Friday, February 07 Friday, March 07
DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 1 review: Quizzes 1-6.
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Unit Roots & Forecasting
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
Economics 310 Lecture 25 Univariate Time-Series Methods of Economic Forecasting Single-equation regression models Simultaneous-equation regression models.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
Stationary Stochastic Process
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Non-Seasonal Box-Jenkins Models
Forecasting Purpose is to forecast, not to explain the historical pattern Models for forecasting may not make sense as a description for ”physical” behaviour.
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
ARIMA-models for non-stationary time series
Modeling Cycles By ARMA
3 mo treasury yield borrowing costs Dow industrials NY Times 18 Sept 2008 front page.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
Economics 20 - Prof. Anderson
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Non-Seasonal Box-Jenkins Models
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
BOX JENKINS METHODOLOGY
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
Teknik Peramalan: Materi minggu kedelapan
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 2 review: Quizzes 7-12* (*) Please note that.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
Models for Non-Stationary Time Series The ARIMA(p,d,q) time series.
It’s About Time Mark Otto U. S. Fish and Wildlife Service.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
Autocorrelation, Box Jenkins or ARIMA Forecasting.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Forecasting (prediction) limits Example Linear deterministic trend estimated by least-squares Note! The average of the numbers 1, 2, …, t is.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Linear Filters. denote a bivariate time series with zero mean. Let.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
The Box-Jenkins (ARIMA) Methodology
MODELS FOR NONSTATIONARY TIME SERIES By Eni Sumarminingsih, SSi, MM.
Seasonal ARIMA FPP Chapter 8.
Ch16: Time Series 24 Nov 2011 BUSI275 Dr. Sean Ho HW8 due tonight Please download: 22-TheFed.xls 22-TheFed.xls.
Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
Subodh Kant. Auto-Regressive Integrated Moving Average Also known as Box-Jenkins methodology A type of linear model Capable of representing stationary.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Advanced Econometrics - Lecture 5 Univariate Time Series Models.
Review of Unit Root Testing D. A. Dickey North Carolina State University (Previously presented at Purdue Econ Dept.)
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Model Building For ARIMA time series
Machine Learning Week 4.
Time Series Analysis and Forecasting
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary time series can be transformed into a stationary time series, modelled and back-transformed to original scale (e.g. for purposes of forecasting) ARIMA – models These parts can be modelled on a stationary series This part has to do with the transformation

AR-models (for stationary time series) Consider the model Y t = δ +  ·Y t –1 + e t with {e t } i.i.d with zero mean and constant variance = σ 2 (white noise) and where δ (delta) and  (phi) are (unknown) parameters Autoregressive process of order 1: AR(1) Set δ = 0 by sake of simplicity  E(Y t ) = 0   k = Cov(Y t, Y t-k ) = Cov(Y t, Y t+k ) = E(Y t ·Y t-k ) = E(Y t ·Y t+k )

Now:  0 = E(Y t ·Y t ) = E((  ·Y t-1 + e t )  Y t )=  · E(Y t-1 ·Y t ) + E(e t  Y t ) = =  ·  1 + E(e t  (  ·Y t-1 + e t ) ) =  ·  1 +  · E(e t  Y t-1 ) + E(e t ·e t )= =  ·  σ 2 (for e t is independent of Y t-1 )  1 = E(Y t-1 ·Y t ) = E(Y t-1 ·(  ·Y t-1 + e t ) =  · E(Y t-1 ·Y t-1 ) + E(Y t-1 ·e t ) = =  ·  (for e t is independent of Y t-1 )  2 = E(Y t -2 ·Y t ) = E(Y t-2 ·(  ·Y t-1 + e t ) =  · E(Y t-2 ·Y t-1 ) + + E(Y t-2 ·e t ) =  ·  (for e t is independent of Y t-2 ) 

 0 =   1 + σ 2  1 =  ·  0 Yule-Walker equations  2 =  ·  1 …   k =  ·  k-1 =…=  k ·  0  0 =  2 ·  0 + σ 2 

Note that for  0 to become positive and finite (which we require from a variance) the following must hold: This in effect the condition for an AR(1)-process to be weakly stationary Now, note that

Recall that  k is called the autocorrelation function (ACF) ”auto” because it gives correlations within the same time series. For pairs of different time series one can define the cross correlation function which gives correlations at different lags between series. By studying the ACF it might be possible to identify the approximate magnitude of .

Examples:

The general linear process AR(1) as a general linear process:

If |  | < 1  The representation as a linear process is valid |  | < 1 is at the same time the condition for stationarity of an AR(1)-process Second-order autoregressive process

Characteristic equation Write the AR(2) model as

Stationarity of an AR(2)-process The characteristic equation has two roots (second-order equation). (Under certain conditions there is one (multiple) root.) The roots may be complex-valued If the absolute values of the roots both exceed 1 the process is stationary. Absolute value > 1  Roots are outside the unit circle 1 i

Requires (  1,  2 ) to lie within the blue triangle. Some of these pairs define complex roots.

Finding the autocorrelation function Yule-Walker equations: Start with  0 = 1

For any values of  1 and  2 the autocorrelations will decrease exponentially with k For complex roots to the characteristic equation the correlations will show a damped sine wave behaviour as k increases. Se figures on page 74 in the textbook

The general autoregressive process, AR(p) Exponentially decaying Damped sine wave fashion if complex roots

Moving average processes, MA Always stationary MA(1)

General pattern: “cuts off” after lag q

Invertibility (of an MA-process) i.e. an AR(  )-process provided the rendered coefficients  1,  2, … fulfil the conditions of stationarity for Y t They do if the characteristic equation of the MA(q)-process has all its roots outside the unit circle (modulus > 1)

Autogregressive-moving average processes ARMA(p,q)

Non-stationary processes A simple grouping of non-stationary processes: Non-stationary in mean Non-stationary in variance Non-stationary in both mean and variance Classical approach: Try to “make” the process stationary before modelling Modern approach: Try to model the process in it original form

Classical approach Non-stationary in mean Example Random walk

More generally… First-order non-stationary in mean  Use first-order differencing Second-order non-stationary in mean  Use second order differencing …

ARIMA(p,d,q) Common: d ≤ 2 p ≤ 3 q ≤ 3

Non-stationarity in variance Classical approach: Use power transformations (Box-Cox) Common order of application: 1.Square root 2.Fourth root 3.Log 4.Reciprocal (1/Y) For non-stationarity both in mean and variance: 1.Power transformation 2.Differencing