BOX JENKINS (ARIMA) METHODOLOGY

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

SMA 6304 / MIT / MIT Manufacturing Systems Lecture 11: Forecasting Lecturer: Prof. Duane S. Boning Copyright 2003 © Duane S. Boning. 1.
Autocorrelation Functions and ARIMA Modelling
Autoregressive Integrated Moving Average (ARIMA) models
Dates for term tests Friday, February 07 Friday, March 07
Part II – TIME SERIES ANALYSIS C4 Autocorrelation Analysis © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Applied Business Forecasting and Planning
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
Economics 310 Lecture 25 Univariate Time-Series Methods of Economic Forecasting Single-equation regression models Simultaneous-equation regression models.
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Non-Seasonal Box-Jenkins Models
BABS 502 Lecture 9 ARIMA Forecasting II March 23, 2009.
Modeling Cycles By ARMA
BABS 502 Lecture 8 ARIMA Forecasting II March 16 and 21, 2011.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
Non-Seasonal Box-Jenkins Models
BOX JENKINS METHODOLOGY
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
Teknik Peramalan: Materi minggu kedelapan
The Box-Jenkins Methodology for ARIMA Models
STAT 497 LECTURE NOTES 4 MODEL INDETIFICATION AND NON- STATIONARY TIME SERIES MODELS 1.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
#1 EC 485: Time Series Analysis in a Nut Shell. #2 Data Preparation: 1)Plot data and examine for stationarity 2)Examine ACF for stationarity 3)If not.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 2 review: Quizzes 7-12* (*) Please note that.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
It’s About Time Mark Otto U. S. Fish and Wildlife Service.
John G. Zhang, Ph.D. Harper College
Autocorrelation, Box Jenkins or ARIMA Forecasting.
Big Data at Home Depot KSU – Big Data Survey Course Steve Einbender Advanced Analytics Architect.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
1 Chapter 3:Box-Jenkins Seasonal Modelling 3.1Stationarity Transformation “Pre-differencing transformation” is often used to stablize the seasonal variation.
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Forecasting (prediction) limits Example Linear deterministic trend estimated by least-squares Note! The average of the numbers 1, 2, …, t is.
Case 2 Review Brad Barker, Benjamin Milroy, Matt Sonnycalb, Kristofer Still, Chandhrika Venkataraman Time Series - February 2013.
1 Chapter 5 : Volatility Models Similar to linear regression analysis, many time series exhibit a non-constant variance (heteroscedasticity). In a regression.
The Box-Jenkins (ARIMA) Methodology
MODELS FOR NONSTATIONARY TIME SERIES By Eni Sumarminingsih, SSi, MM.
Module 4 Forecasting Multiple Variables from their own Histories EC 827.
Seasonal ARIMA FPP Chapter 8.
Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
Econometric methods of analysis and forecasting of financial markets Lecture 3. Time series modeling and forecasting.
Subodh Kant. Auto-Regressive Integrated Moving Average Also known as Box-Jenkins methodology A type of linear model Capable of representing stationary.
MODEL DIAGNOSTICS By Eni Sumarminingsih, Ssi, MM.
Covariance, stationarity & some useful operators
Case study 4: Multiplicative Seasonal ARIMA Model
Dynamic Models, Autocorrelation and Forecasting
Financial Econometrics Lecture Notes 2
Ch8 Time Series Modeling
Lecture 8 ARIMA Forecasting II
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Statistics 153 Review - Sept 30, 2008
Model Building For ARIMA time series
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS.
Applied Econometric Time-Series Data Analysis
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati
A Weighted Moving Average Process for Forecasting “Economics and Environment” By Chris P. Tsokos.
Time Series introduction in R - Iñaki Puigdollers
Chap 7: Seasonal ARIMA Models
Presentation transcript:

BOX JENKINS (ARIMA) METHODOLOGY ARIMA MODELS Specialized class of linear filtering techniques that uses current and past values of the dependent variable and residuals to produce accurate short-term forecasts. BOX JENKINS METHOD An iterative approach of identifying a potential model from a general class of ARIMA models for a stationary time series.

AUTOREGRESSIVE MODELS AR = AutoRegressive The explanatory variables in the model are time-lagged values of the forecast variable.

MOVING AVERAGE MODELS MA = Moving Average Not the same as moving average methods. Here we are using weighted averages of the error series.

ARMA MODELS AR models can be effectively coupled with MA models to form a general and useful class of tie series models called autoregressive moving average (ARMA) models. These can only be used when data are stationary. Stationary data is has a constant mean and constant variance. These models can be used with data that are non-stationary in the mean by differencing the series.

ARIMA (p, d, q) MODELS AR: p = order of the autoregressive process I: d = degree of differencing involved MA: q = order of the moving average process

BASIC MODELS ARIMA (0, 0, 0) ― WHITE NOISE ARIMA (0, 1, 0) ― RANDOM WALK ARIMA (1, 0, 0) ― AUTOREGRESSIVE MODEL (order 1) ARIMA (0, 0, 1) ― MOVING AVERAGE MODEL (order 1) ARIMA (1, 0, 1) ― SIMPLE MIXED MODEL

RANDOM WALK ARIMA (0, 1, 0) or where B is the backward shift operator

AUTOREGRESSIVE MODEL (order 1) ARIMA (1, 0, 0) or AR(1) or where is autoregressive parameter and B is the backward shift

MOVING AVERAGE MODEL (order 1) ARIMA (0, 0, 1) or MA(1) or where is moving average parameter and B is the backward shift operator

SIMPLE MIXED MODEL ARIMA (1, 0, 1)

BOX-JENKINS PARADIGM IDENTIFICATION ↓    ESTIMATION   ↓    data preparation model selection           ↓    ESTIMATION   ↓    DIAGNOSTIC CHECKING   IF NOT OK    ↓ OK FORECAST

IDENTIFICATION STAGE Step1: Make series stationary. Difference data to stabilize mean If trend is linear, first order differencing will suffice If trend is nonlinear, second order differencing is required (take differences of the 1st differenced data) Transform data to stabilize variance Transform data using log or square root Adds another level of complexity to the process

IDENTIFICATION STAGE Step2: Identify an appropriate model. Compare the ACF of the series with the theoretical ACF for various ARIMA models. Compare the PACF of the series with the theoretical PACF for various ARIMA models PACF = partial autocorrelation measures the degree of relationship between current value and some earlier value of the series when the effects of other time lags are removed.

THEORETICALLY AR (p)  (i) autocorrelations tail off to zero exponentially (ii) p significant partial autocorrelations MA (q)  (i) q significant autocorrelations (ii) partial autocorrelations tail off to zero exponentially

IDENTIFICATION Plot the data. If necessary transform data to achieve stationarity in the variance (log or power transformation). Check time series plot, ACF and PACF for nonstationarity in the mean. If nonstationary in the mean, take first differences. When stationarity has been achieved, examine ACF and PACF to identify plausible model by comparing to theoretical patterns. Note: Because we are dealing with real data, and randomness is present, the ACF and PACF will rarely follow the underlying theoretical process exactly.

ESTIMATION Study sampling statistics of current solution a) check for significance of parameters by looking at t-test results (p-values less than typical α = .05 level indicate significance). b) can overfit to verify the model solution (additional terms will be insignificant) c) decide between competing models by checking for a minimum MS or AIC (the latter is the Akaike’s Information Criterion, however it is not provided in Minitab)

DIAGNOSTIC CHECKING Study residuals to verify that they are random a) check ACF of residuals (no significant spikes should be present) b) check PACF of residuals (no significant spikes should be present) c) test for individual significant autocorrelations and use Box-Pierce Q statistic to test for nonrandomness in a set of autocorrelations (all p-values should be > 0.05) d) if residuals are not white noise, go back to identification to improve model selection

BOX-PIERCE Q STATISTIC Null Hypothesis: The set of autocorrelations out to lag k is equivalent to the null set (implies the residuals are random). Accepting the null hypothesis indicates that the model is adequate as there is no pattern left in the residuals. Consequently we want p-values greater than the typical alpha level of 0.05 as we want to accept the null here.