Seasonal ARIMA FPP Chapter 8.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Dates for term tests Friday, February 07 Friday, March 07
Part II – TIME SERIES ANALYSIS C4 Autocorrelation Analysis © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Analysis of Sales of Food Services & Drinking Places Julianne Shan Ho-Jung Hsiao Christian Treubig Lindsey Aspel Brooks Allen Edmund Becdach.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Applied Business Forecasting and Planning
Time Series Analysis Materials for this lecture Lecture 5 Lags.XLS Lecture 5 Stationarity.XLS Read Chapter 15 pages Read Chapter 16 Section 15.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
Economics 310 Lecture 25 Univariate Time-Series Methods of Economic Forecasting Single-equation regression models Simultaneous-equation regression models.
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Business Forecasting Chapter 10 The Box–Jenkins Method of Forecasting.
Tutorial for solution of Assignment week 40 “Forecasting monthly values of Consumer Price Index Data set: Swedish Consumer Price Index” sparetime.
Forecasting Purpose is to forecast, not to explain the historical pattern Models for forecasting may not make sense as a description for ”physical” behaviour.
BABS 502 Lecture 9 ARIMA Forecasting II March 23, 2009.
1 Power Nine Econ 240C. 2 Outline Lab Three Exercises Lab Three Exercises –Fit a linear trend to retail and food sales –Add a quadratic term –Use both.
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
ARIMA-models for non-stationary time series
Modeling Cycles By ARMA
1 Identifying ARIMA Models What you need to know.
Some Important Time Series Processes Fin250f: Lecture 8.3 Spring 2010.
BABS 502 Lecture 8 ARIMA Forecasting II March 16 and 21, 2011.
Chapter 13 Forecasting.
Prediction and model selection
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Non-Seasonal Box-Jenkins Models
BOX JENKINS METHODOLOGY
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
AR- MA- och ARMA-.
STAT 497 LECTURE NOTES 7 FORECASTING.
STAT 497 LECTURE NOTES 2.
The Box-Jenkins Methodology for ARIMA Models
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
#1 EC 485: Time Series Analysis in a Nut Shell. #2 Data Preparation: 1)Plot data and examine for stationarity 2)Examine ACF for stationarity 3)If not.
Intervention models Something’s happened around t = 200.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 2 review: Quizzes 7-12* (*) Please note that.
Tutorial for solution of Assignment week 39 “A. Time series without seasonal variation Use the data in the file 'dollar.txt'. “
Lecture 7: Forecasting: Putting it ALL together. The full model The model with seasonality, quadratic trend, and ARMA components can be written: Ummmm,
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
It’s About Time Mark Otto U. S. Fish and Wildlife Service.
John G. Zhang, Ph.D. Harper College
FORECASTING. Minimum Mean Square Error Forecasting.
Time Series Basics (2) Fin250f: Lecture 3.2 Fall 2005 Reading: Taylor, chapter , 3.9(skip 3.6.1)
Big Data at Home Depot KSU – Big Data Survey Course Steve Einbender Advanced Analytics Architect.
1 Chapter 3:Box-Jenkins Seasonal Modelling 3.1Stationarity Transformation “Pre-differencing transformation” is often used to stablize the seasonal variation.
LINREG linreg(options) depvar start end residuals # list where: depvar The dependent variable. start endThe range to use in the regression. The default.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Time Series Analysis Lecture 11
Forecasting (prediction) limits Example Linear deterministic trend estimated by least-squares Note! The average of the numbers 1, 2, …, t is.
The Box-Jenkins (ARIMA) Methodology
MODELS FOR NONSTATIONARY TIME SERIES By Eni Sumarminingsih, SSi, MM.
Module 4 Forecasting Multiple Variables from their own Histories EC 827.
Ch16: Time Series 24 Nov 2011 BUSI275 Dr. Sean Ho HW8 due tonight Please download: 22-TheFed.xls 22-TheFed.xls.
Introduction to stochastic processes
Subodh Kant. Auto-Regressive Integrated Moving Average Also known as Box-Jenkins methodology A type of linear model Capable of representing stationary.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Review of Unit Root Testing D. A. Dickey North Carolina State University (Previously presented at Purdue Econ Dept.)
Lecture 8 ARIMA Forecasting II
Applied Econometric Time Series Third Edition
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Machine Learning Week 4.
BOX JENKINS (ARIMA) METHODOLOGY
Chap 7: Seasonal ARIMA Models
Presentation transcript:

Seasonal ARIMA FPP Chapter 8

Backshift Notation

Backshift Notation

Backshift Notation

Backshift Notation

Seasonal ARIMA Models

Seasonal ARIMA Models

Seasonal ARIMA Models

Seasonal ARIMA Models

Seasonal ARIMA Models

Seasonal ARIMA Models

Seasonal ARIMA Models

Seasonal ARIMA Models

Seasonal ARIMA Models

Rules for Identifying ARIMA Order Rule 1: If the series has positive autocorrelations out to a high number of lags, then it probably needs a higher order of differencing. Rule 2: If the lag-1 autocorrelation is zero or negative, or the autocorrelations are all small and patternless, then the series does not need a higher order of differencing. If the lag-1 autocorrelation is -0.5 or more negative, the series may be overdifferenced.  BEWARE OF OVERDIFFERENCING!! Rule 3: The optimal order of differencing is often the order of differencing at which the standard deviation is lowest.

Rules for Identifying ARIMA Order Rule 4: A model with no orders of differencing assumes that the original series is stationary (among other things, mean-reverting). A model with one order of differencing assumes that the original series has a constant average trend (e.g. a random walk or SES-type model, with or without growth). A model with two orders of total differencing assumes that the original series has a time-varying trend (e.g. a random trend or linear exponential smoothing-type model). Rule 5: A model with no orders of differencing normally includes a constant term (which represents the mean of the series). A model with two orders of total differencing normally does not include a constant term. In a model with one order of total differencing, a constant term should be included if the series has a non-zero average trend.

Identifying the Number of AR & MA Terms Rule 6: If the partial autocorrelation function (PACF) of the differenced series displays a sharp cutoff and/or the lag-1 autocorrelation is positive – i.e., if the series appears slightly "underdifferenced" – then consider adding one or more AR terms to the model. The lag beyond which the PACF cuts off is the indicated number of AR terms. Rule 7: If the autocorrelation function (ACF) of the differenced series displays a sharp cutoff and/or the lag-1 autocorrelation is negative – i.e., if the series appears slightly "overdifferenced" – then consider adding an MA term to the model. The lag beyond which the ACF cuts off is the indicated number of MA terms.

Identifying the Number of AR & MA Terms Rule 8: It is possible for an AR term and an MA term to cancel each other's effects, so if a mixed AR-MA model seems to fit the data, also try a model with one fewer AR term and one fewer MA term – particularly if the parameter estimates in the original model require more than 10 iterations to converge. Rule 9: If there is a unit root in the AR part of the model – i.e., if the sum of the AR coefficients is almost exactly 1 – you should reduce the number of AR terms by one and increase the order of differencing by one.

Identifying the Number of AR & MA Terms Rule 10: If there is a unit root in the MA part of the model – i.e., if the sum of the MA coefficients is almost exactly 1 – you should reduce the number of MA terms by one and reduce the order of differencing by one. Rule 11: If the long-term forecasts appear erratic or unstable, there may be a unit root in the AR or MA coefficients.

Identifying the Seasonal Part of the Model Rule 12: If the series has a strong and consistent seasonal pattern, then you should use an order of seasonal differencing – but never use more than one order of seasonal differencing or more than 2 orders of total differencing (seasonal + nonseasonal). Rule 13: If the autocorrelation at the seasonal period is positive, consider adding an SAR term to the model. If the autocorrelation at the seasonal period is negative, consider adding an SMA term to the model. Do not mix SAR and SMA terms in the same model, and avoid using more than one of either kind.

Example 1: European Retail Trade

Example 1: European Retail Trade > auto.arima(euretail) Series: euretail ARIMA(1,1,1)(0,1,1)[4] Coefficients: ar1 ma1 sma1 0.8828 -0.5208 -0.9704 s.e. 0.1424 0.1756 0.6793 sigma^2 estimated as 0.1411: log likelihood=-30.19 AIC=68.37 AICc=69.11 BIC=76.68

Example 1: European Retail Trade > auto.arima(euretail, stepwise=FALSE, approximation=FALSE) Series: euretail ARIMA(0,1,3)(0,1,1)[4] Coefficients: ma1 ma2 ma3 sma1 0.2625 0.3697 0.4194 -0.6615 s.e. 0.1239 0.1260 0.1296 0.1555 sigma^2 estimated as 0.1451: log likelihood=-28.7 AIC=67.4 AICc=68.53 BIC=77.78

Example 1: European Retail Trade