Time Series Analysis Negar Koochakzadeh.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

SMA 6304 / MIT / MIT Manufacturing Systems Lecture 11: Forecasting Lecturer: Prof. Duane S. Boning Copyright 2003 © Duane S. Boning. 1.
Time series modelling and statistical trends
Marian Scott SAGES, March 2009
Autocorrelation Functions and ARIMA Modelling
Autoregressive Integrated Moving Average (ARIMA) models
Stationary Time Series
Dates for term tests Friday, February 07 Friday, March 07
Part II – TIME SERIES ANALYSIS C4 Autocorrelation Analysis © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Unit Roots & Forecasting
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Applied Business Forecasting and Planning
Time Series Analysis Materials for this lecture Lecture 5 Lags.XLS Lecture 5 Stationarity.XLS Read Chapter 15 pages Read Chapter 16 Section 15.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
Economics 310 Lecture 25 Univariate Time-Series Methods of Economic Forecasting Single-equation regression models Simultaneous-equation regression models.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Business Forecasting Chapter 10 The Box–Jenkins Method of Forecasting.
Non-Seasonal Box-Jenkins Models
13 Introduction toTime-Series Analysis. What is in this Chapter? This chapter discusses –the basic time-series models: autoregressive (AR) and moving.
Moving Averages Ft(1) is average of last m observations
ARIMA-models for non-stationary time series
Modeling Cycles By ARMA
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Economics 20 - Prof. Anderson
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Non-Seasonal Box-Jenkins Models
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
BOX JENKINS METHODOLOGY
Chapter 15 Forecasting Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
AR- MA- och ARMA-.
TIME SERIES by H.V.S. DE SILVA DEPARTMENT OF MATHEMATICS
STAT 497 LECTURE NOTES 2.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
#1 EC 485: Time Series Analysis in a Nut Shell. #2 Data Preparation: 1)Plot data and examine for stationarity 2)Examine ACF for stationarity 3)If not.
Geographic Information Science
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
It’s About Time Mark Otto U. S. Fish and Wildlife Service.
John G. Zhang, Ph.D. Harper College
Autocorrelation, Box Jenkins or ARIMA Forecasting.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
1 Chapter 3:Box-Jenkins Seasonal Modelling 3.1Stationarity Transformation “Pre-differencing transformation” is often used to stablize the seasonal variation.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Time Series Analysis Lecture 11
Case 2 Review Brad Barker, Benjamin Milroy, Matt Sonnycalb, Kristofer Still, Chandhrika Venkataraman Time Series - February 2013.
The Box-Jenkins (ARIMA) Methodology
Seasonal ARIMA FPP Chapter 8.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 14 l Time Series: Understanding Changes over Time.
Ch16: Time Series 24 Nov 2011 BUSI275 Dr. Sean Ho HW8 due tonight Please download: 22-TheFed.xls 22-TheFed.xls.
Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Analysis of financial data Anders Lundquist Spring 2010.
MODEL DIAGNOSTICS By Eni Sumarminingsih, Ssi, MM.
EC 827 Module 2 Forecasting a Single Variable from its own History.
Statistics for Business and Economics Module 2: Regression and time series analysis Spring 2010 Lecture 8: Time Series Analysis and Forecasting 2 Priyantha.
Forecasting. Model with indicator variables The choice of a forecasting technique depends on the components identified in the time series. The techniques.
1 Autocorrelation in Time Series data KNN Ch. 12 (pp )
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Lecturer Dr. Veronika Alhanaqtah
CH2 Time series.
Load forecasting Prepared by N.CHATHRU.
BOX JENKINS (ARIMA) METHODOLOGY
Chap 7: Seasonal ARIMA Models
Presentation transcript:

Time Series Analysis Negar Koochakzadeh

Outline Introduction: Stationary / Non-stationary TS Data Time Series Data Stationary / Non-stationary TS Data Existing TSA Models AR (Auto-Regression) MA (Moving Average) ARMA (Auto-Regression Moving Average) ARIMA (Auto-Regression Integrated Moving Average) SARIMA (Seasonal ARIMA) Examples Example 1: International Airline Passenger Example 2&3: Energy Load Prediction Time Series Data Mining Time Series Classification (SVM) Example Example 4: Stock Market Analysis

Time Series Data In many fields of study, data is collected from a system over time. This sequence of observations generated a time series: Examples: Closing prices of the stock market A country’s unemployment rate Temperature readings of an industrial furnace Sea level changes in coastal regions Number of flu cases in a region Inventory levels at a production site

Temporal Behaviour Most physical processes do not change quickly, often makes consecutive observation correlated. Correlation between consecutive observation is called autocorrelation. Most of the standard modeling methods based on the assumption of independent observations can be misleading. We need to consider alternative methods that take into account the serial dependence in the data.

Stationary Time Series Data Stationary time series are characterized by having a distribution that is independent of time shifts. Mean and variance of these time series are constants If arbitrary snapshots of the time series we study exhibit similar behaviour in central tendency and spread, we can assume that the time series is indeed stationary.

Stationary or Non-Stationary? In practice, there is no clear demarcation line between a stationary and a non-stationary process. Some methods to identify: Visual inspection Using intuition and knowledge about the process Autocorrelation Function (ACF) Variogram

Visual Inspection A properly constructed graph of a time series can dramatically improve the statistical analysis and accelerate the discovery of the hidden information in the data. “You can observe a lot by watching.” This is particularly true with time series data analysis! [Yogi Berra, 1963]

Intuition and knowledge Inspection Does it make sense... for a tightly controlled chemical process to exhibit similar behaviour in mean and variance in time? to expect the stock market out it “to remain in equilibrium about a constant mean level” The selection of a stationary or non-stationary model must often be made on the basis of not only the data but also a physical understanding of the process.

Autocorrelation Function (ACF) Autocorrelation is the cross- correlation of a time series data with itself based on lag k ACF summarizes as a function of k, how correlated the observations that are k lags apart are. If the ACF does not dampen out then the process is likely not stationary (If a time series is non-stationary, the ACF will not die out quickly) Confidence interval =  indicate the reliability of an estimate

Variogram The Variogram Gk measures the variance of differences k time units apart relative to the variance of the differences one time unit apart For stationary process, Gk when plotted as a function of k will reach an asymptote line. However, if the process is non-stationary, Gk will increase monotonically.

Modeling and Prediction “If we wish to make predictions, then clearly we must assume that something does not vary with time.” [Brockwell and Davis, 2002] Let’s try to predict and build a model for our time series process based on: Serial Dependency Leading Indicators Disturbance True disturbances caused by unknown and/or uncontrollable factors that have direct impact on the process. It is impossible to come up with a comprehensive deterministic model to account for all these possible disturbances, since by definition they are unknown. In these cases, a probabilistic or stochastic model will be more appropriate to describe the behaviour of the process.

Notations Backshift Operator

Auto-Regressive Models AR(P) Where at is an error term (called white error) assumed to be uncorrelated with zero mean and constant variance. The random error at cannot be observed. Instead we estimate it by using the one-step-ahead forecast error The regression coefficients , i = 1, ... , p, are parameters to be estimated from the data

Moving Average Current and previous disturbances affect the value. We have a sequence of random shocks bombarding the system and not just a single shock. MA(q) Uncorrelated random shocks with zero mean and constant variance The coefficients , i = 1, ... , q are parameters to be determined from the data

Auto-Regressive Moving Average ARMA(p,q) Typical stationary time series models come in three general classes, auto-regressive (AR) models, moving average (MA) models, or a combination of the two (ARMA).

Identifying appropriate Model The ACF plays an extremely crucial role in the identification of time series models The identification of the particular model within ARMA class of models is determined by looking at the ACF and PACF.

Partial Autocorrelation Function (PACF) Partial Autocorrelation is the partial cross-correlation of a time series data with itself based on lag k Partial correlation is a conditional correlation: It is the correlation between two variables under the assumption that we know and take into account the values of some other set of variables How Zt and Zt-k are correlated taking into account how both Zt and Zt-k are related to Zt-1 , Zt-2 , ... , Zt-k+1 The kth order PACF measure correlation between Zt and Zt+k after adjustments have been made for the intermediate observations Zt-1 , Zt-2 , ... , Zt-k+1 where   denotes the projection of x onto the space spanned by Zt-1 , Zt-2 , ... , Zt-k+1 Partial Correlation: For instance, consider a regression context in which y = response variable and x1, x2, and x3 are predictor variables.  The partial correlation between y and x3 is the correlation between the variables determined taking into account how both y and x3 are related to x1 and x2. In regression, this partial correlation could be found by correlating the residuals from two different regressions:  (1) Regression in which we predict y from x1 and x2, (2) regression in which we predict x3 from x1 and x2.  Basically, we correlate the “parts” of y and x3 that are not predicted by x1 and x2.

ARMA Model identification from ACF and PACF AR(p) MA(q) ARMA(p, q) Infinite damped exponentials and/or damped sine waves; Tails off Finite; cuts off after q lags Infinite damped exponentials and/or damped sine waves; Tails off ACF Infinite damped exponentials and/or damped sine waves; Tails off Finite; cuts off after p lags Infinite damped exponentials and/or damped sine waves; Tails off PACF Source: Adapted from BJR

Examples

Models for Non-Stationary Data Standard autoregressive moving average (ARMA) time series models apply only to stationary time series. The assumption that a time series is stationary is quite unrealistic. (Stationary is not natural!) For a system to exhibit a stationary behaviour, it has to be tightly controlled and maintained in time. Otherwise, systems will tend to drift away from stationary

Converting Non-Stationary Data to Stationary More realistic is to claim that the changes to a process, or the first difference, form a stationary process. And if that is not realistic, we mat try to see if the changes of the changes, the second difference, form a stationary process. If that is the case, we can then model the changes, make forecasts about the future values of these changes, and from the model of the changes build models and create forecasts of the original non-stationary time series. In practice, we seldom need to go beyond second order differencing.

Auto Regressive Integrated Moving Average (ARIMA) In the case of non-stationary data, differencing before we use the (stationary) ARMA model to fit the (differenced) data is appropriate. Because the inverse operation of differencing is summing or integrating, an ARMA model applied to d differenced data is called an autoregressive integrated moving average process, ARIMA (p, d, q). In practice, the orders p, d, and q are seldom higher than 2.

Stages of the time series model building process using ARIMA Consider a general ARIMA Model Identify the appropriate degree of differencing if needed Using ACF and PACF, find a tentative model Estimate the parameters of the model using appropriate software Perform the residual analysis. Is the model adequate? Start forecasting

Model Evaluation Once a model has been fitted to the data, we process to conduct a number of diagnostic checks. If the model fits well, the residuals should essentially behave like white noise. In other words, the residuals should be uncorrelated with constant variance. Standard checks are to compute the ACF and PACF of the residuals. If they appear in the confidence interval there is no alarm indications that the model does not fit well.

Exponentially Weighted Moving Average Special case of ARIMA model: EWMA Unlike a regular average that assigns equal weight to all observation, an EWMA has a relatively short memory that assigns decreasing weights to past observations. EWMA made practical sense that a forecast should be a weighted average that assigns most weight to the most immediate past observation, somewhat less weight to the second to the last observation, and so on. It just made good practical sense.

Seasonal Models For ARIMA models, the serial dependence of the current observation to the previous observations was often strongest for the immediate past and followed a decaying pattern as we move further back in time. For some systems, this dependence shows a repeating, cyclic behaviour. This cyclic pattern or as more commonly called seasonal pattern can be effectively used to further improve the forecasting performance. The ARIMA models are flexible enough to allow for modeling both seasonal and non-seasonal dependence.

Example 1: International Airline Passengers

Trend and Seasonal Relationship Two relationship going on simultaneously: Between observations for successive months within the same year Between observation for the same month in successive years. Therefore, we essentially need to build two time series models, and then combine the two. If the season is s period long, in this example s = 12 months, then observation that are s time intervals apart are alike.

Pre-Processing Log Transformation The variability is not constant, but increases over time and gets larger. The goal of the transformation is to identify a scale where the residuals, after fitting a model, will have homogeneous variability.

Apply Differencing on Seasonal Data For seasonal data, we may need to use not only regular difference but also a seasonal difference . Sometimes, we may even need both (e.g., ) to obtain an ACF that dies out sufficiently quickly.

Investigate ACFs Only the last one (combination of regular difference and seasonal difference) is stationary:

Model Identification Identifying stationary seasonal models is a modification of the one used for regular ARMA time series models where the patterns of the sample ACF and PACF provide guidance. First, look for similarities that are 12 lags apart. ACF seems to cut off after the first one (in k=12). This is a sign of a Moving Average Model applied to the 12- month seasonal pattern. Second, look for patterns between successive months ACF seems to cut off after the first one First order MA term in the regular model In non-seasonal data, it is usually sufficient to consider the ACF and PACF for up to 20-25 lags. Bur for seasonal data, we recommend increasing the number of lags to at least 3 or 4 multiples of the seasonality. ACF PACF

Model Evaluation ACF of the residuals after fitting a first order SMA model to : We see that the ACF shows a significant negative spike at lag 1, indicating that we need an additional regular moving average term

ARIMA (p,d,q)*(P,D,Q)12

Example 2: Energy Peak Load Prediction The hourly peak load follows a daily periodic pattern S=24 hours Covert peak load values into and then apply ARMA ACF PACF

Example 3: Energy Load Prediction Daily, weekly, and monthly periodic patterns Exogenous Variables (Temperature) They proposed to apply Periodic Auto-Regression (PAR) * An auto-regression is periodic when the parameters are allowed to vary across seasons.

Example 3 (cont’d) Proposed model template: Seasonality varying intercept term Proposed model template: Dummy variable for weekly seasonal Dummy variable for monthly seasonal Exogenous variable for temperature sensitivity

Time Series Data Mining Using Serial Dependency of forecasting variable to build the training set. Leading indicators might exhibit similar behaviour to forecasting variable The important task is to find out whether there exists a lagged relationship between indicators and predicted variable If such a relationship exists, then from the current and past behaviour of the leading indicators, it may be possible to determine how the sales will behave in the near future.

Time Series SVM Optimization problem in SVM: Error in SVM: Error in Modified SVM:

Example 4: Stock Market Analysis Portfolio optimization is the decision process of asset selection and weighting, such that the collection of assets satisfies an investor’s objectives Serial dependency or Lagged Relationship between stock performance and financial indicators from the companies.

Stock Ranking Learn relationship between stocks’ current features and their future rank score. (Lagged Relationship) By Applying modified version of SVM Rank Algorithm for time series based on exponential weighted error.

References [1] Søren Bisgaard and M. Kulahci, TIME SERIES ANALYSIS AND FORECASTING BY EXAMPLE: A JOHN WILEY & SONS, INC., 2011. [2] Rayman Preet Singh, Peter Xiang Gao, and Daniel J. Lizotte, "On Hourly Home Peak Load Prediction," in IEEE SmartGridComm, 2012. [3] Marcelo Espinoza, Caroline Joye, Ronnie Belmans, and Bart De Moor, "Short-Term Load Forecasting, Profile Identification, and Customer Segmentation: A Methodology Based on Periodic Time Series," Power Systems, vol. 20, pp. 1622-1630, 2005. [4] F. E. H. Tay and L. Cao, "Modified support vector machines in financial time series forecasting," Neurocomputing, vol. 48, pp. 847-861, 2002

Questions?