Time Series Analysis and Forecasting

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Dates for term tests Friday, February 07 Friday, March 07
DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 1 review: Quizzes 1-6.
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
STAT 497 APPLIED TIME SERIES ANALYSIS
Business Forecasting Chapter 10 The Box–Jenkins Method of Forecasting.
Non-Seasonal Box-Jenkins Models
Lecture 7 Introduction to Time Series Analysis By Aziza Munir.
13 Introduction toTime-Series Analysis. What is in this Chapter? This chapter discusses –the basic time-series models: autoregressive (AR) and moving.
Modeling Cycles By ARMA
Data Sources The most sophisticated forecasting model will fail if it is applied to unreliable data Data should be reliable and accurate Data should be.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Non-Seasonal Box-Jenkins Models
BOX JENKINS METHODOLOGY
© 2002 Prentice-Hall, Inc.Chap 13-1 Statistics for Managers using Microsoft Excel 3 rd Edition Chapter 13 Time Series Analysis.
ARMA models Gloria González-Rivera University of California, Riverside
TIME SERIES by H.V.S. DE SILVA DEPARTMENT OF MATHEMATICS
STAT 497 LECTURE NOTES 2.
Business Forecasting Used to try to predict the future Uses two main methods: Qualitative – seeking opinions on which to base decision making – Consumer.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Lecture 7: Forecasting: Putting it ALL together. The full model The model with seasonality, quadratic trend, and ARMA components can be written: Ummmm,
1 DSCI 3023 Forecasting Plays an important role in many industries –marketing –financial planning –production control Forecasts are not to be thought of.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
Time Series Analysis and Forecasting
John G. Zhang, Ph.D. Harper College
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Time Series Analysis and Forecasting. Introduction to Time Series Analysis A time-series is a set of observations on a quantitative variable collected.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Linear Filters. denote a bivariate time series with zero mean. Let.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
The Box-Jenkins (ARIMA) Methodology
Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Chapter 15 Forecasting. Forecasting Methods n Forecasting methods can be classified as qualitative or quantitative. n Such methods are appropriate when.
EC 827 Module 2 Forecasting a Single Variable from its own History.
Chapter 9 Forecasting Techniques
Time Series Econometrics
Stochastic Process - Introduction
Covariance, stationarity & some useful operators
Forecasting Methods Dr. T. T. Kachwala.
Ch8 Time Series Modeling
Lecture 8 ARIMA Forecasting II
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
“The Art of Forecasting”
Chapter 4: Seasonal Series: Forecasting and Decomposition
Model Building For ARIMA time series
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati
Introduction to Instrumentation Engineering
Chapter 6: Forecasting/Prediction
Machine Learning Week 4.
Forecasting is an Integral Part of Business Planning
Linear Filters.
Lecturer Dr. Veronika Alhanaqtah
Time Series introduction in R - Iñaki Puigdollers
CH2 Time series.
Load forecasting Prepared by N.CHATHRU.
Demand Management and Forecasting
BOX JENKINS (ARIMA) METHODOLOGY
Forecasting Plays an important role in many industries
Chap 7: Seasonal ARIMA Models
TIME SERIES MODELS – MOVING AVERAGES.
Presentation transcript:

Time Series Analysis and Forecasting Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray Introduction A time series is a set of observations generated sequentially in time Continuous vs. discrete time series The observations from a discrete time series, made at some fixed interval h, at times 1, 2,…, N may be denoted by x(1), x(2),…, x(N) Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray Timseries Analysis and Forecasting Lecture Notes

Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray Introduction (cont.) Discrete time series may arise in two ways: 1- By sampling a continuous time series 2- By accumulating a variable over a period of time Characteristics of time series Time periods are of equal length No missing values Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray Timseries Analysis and Forecasting Lecture Notes

Components of a time series xt = Ft + xt Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Definitions Trend Pattern Although time series data generally exhibit random fluctuations, a time series may also show gradual shifts or movements to relatively higher or lower values over a longer period of time. A trend is usually the result of long-term factors such as population increases or decreases, changing demographic characteristics of the population, technology, and/or consumer preferences.

Definitions A seasonal pattern exists when a series is influenced by seasonal factors (e.g., the quarter of the year, the month, or day of the week). Seasonality is always of a fixed and known period. Hence, seasonal time series are sometimes called periodic time series. A cyclic pattern exists when data exhibit rises and falls that are not of fixed period. The duration of these fluctuations is usually of at least 2 years. Think of business cycles which usually last several years, but where the length of the current cycle is unknown beforehand. Many people confuse cyclic behaviour with seasonal behaviour, but they are really quite different. If the fluctuations are not of fixed period then they are cyclic; if the period is unchanging and associated with some aspect of the calendar, then the pattern is seasonal. In general, the average length of cycles is longer than the length of a seasonal pattern, and the magnitude of cycles tends to be more variable than the magnitude of seasonal patterns.

Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray Areas of application Forecasting Determination of a transfer function of a system Design of simple feed-forward and feedback control schemes Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray Forecasting Applications Economic and business planning Inventory and production control Control and optimization of industrial processes Lead time of the forecasts is the period over which forecasts are needed Degree of sophistication Simple ideas Moving averages Simple regression techniques Complex statistical concepts Box-Jenkins methodology Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Approaches to forecasting Self-projecting approach Cause-and-effect approach Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Approaches to forecasting (cont.) Self-projecting approach Advantages Quickly and easily applied A minimum of data is required Reasonably short-to medium-term forecasts They provide a basis by which forecasts developed through other models can be measured against Disadvantages Not useful for forecasting into the far future Do not take into account external factors Cause-and-effect approach Advantages Bring more information More accurate medium-to long-term forecasts Disadvantages Forecasts of the explanatory time series are required

Some traditional self-projecting models Overall trend models The trend could be linear, exponential, parabolic, etc. A linear Trend has the form Trendt = A + Bt Short-term changes are difficult to track Smoothing models Respond to the most recent behavior of the series Employ the idea of weighted averages They range in the degree of sophistication The simple exponential smoothing method:

Some traditional self-projecting models (cont.) Seasonal models Very common Most seasonal time series also contain long- and short-term trend patterns Decomposition models The series is decomposed into its separate patterns Each pattern is modeled separately Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Drawbacks of the use of traditional models There is no systematic approach for the identification and selection of an appropriate model, and therefore, the identification process is mainly trial-and-error There is difficulty in verifying the validity of the model Most traditional methods were developed from intuitive and practical considerations rather than from a statistical foundation Too narrow to deal efficiently with all time series Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

ARIMA models Autoregressive Integrated Moving-average Can represent a wide range of time series A “stochastic” modeling approach that can be used to calculate the probability of a future value lying between two specified limits

ARIM(p,d,q) 全称为自回归积分滑动平均模型(Autoregressive Integrated Moving Average Model,简记ARIMA),是由博克思(Box)和詹金斯(Jenkins)于70年代初提出一著名时间序列预测方法[1]  ,所以又称为box-jenkins模型、博克思-詹金斯法。其中ARIMA(p,d,q)称为差分自回归移动平均模型,AR是自回归, p为自回归项; MA为移动平均,q为移动平均项数,d为时间序列成为平稳时所做的差分次数。所谓ARIMA模型,是指将非平稳时间序列转化为平稳时间序列,然后将因变量仅对它的滞后值以及随机误差项的现值和滞后值进行回归所建立的模型。ARIMA模型根据原序列是否平稳以及回归中所含部分的不同,包括移动平均过程(MA)、自回归过程(AR)、自回归移动平均过程(ARMA)以及ARIMA过程。

ARIMA models (Cont.) In the 1960’s Box and Jenkins recognized the importance of these models in the area of economic forecasting “Time series analysis - forecasting and control” George E. P. Box Gwilym M. Jenkins 1st edition was in 1976 Often called The Box-Jenkins approach

Transfer function modeling Yt = (B)Xt where (B) = 0 + 1B + 2B2 + ….. B is the backshift operator BmXt = Xt - m

Transfer function modeling (cont.) The study of process dynamics can achieve: Better control Improved design Methods for estimating transfer function models Classical methods Based on deterministic perturbations Uncontrollable disturbances (“noise”) are not accounted for, and hence, these methods have not always been successful Statistical methods Make allowance for “noise” The Box-Jenkins methodology

Process control Feed-forward control Feedback control Control equation z t b 1 B ) ( w d - 2 L + f Compen sating variable X t+ N P Deviation from target output Deviation from target output Control equation 1 2 B ) ( L + - f Compensating variable X t+ N t P

Process control (cont.)

Process control (cont.) The Box-Jenkins approach to control is to typify the disturbance by a suitable time series or stochastic model and the inertial characteristics of the system by a suitable transfer function model The “Control equation”, allows the action which should be taken at any given time to be calculated given the present and previous states of the system Various ways corresponding to various levels of technological sophistication can be used to execute a “control action” called for by the control equation

The Box-Jenkins model building process Model identification Model estimation Is model adequate ? No Modify model Yes Forecasts Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

The Box-Jenkins model building process (cont.) Model identification Autocorrelations Partial-autocorrelations Model estimation The objective is to minimize the sum of squares of errors Model validation Certain diagnostics are used to check the validity of the model Model forecasting The estimated model is used to generate forecasts and confidence limits of the forecasts Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Important Fundamentals A Normal process Stationarity Regular differencing Autocorrelations (ACs) The white noise process The linear filter model Invertibility Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

A Normal process (A Gaussian process) The Box-Jenkins methodology analyze a time series as a realization of a stochastic process. The observation zt at a given time t can be regarded as a realization of a random variable zt with probability density function p(zt) The observations at any two times t1 and t2 may be regarded as realizations of two random variables zt1, zt2 and with joint probability density function p(zt1, zt2) If the probability distribution associated with any set of times is multivariate Normal distribution, the process is called a normal or Gaussian process

Stationary stochastic processes In order to model a time series with the Box-Jenkins approach, the series has to be stationary In practical terms, the series is stationary if tends to wonder more or less uniformly about some fixed level In statistical terms, a stationary process is assumed to be in a particular state of statistical equilibrium, i.e., p(xt) is the same for all t Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Stationary stochastic processes (cont.) the process is called “strictly stationary” if the joint probability distribution of any m observations made at times t1, t2, …, tm is the same as that associated with m observations made at times t1 + k, t2 + k, …, tm + k When m = 1, the stationarity assumption implies that the probability distribution p(zt) is the same for all times t Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Stationary stochastic processes (cont.) In particular, if zt is a stationary process, then the first difference zt = zt - zt-1and higher differences dzt are stationary Most time series are nonstationary Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Achieving stationarity Regular differencing (RD) (1st order) xt = (1 – B)xt = xt – xt-1 (2nd order) 2xt = (1 – B)2xt = xt – 2xt-1 + xt-2 “B” is the backward shift operator It is unlikely that more than two regular differencing would ever be needed Sometimes regular differencing by itself is not sufficient and prior transformation is also needed Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Some nonstationary series Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Some nonstationary series (cont.) Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Some nonstationary series (cont.) How can we determine the number of regular differencing ? Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Autocorrelations (ACs) Autocorrelations are statistical measures that indicate how a time series is related to itself over time The autocorrelation at lag 1 is the correlation between the original series zt and the same series moved forward one period (represented as zt-1) Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Autocorrelations (cont.) The theoretical autocorrelation function The sample autocorrelation

Autocorrelations (cont.) A graph of the correlation values is called a “correlogram” In practice, to obtain a useful estimate of the autocorrelation function, at least 50 observations are needed The estimated autocorrelations rk would be calculated up to lag no larger than N/4 Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

A “correlogram” of a nonstationary time seies

Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray After one RD Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

After two RD

The white noise process The Box-Jenkins models are based on the idea that a time series can be usefully regarded as generated from (driven by) a series of uncorrelated independent “shocks” et Such a sequence et, et-1, et-2,… is called a “white noise process”

The linear filter model ) B ( y x t White noise e A “linear filter” is a model that transform the white noise process et to the process that generated the time series xt

The linear filter model (cont.) (B) is the “transfer function” of the filter Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

The linear filter model (cont.) The linear filter can be put in another form This form can be written

Stationarity and invertibility conditions for a linear filter For a linear process to be stationary, If the current observation xt depends on past observations with weights which decrease as we go back in time, the series is called invertible For a linear process to be invertible,

Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray Model building blocks Autoregressive (AR) models Moving-average (MA) models Mixed ARMA models Non stationary models (ARIMA models) The mean parameter The trend parameter Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Autoregressive (AR) models An autoregressive model of order “p” The autoregressive process can be thought of as the output from a linear filter with a transfer function -1(B), when the input is white noise et The equation (B) = 0 is called the “characteristic equation” Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Moving-average (MA) models A moving-average model of order “q” The moving-average process can be thought of as the output from a linear filter with a transfer function (B), when the input is white noise et The equation (B) = 0 is called the “characteristic equation”

Mixed AR and MA (ARMA) models A moving-average process of 1st order can be written as Hence, if the process were really MA(1), we would obtain a non parsimonious representation in terms of an autoregressive model

Mixed AR and MA (ARMA) models (cont.) In order to obtain a parsimonious model, sometimes it will be necessary to include both AR and MA terms in the model An ARMA(p, q) model The ARMA process can be thought of as the output from a linear filter with a transfer function (B)/(B), when the input is white noise at

The Box-Jenkins model building process Model identification Autocorrelations Partial-autocorrelations Model estimation Model validation Certain diagnostics are used to check the validity of the model Model forecasting Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Partial-autocorrelations (PACs) Partial-autocorrelations are another set of statistical measures are used to identify time series models PAC is Similar to AC, except that when calculating it, the ACs with all the elements within the lag are partialled out (Box & Jenkins, 1976) Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Partial-autocorrelations (cont.) PACs can be calculated from the values of the ACs where each PAC is obtained from a different set of linear equations that describe a pure autoregressive model of an order that is equal to the value of the lag of the partial-autocorrelation computed PAC at lag k is denoted by kk The double notation kk is to emphasize that kk is the autoregressive parameter k of the autoregressive model of order k Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Model identification The sample ACs and PACs are computed for the series and compared to theoretical autocorrelation and partial-autocorrelation functions for candidate models investigated Theoretical ACs and PACs Stationarity and invertibility conditions Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Stationarity and invertibility conditions For a linear process to be invertible, For a linear process to be stationary, Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Stationarity requirements for AR(1) model For an AR(1) to be stationary: -1 < a 1 < 1 i.e., the roots of the characteristic equation 1 - a1B = 0 lie outside the unit circle For an AR(1) it can be shown that: k = a1 k – 1 which with 0 = 1 has the solution k = a1k k > 0 i.e., for a stationary AR(1) model, the theoretical autocorrelation function decays exponentially to zero, however, the theoretical partial-autocorrelation function has a cut off after the 1st lag Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Invertibility requirements for a MA(1) model For a MA(1) to be invertible: -1 < b1 < 1 i.e., the roots of the characteristic equation 1 - b 1B = 0 lie outside the unit circle For a MA(1) it can be shown that: ï î í ì > = b + - r 1 k 2 i.e., for an invertible MA(1) model, the theoretical autocorrelation function has a cut off after the 1st lag, however, the theoretical partial-autocorrelation function decays exponentially to zero

Higher order models For an AR model of order p > 1: The autocorrelation function consists of a mixture of damped exponentials and damped sine waves The partial-autocorrelation function has a cut off after the p lag For a MA models of order q > 1: The autocorrelation function has a cut off after the q lag The partial-autocorrelation function consists of a mixture of damped exponentials and damped sine waves

Permissible regions for the AR and MA parameters

Theoretical ACs and PACs (cont.)

Theoretical ACs and PACs (cont.) Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray Model identification Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray Model 定阶 Time Series Analysis Lecture Notes MA(4030)Prepared By TMJA Cooray

几个问题 向量时间序列的建模 非等间隔时间序列的建模 非线性时间序列的建模