Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Vector Autoregressive Models
Model Building For ARIMA time series
Use of Business Tendency Survey Results for Forecasting Industry Production in Slovakia Use of Business Tendency Survey Results for Forecasting Industry.
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Non-stationary data series
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
Vector Error Correction and Vector Autoregressive Models
Economics 310 Lecture 25 Univariate Time-Series Methods of Economic Forecasting Single-equation regression models Simultaneous-equation regression models.
STAT 497 APPLIED TIME SERIES ANALYSIS
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Business Forecasting Chapter 10 The Box–Jenkins Method of Forecasting.
Non-Seasonal Box-Jenkins Models
Stationary process NONSTATIONARY PROCESSES 1 In the last sequence, the process shown at the top was shown to be stationary. The expected value and variance.
13 Introduction toTime-Series Analysis. What is in this Chapter? This chapter discusses –the basic time-series models: autoregressive (AR) and moving.
Modeling Cycles By ARMA
Chapter 9 Simultaneous Equations Models. What is in this Chapter? In Chapter 4 we mentioned that one of the assumptions in the basic regression model.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Non-Seasonal Box-Jenkins Models
Copyright 2013 John Wiley & Sons, Inc. Chapter 8 Supplement Forecasting.
Dealing with Heteroscedasticity In some cases an appropriate scaling of the data is the best way to deal with heteroscedasticity. For example, in the model.
14 Vector Autoregressions, Unit Roots, and Cointegration.
BOX JENKINS METHODOLOGY
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
Chapter 15 Forecasting Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
ARMA models Gloria González-Rivera University of California, Riverside
Time Series Forecasting (Part II)
STAT 497 LECTURE NOTES 2.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
#1 EC 485: Time Series Analysis in a Nut Shell. #2 Data Preparation: 1)Plot data and examine for stationarity 2)Examine ACF for stationarity 3)If not.
Lecture 7: Forecasting: Putting it ALL together. The full model The model with seasonality, quadratic trend, and ARMA components can be written: Ummmm,
Various topics Petter Mostad Overview Epidemiology Study types / data types Econometrics Time series data More about sampling –Estimation.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Chapter 1 : Introduction and Review
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
1 Another useful model is autoregressive model. Frequently, we find that the values of a series of financial data at particular points in time are highly.
Robert Engle UCSD and NYU and Robert F. Engle, Econometric Services DYNAMIC CONDITIONAL CORRELATIONS.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Forecasting (prediction) limits Example Linear deterministic trend estimated by least-squares Note! The average of the numbers 1, 2, …, t is.
1 Chapter 5 : Volatility Models Similar to linear regression analysis, many time series exhibit a non-constant variance (heteroscedasticity). In a regression.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
The Box-Jenkins (ARIMA) Methodology
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 14 l Time Series: Understanding Changes over Time.
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
AUTOCORRELATION 1 Assumption B.5 states that the values of the disturbance term in the observations in the sample are generated independently of each other.
Introduction to stochastic processes
Lecture 2 Stephen G Hall Time Series Forecasting.
Subodh Kant. Auto-Regressive Integrated Moving Average Also known as Box-Jenkins methodology A type of linear model Capable of representing stationary.
Stationarity and Unit Root Testing Dr. Thomas Kigabo RUSUHUZWA.
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Financial Econometrics Lecture Notes 2
Ch8 Time Series Modeling
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati
ASSET PRICE VOLATILITY: THE ARCH AND GARCH MODELS
Forecasting with non-stationary data series
Unit Root & Augmented Dickey-Fuller (ADF) Test
Lecturer Dr. Veronika Alhanaqtah
Time Series introduction in R - Iñaki Puigdollers
CH2 Time series.
BOX JENKINS (ARIMA) METHODOLOGY
Chap 7: Seasonal ARIMA Models
Presentation transcript:

Time Series Analysis PART II

Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important. How do we forecast economic variables, such as GDP, inflation, exchange rates, stock prices, unemployment rates, and myriad other economic variables?

APPROACHES TO ECONOMIC FORECASTING (1)exponential smoothing methods (2) single-equation regression models (3) simultaneous-equation regression models (4) Autoregressive integrated moving average models (ARIMA) (5) vector autoregression.

ARIMA Models Popularly known as the Box–Jenkins (BJ) methodology, but technically known as the ARIMA methodology, the emphasis of these methods is not on constructing single-equation or simultaneous-equation models but on analyzing the probabilistic, or stochastic, properties of economic time series on their own under the philosophy let the data speak for themselves.

Unlike the regression models, in which Y t is explained by k regressor X 1, X 2, X 3,..., X k, the BJ-type time series models allow Y t to be explained by past, or lagged, values of Y itself and stochastic error terms.

Autoregressive processes (AR) An AR(1) process is written as: y t =  y t-1 +  t where  t ~ IID(0,  2 ) ie. the current value of y t is equal to  times its previous value plus an unpredictable component  t. we say that Y t follows a first- order autoregressive, or AR(1), stochastic process.

Here the value of Y at time t depends on its value in the previous time period and a random term; the Y values are expressed as deviations from their mean value. In other words, this model says that the forecast value of Y at time t is simply some proportion (=  ) of its value at time (t − 1) plus a random shock or disturbance at time t.

If we consider a model y t =  1 y t-1 +  2 y t-2 +  t then we say that Yt follows a second-order autoregressive, or AR(2), process. That is, the value of Y at time t depends on its value in the previous two time periods.

This can be extended to an AR(p) process y t =  1 y t-1 +  2 y t-2 +…..+  p y t-p +  t where  t ~ IID(0,  2 ) ie. the current value of y t depends on p past values plus an unpredictable component  t.

Moving Average processes(MA) An MA(1) process is written as y t =  t + β 1  t-1 where  t ~ IID(0,  2 ) ie. the current value is given by an unpredictable component  t and β times the previous period’s error.

Here Y at time t is equal to a constant plus a moving average of the current and past error terms. Thus, in the present case, we say that Y follows a first-order moving average, or an MA(1), process.

But if Y follows the expression, y t =  t + β 1  t-1 + β 2  t-2 then it is an MA(2) process, This can also be extended to an MA(q) process y t =  t + β 1  t-1 +…..+ β q  t-q where  t ~ IID(0,  2 ) In short, a moving average process is simply a linear combination of white noise error terms.

Autoregressive and Moving Average (ARMA) Process Of course, it is quite likely that Y has characteristics of both AR and MA and is therefore ARMA. Thus, Y t follows an ARMA(1, 1) process if it can be written as y t =  1 y t-1 +  t + β 1  t-1

Because there is one autoregressive and one moving average term. In general, in an ARMA( p, q) process, there will be p autoregressive and q moving average terms, y t =  1 y t  p y t-p +  t + β 1  t β q  t-q

Autoregressive Integrated Moving Average (ARIMA) Process The time series models we have already discussed are based on the assumption that the time series involved are (weakly) stationary. Briefly, the mean and variance for a weakly stationary time series are constant and its covariance is time- invariant. But we know that many economic time series are nonstationary, that is, they are integrated.

if a time series is integrated of order 1 [i.e., it is I(1)], its first differences are I(0), that is, stationary. Similarly, if a time series is I(2), its second difference is I(0). In general, if a time series is I(d), after differencing it d times we obtain an I(0) series. Therefore, if we have to difference a time series d times to make it stationary and then apply the ARMA(p, q) model to it,

we say that the original time series is ARIMA(p, d, q), that is, it is an autoregressive integrated moving average time series, where p denotes the number of autoregressive terms, d the number of times the series has to be differenced before it becomes stationary, and q the number of moving average terms.

Thus, an ARIMA(2, 1, 2) time series has to be differenced once (d = 1) before it becomes stationary and the (first-differenced) stationary time series can be modeled as an ARMA(2, 2) process, that is, it has two AR and two MA terms.

THE BOX–JENKINS (BJ) METHODOLOGY How does one know whether it follows a purely AR process (and if so, what is the value of p) or a purely MA process (and if so, what is the value of q) or an ARMA process (and if so, what are the values of p and q) or an ARIMA process, in which case we must know the values of p, d, and q. The BJ methodology comes in handy in answering the preceding question. The method consists of four steps:

Step 1. Identification. To find out the appropriate values of p, d, and q. We use the correlogram and partial correlogram for this task.

Step 2. Estimation. Having identified the appropriate p and q values, the next stage is to estimate the parameters of the autoregressive and moving average terms included in the model. Sometimes this calculation can be done by simple least squares but sometimes we will have to resort to nonlinear (in parameter) estimation methods.

Step 3. Diagnostic checking. Having chosen a particular ARIMA model, and having estimated its parameters, we next see whether the chosen model fits the data reasonably well, for it is possible that another ARIMA model might do the job as well. This is why Box–Jenkins ARIMA modeling is more an art than a science; considerable skill is required to choose the right ARIMA model.

One simple test of the chosen model is to see if the residuals estimated from this model are white noise; if they are, we can accept the particular fit; if not, we must start over. Thus, the BJ methodology is an iterative process.

Step 4. Forecasting. One of the reasons for the popularity of the ARIMA modeling is its success in forecasting. In many cases, the forecasts obtained by this method are more reliable than those obtained from the traditional econometric modeling, particularly for short-term forecasts. Of course, each case must be checked.

Vector ARX The term autoregressive is due to the appearance of the lagged value of the dependent variable on the right-hand side and the term vector is due to the fact that we are dealing with a vector of two (or more) variables.

Volatility Financial time series, such as stock prices, exchange rates, inflation rates, etc. often exhibit the phenomenon of volatility clustering, that is, periods in which their prices show wide swings for an extended time period followed by periods in which there is relative calm

Knowledge of volatility is of crucial importance in many areas. For example, considerable macroeconometric work has been done in studying the variability of inflation over time. For some decision makers, inflation in itself may not be bad, but its variability is bad because it makes financial planning difficult.

A characteristic of most of thefinancial time series is that in their level form they are random walks; that is, they are nonstationary. On the other hand, in the first difference form, they are generally stationary

Therefore, instead of modeling the levels of financial time series, why not model their first differences? But these first differences often exhibit wide swings, or volatility, suggesting that the variance of financial time series varies over time.

How can we model such “varying variance”? This is where the so-called autoregressive conditional heteroscedasticity (ARCH)