Module 4 Forecasting Multiple Variables from their own Histories EC 827.

Slides:



Advertisements
Similar presentations
SMA 6304 / MIT / MIT Manufacturing Systems Lecture 11: Forecasting Lecturer: Prof. Duane S. Boning Copyright 2003 © Duane S. Boning. 1.
Advertisements

Cointegration and Error Correction Models
Autocorrelation Functions and ARIMA Modelling
DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 1 review: Quizzes 1-6.
Decomposition Method.
Use of Business Tendency Survey Results for Forecasting Industry Production in Slovakia Use of Business Tendency Survey Results for Forecasting Industry.
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Nonstationary Time Series Data and Cointegration Prepared by Vera Tabakova, East Carolina University.
Non-stationary data series
COINTEGRATION 1 The next topic is cointegration. Suppose that you have two nonstationary series X and Y and you hypothesize that Y is a linear function.
Advanced Time Series PS 791C. Advanced Time Series Techniques A number of topics come under the general heading of “state-of-the-art” time series –Unit.
1 MF-852 Financial Econometrics Lecture 11 Distributed Lags and Unit Roots Roy J. Epstein Fall 2003.
Unit Roots & Forecasting
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Building 1. Model Identification
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Vector Error Correction and Vector Autoregressive Models
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Modeling Cycles By ARMA
Data Sources The most sophisticated forecasting model will fail if it is applied to unreliable data Data should be reliable and accurate Data should be.
Multiple Regression Involves the use of more than one independent variable. Multivariate analysis involves more than one dependent variable - OMS 633 Adding.
Additional Topics in Regression Analysis
1 Ka-fu Wong University of Hong Kong Forecasting with Regression Models.
MOVING AVERAGES AND EXPONENTIAL SMOOTHING
Prediction and model selection
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Economics 20 - Prof. Anderson
Non-Seasonal Box-Jenkins Models
Dr. Mario MazzocchiResearch Methods & Data Analysis1 Correlation and regression analysis Week 8 Research Methods & Data Analysis.
14 Vector Autoregressions, Unit Roots, and Cointegration.
BOX JENKINS METHODOLOGY
(c) Martin L. Puterman1 BABS 502 Regression Based Forecasting February 28, 2011.
Regression and Correlation Methods Judy Zhong Ph.D.
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Time-Series Analysis and Forecasting – Part V To read at home.
Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
Intervention models Something’s happened around t = 200.
Lecture 7: Forecasting: Putting it ALL together. The full model The model with seasonality, quadratic trend, and ARMA components can be written: Ummmm,
1 DSCI 3023 Forecasting Plays an important role in many industries –marketing –financial planning –production control Forecasts are not to be thought of.
© 2000 Prentice-Hall, Inc. Chap The Least Squares Linear Trend Model Year Coded X Sales
Autocorrelation, Box Jenkins or ARIMA Forecasting.
Discussion of time series and panel models
September 18-19, 2006 – Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development Conducting and interpreting multivariate analyses.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
Environmental Modeling Basic Testing Methods - Statistics III.
1 Experimental Statistics - week 12 Chapter 12: Multiple Regression Chapter 13: Variable Selection Model Checking.
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS 1. After identifying and estimating a time series model, the goodness-of-fit of the model and validity of the.
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
1 Chapter 5 : Volatility Models Similar to linear regression analysis, many time series exhibit a non-constant variance (heteroscedasticity). In a regression.
The Box-Jenkins (ARIMA) Methodology
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 14 l Time Series: Understanding Changes over Time.
Cointegration in Single Equations: Lecture 5
Demand Management and Forecasting Chapter 11 Portions Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
EC 827 Module 2 Forecasting a Single Variable from its own History.
Lecture 12 Time Series Model Estimation Materials for lecture 12 Read Chapter 15 pages 30 to 37 Lecture 12 Time Series.XLSX Lecture 12 Vector Autoregression.XLSX.
Chapter 11 – With Woodruff Modications Demand Management and Forecasting Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Financial Econometrics Lecture Notes 4
Linear Regression.
Nonstationary Time Series Data and Cointegration
VAR models and cointegration
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati
1) A residual: a) is the amount of variation explained by the LSRL of y on x b) is how much an observed y-value differs from a predicted y-value c) predicts.
Module 3 Forecasting a Single Variable from its own History, continued
Unit Root & Augmented Dickey-Fuller (ADF) Test
Vector AutoRegression models (VARs)
BOX JENKINS (ARIMA) METHODOLOGY
Forecasting Plays an important role in many industries
Presentation transcript:

Module 4 Forecasting Multiple Variables from their own Histories EC 827

Multiple Variable AR Systems  Vector Autoregression (VAR) » equations for several variables where each variable depends not only on its own history, but also the history of all the other variables. » multiple variable extension of an AR model  In principle could specify multiple variable MA or multiple variable ARMA models » in practice such models are difficult to specify » typically low order VARs are adequate to approximate MA or ARMA processes

VAR Process: A Two Variable Example

Predictive Causality (Diebold p.303)  Says that information in the history of one variable can be used to improve upon the forecasts of a second variable compared to just forecasting from the history of the second variable.  Z has predictive causality for X if b 1 not equal to zero. X has predictive causality for Z if c 1 not equal to zero

VAR Processes: Higher Order Systems  Only one lag on X and Z appears in each of the above equations. – systems with one lag are referred to as first order systems. – higher order systems involve more than one lag in at least one of the variables  Not necessary to limit the size of the forecasting problem to only two variables – data limitations preclude systems with very large number of variables (say > 10)

How Long for the Lags?  Long enough to get rid of any significant autocorrelation in the residuals of each equation (otherwise there is information to improve on the forecasts  Not so long that the model is “over- parameterized” and there is a loss of forecasting efficiency  AIC and SIC again (MenuRATS VAR procudure will compute their logs and print)

How Long for the Lags? II  One strategy: – start with a longer lag – check that autocorrelations of residuals are small (i.e. you’re not wasting information). – shorten the lag and re-estimate » do AIC and/or SIC increase or decrease? (smaller is better!) » check on the stability of the estimated coefficients as the lag length is shortened

How Long for the Lags? III  Generally are not going to need a lot of lags for seasonally adjusted data – 3-4 lags for quarterly observations – 5-7 lags for monthly observations  For non-seasonally adjusted data, be careful about autocorrelations at seasonal frequencies – may need short continuous lag, then another lag at seasonal frequency.

Leading Indicators: An Example

Leading Indicator  When c 1 = 0.0 then the history of the X variable does not influence the future values of the Z variable (no predictive causality of X for Z)  As long as b 1 not equal to zero, the history of Z has predictive value for future outcomes of X – under these conditions we say that Z is a leading indicator of X. – good or bad leading indicator depends on the size of b 1 and the variance of e 1t

Testing for Leading Indicators  Question of interest is whether all the coefficient on lagged values on one variable are zero in the regression in which some other variable is the dependent variable?  F test can be used to examine the hypothesis that multiple regression coefficients are jointly equal to zero.

Leading Indicators  Until recently the U.S Department of Commerce published a leading indicator series – Recently “privatized” (or “outsourced”) to Conference Board (NY research operation) – not a single variable, but a weighted sum of 11 variables (a composite leading indicator) – alleged systematic autocorrelation between this composite variable and real output (real GDP)

Leading Indicators  Commerce (or Conference Board) Composite Leading Indicator viewed as a forecasting device for future expansions or recessions. – autocorrelations are not 1.0; not a perfect forecasting device by any means – frequently generates false signals of recessions – accuracy somewhat improved (though not perfect by any means) by looking at average behavior over several months.

Industrial Production Autocorrelations Log of Industrial Production

Log Change in IP Autocorrelations Log Differences of Industrial Production

Log Difference IP AR(1) Model Dependent Variable DQIP - Estimation by Least Squares Monthly Data From 47:02 To 94:12 R Bar ** Standard Error of Estimate Durbin-Watson Statistic 2.07 Variable Coeff Std Error T-Stat ******************************************* 1. Constant DQIP{1}

Differenced IP AR(1) Residuals Log Difference IP AR(1)Residuals

IP-Composite Leading Indicator Model Dependent Variable DQIP - Estimation by Least Squares Monthly Data From 47:02 To 94:12 R Bar ** Standard Error of Estimate Durbin-Watson Statistic Variable Coeff Std Error T-Stat ******************************************** 1. Constant DIND{1} DIND{2} DIND{3} DIND{4} DQIP{1}

Forecasting from VAR Models  One period ahead forecasts: – multiply coefficients of model by most recently observed values of time series and add the terms up = forecast of next period value ( t X t+1 )  Multiple period ahead forecasts: – most recent data values are not available – for t X t+2 use predicted values, t X t+1 for X t-1, X t for X t-2, etc. – for t X t+3 use predicted values, t X t+2 for X t-1, t X t+1 for X t-2, X t for X t-3, etc.

Cointegration  Suppose that you have several variables that are generated by unit root processes –random walks with or without drift  Suppose that such variables are “tied together” - there are linear combinations of the variables that are stationary –such variables are said to be cointegrated

Cointegration and Vector Error Correction Models (VECM)  Variables that are cointegrated can be represented by a special kind of VAR - A Vector Error Correction Model  Two Variable VECM:

Cointegration and VECM’s  gX t-1 + hZ t-1 is called the cointegrating vector (the linear combination of X and Z that is stationary)  f 1 and f 2 are called the error correction coefficients  if f 1 and f 2 are both equal to zero, then the VECM is just an ordinary VAR in first differences of X and Z

Cointegration and VECM’s  Advantage of VECM specification –VAR in differences ignores the information that the levels of the variables cannot wander aimlessly, but are tied together in the long run. –may be able to improve forecasts over intermediate to long-run over just VAR in differences.

Judging Forecasts I Prediction-Realization Diagram Actual Change Predicted Change Line of Perfect Forecast Turning Point Errors

Judging Forecasts III  Mean Squared Error (MSE): –obviously, smaller is better, again...

Judging Forecasts IV  Theil Inequality Proportions (add to 1.0) –U M = Bias Proportion »large values are bad; indicates systematic differences in actual and average changes –U S = Variance Proportion »large values indicate unequal variances of actual and predicted changes –U C = Covariance Proportion »zero = perfect correlation between actual and predicted changes