STAT 497 LECTURE NOTE 11 VAR MODELS AND GRANGER CAUSALITY 1.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Cointegration and Error Correction Models
Autocorrelation Functions and ARIMA Modelling
Things to do in Lecture 1 Outline basic concepts of causality
Properties of Least Squares Regression Coefficients
VAR Models Gloria González-Rivera University of California, Riverside
Structural Equation Modeling
The Multiple Regression Model.
Vector Autoregressive Models
Model Building For ARIMA time series
Unit Roots & Forecasting
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
Vector Error Correction and Vector Autoregressive Models
Instrumental Variables Estimation and Two Stage Least Square
STAT 497 APPLIED TIME SERIES ANALYSIS
Lecture 7: Principal component analysis (PCA)
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
1 Lecture 4:F-Tests SSSII Gwilym Pryce
1 Module II Lecture 4:F-Tests Graduate School 2004/2005 Quantitative Research Methods Gwilym Pryce
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Chapter 3 Simple Regression. What is in this Chapter? This chapter starts with a linear regression model with one explanatory variable, and states the.
Econ 140 Lecture 131 Multiple Regression Models Lecture 13.
Chapter 9 Simultaneous Equations Models. What is in this Chapter? In Chapter 4 we mentioned that one of the assumptions in the basic regression model.
1 Ka-fu Wong University of Hong Kong Forecasting with Regression Models.
Multiple Regression Models
Financial Econometrics
Chapter 11 Multiple Regression.
Further Inference in the Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Economics 20 - Prof. Anderson
Non-Seasonal Box-Jenkins Models
Correlation and Regression Analysis
So are how the computer determines the size of the intercept and the slope respectively in an OLS regression The OLS equations give a nice, clear intuitive.
Relationships Among Variables
ARMA models Gloria González-Rivera University of California, Riverside
STAT 497 LECTURE NOTES 2.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Autocorrelation, Box Jenkins or ARIMA Forecasting.
Discussion of time series and panel models
1 Another useful model is autoregressive model. Frequently, we find that the values of a series of financial data at particular points in time are highly.
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Module 4 Forecasting Multiple Variables from their own Histories EC 827.
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
Financial Econometrics – 2014 – Dr. Kashif Saleem 1 Financial Econometrics Dr. Kashif Saleem Associate Professor (Finance) Lappeenranta School of Business.
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
EC 827 Module 2 Forecasting a Single Variable from its own History.
Dr. Thomas Kigabo RUSUHUZWA
Time Series Econometrics
Financial Econometrics Lecture Notes 4
VAR models and cointegration
Ch8 Time Series Modeling
ECO 400-Time Series Econometrics VAR MODELS
Model Building For ARIMA time series
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati
Vector Autoregression
Module 3 Forecasting a Single Variable from its own History, continued
Tutorial 1: Misspecification
VAR Models Gloria González-Rivera University of California, Riverside
Lecturer Dr. Veronika Alhanaqtah
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

STAT 497 LECTURE NOTE 11 VAR MODELS AND GRANGER CAUSALITY 1

VECTOR TIME SERIES A vector series consists of multiple single series. Why we need multiple series? – To be able to understand the relationship between several components – To be able to get better forecasts 2

VECTOR TIME SERIES Price movements in one market can spread easily and instantly to another market. For this reason, financial markets are more dependent on each other than ever before. So, we have to consider them jointly to better understand the dynamic structure of global market. Knowing how markets are interrelated is of great importance in finance. For an investor or a financial institution holding multiple assets play an important role in decision making. 3

VECTOR TIME SERIES 4

5

6

7

8

Consider an m -dimensional time series Y t =(Y 1,Y 2,…,Y m )’. The series Y t is weakly stationary if its first two moments are time invariant and the cross covariance between Y it and Y js for all i and j are functions of the time difference (s-t) only. 9

VECTOR TIME SERIES The mean vector: The covariance matrix function 10

VECTOR TIME SERIES The correlation matrix function: where D is a diagonal matrix in which the i-th diagonal element is the variance of the i-th process, i.e. The covariance and correlation matrix functions are positive semi-definite. 11

VECTOR WHITE NOISE PROCESS {a t }~WN(0,  ) iff {a t } is stationary with mean 0 vector and 12

VECTOR TIME SERIES {Y t } is a linear process if it can be expressed as where {  j } is a sequence of mxn matrix whose entries are absolutely summable, i.e. 13

VECTOR TIME SERIES For a linear process, E(Y t )=0 and 14

MA (WOLD) REPRESENTATION For the process to be stationary,  s should be square summable in the sense that each of the mxm sequence  ij.s is square summable. 15

AR REPRESENTATION For the process to be invertible,  s should be absolute summable. 16

THE VECTOR AUTOREGRESSIVE MOVING AVERAGE (VARMA) PROCESSES VARMA(p,q) process: 17

VARMA PROCESS VARMA process is stationary if the zeros of |  p (B)| are outside the unit circle. VARMA process is invertible if the zeros of |  q (B)| are outside the unit circle. 18

IDENTIFIBILITY PROBLEM Multiplying matrices by some arbitrary matrix polynomial may give us an identical covariance matrix. So, the VARMA(p,q) model is not identifiable. We cannot uniquely determine p and q. 19

IDENTIFIBILITY PROBLEM Example: VARMA(1,1) process 20 MA(  )=VMA(1)

21

IDENTIFIBILITY To eliminate this problem, there are three methods suggested by Hannan (1969, 1970, 1976, 1979). – From each of the equivalent models, choose the minimum MA order q and AR order p. The resulting representation will be unique if Rank(  p (B))=m. – Represent  p (B) in lower triangular form. If the order of  ij (B) for i,j=1,2,…,m, then the model is identifiable. – Represent  p (B) in a form  p (B) =  p (B)I where  p (B) is a univariate AR(p). The model is identifiable if  p  0. 22

VAR(1) PROCESS Y i,t depends not only the lagged values of Y it but also the lagged values of the other variables. Always invertible. Stationary if outside the unit circle. Let =B The zeros of |I  B| is related to the eigenvalues of .

VAR(1) PROCESS Hence, VAR(1) process is stationary if the eigenvalues of  ; i, i=1,2,…,m are all inside the unit circle. The autocovariance matrix: 24

VAR(1) PROCESS k=1, 25

VAR(1) PROCESS Then, 26

VAR(1) PROCESS Example: 27 The process is stationary.

VMA(1) PROCESS Always stationary. The autocovariance function: The autocovariance matrix function cuts of after lag 1. 28

VMA(1) PROCESS Hence, VMA(1) process is invertible if the eigenvalues of  ; i, i=1,2,…,m are all inside the unit circle. 29

IDENTIFICATION OF VARMA PROCESSES Same as univariate case. SAMPLE CORRELATION MATRIC FUNCTION: Given a vector series of n observations, the sample correlation matrix function is where ‘s are the crosscorrelation for the i-th and j-th component series. It is very useful to identify VMA(q). 30

SAMPLE CORRELATION MATRIC FUNCTION Tiao and Box (1981): They have proposed to use +,  and. signs to show the significance of the cross correlations. + sign: the value is greater than 2 times the estimated standard error  sign: the value is less than 2 times the estimated standard error. sign: the value is within the 2 times estimated standard error 31

PARTIAL AUTOREGRESSION OR PARTIAL LAG CORRELATION MATRIX FUNCTION They are useful to identify VAR order. The partial autoregression matrix function is proposed by Tiao and Box (1981) but it is not a proper correlation coefficient. Then, Heyse and Wei (1985) have proposed the partial lag correlation matrix function which is a proper correlation coefficient. Both of them can be used to identify the VARMA( p,q ). 32

GRANGER CAUSALITY In time series analysis, sometimes, we would like to know whether changes in a variable will have an impact on changes other variables. To find out this phenomena more accurately, we need to learn more about Granger Causality Test. 33

GRANGER CAUSALITY In principle, the concept is as follows: If X causes Y, then, changes of X happened first then followed by changes of Y. 34

GRANGER CAUSALITY If X causes Y, there are two conditions to be satisfied: 1. X can help in predicting Y. Regression of X on Y has a big R 2 2. Y can not help in predicting X. 35

GRANGER CAUSALITY In most regressions, it is very hard to discuss causality. For instance, the significance of the coefficient  in the regression only tells the ‘co-occurrence’ of x and y, not that x causes y. In other words, usually the regression only tells us there is some ‘relationship’ between x and y, and does not tell the nature of the relationship, such as whether x causes y or y causes x. 36

GRANGER CAUSALITY One good thing of time series vector autoregression is that we could test ‘causality’ in some sense. This test is first proposed by Granger (1969), and therefore we refer it Granger causality. We will restrict our discussion to a system of two variables, x and y. y is said to Granger-cause x if current or lagged values of y helps to predict future values of x. On the other hand, y fails to Granger-cause x if for all s > 0, the mean squared error of a forecast of x t+ s based on (x t, x t−1,...) is the same as that is based on (y t, y t−1,...) and (x t, x t−1,...). 37

GRANGER CAUSALITY If we restrict ourselves to linear functions, x fails to Granger-cause x if Equivalently, we can say that x is exogenous in the time series sense with respect to y, or y is not linearly informative about future x. 38

GRANGER CAUSALITY A variable X is said to Granger cause another variable Y, if Y can be better predicted from the past of X and Y together than the past of Y alone, other relevant information being used in the prediction (Pierce, 1977). 39

GRANGER CAUSALITY In the VAR equation, the example we proposed above implies a lower triangular coefficient matrix: Or if we use MA representations, 40

GRANGER CAUSALITY Consider a linear projection of y t on past, present and future x ’s, where E(e t x  ) = 0 for all t and . Then y fails to Granger-cause x iff d j = 0 for j = 1, 2,

TESTING GRANGER CAUSALITY Procedure 1) Check that both series are stationary in mean, variance and covariance (if necessary transform the data via logs, differences to ensure this) 2) Estimate AR(p) models for each series, where p is large enough to ensure white noise residuals. F tests and other criteria (e.g. Schwartz or Akaike) can be used to establish the maximum lag p that is needed. 3) Re-estimate both model, now including all the lags of the other variable 4) Use F tests to determine whether, after controlling for past Y, past values of X can improve forecasts Y (and vice versa) 42

TEST OUTCOMES 1. X Granger causes Y but Y does not Granger cause X 2. Y Granger causes X but X does not Granger cause Y 3. X Granger causes Y and Y Granger causes X (i.e., there is a feedback system) 4. X does not Granger cause Y and Y does not Granger cause X 43

TESTING GRANGER CAUSALITY The simplest test is to estimate the regression which is based on using OLS and then conduct a F -test of the null hypothesis H 0 :  1 =  2 =... =  p = 0. 44

TESTING GRANGER CAUSALITY 2.Run the following regression, and calculate RSS (full model) 3.Run the following limited regression, and calculate RSS (Restricted model). 45

TESTING GRANGER CAUSALITY 4.Do the following F -test using RSS obtained from stages 2 and 3: F = [{(n-k) /q }.{(RSSrestricted-RSSfull) / RSSfull}] n : number of observations k : number of parameters from full model q : number of parameters from restricted model 46

TESTING GRANGER CAUSALITY 5. If H 0 rejected, then X causes Y. This technique can be used in investigating whether or not Y causes X. 47

Example of the Usage of Granger Test World Oil Price and Growth of US Economy Does the increase of world oil price influence the growth of US economy or does the growth of US economy effects the world oil price? James Hamilton did this study using the following model: Z t = a 0 + a 1 Z t a m Z t-m +b 1 X t-1 +…b m X t-m +ε t Z t = ΔP t ; changes of world price of oil X t = log (GNP t / GNP t-1 ) 48

World Oil Price and Growth of US Economy There are two causalities that need to be observed: (i) H 0 : Growth of US Economy does not influence world oil price Full: Z t = a 0 + a 1 Z t a m Z t-m +b 1 X t-1 +…+b m X t-m +ε t Restricted: Z t = a 0 + a 1 Z t a m Z t-m + ε t 49

World Oil Price and Growth of US Economy (ii) H 0 : World oil price does not influence growth of US Economy Full : X t = a 0 + a 1 X t-1 + …+a m X t-m + b 1 Z t-1 +…+b m Z t-m + ε t Restricted: X t = a 0 + a 1 X t-1 + …+a m X t-m + ε t 50

World Oil Price and Growth of US Economy F Tests Results: 1. Hypothesis that world oil price does not influence US economy is rejected. It means that the world oil price does influence US economy. 2. Hypothesis that US economy does not effect world oil price is not rejected. It means that the US economy does not have effect on world oil price. 51

World Oil Price and Growth of US Economy Summary of James Hamilton’s Results 52 Null Hypothesis (H 0 )(I)F(4,86)(II)F(8,74) I. Economic growth ≠→World Oil Price II. World Oil Price≠→Economic growth

World Oil Price and Growth of US Economy Remark: The first experiment used the data (95 observations) and m=4 ; while the second experiment used data (91 observations) and m=8. 53

Chicken vs. Egg This causality test is also can be used in explaining which comes first: chicken or egg. More specifically, the test can be used in testing whether the existence of egg causes the existence of chicken or vise versa. Thurman and Fisher did this study using yearly data of chicken and egg productions in the US from 1930 to1983 The results: 1. Egg causes the chicken. 2. There is no evidence that chicken causes egg. 54

Chicken vs. Egg Remark: Hypothesis that egg has no effect on chicken population is rejected; while the other hypothesis that chicken has no effect on egg is not rejected. Why? 55

GRANGER CAUSALITY We have to be aware of that Granger causality does not equal to what we usually mean by causality. For instance, even if x 1 does not cause x 2, it may still help to predict x 2, and thus Granger-causes x 2 if changes in x 1 precedes that of x 2 for some reason. A naive example is that we observe that a dragonfly flies much lower before a rain storm, due to the lower air pressure. We know that dragonflies do not cause a rain storm, but it does help to predict a rain storm, thus Granger-causes a rain storm. 56