Presentation on theme: "Part 25: Time Series 25-1/25 Econometrics I Professor William Greene Stern School of Business Department of Economics."— Presentation transcript:
Part 25: Time Series 25-1/25 Econometrics I Professor William Greene Stern School of Business Department of Economics
Part 25: Time Series 25-2/25 Econometrics I Part 25 – Time Series
Part 25: Time Series 25-3/25 Modeling an Economic Time Series Observed y 0, y 1, …, y t,… What is the sample Random sampling? The observation window
Part 25: Time Series 25-4/25 Estimators Functions of sums of observations Law of large numbers? Nonindependent observations What does increasing sample size mean? Asymptotic properties? (There are no finite sample properties.)
Part 25: Time Series 25-5/25 Interpreting a Time Series Time domain: A process y(t) = ax(t) + by(t-1) + … Regression like approach/interpretation Frequency domain: A sum of terms y(t) = Contribution of different frequencies to the observed series. (High frequency data and financial econometrics – frequency is used slightly differently here.)
Part 25: Time Series 25-6/25 For example,…
Part 25: Time Series 25-7/25 In parts…
Part 25: Time Series 25-8/25 Studying the Frequency Domain Cannot identify the number of terms Cannot identify frequencies from the time series Deconstructing the variance, autocovariances and autocorrelations Contributions at different frequencies Apparent large weights at different frequencies Using Fourier transforms of the data Does this provide new information about the series?
Part 25: Time Series 25-9/25 Autocorrelation in Regression Y t = bx t + ε t Cov(ε t, ε t-1 ) 0 Ex. RealCons t = a + bRealIncome + ε t U.S. Data, quarterly,
Part 25: Time Series 25-10/25 Autocorrelation How does it arise? What does it mean? Modeling approaches Classical – direct: corrective Estimation that accounts for autocorrelation Inference in the presence of autocorrelation Contemporary – structural Model the source Incorporate the time series aspect in the model
Part 25: Time Series 25-11/25 Stationary Time Series z t = b 1 y t-1 + b 2 y t-2 + … + b P y t-P + e t Autocovariance: γ k = Cov[y t,y t-k ] Autocorrelation: k = γ k / γ 0 Stationary series: γ k depends only on k, not on t Weak stationarity: E[y t ] is not a function of t, E[y t * y t-s ] is not a function of t or s, only of |t-s| Strong stationarity: The joint distribution of [y t,y t-1,…,y t-s ] for any window of length s periods, is not a function of t or s. A condition for weak stationarity: The smallest root of the characteristic polynomial: 1 - b 1 z 1 - b 2 z 2 - … - b P z P = 0, is greater than one. The unit circle Complex roots Example: y t = y t-1 + e e, 1 - z = 0 has root z = 1/, | z | > 1 => | | < 1.
Part 25: Time Series 25-12/25 Stationary vs. Nonstationary Series
Part 25: Time Series 25-13/25 The Lag Operator Lx t = x t-1 L 2 x t = x t-2 L P x t + L Q x t = x t-P + x t-Q Polynomials in L: y t = B(L)y t + e t A(L) y t = e t Invertibility: y t = [A(L)] -1 e t
Part 25: Time Series 25-14/25 Inverting a Stationary Series y t = y t-1 + e t (1- L)y t = e t y t = [1- L] -1 e t = e t + e t e t-2 + … Stationary series can be inverted Autoregressive vs. moving average form of series
Part 25: Time Series 25-15/25 Regression with Autocorrelation y t = x t b + e t, e t = e t-1 + u t (1- L)e t = u t e t = (1- L) -1 u t E[e t ] = E[ (1- L) -1 u t ] = (1- L) -1 E[u t ] = 0 Var[e t ] = (1- L) -2 Var[u t ] = 1+ 2 u 2 + … = u 2 /(1- 2 ) Cov[e t,e t-1 ] = Cov[ e t-1 + u t, e t-1 ] = = Cov[e t-1,e t-1 ]+Cov[u t,e t-1 ] = u 2 /(1- 2 )
Part 25: Time Series 25-16/25 OLS vs. GLS OLS Unbiased? Consistent: (Except in the presence of a lagged dependent variable) Inefficient GLS Consistent and efficient
Part 25: Time Series 25-17/ | Ordinary least squares regression | | LHS=REALCONS Mean = | | Autocorrel Durbin-Watson Stat. = | | Rho = cor[e,e(-1)] = | |Variable | Coefficient | Standard Error |t-ratio |P[|T|>t] | Mean of X| Constant REALDPI | Robust VC Newey-West, Periods = 10 | Constant REALDPI | AR(1) Model: e(t) = rho * e(t-1) + u(t) | | Final value of Rho = | | Iter= 6, SS= , Log-L= | | Durbin-Watson: e(t) = | | Std. Deviation: e(t) = | | Std. Deviation: u(t) = | | Durbin-Watson: u(t) = | | Autocorrelation: u(t) = | | N[0,1] used for significance levels | |Variable | Coefficient | Standard Error |b/St.Er.|P[|Z|>z] | Mean of X| Constant REALDPI RHO
Part 25: Time Series 25-18/25 Detecting Autocorrelation Use residuals Durbin-Watson d= Assumes normally distributed disturbances strictly exogenous regressors Variable addition (Godfrey) y t = x t + ε t-1 + u t Use regression residuals e t and test = 0 Assumes consistency of b.
Part 25: Time Series 25-19/25 A Unit Root? How to test for = 1? By construction: ε t – ε t-1 = ( - 1)ε t-1 + u t Test for γ = ( - 1) = 0 using regression? Variance goes to 0 faster than 1/T. Need a new table; cant use standard t tables. Dickey – Fuller tests Unit roots in economic data. (Are there?) Nonstationary series Implications for conventional analysis
Part 25: Time Series 25-20/25 Reinterpreting Autocorrelation
Part 25: Time Series 25-21/25 Integrated Processes Integration of order (P) when the Pth differenced series is stationary Stationary series are I(0) Trending series are often I(1). Then y t – y t-1 = y t is I(0). [Most macroeconomic data series.] Accelerating series might be I(2). Then (y t – y t-1 )- (y t – y t-1 ) = 2 y t is I(0) [Money stock in hyperinflationary economies. Difficult to find many applications in economics]
Part 25: Time Series 25-22/25 Cointegration: Real DPI and Real Consumption
Part 25: Time Series 25-23/25 Cointegration – Divergent Series?
Part 25: Time Series 25-24/25 Cointegration X(t) and y(t) are obviously I(1) Looks like any linear combination of x(t) and y(t) will also be I(1) Does a model y(t) = bx(t) + u(u) where u(t) is I(0) make any sense? How can u(t) be I(0)? In fact, there is a linear combination, [1,- ] that is I(0). y(t) =.1*t + noise, x(t) =.2*t + noise y(t) and x(t) have a common trend y(t) and x(t) are cointegrated.
Part 25: Time Series 25-25/25 Cointegration and I(0) Residuals