Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bristol MSc Time Series Econometrics, Spring 2015 Univariate time series processes, moments, stationarity.

Similar presentations


Presentation on theme: "Bristol MSc Time Series Econometrics, Spring 2015 Univariate time series processes, moments, stationarity."— Presentation transcript:

1 Bristol MSc Time Series Econometrics, Spring 2015 Univariate time series processes, moments, stationarity

2 Overview Moving average processes Autogregressive processes MA representation of autoregressive processes Computing first and second moments, means, variances, autocovariances, autocorrelations Stationarity, strong and weak, ergodicity The lag operator, lag polynomials, invertibility. Mostly from Hamilton (1994), but see also Cochrane’s monograph on time series.Hamilton (1994) Cochrane’s monograph on time series

3 Aim These univariate concepts needed in multivariate analysis Moment calculation a building block in forming the likelihood of time series data, and therefore estimation Foundational tools in descriptive time series modelling.

4 Two notions of the mean in time series 1.Imagine many computers simulating a series in parrallel. If at date t, we took an average of all of them, what would that converge to as we made the number I of these computers large? 2.Suppose we used 1 computer to simulate a time series process. What would the average of all these observations converge to as T got very large?

5 Variance, autocovariance The variance General autocovariance, which nests the variance. Related to covariance in general, (ie not just time series) multivariate analysis

6 Autocorrelation, correlation Autocorrelation order j is the autocovariance order j divided by the variance. Autocorrelation comes from definition and computation of general notion of the correlation from multivariate, not necessarily time-series analysis

7 Moving average processes First order MA process, ‘MA(1)’, mu and theta are parameters, e is a white noise shock. Computing the mean and variance of an MA(1).

8 White noise Cross section average of shocks is zero. Variance is some constant. No ‘correlation’ across different units. Gaussian white noise if, in addition, normally distributed.

9 1 st autocovariance of an MA(1) Higher order autocovariances of an MA(1) are 0. It’s an exercise to explain why this is.

10 Higher order MA processes MA(2) MA(n) And we can have infinite order MA processes, referred to as MA(inf).

11 Why ‘moving average’ process? The RHS is an ‘average’ [actually a weighted sum] And it is a sum whose coverage or window ‘moves’ as the time indicator grows.

12 Stationarity, ergodicity Weak or covariance stationarity: Mean and autocovariances are independent of t. Strong stationarity: joint density of these elements in the sequence depend not on t, just on the gap between the different elements. Ergodicity: convergence of ‘time-series’ average to the ‘cross- section’ average.

13 Cross-sectional and time series stationarity y_t=rho*y_t-1+sd*e_t; rho=0.8,sd=1 Top-panel: variance of outturns ACROSS simulations Bottom panel: rolling variance OVER TIME for 1 simulation.

14 Cross-sectional and time-series non- stationarity y_t=rho*y_t-1+sd*e_t; rho=1.002,sd=1 Coefficient just over unity, but cross sectional variance exploding… And rolling time series variance not constant either.

15 Matlab code to simulate ARs, compute and plot cs and ts variances %script to demonstrate non-stationarity in AR(1) and time series / cross %sectional notion of variance. clear all; %ensures memory doesn't carry forward errors from runs of old versions tsample=1000; %define length of time series to simulate mcsample=50; %number of time series in our monte carlo rho=1.002; %autoregressive parameter sd=1; %sdeviation of shocks shocks=randn(tsample,mcsample); y=zeros(tsample,mcsample); %store our simulated data here csvar=zeros(mcsample,1); %store cr sec variances here tsvar=zeros(tsample-1,1); for i=1:mcsample for j=2:tsample y(j,i)=rho*y(j-1,i)+sd*shocks(j,i); end %calculate cross sectional variances for i=2:tsample csvar(i-1)=var(y(i,:)); end %calculate rolling ts variances for j=2:tsample tsvar(j-1)=var(y(1:j,1)); end %chart results figure subplot(2,1,1) plot(csvar) title('Cross-sectional variances,rho=1.002') subplot(2,1,2) plot(tsvar) title('time-series variances,rho=1.002')

16 AR(1), MA(1) Initial shock=0, theta=0.7

17 Matlab code to simulate MA(1), AR(1)

18 AR and ARMA processes AR(1) AR(2) ARMA(1,1) Which process you use will depend on whether you have economics/theory to guide you, or statistical criteria.

19 MA representation of an AR(1) Derive the MA rep by repeatedly substituting out for lag Y using the AR(1) form..

20 MA(inf) representation of AR(1) Exists provided mod(phi)<1 Shows that for a stationary AR(1), we can view today’s Y as the sum of the infinite sequence of past shocks. Note the imprint of past shocks on today is smaller, the further back in time they happened. And that’s true because of the dampening implied by the mod(phi)<1.

21 Impulse response function for an AR(1) Start from zero. Effect of a shock today is the shock itself. Effect of that shock tomorrow, in period t+1 And the propagated out another period…. IRF asks: what is the effect of a shock (an impulse) at a particular horizon in the future? Note relationship with MA(inf) rep of an AR(1).

22 IRF for AR(1) an example Suppose phi=0.8, c=0, e_0=1 e_0=1 is what we would often take as a standardised shock size to illustrate the shape of the IRF for an estimated time series process. Or we might take a shock size=1 standard deviation. Note how the IRF for the stationary AR(1) is monotonic [always falling] and dies out.

23 The forecast in an AR(1) (‘time series’) expectation given information at 0 of Y_1 Forecast at some horizon h The forecast error we will record when period h comes along and we observe the data we forecast.

24 Forecast errors in an AR(1) Partially construct the MA rep of an outturn for Y at horizon h in the future. We see that the forecast error at horizon h is a moving average of the shock that hit between now and h.

25 Forecast error analysis Armed with our analytical time series forecast errors…. We can compute their expectation. We can compare the expectation to the outturn. [Are they biased?] We can compute the expected autocorrelation of the errors. Their variance…. Connection with the empirical literature on rational expectations and survey/professional forecaster measures of inflation expectations.

26 VAR(1) representation of AR(2) AR(2) VAR(1) representation of the AR(2). First line has the ‘meat’. Second line just identity. Bold type sometimes to denote matrices. Why do this? Certain formulae for IRFs, or standard errors, forecast error variances, are easily derivable with first order models. So get the higher order model into first order form and then proceed….

27 The lag operator The lag operator shifts the time subscript backwards, or, if we write its inverse, forwards. We can use it to express time series processes like AR models differently. The lag operator is commutative with multiplication.

28 Rediscovering the MA(inf) representation of an AR(1) with the lag operator Operate on both sides of this with this And this is what you get. Here we expand the compound operator on the LHS of the equation above

29 Rediscovering….ctd LHS of above written explicitly, without lag operators. Note as t goes to inf, we are left with Y_t So with the aid of the lag operator, we have rediscovered the MA(inf) representation of the AR(1).

30 Lag operators and invertibility of AR(1) This is what we have established. Implying that these operators are approximately inverses of one another. Note this property of (any) inverse operator `1’ here is the ‘identity operator’

31 Invertibility, ctd… Provided mod(phi)<1, we can operate on both sides of this with the inverse of the operator on the RHS to get this… This is what is referred to as the ‘invertibility’ property of an AR(1) process. Analogous properties are deduced of multivariate vector autoregressive (VAR) processes too.

32 Computing mean and variance of AR(2) More involved than for the AR(1) Introduces likelihood computation for more complex processes Introduces recursive nature of autocovariances and its usefulness NB: it will simply be an exercise to do this for an AR(1) process.

33 Mean of an AR(2) Here is an AR(2) process. Start with calculating the mean. To get the mean, simply take expectations of both sides.

34 Variance of an AR(2) Rewrite our AR(2) using this substitution for the constant term c. This is what we get after making the substitution for c.

35 Variance of an AR(2) * by Y_t-mu, take expectations, and we get this. The above is a recursive equation in autocovariances, which we can denote like this. This is where the sig^2 term above comes from.

36 Variance of an AR(2) General form of the recursive autocovariance equation, formed by multiplying not by Y_t-mu, but Y_t-j-mu, then taking expectations.

37 Variance of an AR(2) Divide both sides by the variance, or the 0 th order autocovariance to get an equation in autocorrelations Set j=1, to get this, noting that rho_0=1, and rho_1=rho_-1 Set j=2 and the recursive equation in autocorrelations implies this… which we can rewrite substituting in for expression for rho_1

38 Variance of an MA(2) Rewrite the autocovariances on the RHS in terms of autocorrelations. Then substitute in for the autocorrelations which we found on the last slide…. And rearrange as an equation in gamma_0, the variance, which is what we were trying to solve for. Done!

39 Recap Moving average processes Autoregressive processes ARMA processes Methods for computing first and second moments of these Impulse response Forecast, forecast errors MA(infinity) representation of an AR(1) Lag operators, polynomials in the lag operator


Download ppt "Bristol MSc Time Series Econometrics, Spring 2015 Univariate time series processes, moments, stationarity."

Similar presentations


Ads by Google