Presentation is loading. Please wait.

Presentation is loading. Please wait.

Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.

Similar presentations


Presentation on theme: "Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7."— Presentation transcript:

1 Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7

2 Outline  Linear stochastic processes  Autoregressive process  Moving average process  Lag operator  Model identification PACF/ACF Information Criteria

3 Stochastic Processes

4 Time Series Definitions  Strictly stationary  Covariance stationary  Uncorrelated  White noise

5 Strictly Stationary  All distributional features are independent of time

6 Weak or Covariance Stationary  Variances and covariances independent of time

7 Autocorrelation

8 White Noise

9 White Noise in Words  Weakly stationary  All autocovariances are zero  Not necessarily independent

10 Time Series Estimates

11 Ljung-Box Statistic

12 Linear Stochastic Processes  Linear models  Time series dependence  Common econometric frameworks  Engineering background

13 Autoregressive Process, Order 1:AR(1)

14 AR(1) Properties

15 More AR(1) Properties

16 More AR(1) properties

17 AR(1): Zero mean form

18 AR(m) (Order m)

19 Moving Average Process of Order 1, MA(1)

20 MA(1) Properties

21 MA(m)

22 Stationarity  Process not exploding  For AR(1)  All finite MA's are stationary  More complex beyond AR(1)

23 AR(1)->MA(infinity)

24 Lag Operator (L)

25 Using the Lag Operator (Mean adjusted form)

26 An important feature for L

27 MA(1) -> AR(infinity)

28 MA->AR

29 AR's and MA's  Can convert any stationary AR to an infinite MA  Exponentially declining weights  Can only convert "invertible" MA's to AR's  Stationarity and invertibility: Easy for AR(1), MA(1) More difficult for larger models

30 Combining AR and MA ARMA(p,q) (more later)

31 Modeling Procedures Box/Jenkins  Identification Determine structure  How many lags?  AR, MA, ARMA? Tricky  Estimation Estimate the parameters  Residual diagnostics  Next section: Forecast performance and evaluation

32 Identification Tools  Diagnostics ACF, Partial ACF Information criteria Forecast

33 Autocorrelation

34 Partial Autocorrelation  Correlation between y(t) and y(t-k) after removing all smaller (<k) correlations  Marginal forecast impact from t-k given all earlier information

35 Partial Autocorrelation

36 For an AR(1)

37 AR(1) (0.9)

38 For an MA(1)

39 MA(1) (0.9)

40 General Features  Autoregressive Decaying ACF PACF drops to zero beyond model order(p)  Moving average Decaying PACF ACF drops to zero beyond model order(q)  Don’t count on things looking so good

41 Information Criteria  Akaike, AIC  Schwarz Bayesian criterion, SBIC  Hannan-Quinn, HQIC  Objective: Penalize model errors Penalize model complexity Simple/accurate models

42 Information Criteria

43 Estimation  Autoregressive AR OLS Biased(-), but consistent, and approaches normal distribution for large T  Moving average MA and ARMA Numerical estimation procedures Built into many packages  Matlab econometrics toolbox

44 Residual Diagnostics  Get model residuals (forecast errors)  Run this time series through various diagnostics ACF, PACF, Ljung/Box, plots  Should be white noise (no structure)


Download ppt "Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7."

Similar presentations


Ads by Google