Download presentation

Presentation is loading. Please wait.

Published byJakob Thorogood Modified about 1 year ago

1
Using SAS for Time Series Data LSU Economics Department March 16, 2012

2
Next Workshop March 30 Instrumental Variables Estimation

3
Principles of Econometrics, 4t h EditionPage 3 Chapter 12: Regression with Time-Series Data: Nonstationary Variables Time-Series Data: Nonstationary Variables

4
Principles of Econometrics, 4t h EditionPage 4 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 12.1 Stationary and Nonstationary Variables 12.2 Spurious Regressions 12.3 Unit Root Tests for Nonstationarity Chapter Contents

5
Principles of Econometrics, 4t h EditionPage 5 Chapter 12: Regression with Time-Series Data: Nonstationary Variables The aim is to describe how to estimate regression models involving nonstationary variables – The first step is to examine the time-series concepts of stationarity (and nonstationarity) and how we distinguish between them.

6
Principles of Econometrics, 4t h EditionPage 6 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 12.1 Stationary and Nonstationary Variables

7
Principles of Econometrics, 4t h EditionPage 7 Chapter 12: Regression with Time-Series Data: Nonstationary Variables The change in a variable is an important concept – The change in a variable y t, also known as its first difference, is given by Δy t = y t – y t-1. Δy t is the change in the value of the variable y from period t - 1 to period t 12.1 Stationary and Nonstationary Variables

8
Principles of Econometrics, 4t h EditionPage 8 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 12.1 Stationary and Nonstationary Variables FIGURE 12.1 U.S. economic time series

9
Principles of Econometrics, 4t h EditionPage 9 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 12.1 Stationary and Nonstationary Variables FIGURE 12.1 (Continued) U.S. economic time series

10
Principles of Econometrics, 4t h EditionPage 10 Chapter 12: Regression with Time-Series Data: Nonstationary Variables Formally, a time series y t is stationary if its mean and variance are constant over time, and if the covariance between two values from the series depends only on the length of time separating the two values, and not on the actual times at which the variables are observed 12.1 Stationary and Nonstationary Variables

11
Principles of Econometrics, 4t h EditionPage 11 Chapter 12: Regression with Time-Series Data: Nonstationary Variables That is, the time series y t is stationary if for all values, and every time period, it is true that: 12.1 Stationary and Nonstationary Variables Eq. 12.1a Eq. 12.1b Eq. 12.1c

12
Principles of Econometrics, 4t h EditionPage 12 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 12.1 Stationary and Nonstationary Variables The First-Order Autoregressive Model FIGURE 12.2 Time-series models

13
Principles of Econometrics, 4t h EditionPage 13 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 12.1 Stationary and Nonstationary Variables The First-Order Autoregressive Model FIGURE 12.2 (Continued) Time-series models

14
Principles of Econometrics, 4t h EditionPage 14 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 12.2 Spurious Regressions FIGURE 12.3 Time series and scatter plot of two random walk variables

15
Principles of Econometrics, 4t h EditionPage 15 Chapter 12: Regression with Time-Series Data: Nonstationary Variables A simple regression of series one (rw 1 ) on series two (rw 2 ) yields: – These results are completely meaningless, or spurious The apparent significance of the relationship is false 12.2 Spurious Regressions

16
Principles of Econometrics, 4t h EditionPage 16 Chapter 12: Regression with Time-Series Data: Nonstationary Variables When nonstationary time series are used in a regression model, the results may spuriously indicate a significant relationship when there is none – In these cases the least squares estimator and least squares predictor do not have their usual properties, and t-statistics are not reliable – Since many macroeconomic time series are nonstationary, it is particularly important to take care when estimating regressions with macroeconomic variables 12.2 Spurious Regressions

17
Principles of Econometrics, 4t h EditionPage 17 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 12.3 Unit Root Tests for Stationarity

18
Principles of Econometrics, 4t h EditionPage 18 Chapter 12: Regression with Time-Series Data: Nonstationary Variables There are many tests for determining whether a series is stationary or nonstationary – The most popular is the Dickey–Fuller test 12.3 Unit Root Tests for Stationarity

19
Principles of Econometrics, 4t h EditionPage 19 Chapter 12: Regression with Time-Series Data: Nonstationary Variables Consider the AR(1) model: – We can test for nonstationarity by testing the null hypothesis that ρ = 1 against the alternative that |ρ| < 1 Or simply ρ < Unit Root Tests for Stationarity Eq Dickey-Fuller Test 1 (No constant and No Trend)

20
Principles of Econometrics, 4t h EditionPage 20 Chapter 12: Regression with Time-Series Data: Nonstationary Variables A more convenient form is: – The hypotheses are: 12.3 Unit Root Tests for Stationarity Eq. 12.5a Dickey-Fuller Test 1 (No constant and No Trend)

21
Principles of Econometrics, 4t h EditionPage 21 Chapter 12: Regression with Time-Series Data: Nonstationary Variables The second Dickey–Fuller test includes a constant term in the test equation: – The null and alternative hypotheses are the same as before 12.3 Unit Root Tests for Stationarity Dickey-Fuller Test 2 (With Constant but No Trend) Eq. 12.5b

22
Principles of Econometrics, 4t h EditionPage 22 Chapter 12: Regression with Time-Series Data: Nonstationary Variables The third Dickey–Fuller test includes a constant and a trend in the test equation: – The null and alternative hypotheses are H 0 : γ = 0 and H 1 :γ < Unit Root Tests for Stationarity Dickey-Fuller Test 3 (With Constant and With Trend) Eq. 12.5c

23
Principles of Econometrics, 4t h EditionPage 23 Chapter 12: Regression with Time-Series Data: Nonstationary Variables To test the hypothesis in all three cases, we simply estimate the test equation by least squares and examine the t-statistic for the hypothesis that γ = 0 – Unfortunately this t-statistic no longer has the t-distribution – Instead, we use the statistic often called a τ (tau) statistic 12.3 Unit Root Tests for Stationarity The Dickey-Fuller Critical Values

24
Principles of Econometrics, 4t h EditionPage 24 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 12.3 Unit Root Tests for Stationarity The Dickey-Fuller Critical Values Table 12.2 Critical Values for the Dickey–Fuller Test

25
Principles of Econometrics, 4t h EditionPage 25 Chapter 12: Regression with Time-Series Data: Nonstationary Variables To carry out a one-tail test of significance, if τ c is the critical value obtained from Table 12.2, we reject the null hypothesis of nonstationarity if τ ≤ τ c – If τ > τ c then we do not reject the null hypothesis that the series is nonstationary 12.3 Unit Root Tests for Stationarity The Dickey-Fuller Critical Values

26
Principles of Econometrics, 4t h EditionPage 26 Chapter 12: Regression with Time-Series Data: Nonstationary Variables An important extension of the Dickey–Fuller test allows for the possibility that the error term is autocorrelated – Consider the model: where 12.3 Unit Root Tests for Stationarity The Dickey-Fuller Critical Values Eq. 12.6

27
Principles of Econometrics, 4t h EditionPage 27 Chapter 12: Regression with Time-Series Data: Nonstationary Variables As an example, consider the two interest rate series: – The federal funds rate (F t ) – The three-year bond rate (B t ) Following procedures described in Sections 9.3 and 9.4, we find that the inclusion of one lagged difference term is sufficient to eliminate autocorrelation in the residuals in both cases 12.3 Unit Root Tests for Stationarity The Dickey-Fuller Tests: An Example

28
Principles of Econometrics, 4t h EditionPage 28 Chapter 12: Regression with Time-Series Data: Nonstationary Variables The results from estimating the resulting equations are: – The 5% critical value for tau (τ c ) is – Since > -2.86, we do not reject the null hypothesis 12.3 Unit Root Tests for Stationarity The Dickey-Fuller Tests: An Example

29
Principles of Econometrics, 4t h EditionPage 29 Chapter 12: Regression with Time-Series Data: Nonstationary Variables Recall that if y t follows a random walk, then γ = 0 and the first difference of y t becomes: – Series like y t, which can be made stationary by taking the first difference, are said to be integrated of order one, and denoted as I(1) Stationary series are said to be integrated of order zero, I(0) – In general, the order of integration of a series is the minimum number of times it must be differenced to make it stationary 12.3 Unit Root Tests for Stationarity Order of Integration

30
Principles of Econometrics, 4t h EditionPage 30 Chapter 12: Regression with Time-Series Data: Nonstationary Variables The results of the Dickey–Fuller test for a random walk applied to the first differences are: 12.3 Unit Root Tests for Stationarity Order of Integration

31
Principles of Econometrics, 4t h EditionPage 31 Chapter 12: Regression with Time-Series Data: Nonstationary Variables Based on the large negative value of the tau statistic ( < -1.94), we reject the null hypothesis that ΔF t is nonstationary and accept the alternative that it is stationary – We similarly conclude that ΔB t is stationary (-7:662 < -1:94) 12.3 Unit Root Tests for Stationarity Order of Integration

32
Principles of Econometrics, 4t h EditionPage 32 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 12.4 Cointegration

33
Principles of Econometrics, 4t h EditionPage 33 Chapter 12: Regression with Time-Series Data: Nonstationary Variables As a general rule, nonstationary time-series variables should not be used in regression models to avoid the problem of spurious regression – There is an exception to this rule 12.4 Cointegration

34
Principles of Econometrics, 4t h EditionPage 34 Chapter 12: Regression with Time-Series Data: Nonstationary Variables There is an important case when e t = y t - β 1 - β 2 x t is a stationary I(0) process – In this case y t and x t are said to be cointegrated Cointegration implies that y t and x t share similar stochastic trends, and, since the difference e t is stationary, they never diverge too far from each other 12.4 Cointegration

35
Principles of Econometrics, 4t h EditionPage 35 Chapter 12: Regression with Time-Series Data: Nonstationary Variables The test for stationarity of the residuals is based on the test equation: – The regression has no constant term because the mean of the regression residuals is zero. – We are basing this test upon estimated values of the residuals 12.4 Cointegration Eq. 12.7

36
Principles of Econometrics, 4t h EditionPage 36 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 12.4 Cointegration Table 12.4 Critical Values for the Cointegration Test

37
Principles of Econometrics, 4t h EditionPage 37 Chapter 12: Regression with Time-Series Data: Nonstationary Variables There are three sets of critical values – Which set we use depends on whether the residuals are derived from: 12.4 Cointegration Eq. 12.8a Eq. 12.8b Eq. 12.8c

38
Principles of Econometrics, 4t h EditionPage 38 Chapter 12: Regression with Time-Series Data: Nonstationary Variables Consider the estimated model: – The unit root test for stationarity in the estimated residuals is: 12.4 Cointegration Eq An Example of a Cointegration Test

39
Principles of Econometrics, 4t h EditionPage 39 Chapter 12: Regression with Time-Series Data: Nonstationary Variables The null and alternative hypotheses in the test for cointegration are: – Similar to the one-tail unit root tests, we reject the null hypothesis of no cointegration if τ ≤ τ c, and we do not reject the null hypothesis that the series are not cointegrated if τ > τ c 12.4 Cointegration An Example of a Cointegration Test

40
Principles of Econometrics, 4t h EditionPage 40 Chapter 12: Regression with Time-Series Data: Nonstationary Variables Chapter 9 Regression with Time Series Data: Stationary Variables Walter R. Paczkowski Rutgers University

41
Principles of Econometrics, 4t h EditionPage 41 Chapter 9: Regression with Time Series Data: Stationary Variables 9.1 Introduction 9.2 Finite Distributed Lags 9.3 Serial Correlation 9.4 Other Tests for Serially Correlated Errors 9.5 Estimation with Serially Correlated Errors 9.6 Autoregressive Distributed Lag Models 9.7 Forecasting 9.8 Multiplier Analysis Chapter Contents

42
Principles of Econometrics, 4t h EditionPage 42 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 9.1 Introduction

43
Principles of Econometrics, 4t h EditionPage 43 Chapter 9: Regression with Time Series Data: Stationary Variables When modeling relationships between variables, the nature of the data that have been collected has an important bearing on the appropriate choice of an econometric model – Two features of time-series data to consider: 1.Time-series observations on a given economic unit, observed over a number of time periods, are likely to be correlated 2.Time-series data have a natural ordering according to time 9.1 Introduction

44
Principles of Econometrics, 4t h EditionPage 44 Chapter 9: Regression with Time Series Data: Stationary Variables There is also the possible existence of dynamic relationships between variables – A dynamic relationship is one in which the change in a variable now has an impact on that same variable, or other variables, in one or more future time periods – These effects do not occur instantaneously but are spread, or distributed, over future time periods 9.1 Introduction

45
Principles of Econometrics, 4t h EditionPage 45 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 9.1 Introduction FIGURE 9.1 The distributed lag effect

46
Principles of Econometrics, 4t h EditionPage 46 Chapter 9: Regression with Time Series Data: Stationary Variables Ways to model the dynamic relationship: 1.Specify that a dependent variable y is a function of current and past values of an explanatory variable x Because of the existence of these lagged effects, Eq. 9.1 is called a distributed lag model 9.1 Introduction Dynamic Nature of Relationships Eq. 9.1

47
Principles of Econometrics, 4t h EditionPage 47 Chapter 9: Regression with Time Series Data: Stationary Variables Ways to model the dynamic relationship (Continued) : 2.Capturing the dynamic characteristics of time- series by specifying a model with a lagged dependent variable as one of the explanatory variables Or have: – Such models are called autoregressive distributed lag (ARDL) models, with ‘‘autoregressive’’ meaning a regression of y t on its own lag or lags 9.1 Introduction Dynamic Nature of Relationships Eq. 9.2 Eq. 9.3

48
Principles of Econometrics, 4t h EditionPage 48 Chapter 9: Regression with Time Series Data: Stationary Variables Ways to model the dynamic relationship (Continued) : 3.Model the continuing impact of change over several periods via the error term In this case e t is correlated with e t - 1 We say the errors are serially correlated or autocorrelated 9.1 Introduction Dynamic Nature of Relationships Eq. 9.4

49
Principles of Econometrics, 4t h EditionPage 49 Chapter 9: Regression with Time Series Data: Stationary Variables The primary assumption is Assumption MR4: For time series, this is written as: – The dynamic models in Eqs. 9.2, 9.3 and 9.4 imply correlation between y t and y t - 1 or e t and e t - 1 or both, so they clearly violate assumption MR4 9.1 Introduction Least Squares Assumptions

50
Principles of Econometrics, 4t h EditionPage 50 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 9.2 Finite Distributed Lags

51
Principles of Econometrics, 4t h EditionPage 51 Chapter 9: Regression with Time Series Data: Stationary Variables Consider a linear model in which, after q time periods, changes in x no longer have an impact on y – Note the notation change: β s is used to denote the coefficient of x t-s and α is introduced to denote the intercept Eq Finite Distributed Lags

52
Principles of Econometrics, 4t h EditionPage 52 Chapter 9: Regression with Time Series Data: Stationary Variables Model 9.5 has two uses: – Forecasting – Policy analysis What is the effect of a change in x on y? Eq. 9.6 Eq Finite Distributed Lags

53
Principles of Econometrics, 4t h EditionPage 53 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 9.3 Serial Correlation

54
Principles of Econometrics, 4t h EditionPage 54 Chapter 9: Regression with Time Series Data: Stationary Variables When is assumption TSMR5, cov(e t, e s ) = 0 for t ≠ s likely to be violated, and how do we assess its validity? – When a variable exhibits correlation over time, we say it is autocorrelated or serially correlated These terms are used interchangeably 9.3 Serial Correlation

55
Principles of Econometrics, 4t h EditionPage 55 Chapter 9: Regression with Time Series Data: Stationary Variables More generally, the k-th order sample autocorrelation for a series y that gives the correlation between observations that are k periods apart is: 9.3 Serial Correlation 9.3.1a Computing Autocorrelation Eq. 9.14

56
Principles of Econometrics, 4t h EditionPage 56 Chapter 9: Regression with Time Series Data: Stationary Variables How do we test whether an autocorrelation is significantly different from zero? – The null hypothesis is H 0 : ρ k = 0 – A suitable test statistic is: 9.3 Serial Correlation 9.3.1a Computing Autocorrelation Eq. 9.17

57
Principles of Econometrics, 4t h EditionPage 57 Chapter 9: Regression with Time Series Data: Stationary Variables The correlogram, also called the sample autocorrelation function, is the sequence of autocorrelations r 1, r 2, r 3, … – It shows the correlation between observations that are one period apart, two periods apart, three periods apart, and so on 9.3 Serial Correlation 9.3.1b The Correlagram

58
Principles of Econometrics, 4t h EditionPage 58 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 9.3 Serial Correlation 9.3.1b The Correlagram FIGURE 9.6 Correlogram for G

59
Principles of Econometrics, 4t h EditionPage 59 Chapter 9: Regression with Time Series Data: Stationary Variables To determine if the errors are serially correlated, we compute the least squares residuals: 9.3 Serial Correlation 9.3.2a A Phillips Curve Eq. 9.20

60
Principles of Econometrics, 4t h EditionPage 60 Chapter 9: Regression with Time Series Data: Stationary Variables The k-th order autocorrelation for the residuals can be written as: – The least squares equation is: 9.3 Serial Correlation 9.3.2a A Phillips Curve Eq Eq. 9.22

61
Principles of Econometrics, 4t h EditionPage 61 Chapter 9: Regression with Time Series Data: Stationary Variables The values at the first five lags are: 9.3 Serial Correlation 9.3.2a A Phillips Curve

62
Principles of Econometrics, 4t h EditionPage 62 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 9.4 Other Tests for Serially Correlated Errors

63
Principles of Econometrics, 4t h EditionPage 63 Chapter 9: Regression with Time Series Data: Stationary Variables If e t and e t-1 are correlated, then one way to model the relationship between them is to write: – We can substitute this into a simple regression equation: 9.4 Other Tests for Serially Correlated Errors A Lagrange Multiplier Test Eq Eq. 9.24

64
Principles of Econometrics, 4t h EditionPage 64 Chapter 9: Regression with Time Series Data: Stationary Variables To derive the relevant auxiliary regression for the autocorrelation LM test, we write the test equation as: – But since we know that, we get: 9.4 Other Tests for Serially Correlated Errors A Lagrange Multiplier Test Eq. 9.25

65
Principles of Econometrics, 4t h EditionPage 65 Chapter 9: Regression with Time Series Data: Stationary Variables Rearranging, we get: – If H 0 : ρ = 0 is true, then LM = T x R 2 has an approximate χ 2 (1) distribution T and R 2 are the sample size and goodness- of-fit statistic, respectively, from least squares estimation of Eq Other Tests for Serially Correlated Errors A Lagrange Multiplier Test Eq. 9.26

66
Principles of Econometrics, 4t h EditionPage 66 Chapter 12: Regression with Time-Series Data: Nonstationary Variables 9.5 Estimation with Serially Correlated Errors

67
Principles of Econometrics, 4t h EditionPage 67 Chapter 9: Regression with Time Series Data: Stationary Variables Three estimation procedures are considered: 1.Least squares estimation 2.An estimation procedure that is relevant when the errors are assumed to follow what is known as a first-order autoregressive model 3.A general estimation strategy for estimating models with serially correlated errors 9.5 Estimation with Serially Correlated Errors

68
Principles of Econometrics, 4t h EditionPage 68 Chapter 9: Regression with Time Series Data: Stationary Variables We will encounter models with a lagged dependent variable, such as: 9.5 Estimation with Serially Correlated Errors

69
Principles of Econometrics, 4t h EditionPage 69 Chapter 12: Regression with Time-Series Data: Nonstationary Variables TSMR2A In the multiple regression model Where some of the x tk may be lagged values of y, v t is uncorrelated with all x tk and their past values. ASSUMPTION FOR MODELS WITH A LAGGED DEPENDENT VARIABLE 9.5 Estimation with Serially Correlated Errors

70
Principles of Econometrics, 4t h EditionPage 70 Chapter 9: Regression with Time Series Data: Stationary Variables Suppose we proceed with least squares estimation without recognizing the existence of serially correlated errors. What are the consequences? 1.The least squares estimator is still a linear unbiased estimator, but it is no longer best 2.The formulas for the standard errors usually computed for the least squares estimator are no longer correct Confidence intervals and hypothesis tests that use these standard errors may be misleading 9.5 Estimation with Serially Correlated Errors Least Squares Estimation

71
Principles of Econometrics, 4t h EditionPage 71 Chapter 9: Regression with Time Series Data: Stationary Variables It is possible to compute correct standard errors for the least squares estimator: – HAC (heteroskedasticity and autocorrelation consistent) standard errors, or Newey-West standard errors These are analogous to the heteroskedasticity consistent standard errors 9.5 Estimation with Serially Correlated Errors Least Squares Estimation

72
Principles of Econometrics, 4t h EditionPage 72 Chapter 9: Regression with Time Series Data: Stationary Variables Consider the model y t = β 1 + β 2 x t + e t – The variance of b 2 is: where 9.5 Estimation with Serially Correlated Errors Least Squares Estimation Eq. 9.27

73
Principles of Econometrics, 4t h EditionPage 73 Chapter 9: Regression with Time Series Data: Stationary Variables When the errors are not correlated, cov(e t, e s ) = 0, and the term in square brackets is equal to one. – The resulting expression is the one used to find heteroskedasticity- consistent (HC) standard errors – When the errors are correlated, the term in square brackets is estimated to obtain HAC standard errors 9.5 Estimation with Serially Correlated Errors Least Squares Estimation

74
Principles of Econometrics, 4t h EditionPage 74 Chapter 9: Regression with Time Series Data: Stationary Variables If we call the quantity in square brackets g and its estimate, then the relationship between the two estimated variances is: 9.5 Estimation with Serially Correlated Errors Least Squares Estimation Eq. 9.28

75
Principles of Econometrics, 4t h EditionPage 75 Chapter 9: Regression with Time Series Data: Stationary Variables Substituting, we get: 9.5 Estimation with Serially Correlated Errors 9.5.2b Nonlinear Least Squares Estimation Eq. 9.43

76
Principles of Econometrics, 4t h EditionPage 76 Chapter 9: Regression with Time Series Data: Stationary Variables The coefficient of x t-1 equals -ρβ 2 – Although Eq is a linear function of the variables x t, y t-1 and x t-1, it is not a linear function of the parameters (β 1, β 2, ρ) – The usual linear least squares formulas cannot be obtained by using calculus to find the values of (β 1, β 2, ρ) that minimize S v These are nonlinear least squares estimates 9.5 Estimation with Serially Correlated Errors 9.5.2b Nonlinear Least Squares Estimation

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google