12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly.

Slides:



Advertisements
Similar presentations
Regression Analysis.
Advertisements

Economics 20 - Prof. Anderson1 Panel Data Methods y it = x it k x itk + u it.
Multiple Regression Analysis
The Simple Regression Model
Lecture 8 (Ch14) Advanced Panel Data Method
3.3 Omitted Variable Bias -When a valid variable is excluded, we UNDERSPECIFY THE MODEL and OLS estimates are biased -Consider the true population model:
3.2 OLS Fitted Values and Residuals -after obtaining OLS estimates, we can then obtain fitted or predicted values for y: -given our actual and predicted.
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
8. Heteroskedasticity We have already seen that homoskedasticity exists when the error term’s variance, conditional on all x variables, is constant: Homoskedasticity.
Lecture 4 Econ 488. Ordinary Least Squares (OLS) Objective of OLS  Minimize the sum of squared residuals: where Remember that OLS is not the only possible.
8.4 Weighted Least Squares Estimation Before the existence of heteroskedasticity-robust statistics, one needed to know the form of heteroskedasticity -Het.
Part 1 Cross Sectional Data
HETEROSKEDASTICITY Chapter 8.
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
2.5 Variances of the OLS Estimators
Economics 20 - Prof. Anderson1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 6. Heteroskedasticity.
Prof. Dr. Rainer Stachuletz
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
Specific to General Modelling The traditional approach to econometrics modelling was as follows: 1.Start with an equation based on economic theory. 2.Estimate.
1Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 6. Heteroskedasticity.
Additional Topics in Regression Analysis
Generalized Regression Model Based on Greene’s Note 15 (Chapter 8)
1Prof. Dr. Rainer Stachuletz Fixed Effects Estimation When there is an observed fixed effect, an alternative to first differences is fixed effects estimation.
Economics 20 - Prof. Anderson
Topic 3: Regression.
Economics 20 - Prof. Anderson1 Fixed Effects Estimation When there is an observed fixed effect, an alternative to first differences is fixed effects estimation.
6.4 Prediction -We have already seen how to make predictions about our dependent variable using our OLS estimates and values for our independent variables.
1Prof. Dr. Rainer Stachuletz Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
Economics Prof. Buckles
Ordinary Least Squares
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
3. Multiple Regression Analysis: Estimation -Although bivariate linear regressions are sometimes useful, they are often unrealistic -SLR.4, that all factors.
Hypothesis Testing in Linear Regression Analysis
Regression Method.
Serial Correlation and the Housing price function Aka “Autocorrelation”
What does it mean? The variance of the error term is not constant
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
Pure Serial Correlation
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
9-1 MGMG 522 : Session #9 Binary Regression (Ch. 13)
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
Properties of OLS How Reliable is OLS?. Learning Objectives 1.Review of the idea that the OLS estimator is a random variable 2.How do we judge the quality.
2.4 Units of Measurement and Functional Form -Two important econometric issues are: 1) Changing measurement -When does scaling variables have an effect.
3.4 The Components of the OLS Variances: Multicollinearity We see in (3.51) that the variance of B j hat depends on three factors: σ 2, SST j and R j 2.
1 Javier Aparicio División de Estudios Políticos, CIDE Primavera Regresión.
Problems with the Durbin-Watson test
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS 1. After identifying and estimating a time series model, the goodness-of-fit of the model and validity of the.
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
5. Consistency We cannot always achieve unbiasedness of estimators. -For example, σhat is not an unbiased estimator of σ -It is only consistent -Where.
8-1 MGMG 522 : Session #8 Heteroskedasticity (Ch. 10)
AUTOCORRELATION 1 Assumption B.5 states that the values of the disturbance term in the observations in the sample are generated independently of each other.
Lecture 22 Summary of previous lecture Auto correlation  Nature  Reasons  Consequences  Detection.
© 2013 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
High Speed Heteroskedasticity Review. 2 Review: Heteroskedasticity Heteroskedasticity leads to two problems: –OLS computes standard errors on slopes incorrectly.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Ch5 Relaxing the Assumptions of the Classical Model
Esman M. Nyamongo Central Bank of Kenya
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
Further Issues Using OLS with Time Series Data
Fundamentals of regression analysis 2
Pure Serial Correlation
Further Issues in Using OLS with Time Series Data
Chapter 12 – Autocorrelation
Serial Correlation and Heteroskedasticity in Time Series Regressions
Serial Correlation and Heteroscedasticity in
Further Issues Using OLS with Time Series Data
Multiple Regression Analysis: OLS Asymptotics
Serial Correlation and Heteroscedasticity in
Presentation transcript:

12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly exogenous -in particular, we should have no lagged explanatory variables Assume that our error terms follow AR(1) SERIAL CORRELATION : -assuming from here on in that everything is conditional on X, we can calculate variance as:

12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors If we consider a single explanatory variable, we can eliminate the correlation in the error term as follows: This provides us with new error terms that are uncorrelated -Note that ytilde and xtilde are called QUASI- DIFFERENCED DATA

12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors Note that OLS is not BLUE yet as the initial y 1 is undefined -to make OLS blue and ensure the first term’s errors are the same as other terms, we set -note that our first term’s quasi-differenced data is calculated differently than all other terms -note also that this is another example of GLS estimation

12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors Given multiple explanatory variables, we have: -note that this GLS estimation is BLUE and will generally differ from OLS -note also that our t and F statistics are now valid and testing can be done

12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors Unfortunately, ρ is rarely know, but it can be estimated from the formula: We then use ρhat to estimate: Note that in this FEASIBLE GLS (FGLS), the estimation error in ρhat does not affect FGLS’s estimator’s asymptotic distribution

Feasible GLS Estimation of the AR(1) Model 1)Regress y on all x’s to obtain residuals uhat 2)Regress uhat t on uhat t-1 and obtain OLS estimates of ρhat 3)Use these ρhat estimates to estimate We now have adjusted slope estimates with valid standard errors for testing

12.3 FGLS Notes -Using ρhat is not valid for small sample sizes -This is due to the fact that FGLS is not unbiased, it is only consistent if the data is weakly dependent -while FGLS is not BLUE, it is asymptotically more efficient than OLS (again, large samples) -two examples of FGLS are the COCHRANE- ORCUTT (CO) ESTIMATION and the PRAIS- WINSTEN (PW) ESTIMATION -these estimations are similar and differ only in treating the first observation

12.3 Iterated FGLS -Practically, FGLS is often iterated: -once FGLS is estimated once, its residuals are used to recalculate phat, and FGLS is estimated again -this is generally repeated until phat converges to a number -regression programs can automatically perform this iteration -theoretically, the first iteration satisfies all large sample properties needed for tests -Note: regression programs can also correct for AR(q) using a complicated FGLS

12.3 FGLS vrs. OLS -In certain cases (such as in the presence of unit roots), FGLS can fail to obtain accurate estimates; its estimates can vary greatly from OLS -When FGLS and OLS give similar estimates, FGLS is always preferred if autocorrelation exists -If FGLS and OLS estimates differ greatly, more complicated statistical estimation is needed

12.5 Serial Correlation-Robust Inference -FGLS can fail for a variety of reasons: -explanatory variables are not strictly exogenous -sample size is too low -the form of autocorrelation is unknown and more complicated than AR(1) -in these cases OLS standard errors can be corrected for arbitrary autocorrelation -the estimates themselves aren’t affected, and therefore OLS is inefficient (much like the het- robust correction of simple OLS)

12.5 Autocorrelation-Robust Inference -To correct standard errors for arbitrary autocorrelation, chose an integer g>0 (generally 1-2 in most cases, 1x-2x where x=frequency greater than annually): -where rhat is the residual from regressing x 1 on all other x’s and uhat is the residual from the typical OLS estimation

12.5 Autocorrelation-Robust Inference -After obtaining vhat, our standard errors are adjusted using: -note that this transformation is applied to all variables (as any can be listed as x 1 ) -these standard errors are also robust to arbitrary heteroskedasticity -this transformation is done using the OLS subcommand /autcov=1 in SHAZAM, but can also be done step by step:

Serial Correlation-Robust Standard Error for B 1 hat 1)Regress y on all x’s to obtain residuals uhat, OLS standard errors, and σhat 2)Regress x 1 on all other x’s and obtain residuals rhat 3)Use these estimates to estimate vhat as seen previously 4)Using vhat, obtain new standard errors through:

12.5 SC-Robust Notes -Note that these Serial correlation (SC) robust standard errors are poorly behaved for small sample sizes (even as large as 100) -note that g must be chosen, making this correction less than automatic -if serial correlation is severe, this correction leaves OLS very inefficient, especially in small sample sizes -use this correction only if forced to (some variables not strictly exogenous, lagged dependent variables) -correction is like a hand grenade, not a sniper

12.6 Het in Time Series -Like in cross sectional studies, heteroskedasticity in time series studies doesn’t cause unbiasedness or inconsistency -it does invalidate standard errors and tests -while robust solutions for autocorrelation may correct Het, the opposite is NOT true -Heteroskedasticity-Robust Statistics do NOT correct for autocorrelation -note also that Autocorrelation is often more damaging to a model than Het (depending on the amount of auto (ρ) and amount of Het)

12.6 Testing and Fixing Het in Time Series In order to test for Het: 1)Serial correlation must be tested for and corrected first 2)Dynamic Heteroskedasticity (see next section) must not exist Fixing Het is the same as the cross secitonal case: 1)WLS is BLUE if correctly specified 2)FGLS is asymptotically valid in large samples 3)Het-robust corrections are better than nothing (they don’t correct estimates, only s.e.’s)

12.6 Dynamic Het -Time series adds the complication that the variance of the error term may depend on explanatory variables of other periods (and thus errors of other periods) -Engle (1982) suggested the AUTOREGRESSIVE CONDITIONAL HETEROSKEDASTICITY (ARCH) model. A first-order Arch (ARCH(1)) model would look like:

12.6 ARCH -The ARCH(1) model can be rewritten as: -Which is similar to the autoregressive model and has the similar stability condition that α 1 <1 -While ARCH does not make OLS biased or inconsistent, if it exists a WLS or maximum likelihood (ML) estimation are asymptotically more efficient (better estimates) -note that the usual het-robust standard errors and test statistics are still valid under ARCH

12.6 Het and Autocorrelation…end of the world? -typically, serial correlation is a more serious issue than Het as it affects standard errors and estimation efficiency more -however, a low ρ value may cause Het to be more serious -we’ve already seen that het-robust autocorrelation tests are straightforward

12.6 Het and Autocorrelation…is there hope? If Het and Autocorrelation are found, one can: 1)Fix autocorrelation using the CO or PW method (Auto command in Shazam) 2)Apply heteroskedastic-robust standard errors to the regression (Not possible through a simple Shazam command) As a last resort, SC-robust standard errors are also heteroskedastic-robust

12.6 Het and Autocorrelation…is there hope? Alternately, het can be corrected through a combined WLS AR(1) procedure: -Since u t /h t 1/2 is homoskedastic, the above equation can be estimated using CO or PW

FGLS with Heteroskedasticity and AR(1) Serial Correlation: 1)Regress y on all x’s to obtain residuals uhat 2)Regress log(uhat t 2 ) on all x t ’s (or y t hat and y t hat 2 ) and obtain fitted values, ghat t 3)Estimate h t : h t hat=exp(ghat t ) 4)Estimate the equation By Cochrane-Orcutt (CO) or Prais-Winsten (PW) methods. (This corrects for serial correlation.)