7-1 MGMG 522 : Session #7 Serial Correlation (Ch. 9)

Slides:



Advertisements
Similar presentations
Introduction Describe what panel data is and the reasons for using it in this format Assess the importance of fixed and random effects Examine the Hausman.
Advertisements

Managerial Economics in a Global Economy
Using SAS for Time Series Data
Lecture #9 Autocorrelation Serial Correlation
Chapter 11 Autocorrelation.
Multicollinearity Multicollinearity - violation of the assumption that no independent variable is a perfect linear function of one or more other independent.
Lecture 4 Econ 488. Ordinary Least Squares (OLS) Objective of OLS  Minimize the sum of squared residuals: where Remember that OLS is not the only possible.
The Simple Linear Regression Model: Specification and Estimation
Chapter 13 Additional Topics in Regression Analysis
Marietta College Week 14 1 Tuesday, April 12 2 Exam 3: Monday, April 25, 12- 2:30PM Bring your laptops to class on Thursday too.
Chapter 6 Autocorrelation.
Correlation and Simple Regression Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
Economics 20 - Prof. Anderson1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 6. Heteroskedasticity.
Economics 310 Lecture 15 Autocorrelation. Correlation between members of series of observations order in time or space. For our classic model, we have.
Additional Topics in Regression Analysis
Econ 140 Lecture 181 Multiple Regression Applications III Lecture 18.
Topic 3: Regression.
Autocorrelation Lecture 18 Lecture 18.
So are how the computer determines the size of the intercept and the slope respectively in an OLS regression The OLS equations give a nice, clear intuitive.
Relationships Among Variables
Ordinary Least Squares
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
ECON 7710, Heteroskedasticity What is heteroskedasticity? What are the consequences? How is heteroskedasticity identified? How is heteroskedasticity.
Introduction to Linear Regression and Correlation Analysis
1 MADE WHAT IF SOME OLS ASSUMPTIONS ARE NOT FULFILED?
Hypothesis Testing in Linear Regression Analysis
Serial Correlation Serial correlation is a problem associated with time-series data. It occurs when the errors of the regression model are correlated with.
Regression Method.
BSc (Hons) Finance II/ BSc (Hons) Finance with Law II
Autocorrelation Outline 1) What is it?
What does it mean? The variance of the error term is not constant
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Two Ending Sunday, September 9 (Note: You must go over these slides and complete every.
Chapter 10 Hetero- skedasticity Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
1 MF-852 Financial Econometrics Lecture 10 Serial Correlation and Heteroscedasticity Roy J. Epstein Fall 2003.
Pure Serial Correlation
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Six.
Properties of OLS How Reliable is OLS?. Learning Objectives 1.Review of the idea that the OLS estimator is a random variable 2.How do we judge the quality.
Autocorrelation in Time Series KNNL – Chapter 12.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Thirteen.
1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.
Chapter Three TWO-VARIABLEREGRESSION MODEL: THE PROBLEM OF ESTIMATION
3-1 MGMG 522 : Session #3 Hypothesis Testing (Ch. 5)
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
Problems with the Durbin-Watson test
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS 1. After identifying and estimating a time series model, the goodness-of-fit of the model and validity of the.
AUTOCORRELATION: WHAT HAPPENS IF THE ERROR TERMS ARE CORRELATED?
5. Consistency We cannot always achieve unbiasedness of estimators. -For example, σhat is not an unbiased estimator of σ -It is only consistent -Where.
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Residual Analysis Purposes –Examine Functional Form (Linear vs. Non- Linear Model) –Evaluate Violations of Assumptions Graphical Analysis of Residuals.
8-1 MGMG 522 : Session #8 Heteroskedasticity (Ch. 10)
Chap 9 Regression with Time Series Data: Stationary Variables
10-1 MGMG 522 : Session #10 Simultaneous Equations (Ch. 14 & the Appendix 14.6)
5-1 MGMG 522 : Session #5 Multicollinearity (Ch. 8)
More on regression Petter Mostad More on indicator variables If an independent variable is an indicator variable, cases where it is 1 will.
2/25/ lecture 121 STATS 330: Lecture 12. 2/25/ lecture 122 Diagnostics 4 Aim of today’s lecture To discuss diagnostics for independence.
Quantitative Methods. Bivariate Regression (OLS) We’ll start with OLS regression. Stands for  Ordinary Least Squares Regression. Relatively basic multivariate.
AUTOCORRELATION 1 Assumption B.5 states that the values of the disturbance term in the observations in the sample are generated independently of each other.
Lecture 22 Summary of previous lecture Auto correlation  Nature  Reasons  Consequences  Detection.
4-1 MGMG 522 : Session #4 Choosing the Independent Variables and a Functional Form (Ch. 6 & 7)
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
Fundamentals of regression analysis
Fundamentals of regression analysis 2
Pure Serial Correlation
Serial Correlation and Heteroskedasticity in Time Series Regressions
Serial Correlation and Heteroscedasticity in
Chapter 13 Additional Topics in Regression Analysis
Serial Correlation and Heteroscedasticity in
Presentation transcript:

7-1 MGMG 522 : Session #7 Serial Correlation (Ch. 9)

7-2 Serial Correlation (a.k.a. Autocorrelation)  Autocorrelation is a violation of the classical assumption #4 (error terms are uncorrelated with each other).  Autocorrelation can be categorized into 2 kinds: –Pure autocorrelation (autocorrelation that exists in a correctly specified regression model). –Impure autocorrelation (autocorrelation that is caused by specification errors: omitted variables or incorrect functional form).  Autocorrelation mostly happens in a data set where order of observations has some meaning (e.g. time-series data).

7-3 Pure Autocorrelation  Classical assumption #4 implies that there is no correlation between any two observations of the error term, or E(r ij ) = 0 for i ≠ j.  Most common kind of autocorrelation is the first-order autocorrelation, where current observation of the error term is correlated with the previous observation of the error term.  Mathematically, ε t = ε t-1 + u t. Where, ε = error term,  = simple correlation coefficient (-1 <  < +1), and u = classical error term.

7-4  (pronounced “ rho ” )  -1 <  < +1   < 0 indicates negative autocorrelation (the signs of the error term switch back and forth).   > 0 indicates positive autocorrelation (a positive error term tends to be followed by a positive error term and a negative error term tends to be followed by a negative error term).  Positive autocorrelation is more common than negative autocorrelation.

7-5 Higher-order Autocorrelation  Examples: 1. Seasonal autocorrelation: ε t = ε t-4 + u t 2. Second-order autocorrelation: ε t =  1 ε t-1 +  2 ε t-2 + u t.

7-6 Impure Autocorrelation  Caused by specification errors: omitted variables or incorrect functional form.  Specification errors should be corrected first by way of investigating the independent variables and/or functional form.  How can omitted variables or incorrect functional form cause autocorrelation?  Remember that the error term is the sum of the effects of: 1.Omitted variables 2.Nonlinearities 3.Measurement errors 4.Pure error

7-7 Example: Omitted Variable Causes Autocorrelation  Correct model: Y =  0 + 1 X 1 + 2 X 2 +ε  If X 2 is omitted: Y =  0 + 1 X 1 +ε *  Where, ε * =  2 X 2 +ε  If ε is small compared to  2 X 2, and X 2 is serially correlated (very likely in a time series), ε * will be autocorrelated.  Estimate of  1 will be biased (because X 2 is omitted).  Both the bias and impure autocorrelation will disappear once the model gets corrected.

7-8 Example: Incorrect Functional Form Causes Autocorrelation  Correct model: Y =  0 + 1 X 1 + 2 X 1 2 +ε  Our model: Y =  0 + 1 X 1 +ε *  Where, ε * =  2 X 1 2 +ε  Autocorrelation could result. (See Figure 9.5 on p. 323)

7-9 Consequences of Autocorrelation 1. Pure autocorrelation does not cause bias in the coefficient estimates. 2. Pure autocorrelation increases variances of the coefficient estimates. 3. Pure autocorrelation causes OLS to underestimate the standard errors of the coefficients. (Hence, pure autocorrelation overestimates the t- values.)

7-10 Example of the Consequences 1. With no autocorrelation  b 1 =  SE(b 1 ) =  t-value = With autocorrelation but a correct SE of coefficient  b 1 =  SE * (b 1 ) =  t-value = With autocorrelation and OLS underestimate SE of coefficient  b 1 =  SE(b 1 ) =  t-value = 2.66

7-11 Detection of Autocorrelation  Use the Durbin-Watson d Test  Durbin-Watson d Test is only appropriate for –a regression model with an intercept term, –autocorrelation is of first-order, and –The regression model does not include a lagged dependent variable as an independent variable (e.g., Y t-1 )  Durbin-Watson d statistic for T observations is:

7-12 d statistic  d = 0 indicates extreme positive autocorrelation (e t = e t-1 ).  d = 4 indicates extreme negative autocorrelation (e t = -e t-1 ).  d = 2 indicates no autocorrelation Σ(e t -e t-1 ) 2 = Σ(e t 2 -2e t e t-1 +e t-1 2 ) = Σ(e t 2 +e t-1 2 ).

7-13 The Use of d Test 1. Econometricians almost never test one-sided that there is negative autocorrelation. Most of the tests are to detect positive autocorrelation (one-sided) or to detect autocorrelation (two-sided). 2. d test is sometimes inconclusive.

7-14 Example: One-sided d test that there is positive autocorrelation H 0 :  ≤ 0 (no positive autocorrelation) H 1 :  > 0 (positive autocorrelation)  Decision Rule If d < d L Reject H 0 If d < d L Reject H 0 If d > d U Do not reject H 0 If d > d U Do not reject H 0 If d L ≤ d ≤ d U Inconclusive

7-15 Example: Two-sided d test that there is autocorrelation H 0 :  = 0 (no autocorrelation) H 1 :  ≠ 0 (autocorrelation)  Decision Rule If d < d L Reject H 0 If d < d L Reject H 0 If d > 4-d L Reject H 0 If d > 4-d L Reject H 0 If 4-d U > d > d U Do not reject H 0 OtherwiseInconclusive

7-16 Correcting Autocorrelation  Use the Generalized Least Squares to restore the minimum variance property of the OLS estimation.

7-17 GLS properties 1. Now, the error term is not autocorrelated. Thus, OLS estimation of Eq.4 will be minimum variance. 2. The slope coefficient  1 of Eq.1 will be the same as that of Eq.4, and has the same meaning. 3. Adj-R 2 from Eq.1 and Eq.4 should not be used for comparison because the dependent variables are not the same in the two models.

7-18 GLS methods 1. Use the Cochrane-Orcutt method (EViews does not support this estimation method.) (Details on p ) 2. Use the AR(1) method (In EViews, insert the term AR(1) after the list of independent variables.) 3. When d test is inconclusive, GLS should not be used. 4. When d test is conclusive, GLS should not be used if 1.The autocorrelation is impure. 2.The consequence of autocorrelation is minor.

7-19 Autocorrelation-Corrected Standard Errors  The idea of this remedy goes like this.  Since, there is no bias in the coefficient estimates.  But, the standard errors of the coefficient estimates are larger with autocorrelation than without it.  Therefore, why not fix the standard errors and leave the coefficient estimates alone?  This method is referred to as HCCM (heteroskedasticity-consistent covariance matrix).

7-20 Autocorrelation-Corrected Standard Errors  In EViews, you will choose “LS” and click on “Options”, then select “Heteroskedasticity- Consistent Coefficient Covariance” and select “Newey-West”.  The standard errors of the coefficient estimates will be bigger than those from the OLS.  Newey-West standard errors are also known as HAC standard errors; for they correct both Heteroskedasticity and AutoCorrelation problems.  This method works best in a large sample data. In fact, when the sample size is large, you can always use HAC standard errors.