1 MF-852 Financial Econometrics Lecture 10 Serial Correlation and Heteroscedasticity Roy J. Epstein Fall 2003.

Slides:



Advertisements
Similar presentations
Regression Analysis.
Advertisements

Properties of Least Squares Regression Coefficients
Autocorrelation Lecture 20 Lecture 20.
Econometric Modeling Through EViews and EXCEL
Managerial Economics in a Global Economy
Lecture 3 HSPM J716. Efficiency in an estimator Efficiency = low bias and low variance Unbiased with high variance – not very useful Biased with low variance.
Using SAS for Time Series Data
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
1 MF-852 Financial Econometrics Lecture 11 Distributed Lags and Unit Roots Roy J. Epstein Fall 2003.
Chapter 11 Autocorrelation.
Regression Analysis Notes. What is a simple linear relation? When one variable is associated with another variable in such a way that two numbers completely.
Objectives (BPS chapter 24)
MF-852 Financial Econometrics
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Chapter 13 Additional Topics in Regression Analysis
Chapter 10 Simple Regression.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Additional Topics in Regression Analysis
1 MF-852 Financial Econometrics Lecture 8 Introduction to Multiple Regression Roy J. Epstein Fall 2003.
QA-3 FRM-GARP Sep-2001 Zvi Wiener Quantitative Analysis 3.
T-test.
FRM Zvi Wiener Following P. Jorion, Financial Risk Manager Handbook Financial Risk Management.
1 MF-852 Financial Econometrics Lecture 6 Linear Regression I Roy J. Epstein Fall 2003.
Topic 3: Regression.
Econ 140 Lecture 191 Autocorrelation Lecture 19. Econ 140 Lecture 192 Today’s plan Durbin’s h-statistic Autoregressive Distributed Lag model & Finite.
Autocorrelation Lecture 18 Lecture 18.
Simple Linear Regression Analysis
Lecture 5 Correlation and Regression
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
Chapter 11 Simple Regression
Hypothesis Testing in Linear Regression Analysis
Regression Method.
Understanding Multivariate Research Berry & Sanders.
Autocorrelation Outline 1) What is it?
Serial Correlation and the Housing price function Aka “Autocorrelation”
What does it mean? The variance of the error term is not constant
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
Pure Serial Correlation
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Six.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.
LECTURE 7 CONSTRUCTION OF ECONOMETRIC MODELS WITH AUTOCORRELATED RESIDUES.
Problems with the Durbin-Watson test
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS 1. After identifying and estimating a time series model, the goodness-of-fit of the model and validity of the.
AUTOCORRELATION: WHAT HAPPENS IF THE ERROR TERMS ARE CORRELATED?
8-1 MGMG 522 : Session #8 Heteroskedasticity (Ch. 10)
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
AUTOCORRELATION 1 Assumption B.5 states that the values of the disturbance term in the observations in the sample are generated independently of each other.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Ch5 Relaxing the Assumptions of the Classical Model
REGRESSION (CONTINUED)
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
REGRESSION (CONTINUED)
Fundamentals of regression analysis 2
Pure Serial Correlation
Chapter 12 – Autocorrelation
Autocorrelation.
Serial Correlation and Heteroskedasticity in Time Series Regressions
Serial Correlation and Heteroscedasticity in
Simple Linear Regression
Autocorrelation Dr. A. PHILIP AROKIADOSS Chapter 5 Assistant Professor
Autocorrelation.
Autocorrelation MS management.
Serial Correlation and Heteroscedasticity in
Presentation transcript:

1 MF-852 Financial Econometrics Lecture 10 Serial Correlation and Heteroscedasticity Roy J. Epstein Fall 2003

2 Topics Serial correlation What is it? Effect on hypothesis tests Testing and correcting for serial correlation Heteroscedasticity Ditto. ARCH (or how to win a Nobel prize)

3 Serial Correlation The error terms in the regression should be independent, i.e., E(e i e j ) = 0 for any i and j. If this assumption is not true then the errors are serially correlated. Only a problem for time-series data.

4 Serial Correlation — Possible Causes Omitted variables. Wrong functional form. “inertia” in economic data—error term composed of many small effects, each with a similar trend.

5 Correlated Error Terms Suppose E(e i e i-1 )  0. Implies neighboring observations are correlated, not independent. 1st-order process. Most common form of serial correlation. Suppose E(e i e i-4 )  0. 4 th -order process. Often occurs with quarterly data.

6 Graph of Residuals from a Regression

7 Importance of Serial Correlation Regression coefficients (the marginal effects) are unbiased. BUT their standard errors are biased. Bias generally understates the standard errors, so significance tests are biased against H 0. H 0 is rejected too often.

8 Bias in Standard Errors Standard errors for the coefficients depend on estimated variance of error term, s 2 e. Regression program assumes independent errors with mean 0, so program calculates

9 Why the Standard Errors are Biased Calculation ignores covariance when errors are NOT independent. Covariance between errors, when it exists, is usually positive. So s 2 e would be understated and standard errors would be biased downward.

10 Testing for Serial Correlation Most common test is Durbin- Watson statistic Only used for 1 st order serial correlation Calculated as

11 Durbin-Watson Stat. When covariance between neighboring observations is zero then DW should be close to 2. High covariance implies DW —> 0. H 0 for no 1 st order serial correlation: DW = 2 Look up critical values in table (RR, p. 592) See sample regression in xls file.

12 Model with Serial Correlation Y t =  0 +  1 X t + e t Suppose e t =  e t-1 + u t, where u t is another error with mean 0 that is serially independent and uncorrelated with e or X. -1 <  < 1 (or the process is explosive) u t is called the innovation in e because it is the new component of e each period. Serially correlated: E(e t e t-1 ) =  var(e t ).

13 How to find  Estimate it as  = 1 – DW/2. We can do this in Excel. Fancier procedures: Cochrane- Orcutt and Hildreth-Liu and others. A good regression program will calculate  automatically.

14 Fixing Serial Correlation Suppose  is known. Then “difference” the model: Y t –  Y t-1 =  0 (1–  ) +  1 (X t –  X t-1 ) + (e t –  e t-1 ) Or Y t –  Y t-1 =  0 (1–  ) +  1 (X t –  X t-1 ) + u t u t is a “well behaved” error. Differenced model yields unbiased coefficients and unbiased standard errors. See example.

15 Heteroscedasticity Strange name! Greek for “different variances.” Violation of last assumption about residual: same variance for each error term. Can occur with any kind of data.

16 Heteroscedasticity — Possible Causes Wrong functional form. Var(e) correlated with an included X variable on the right side of the regression. E(var(e), X)  0, NOT E(e, X)  0

17 Heteroscedasticity — Importance Regression coefficients (the marginal effects) are unbiased. BUT their standard errors are biased. Direction of bias not usually known. Confidence levels, p-values, t statistics not reliable.

18 Model with Heteroscedasticity Y t =  0 +  1 X t + e t Suppose var(e t ) =  2 X t 2. Var(e) is different for each observation.

19 Fixing Heteroscedasticity — Weighted Least Squares Observations with smaller error variance are “better.” Give them more weight when estimating the model. Weighted Least Squares (WLS): Multiply observations by weighting factors that equalize the variance. (1/ X t )Y t = (1/ X t )  0 + (1/ X t )  1 X t + (1/ X t )e t Var((1/ X t )e t ) = ((1/ X t 2 )  2 X t 2 =  2

20 Calculating WLS Suppose form of heteroscedasticity is known, e.g., need to weight by X t. You just need to create new variables. (1/ X t )Y t = (1/ X t )  0 +  1 + u t Intercept in WLS is  1, slope on 1/X is  0. “Well behaved” error term, yields unbiased coefficients and unbiased standard errors.

21 ARCH models AutoRegressive Conditionally Heteroscedastic model Regression model with serial correlation (“autoregressive”) AND heteroscedasticity. Used to model volatility, i.e., variance, of returns.

22 ARCH models Sometimes you want to model volatility itself (e.g., it’s an input to an option pricing model). Volatility can change over time, periods of high and low volatility. ARCH describes this process.

23 Formulation of ARCH model Y t =  0 +  1 X t + e t Var(e t ) =  0 +  1 e t-1. 1 st order ARCH process. Can estimate  ’s and  ’s and perform WLS.