1 MF-852 Financial Econometrics Lecture 8 Introduction to Multiple Regression Roy J. Epstein Fall 2003.

Slides:



Advertisements
Similar presentations
Dummy Variables. Introduction Discuss the use of dummy variables in Financial Econometrics. Examine the issue of normality and the use of dummy variables.
Advertisements

Further Inference in the Multiple Regression Model Hill et al Chapter 8.
Multivariate Regression
NOTATION & ASSUMPTIONS 2 Y i =  1 +  2 X 2i +  3 X 3i + U i Zero mean value of U i No serial correlation Homoscedasticity Zero covariance between U.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Economics 105: Statistics GH 24 due Wednesday. Hypothesis Tests on Several Regression Coefficients Consider the model (expanding on GH 22) Is “race” as.
1 MF-852 Financial Econometrics Lecture 11 Distributed Lags and Unit Roots Roy J. Epstein Fall 2003.
Chapter 6 (p153) Predicting Future Performance Criterion-Related Validation – Kind of relation between X and Y (regression) – Degree of relation (validity.
Specification Error II
Multicollinearity Multicollinearity - violation of the assumption that no independent variable is a perfect linear function of one or more other independent.
1 MF-852 Financial Econometrics Lecture 3 Review of Probability Roy J. Epstein Fall 2003.
Lecture 4 Econ 488. Ordinary Least Squares (OLS) Objective of OLS  Minimize the sum of squared residuals: where Remember that OLS is not the only possible.
MF-852 Financial Econometrics
From last time….. Basic Biostats Topics Summary Statistics –mean, median, mode –standard deviation, standard error Confidence Intervals Hypothesis Tests.
1 MF-852 Financial Econometrics Lecture 4 Probability Distributions and Intro. to Hypothesis Tests Roy J. Epstein Fall 2003.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
1Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 7. Specification and Data Problems.
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Statistics 350 Lecture 21. Today Last Day: Tests and partial R 2 Today: Multicollinearity.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Econ 140 Lecture 181 Multiple Regression Applications III Lecture 18.
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
Intro to Statistics for the Behavioral Sciences PSYC 1900
1 MF-852 Financial Econometrics Lecture 2 Matrix Operations in Econometrics, Optimization with Excel Roy J. Epstein Fall 2003.
Multicollinearity Omitted Variables Bias is a problem when the omitted variable is an explanator of Y and correlated with X1 Including the omitted variable.
1 MF-852 Financial Econometrics Lecture 6 Linear Regression I Roy J. Epstein Fall 2003.
Further Inference in the Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Topic 3: Regression.
Multiple Regression Research Methods and Statistics.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Simple Linear Regression Analysis
Relationships Among Variables
Ordinary Least Squares
Lecture 5 Correlation and Regression
8.1 Ch. 8 Multiple Regression (con’t) Topics: F-tests : allow us to test joint hypotheses tests (tests involving one or more  coefficients). Model Specification:
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
3.1 Ch. 3 Simple Linear Regression 1.To estimate relationships among economic variables, such as y = f(x) or c = f(i) 2.To test hypotheses about these.
Understanding Multivariate Research Berry & Sanders.
1 MF-852 Financial Econometrics Lecture 9 Dummy Variables, Functional Form, Trends, and Tests for Structural Change Roy J. Epstein Fall 2003.
How do Lawyers Set fees?. Learning Objectives 1.Model i.e. “Story” or question 2.Multiple regression review 3.Omitted variables (our first failure of.
Error Component Models Methods of Economic Investigation Lecture 8 1.
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
Specification Error I.
1 MF-852 Financial Econometrics Lecture 10 Serial Correlation and Heteroscedasticity Roy J. Epstein Fall 2003.
MTH 161: Introduction To Statistics
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Statistics and Econometrics for Business II Fall 2014 Instructor: Maksym Obrizan Lecture notes III # 2. Advanced topics in OLS regression # 3. Working.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
Nonlinear Models. Agenda Omitted Variables Dummy Variables Nonlinear Models Nonlinear in variables Polynomial Regressions Log Transformed Regressions.
© Buddy Freeman, 2015 Let X and Y be two normally distributed random variables satisfying the equality of variance assumption both ways. For clarity let.
Chap 6 Further Inference in the Multiple Regression Model
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Copyright © 2004 by The McGraw-Hill Companies, Inc. All rights reserved.
Multiple Regression David A. Kenny January 12, 2014.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Regression Analysis: A statistical procedure used to find relations among a set of variables B. Klinkenberg G
Regression Analysis Part A Basic Linear Regression Analysis and Estimation of Parameters Read Chapters 3, 4 and 5 of Forecasting and Time Series, An Applied.
Chapter 11: Linear Regression E370, Spring From Simple Regression to Multiple Regression.
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Econ 326 Lecture 19.
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Multiple Regression Analysis
Chapter 6 Predicting Future Performance
STOCHASTIC REGRESSORS AND THE METHOD OF INSTRUMENTAL VARIABLES
BIVARIATE REGRESSION AND CORRELATION
LESSON 24: INFERENCES USING REGRESSION
Tutorial 1: Misspecification
Chapter 13 Additional Topics in Regression Analysis
Chapter 6 Predicting Future Performance
Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences? It may be difficult to separate.
Presentation transcript:

1 MF-852 Financial Econometrics Lecture 8 Introduction to Multiple Regression Roy J. Epstein Fall 2003

2 Topics Formulation and Estimation of a Multiple Regression Interpretation of the Regression Coefficients Omitted Variables Collinearity Advanced Hypothesis Testing

3 Multiple Regression Used when 2 or more independent variables explain the dependent variable: Y i =  0 +  1 X 1i +  2 X 2i + … +  k X ki + e i or Y i = X i  + e i

4 The Error Term Same assumptions as before: E(e i ) = 0 var(e i ) =  2 cov(X,e) = 0 cov(e i, e j ) = 0

5 The Error Term Same assumptions as before: E(e i ) = 0 var(e i ) =  2 cov(X,e) = 0 cov(e i, e j ) = 0

6 The Estimated Coefficients Measure the marginal effect of an independent variable, controlling for the other effects. I.e., effect of X i “all else equal” Can be sensitive to what other variables are included in the regression.

7 Omitted Variables Suppose true model is: Y i =  0 +  1 X 1i +  2 X 2i + e i But you leave out X 2. (by ignorance or lack of data) Does it matter?

8 Analysis of Omitted Variables Error term now includes e and X 2 : Y i =  0 +  1 X 1i + u i =  0 +  1 X 1i + [  2 X 2i + e i ] Two cases: A.X 2 correlated with X 1. biased — picks up effect of X 2 and attributes it to X 1. B.X 2 uncorrelated with X 1. No bias.

9 Case Study — MIT Lawsuit

10

11 Collinearity Let Y i =  0 +  1 X 1i +  2 X 2i + e i Suppose X 1 and X 2 highly correlated. What difference does it make? Hard to estimate  1 and  2. No bias, but large standard errors.

12 Collinearity—Diagnosis Neither X 1 or X 2 has a significant t statistic BUT X 1 is significant when X 2 is left out of the regression and vice versa. Test joint significance with F test.

13 Exact Collinearity Let Y i =  0 +  1 X 1i +  2 X 2i + e i Suppose X 2 is exact linear function of X 1 E.g., X 2 = a + bX 1 Then cannot estimate model at all! Can also occur with 3 or more X’s.

14 Exact Collinearity—Example Regression to explain calories as function of fat content of foods X 1 is fat in ounces per portion X 2 is fat in same food in grams Then X 2i = X 1i Can’t estimate Y i =  0 +  1 X 1i +  2 X 2i + e i Intuition: no independent information in X 2.

15 Tests of Restrictions Suppose H 0 :  2 = 2  1 in Y i =  0 +  1 X 1i +  2 X 2i + e i Test H 0 with reformulated model that embeds restriction: Y i =  0 +  1 (X 1i + 2X 2i ) +  2 X 2i + e i Under H 0,  2 = 0 Can test with usual t statistic

16 Test your Understanding! What is difference between exact collinearity, e.g., X 2i = 2X 1i And a coefficient restriction, e.g., H 0 :  2 = 2  1 ? Relate the concepts to the model.