Download presentation

Presentation is loading. Please wait.

Published byGlenn Kayton Modified over 2 years ago

1
**CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION**

ECONOMETRICS I CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION Textbook: Damodar N. Gujarati (2004) Basic Econometrics, 4th edition, The McGraw-Hill Companies

2
**3.1 THE METHOD OF ORDINARY LEAST SQUARES**

PRF: SRF: How is SRF determined? We do not minimize the sum of the residuals! Why not?

3
**Least squares criterion**

4
**3.1 THE METHOD OF ORDINARY LEAST SQUARES**

We adopt the least-squares criterion We want to minimize the sum of the squared residuals. This sum is a function of estimated parameters: Normal equations:

5
**3.1 THE METHOD OF ORDINARY LEAST SQUARES**

Solving the normal equations simultaneously, we obtain the following: Beta2-hat can be alternatively expressed as the following:

6
**Three Statistical Properties of OLS Estimators**

I. The OLS estimators are expressed solely in terms of the observable quantities (i.e. X and Y). Therefore they can easily be computed. II. They are point estimators (not interval estimators). Given the sample, each estimator provide only a single (point) value of the relevant population parameter. III. Once the OLS estimates are obtained from the sample data, the sample regression line can be easily obtained.

7
**The properties of the regression line**

It passes through the sample means of Y and X.

8
**The properties of the regression line**

2.

9
**The properties of the regression line**

3. The mean value of the residuals is zero.

10
**The properties of the regression line**

4. 5.

11
**3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares**

12
**3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares**

13
**3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares**

16
**3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares**

18
**3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares**

19
**3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares**

21
**Example of perfect multicollinearity: X1 = 2X2+X3**

3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares Example of perfect multicollinearity: X1 = 2X2+X3 Y X1 X2 X3 6 5 2 1 11 10 4 17 22 16 25 19 8 3 33 15

22
**PRECISION OR STANDARD ERRORS OF LEAST SQUARES ESTIMATES**

var: variance se: standard error : the constant homoscedastic variance of ui : the standard error of the estimate : OLS estimator of

23
Gauss – Markov Theorem An estimator, say the OLS estimator , is said to be a best linear unbiased estimator (BLUE) of β2 if the following hold:

25
**The coefficient of determination r2**

TSS: total sum of squares ESS: explained sum of squares RSS: residual sum of squares

27
**The coefficient of determination r2**

The quantity r2 thus defined is known as the (sample) coefficient of determination and is the most commonly used measure of the goodness of fit of a regression line. Verbally, r2 measures the proportion or percentage of the total variation in Y explained by the regression model.

28
**The coefficient of determination r2**

29
**The coefficient of determination r2**

30
**The coefficient of correlation r**

r is the sample correlation coeffient

32
**Some of the properties of r**

33
Homework Study the numerical example on pages There will be questions on the midterm exam similar to the ones in this example. Data on page 88:

34
Homework

35
Homework

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google