VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE In this sequence and the next we will investigate the consequences of misspecifying the regression.

Slides:



Advertisements
Similar presentations
CHOW TEST AND DUMMY VARIABLE GROUP TEST
Advertisements

EC220 - Introduction to econometrics (chapter 5)
Christopher Dougherty EC220 - Introduction to econometrics (chapter 5) Slideshow: slope dummy variables Original citation: Dougherty, C. (2012) EC220 -
1 XX X1X1 XX X Random variable X with unknown population mean  X function of X probability density Sample of n observations X 1, X 2,..., X n : potential.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 4) Slideshow: interactive explanatory variables Original citation: Dougherty, C. (2012)
HETEROSCEDASTICITY-CONSISTENT STANDARD ERRORS 1 Heteroscedasticity causes OLS standard errors to be biased is finite samples. However it can be demonstrated.
EC220 - Introduction to econometrics (chapter 7)
MEASUREMENT ERROR 1 In this sequence we will investigate the consequences of measurement errors in the variables in a regression model. To keep the analysis.
EC220 - Introduction to econometrics (chapter 2)
© Christopher Dougherty 1999–2006 VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE We will now investigate the consequences of misspecifying.
EC220 - Introduction to econometrics (chapter 9)
Christopher Dougherty EC220 - Introduction to econometrics (chapter 6) Slideshow: variable misspecification iii: consequences for diagnostics Original.
EC220 - Introduction to econometrics (chapter 1)
1 INTERPRETATION OF A REGRESSION EQUATION The scatter diagram shows hourly earnings in 2002 plotted against years of schooling, defined as highest grade.
TESTING A HYPOTHESIS RELATING TO A REGRESSION COEFFICIENT This sequence describes the testing of a hypotheses relating to regression coefficients. It is.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 3) Slideshow: prediction Original citation: Dougherty, C. (2012) EC220 - Introduction.
SLOPE DUMMY VARIABLES 1 The scatter diagram shows the data for the 74 schools in Shanghai and the cost functions derived from a regression of COST on N.
BINARY CHOICE MODELS: LOGIT ANALYSIS
Christopher Dougherty EC220 - Introduction to econometrics (chapter 3) Slideshow: precision of the multiple regression coefficients Original citation:
Christopher Dougherty EC220 - Introduction to econometrics (chapter 4) Slideshow: semilogarithmic models Original citation: Dougherty, C. (2012) EC220.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 5) Slideshow: Chow test Original citation: Dougherty, C. (2012) EC220 - Introduction.
TOBIT ANALYSIS Sometimes the dependent variable in a regression model is subject to a lower limit or an upper limit, or both. Suppose that in the absence.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 5) Slideshow: dummy variable classification with two categories Original citation:
Christopher Dougherty EC220 - Introduction to econometrics (chapter 5) Slideshow: two sets of dummy variables Original citation: Dougherty, C. (2012) EC220.
1 This sequence shows why OLS is likely to yield inconsistent estimates in models composed of two or more simultaneous relationships. SIMULTANEOUS EQUATIONS.
1 PREDICTION In the previous sequence, we saw how to predict the price of a good or asset given the composition of its characteristics. In this sequence,
Christopher Dougherty EC220 - Introduction to econometrics (chapter 5) Slideshow: the effects of changing the reference category Original citation: Dougherty,
Christopher Dougherty EC220 - Introduction to econometrics (chapter 5) Slideshow: dummy classification with more than two categories Original citation:
DUMMY CLASSIFICATION WITH MORE THAN TWO CATEGORIES This sequence explains how to extend the dummy variable technique to handle a qualitative explanatory.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 10) Slideshow: Tobit models Original citation: Dougherty, C. (2012) EC220 - Introduction.
1 INTERACTIVE EXPLANATORY VARIABLES The model shown above is linear in parameters and it may be fitted using straightforward OLS, provided that the regression.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 8) Slideshow: measurement error Original citation: Dougherty, C. (2012) EC220 - Introduction.
1 TWO SETS OF DUMMY VARIABLES The explanatory variables in a regression model may include multiple sets of dummy variables. This sequence provides an example.
Confidence intervals were treated at length in the Review chapter and their application to regression analysis presents no problems. We will not repeat.
CONSEQUENCES OF AUTOCORRELATION
ALTERNATIVE EXPRESSION FOR POPULATION VARIANCE 1 This sequence derives an alternative expression for the population variance of a random variable. It provides.
CONFLICTS BETWEEN UNBIASEDNESS AND MINIMUM VARIANCE
1 PROXY VARIABLES Suppose that a variable Y is hypothesized to depend on a set of explanatory variables X 2,..., X k as shown above, and suppose that for.
MULTIPLE RESTRICTIONS AND ZERO RESTRICTIONS
F TEST OF GOODNESS OF FIT FOR THE WHOLE EQUATION 1 This sequence describes two F tests of goodness of fit in a multiple regression model. The first relates.
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE 1 This sequence provides a geometrical interpretation of a multiple regression model with two.
Simple regression model: Y =  1 +  2 X + u 1 We have seen that the regression coefficients b 1 and b 2 are random variables. They provide point estimates.
A.1The model is linear in parameters and correctly specified. PROPERTIES OF THE MULTIPLE REGRESSION COEFFICIENTS 1 Moving from the simple to the multiple.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 9) Slideshow: instrumental variable estimation: variation Original citation: Dougherty,
. reg LGEARN S WEIGHT85 Source | SS df MS Number of obs = F( 2, 537) = Model |
Christopher Dougherty EC220 - Introduction to econometrics (chapter 6) Slideshow: multiple restrictions and zero restrictions Original citation: Dougherty,
POSSIBLE DIRECT MEASURES FOR ALLEVIATING MULTICOLLINEARITY 1 What can you do about multicollinearity if you encounter it? We will discuss some possible.
1 We will now look at the properties of the OLS regression estimators with the assumptions of Model B. We will do this within the context of the simple.
(1)Combine the correlated variables. 1 In this sequence, we look at four possible indirect methods for alleviating a problem of multicollinearity. POSSIBLE.
1 We will continue with a variation on the basic model. We will now hypothesize that p is a function of m, the rate of growth of the money supply, as well.
COST 11 DUMMY VARIABLE CLASSIFICATION WITH TWO CATEGORIES 1 This sequence explains how you can include qualitative explanatory variables in your regression.
RAMSEY’S RESET TEST OF FUNCTIONAL MISSPECIFICATION 1 Ramsey’s RESET test of functional misspecification is intended to provide a simple indicator of evidence.
1 NONLINEAR REGRESSION Suppose you believe that a variable Y depends on a variable X according to the relationship shown and you wish to obtain estimates.
INSTRUMENTAL VARIABLES 1 Suppose that you have a model in which Y is determined by X but you have reason to believe that Assumption B.7 is invalid and.
1 INSTRUMENTAL VARIABLE ESTIMATION OF SIMULTANEOUS EQUATIONS In the previous sequence it was asserted that the reduced form equations have two important.
1 ESTIMATORS OF VARIANCE, COVARIANCE, AND CORRELATION We have seen that the variance of a random variable X is given by the expression above. Variance.
1 CHANGES IN THE UNITS OF MEASUREMENT Suppose that the units of measurement of Y or X are changed. How will this affect the regression results? Intuitively,
SEMILOGARITHMIC MODELS 1 This sequence introduces the semilogarithmic model and shows how it may be applied to an earnings function. The dependent variable.
GRAPHING A RELATIONSHIP IN A MULTIPLE REGRESSION MODEL The output above shows the result of regressing EARNINGS, hourly earnings in dollars, on S, years.
1 REPARAMETERIZATION OF A MODEL AND t TEST OF A LINEAR RESTRICTION Linear restrictions can also be tested using a t test. This involves the reparameterization.
F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES 1 We now come to more general F tests of goodness of fit. This is a test of the joint explanatory power.
WHITE TEST FOR HETEROSCEDASTICITY 1 The White test for heteroscedasticity looks for evidence of an association between the variance of the disturbance.
1 COMPARING LINEAR AND LOGARITHMIC SPECIFICATIONS When alternative specifications of a regression model have the same dependent variable, R 2 can be used.
VARIABLE MISSPECIFICATION II: INCLUSION OF AN IRRELEVANT VARIABLE In this sequence we will investigate the consequences of including an irrelevant variable.
Introduction to Econometrics, 5th edition
Introduction to Econometrics, 5th edition
Introduction to Econometrics, 5th edition
Introduction to Econometrics, 5th edition
Introduction to Econometrics, 5th edition
Introduction to Econometrics, 5th edition
Presentation transcript:

VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE In this sequence and the next we will investigate the consequences of misspecifying the regression model in terms of explanatory variables. 1 True model Consequences of variable misspecification Fitted model

To keep the analysis simple, we will assume that there are only two possibilities. Either Y depends only on X 2, or it depends on both X 2 and X 3. 2 VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE True model Consequences of variable misspecification Fitted model

If Y depends only on X 2, and we fit a simple regression model, we will not encounter any problems, assuming of course that the regression model assumptions are valid. 3 VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE True model Consequences of variable misspecification Fitted model Correct specification, no problems

Likewise we will not encounter any problems if Y depends on both X 2 and X 3 and we fit the multiple regression. 4 VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE True model Correct specification, no problems Consequences of variable misspecification Fitted model Correct specification, no problems

In this sequence we will examine the consequences of fitting a simple regression when the true model is multiple. 5 VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE True model Correct specification, no problems Consequences of variable misspecification Fitted model Correct specification, no problems

In the next one we will do the opposite and examine the consequences of fitting a multiple regression when the true model is simple. 6 VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE True model Correct specification, no problems Consequences of variable misspecification Fitted model Correct specification, no problems

True model The omission of a relevant explanatory variable causes the regression coefficients to be biased and the standard errors to be invalid. 7 Correct specification, no problems Correct specification, no problems Coefficients are biased (in general). Standard errors are invalid. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE Consequences of variable misspecification Fitted model

8 In the present case, the omission of X 3 causes b 2 to be biased by the term highlighted in yellow. We will explain this first intuitively and then demonstrate it mathematically. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

Y X3X3 X2X2 direct effect of X 2, holding X 3 constant effect of X 3 apparent effect of X 2, acting as a mimic for X 3 22 33 9 The intuitive reason is that, in addition to its direct effect  2, X 2 has an apparent indirect effect as a consequence of acting as a proxy for the missing X 3. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

Y X3X3 X2X2 direct effect of X 2, holding X 3 constant effect of X 3 apparent effect of X 2, acting as a mimic for X 3 22 33 10 The strength of the proxy effect depends on two factors: the strength of the effect of X 3 on Y, which is given by  3, and the ability of X 2 to mimic X 3. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

Y X3X3 X2X2 direct effect of X 2, holding X 3 constant effect of X 3 apparent effect of X 2, acting as a mimic for X 3 22 33 11 The ability of X 2 to mimic X 3 is determined by the slope coefficient obtained when X 3 is regressed on X 2, the term highlighted in yellow. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

12 We will now derive the expression for the bias mathematically. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

13 Although Y really depends on X 3 as well as X 2, we make a mistake and regress Y on X 2 only. The slope coefficient is therefore as shown. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

14 We substitute for Y. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

15 We simplify and demonstrate that b 2 has three components. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

16 To investigate biasedness or unbiasedness, we take the expected value of b 2. The first two terms are unaffected because they contain no random components. Thus we focus on the expectation of the error term. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

17 X 2 is nonstochastic, so the denominator of the error term is nonstochastic and may be taken outside the expression for the expectation. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

18 In the numerator the expectation of a sum is equal to the sum of the expectations (first expected value rule). VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

19 In each product, the factor involving X 2 may be taken out of the expectation because X 2 is nonstochastic. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

20 By Assumption A.3, the expected value of u is 0. It follows that the expected value of the sample mean of u is also 0. Hence the expected value of the error term is 0. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

21 Thus we have shown that the expected value of b 2 is equal to the true value plus a bias term. Note: the definition of a bias is the difference between the expected value of an estimator and the true value of the parameter being estimated. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

22 As a consequence of the misspecification, the standard errors, t tests and F test are invalid. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

23 We will illustrate the bias using an educational attainment model. To keep the analysis simple, we will assume that in the true model S depends only on ASVABC and SM. The output above shows the corresponding regression using EAEF Data Set 21. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg S ASVABC SM Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | _cons |

24 We will run the regression a second time, omitting SM. Before we do this, we will try to predict the direction of the bias in the coefficient of ASVABC. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg S ASVABC SM Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | _cons | cor SM ASVABC (obs=540) | SM ASVABC SM| ASVABC|

25 It is reasonable to suppose, as a matter of common sense, that  3 is positive. This assumption is strongly supported by the fact that its estimate in the multiple regression is positive and highly significant. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg S ASVABC SM Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | _cons | cor SM ASVABC (obs=540) | SM ASVABC SM| ASVABC|

26 The correlation between ASVABC and SM is positive, so the numerator of the bias term must be positive. The denominator is automatically positive since it is a sum of squares and there is some variation in ASVABC. Hence the bias should be positive. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg S ASVABC SM Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | _cons | cor SM ASVABC (obs=540) | SM ASVABC SM| ASVABC|

27 Here is the regression omitting SM. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg S ASVABC Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | _cons | cor SM ASVABC (obs=540) | SM ASVABC SM| ASVABC|

. reg S ASVABC SM S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | _cons | As you can see, the coefficient of ASVABC is indeed higher when SM is omitted. Part of the difference may be due to pure chance, but part is attributable to the bias. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg S ASVABC S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | _cons |

29 Here is the regression omitting ASVABC instead of SM. We would expect b 3 to be upwards biased. We anticipate that  2 is positive and we know that both the numerator and the denominator of the other factor in the bias expression are positive. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg S SM Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] SM | _cons |

30 In this case the bias is quite dramatic. The coefficient of SM has more than doubled. The reason for the bigger effect is that the variation in SM is much smaller than that in ASVABC, while  2 and  3 are similar in size, judging by their estimates. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg S SM S | Coef. Std. Err. t P>|t| [95% Conf. Interval] SM | _cons | reg S ASVABC SM S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | _cons |

. reg S ASVABC SM Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = reg S ASVABC Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = reg S SM Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = Finally, we will investigate how R 2 behaves when a variable is omitted. In the simple regression of S on ASVABC, R 2 is 0.34, and in the simple regression of S on SM it is VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

32 Does this imply that ASVABC explains 34% of the variance in S and SM 13%? No, because the multiple regression reveals that their joint explanatory power is 0.35, not VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg S ASVABC SM Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = reg S ASVABC Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = reg S SM Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE =

33 In the second regression, ASVABC is partly acting as a proxy for SM, and this inflates its apparent explanatory power. Similarly, in the third regression, SM is partly acting as a proxy for ASVABC, again inflating its apparent explanatory power. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg S ASVABC SM Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = reg S ASVABC Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = reg S SM Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE =

. reg LGEARN S EXP Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = LGEARN | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | _cons | However, it is also possible for omitted variable bias to lead to a reduction in the apparent explanatory power of a variable. This will be demonstrated using a simple earnings function model, supposing the logarithm of hourly earnings to depend on S and EXP. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

35 If we omit EXP from the regression, the coefficient of S should be subject to a downward bias.  3 is likely to be positive. The numerator of the other factor in the bias term is negative since S and EXP are negatively correlated. The denominator is positive. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg LGEARN S EXP Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = LGEARN | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | _cons | cor S EXP (obs=540) | S EXP S| EXP|

. reg LGEARN S EXP Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = LGEARN | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | _cons | cor S EXP (obs=540) | S EXP S| EXP| For the same reasons, the coefficient of EXP in a simple regression of LGEARN on EXP should be downwards biased. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE

37 As can be seen, the coefficients of S and EXP are indeed lower in the simple regressions. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg LGEARN S EXP LGEARN | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | EXP | _cons | reg LGEARN S LGEARN | Coef. Std. Err. t P>|t| [95% Conf. Interval] S | _cons | reg LGEARN EXP LGEARN | Coef. Std. Err. t P>|t| [95% Conf. Interval] EXP | _cons |

38 VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg LGEARN S EXP Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = reg LGEARN S Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = reg LGEARN EXP Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = A comparison of R 2 for the three regressions shows that the sum of R 2 in the simple regressions is actually less than R 2 in the multiple regression.

39 This is because the apparent explanatory power of S in the second regression has been undermined by the downwards bias in its coefficient. The same is true for the apparent explanatory power of EXP in the third equation. VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE. reg LGEARN S EXP Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = reg LGEARN S Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = reg LGEARN EXP Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE =.58219

Copyright Christopher Dougherty These slideshows may be downloaded by anyone, anywhere for personal use. Subject to respect for copyright and, where appropriate, attribution, they may be used as a resource for teaching an econometrics course. There is no need to refer to the author. The content of this slideshow comes from Section 6.2 of C. Dougherty, Introduction to Econometrics, fourth edition 2011, Oxford University Press. Additional (free) resources for both students and instructors may be downloaded from the OUP Online Resource Centre Individuals studying econometrics on their own who feel that they might benefit from participation in a formal course should consider the London School of Economics summer school course EC212 Introduction to Econometrics or the University of London International Programmes distance learning course EC2020 Elements of Econometrics