Download presentation

Presentation is loading. Please wait.

Published byAurora Curran Modified over 2 years ago

1
Christopher Dougherty EC220 - Introduction to econometrics (chapter 1) Slideshow: deriving linear regression coefficients Original citation: Dougherty, C. (2012) EC220 - Introduction to econometrics (chapter 1). [Teaching Resource] © 2012 The Author This version available at: Available in LSE Learning Resources Online: May 2012 This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 License. This license allows the user to remix, tweak, and build upon the work even for commercial purposes, as long as the user credits the author and licenses their new creations under the identical terms.

2
DERIVING LINEAR REGRESSION COEFFICIENTS Y X This sequence shows how the regression coefficients for a simple regression model are derived, using the least squares criterion (OLS, for ordinary least squares) 1 True model:

3
DERIVING LINEAR REGRESSION COEFFICIENTS Y X We will start with a numerical example with just three observations: (1,3), (2,5), and (3,6). 2 True model:

4
DERIVING LINEAR REGRESSION COEFFICIENTS Y b2b2 b1b1 X Writing the fitted regression as Y = b 1 + b 2 X, we will determine the values of b 1 and b 2 that minimize RSS, the sum of the squares of the residuals. 3 ^ True model: Fitted line:

5
Given our choice of b 1 and b 2, the residuals are as shown. DERIVING LINEAR REGRESSION COEFFICIENTS Y b2b2 b1b1 4 X True model: Fitted line:

6
SIMPLE REGRESSION ANALYSIS The sum of the squares of the residuals is thus as shown above. 5

7
SIMPLE REGRESSION ANALYSIS The quadratics have been expanded. 6

8
SIMPLE REGRESSION ANALYSIS Like terms have been added together. 7

9
SIMPLE REGRESSION ANALYSIS For a minimum, the partial derivatives of RSS with respect to b 1 and b 2 should be zero. (We should also check a second-order condition.) 8

10
SIMPLE REGRESSION ANALYSIS The first-order conditions give us two equations in two unknowns. 9

11
SIMPLE REGRESSION ANALYSIS Solving them, we find that RSS is minimized when b 1 and b 2 are equal to 1.67 and 1.50, respectively. 10

12
DERIVING LINEAR REGRESSION COEFFICIENTS Y b2b2 b1b1 X Here is the scatter diagram again. 11 True model: Fitted line:

13
DERIVING LINEAR REGRESSION COEFFICIENTS Y X The fitted line and the fitted values of Y are as shown True model: Fitted line:

14
DERIVING LINEAR REGRESSION COEFFICIENTS XXnXn X1X1 Y Now we will do the same thing for the general case with n observations. 13 True model:

15
DERIVING LINEAR REGRESSION COEFFICIENTS XXnXn X1X1 Y b1b1 b2b2 Given our choice of b 1 and b 2, we will obtain a fitted line as shown. 14 True model: Fitted line:

16
DERIVING LINEAR REGRESSION COEFFICIENTS XXnXn X1X1 Y b1b1 b2b2 The residual for the first observation is defined. 15 True model: Fitted line:

17
DERIVING LINEAR REGRESSION COEFFICIENTS Similarly we define the residuals for the remaining observations. That for the last one is marked. XXnXn X1X1 Y b1b1 b2b2 16 True model: Fitted line:

18
DERIVING LINEAR REGRESSION COEFFICIENTS RSS, the sum of the squares of the residuals, is defined for the general case. The data for the numerical example are shown for comparison.. 17

19
DERIVING LINEAR REGRESSION COEFFICIENTS The quadratics are expanded. 18

20
Like terms are added together. DERIVING LINEAR REGRESSION COEFFICIENTS 19

21
DERIVING LINEAR REGRESSION COEFFICIENTS Note that in this equation the observations on X and Y are just data that determine the coefficients in the expression for RSS. 20

22
DERIVING LINEAR REGRESSION COEFFICIENTS The choice variables in the expression are b 1 and b 2. This may seem a bit strange because in elementary calculus courses b 1 and b 2 are usually constants and X and Y are variables. 21

23
DERIVING LINEAR REGRESSION COEFFICIENTS However, if you have any doubts, compare what we are doing in the general case with what we did in the numerical example. 22

24
DERIVING LINEAR REGRESSION COEFFICIENTS The first derivative with respect to b 1. 23

25
DERIVING LINEAR REGRESSION COEFFICIENTS With some simple manipulation we obtain a tidy expression for b 1. 24

26
DERIVING LINEAR REGRESSION COEFFICIENTS The first derivative with respect to b 2. 25

27
SIMPLE REGRESSION ANALYSIS Divide through by 2. 26

28
SIMPLE REGRESSION ANALYSIS We now substitute for b 1 using the expression obtained for it and we thus obtain an equation that contains b 2 only. 27

29
SIMPLE REGRESSION ANALYSIS The definition of the sample mean has been used. 28

30
SIMPLE REGRESSION ANALYSIS The last two terms have been disentangled. 29

31
SIMPLE REGRESSION ANALYSIS Terms not involving b 2 have been transferred to the right side. 30

32
SIMPLE REGRESSION ANALYSIS To create space, the equation is shifted to the top of the slide. 31

33
SIMPLE REGRESSION ANALYSIS Hence we obtain an expression for b 2. 32

34
SIMPLE REGRESSION ANALYSIS In practice, we shall use an alternative expression. We will demonstrate that it is equivalent. 33

35
SIMPLE REGRESSION ANALYSIS Expanding the numerator, we obtain the terms shown. 34

36
SIMPLE REGRESSION ANALYSIS In the second term the mean value of Y is a common factor. In the third, the mean value of X is a common factor. The last term is the same for all i. 35

37
SIMPLE REGRESSION ANALYSIS We use the definitions of the sample means to simplify the expression. 36

38
SIMPLE REGRESSION ANALYSIS Hence we have shown that the numerators of the two expressions are the same. 37

39
SIMPLE REGRESSION ANALYSIS The denominator is mathematically a special case of the numerator, replacing Y by X. Hence the expressions are quivalent. 38

40
DERIVING LINEAR REGRESSION COEFFICIENTS XXnXn X1X1 Y b1b1 b2b2 The scatter diagram is shown again. We will summarize what we have done. We hypothesized that the true model is as shown, we obtained some data, and we fitted a line. 39 True model: Fitted line:

41
DERIVING LINEAR REGRESSION COEFFICIENTS XXnXn X1X1 Y b1b1 b2b2 We chose the parameters of the fitted line so as to minimize the sum of the squares of the residuals. As a result, we derived the expressions for b 1 and b True model: Fitted line:

42
DERIVING LINEAR REGRESSION COEFFICIENTS 41 Typically, an intercept should be included in the regression specification. Occasionally, however, one may have reason to fit the regression without an intercept. In the case of a simple regression model, the true and fitted models become as shown. True model: Fitted line:

43
DERIVING LINEAR REGRESSION COEFFICIENTS 42 We will derive the expression for b 2 from first principles using the least squares criterion. The residual in observation i is e i = Y i – b 2 X i. True model: Fitted line:

44
DERIVING LINEAR REGRESSION COEFFICIENTS 43 With this, we obtain the expression for the sum of the squares of the residuals. True model: Fitted line:

45
DERIVING LINEAR REGRESSION COEFFICIENTS Differentiating with respect to b 2, we obtain the first-order condition for a minimum. 44 True model: Fitted line:

46
DERIVING LINEAR REGRESSION COEFFICIENTS 45 Hence, we obtain the OLS estimator of b 2 for this model. True model: Fitted line:

47
DERIVING LINEAR REGRESSION COEFFICIENTS 46 The second derivative is positive, confirming that we have found a minimum. True model: Fitted line:

48
Copyright Christopher Dougherty These slideshows may be downloaded by anyone, anywhere for personal use. Subject to respect for copyright and, where appropriate, attribution, they may be used as a resource for teaching an econometrics course. There is no need to refer to the author. The content of this slideshow comes from Section 1.3 of C. Dougherty, Introduction to Econometrics, fourth edition 2011, Oxford University Press. Additional (free) resources for both students and instructors may be downloaded from the OUP Online Resource Centre Individuals studying econometrics on their own and who feel that they might benefit from participation in a formal course should consider the London School of Economics summer school course EC212 Introduction to Econometrics or the University of London International Programmes distance learning course 20 Elements of Econometrics

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google