Presentation is loading. Please wait.

Presentation is loading. Please wait.

Christopher Dougherty EC220 - Introduction to econometrics (chapter 10) Slideshow: binary choice logit models Original citation: Dougherty, C. (2012) EC220.

Similar presentations


Presentation on theme: "Christopher Dougherty EC220 - Introduction to econometrics (chapter 10) Slideshow: binary choice logit models Original citation: Dougherty, C. (2012) EC220."— Presentation transcript:

1 Christopher Dougherty EC220 - Introduction to econometrics (chapter 10) Slideshow: binary choice logit models Original citation: Dougherty, C. (2012) EC220 - Introduction to econometrics (chapter 10). [Teaching Resource] © 2012 The Author This version available at: http://learningresources.lse.ac.uk/136/http://learningresources.lse.ac.uk/136/ Available in LSE Learning Resources Online: May 2012 This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 License. This license allows the user to remix, tweak, and build upon the work even for commercial purposes, as long as the user credits the author and licenses their new creations under the identical terms. http://creativecommons.org/licenses/by-sa/3.0/ http://creativecommons.org/licenses/by-sa/3.0/ http://learningresources.lse.ac.uk/

2 1 BINARY CHOICE MODELS: LOGIT ANALYSIS The linear probability model may make the nonsense predictions that an event will occur with probability greater than 1 or less than 0. XXiXi 1 0  1 +  2 X i Y, p 11 A B  1 +  2 X i 1 –  1 –  2 X i

3 The usual way of avoiding this problem is to hypothesize that the probability is a sigmoid (S-shaped) function of Z, F(Z), where Z is a function of the explanatory variables. BINARY CHOICE MODELS: LOGIT ANALYSIS 2

4 Several mathematical functions are sigmoid in character. One is the logistic function shown here. As Z goes to infinity, e –Z goes to 0 and p goes to 1 (but cannot exceed 1). As Z goes to minus infinity, e –Z goes to infinity and p goes to 0 (but cannot be below 0). BINARY CHOICE MODELS: LOGIT ANALYSIS 3

5 The model implies that, for values of Z less than –2, the probability of the event occurring is low and insensitive to variations in Z. Likewise, for values greater than 2, the probability is high and insensitive to variations in Z. BINARY CHOICE MODELS: LOGIT ANALYSIS 4

6 To obtain an expression for the sensitivity, we differentiate F(Z) with respect to Z. The box gives the general rule for differentiating a quotient and applies it to F(Z). BINARY CHOICE MODELS: LOGIT ANALYSIS 5

7 The sensitivity, as measured by the slope, is greatest when Z is 0. The marginal function, f(Z), reaches a maximum at this point. 6

8 For a nonlinear model of this kind, maximum likelihood estimation is much superior to the use of the least squares principle for estimating the parameters. More details concerning its application are given at the end of this sequence. BINARY CHOICE MODELS: LOGIT ANALYSIS 7

9 We will apply this model to the graduating from high school example described in the linear probability model sequence. We will begin by assuming that ASVABC is the only relevant explanatory variable, so Z is a simple function of it. BINARY CHOICE MODELS: LOGIT ANALYSIS 8

10 . logit GRAD ASVABC Iteration 0: Log Likelihood =-162.29468 Iteration 1: Log Likelihood =-132.97646 Iteration 2: Log Likelihood =-117.99291 Iteration 3: Log Likelihood =-117.36084 Iteration 4: Log Likelihood =-117.35136 Iteration 5: Log Likelihood =-117.35135 Logit Estimates Number of obs = 570 chi2(1) = 89.89 Prob > chi2 = 0.0000 Log Likelihood = -117.35135 Pseudo R2 = 0.2769 ------------------------------------------------------------------------------ grad | Coef. Std. Err. z P>|z| [95% Conf. Interval] ---------+-------------------------------------------------------------------- asvabc |.1666022.0211265 7.886 0.000.1251951.2080094 _cons | -5.003779.8649213 -5.785 0.000 -6.698993 -3.308564 ------------------------------------------------------------------------------ BINARY CHOICE MODELS: LOGIT ANALYSIS The Stata command is logit, followed by the outcome variable and the explanatory variable(s). Maximum likelihood estimation is an iterative process, so the first part of the output will be like that shown. 9

11 . logit GRAD ASVABC Iteration 0: log likelihood = -118.67769 Iteration 1: log likelihood = -104.45292 Iteration 2: log likelihood = -97.135677 Iteration 3: log likelihood = -96.887294 Iteration 4: log likelihood = -96.886017 Logit estimates Number of obs = 540 LR chi2(1) = 43.58 Prob > chi2 = 0.0000 Log likelihood = -96.886017 Pseudo R2 = 0.1836 ------------------------------------------------------------------------------ GRAD | Coef. Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------- ASVABC |.1313626.022428 5.86 0.000.0874045.1753206 _cons | -3.240218.9444844 -3.43 0.001 -5.091373 -1.389063 ------------------------------------------------------------------------------ In this case the coefficients of the Z function are as shown. BINARY CHOICE MODELS: LOGIT ANALYSIS 10

12 Since there is only one explanatory variable, we can draw the probability function and marginal effect function as functions of ASVABC. BINARY CHOICE MODELS: LOGIT ANALYSIS 11

13 BINARY CHOICE MODELS: LOGIT ANALYSIS We see that ASVABC has its greatest effect on graduating when it is below 40, that is, in the lower ability range. Any individual with a score above the average (50) is almost certain to graduate. 12

14 The t statistic indicates that the effect of variations in ASVABC on the probability of graduating from high school is highly significant. BINARY CHOICE MODELS: LOGIT ANALYSIS. logit GRAD ASVABC Iteration 0: log likelihood = -118.67769 Iteration 1: log likelihood = -104.45292 Iteration 2: log likelihood = -97.135677 Iteration 3: log likelihood = -96.887294 Iteration 4: log likelihood = -96.886017 Logit estimates Number of obs = 540 LR chi2(1) = 43.58 Prob > chi2 = 0.0000 Log likelihood = -96.886017 Pseudo R2 = 0.1836 ------------------------------------------------------------------------------ GRAD | Coef. Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------- ASVABC |.1313626.022428 5.86 0.000.0874045.1753206 _cons | -3.240218.9444844 -3.43 0.001 -5.091373 -1.389063 ------------------------------------------------------------------------------ 13

15 BINARY CHOICE MODELS: LOGIT ANALYSIS Strictly speaking, the t statistic is valid only for large samples, so the normal distribution is the reference distribution. For this reason the statistic is denoted z in the Stata output. This z has nothing to do with our Z function.. logit GRAD ASVABC Iteration 0: log likelihood = -118.67769 Iteration 1: log likelihood = -104.45292 Iteration 2: log likelihood = -97.135677 Iteration 3: log likelihood = -96.887294 Iteration 4: log likelihood = -96.886017 Logit estimates Number of obs = 540 LR chi2(1) = 43.58 Prob > chi2 = 0.0000 Log likelihood = -96.886017 Pseudo R2 = 0.1836 ------------------------------------------------------------------------------ GRAD | Coef. Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------- ASVABC |.1313626.022428 5.86 0.000.0874045.1753206 _cons | -3.240218.9444844 -3.43 0.001 -5.091373 -1.389063 ------------------------------------------------------------------------------ 14

16 BINARY CHOICE MODELS: LOGIT ANALYSIS The coefficients of the Z function do not have any direct intuitive interpretation. 15

17 However, we can use them to quantify the marginal effect of a change in ASVABC on the probability of graduating. We will do this theoretically for the general case where Z is a function of several explanatory variables. BINARY CHOICE MODELS: LOGIT ANALYSIS 16

18 Since p is a function of Z, and Z is a function of the X variables, the marginal effect of X i on p can be written as the product of the marginal effect of Z on p and the marginal effect of X i on Z. BINARY CHOICE MODELS: LOGIT ANALYSIS 17

19 We have already derived an expression for dp/dZ. The marginal effect of X i on Z is given by its  coefficient. BINARY CHOICE MODELS: LOGIT ANALYSIS 18

20 Hence we obtain an expression for the marginal effect of X i on p. BINARY CHOICE MODELS: LOGIT ANALYSIS 19

21 The marginal effect is not constant because it depends on the value of Z, which in turn depends on the values of the explanatory variables. A common procedure is to evaluate it for the sample means of the explanatory variables. BINARY CHOICE MODELS: LOGIT ANALYSIS 20

22 The sample mean of ASVABC in this sample is 51.36. BINARY CHOICE MODELS: LOGIT ANALYSIS. sum GRAD ASVABC Variable | Obs Mean Std. Dev. Min Max -------------+-------------------------------------------------------- GRAD | 540.9425926.2328351 0 1 ASVABC | 540 51.36271 9.567646 25.45931 66.07963 Logit estimates Number of obs = 540 LR chi2(1) = 43.58 Prob > chi2 = 0.0000 Log likelihood = -96.886017 Pseudo R2 = 0.1836 ------------------------------------------------------------------------------ GRAD | Coef. Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------- ASVABC |.1313626.022428 5.86 0.000.0874045.1753206 _cons | -3.240218.9444844 -3.43 0.001 -5.091373 -1.389063 ------------------------------------------------------------------------------ 21

23 When evaluated at the mean, Z is equal to 3.507. BINARY CHOICE MODELS: LOGIT ANALYSIS. sum GRAD ASVABC Variable | Obs Mean Std. Dev. Min Max -------------+-------------------------------------------------------- GRAD | 540.9425926.2328351 0 1 ASVABC | 540 51.36271 9.567646 25.45931 66.07963 Logit estimates Number of obs = 540 LR chi2(1) = 43.58 Prob > chi2 = 0.0000 Log likelihood = -96.886017 Pseudo R2 = 0.1836 ------------------------------------------------------------------------------ GRAD | Coef. Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------- ASVABC |.1313626.022428 5.86 0.000.0874045.1753206 _cons | -3.240218.9444844 -3.43 0.001 -5.091373 -1.389063 ------------------------------------------------------------------------------ 22

24 e –Z is 0.030. Hence f(Z) is 0.028. BINARY CHOICE MODELS: LOGIT ANALYSIS. sum GRAD ASVABC Variable | Obs Mean Std. Dev. Min Max -------------+-------------------------------------------------------- GRAD | 540.9425926.2328351 0 1 ASVABC | 540 51.36271 9.567646 25.45931 66.07963 23

25 The marginal effect, evaluated at the mean, is therefore 0.004. This implies that a one point increase in ASVABC would increase the probability of graduating from high school by 0.4 percent. BINARY CHOICE MODELS: LOGIT ANALYSIS. sum GRAD ASVABC Variable | Obs Mean Std. Dev. Min Max -------------+-------------------------------------------------------- GRAD | 540.9425926.2328351 0 1 ASVABC | 540 51.36271 9.567646 25.45931 66.07963 24

26 In this example, the marginal effect at the mean of ASVABC is very low. The reason is that anyone with an average score is almost certain to graduate anyway. So an increase in the score has little effect. BINARY CHOICE MODELS: LOGIT ANALYSIS 51.36 25

27 To show that the marginal effect varies, we will also calculate it for ASVABC equal to 30. A one point increase in ASVABC then increases the probability by 2.9 percent. BINARY CHOICE MODELS: LOGIT ANALYSIS. sum GRAD ASVABC Variable | Obs Mean Std. Dev. Min Max -------------+-------------------------------------------------------- GRAD | 540.9425926.2328351 0 1 ASVABC | 540 51.36271 9.567646 25.45931 66.07963 26

28 An individual with a score of 30 has only a 67 percent probability of graduating, and an increase in the score has a relatively large impact. BINARY CHOICE MODELS: LOGIT ANALYSIS 27

29 . logit GRAD ASVABC SM SF MALE Iteration 0: log likelihood = -118.67769 Iteration 1: log likelihood = -104.73493 Iteration 2: log likelihood = -97.080528 Iteration 3: log likelihood = -96.806623 Iteration 4: log likelihood = -96.804845 Iteration 5: log likelihood = -96.804844 Logit estimates Number of obs = 540 LR chi2(4) = 43.75 Prob > chi2 = 0.0000 Log likelihood = -96.804844 Pseudo R2 = 0.1843 ------------------------------------------------------------------------------ GRAD | Coef. Std. Err. z P>|z| [95% Conf. Interval] -------------+---------------------------------------------------------------- ASVABC |.1329127.0245718 5.41 0.000.0847528.1810726 SM | -.023178.0868122 -0.27 0.789 -.1933267.1469708 SF |.0122663.0718876 0.17 0.865 -.1286307.1531634 MALE |.1279654.3989345 0.32 0.748 -.6539318.9098627 _cons | -3.252373 1.065524 -3.05 0.002 -5.340761 -1.163985 ------------------------------------------------------------------------------ Here is the output for a model with a somewhat better specification. BINARY CHOICE MODELS: LOGIT ANALYSIS 28

30 . sum GRAD ASVABC SM SF MALE Variable | Obs Mean Std. Dev. Min Max -------------+-------------------------------------------------------- GRAD | 540.9425926.2328351 0 1 ASVABC | 540 51.36271 9.567646 25.45931 66.07963 SM | 540 11.57963 2.816456 0 20 SF | 540 11.83704 3.53715 0 20 MALE | 540.5.5004636 0 1 We will estimate the marginal effects, putting all the explanatory variables equal to their sample means. BINARY CHOICE MODELS: LOGIT ANALYSIS 29

31 Logit: Marginal Effects mean b product f(Z) f(Z)b ASVABC51.360.1336.8260.0280.004 SM11.58–0.023–0.2690.028–0.001 SF11.840.0120.1460.0280.000 MALE0.500.1280.0640.0280.004 Constant1.00–3.252–3.252 Total3.514 BINARY CHOICE MODELS: LOGIT ANALYSIS The first step is to calculate Z, when the X variables are equal to their sample means. 30

32 Logit: Marginal Effects mean b product f(Z) f(Z)b ASVABC51.360.1336.8260.0280.004 SM11.58–0.023–0.2690.028–0.001 SF11.840.0120.1460.0280.000 MALE0.500.1280.0640.0280.004 Constant1.00–3.252–3.252 Total3.514 BINARY CHOICE MODELS: LOGIT ANALYSIS We then calculate f(Z). 31

33 The estimated marginal effects are f(Z) multiplied by the respective coefficients. We see that the effect of ASVABC is about the same as before. Mother's schooling has negligible effect and father's schooling has no discernible effect at all. BINARY CHOICE MODELS: LOGIT ANALYSIS Logit: Marginal Effects mean b product f(Z) f(Z)b ASVABC51.360.1336.8260.0280.004 SM11.58–0.023–0.2690.028–0.001 SF11.840.0120.1460.0280.000 MALE0.500.1280.0640.0280.004 Constant1.00–3.252–3.252 Total3.514 32

34 Logit: Marginal Effects mean b product f(Z) f(Z)b ASVABC51.360.1336.8260.0280.004 SM11.58–0.023–0.2690.028–0.001 SF11.840.0120.1460.0280.000 MALE0.500.1280.0640.0280.004 Constant1.00–3.252–3.252 Total3.514 BINARY CHOICE MODELS: LOGIT ANALYSIS Males have 0.4 percent higher probability of graduating than females. These effects would all have been larger if they had been evaluated at a lower ASVABC score. 33

35 This sequence will conclude with an outline explanation of how the model is fitted using maximum likelihood estimation. BINARY CHOICE MODELS: LOGIT ANALYSIS Individuals who graduated: outcome probability is 34

36 In the case of an individual who graduated, the probability of that outcome is F(Z). We will give subscripts 1,..., s to the individuals who graduated. BINARY CHOICE MODELS: LOGIT ANALYSIS Individuals who graduated: outcome probability is 35

37 In the case of an individual who did not graduate, the probability of that outcome is 1 – F(Z). We will give subscripts s+1,..., n to these individuals. BINARY CHOICE MODELS: LOGIT ANALYSIS Individuals who graduated: outcome probability is Individuals who did not graduate: outcome probability is 36

38 Maximize F(Z 1 ) x... x F(Z s ) x [1 – F(Z s+1 )] x... x [1 – F(Z n )] Did graduate Did not graduate We choose b 1 and b 2 so as to maximize the joint probability of the outcomes, that is, F(Z 1 ) x... x F(Z s ) x [1 – F(Z s+1 )] x... x [1 – F(Z n )]. There are no mathematical formulae for b 1 and b 2. They have to be determined iteratively by a trial-and-error process. BINARY CHOICE MODELS: LOGIT ANALYSIS 37

39 Copyright Christopher Dougherty 2011. These slideshows may be downloaded by anyone, anywhere for personal use. Subject to respect for copyright and, where appropriate, attribution, they may be used as a resource for teaching an econometrics course. There is no need to refer to the author. The content of this slideshow comes from Section 10.2 of C. Dougherty, Introduction to Econometrics, fourth edition 2011, Oxford University Press. Additional (free) resources for both students and instructors may be downloaded from the OUP Online Resource Centre http://www.oup.com/uk/orc/bin/9780199567089/http://www.oup.com/uk/orc/bin/9780199567089/. Individuals studying econometrics on their own and who feel that they might benefit from participation in a formal course should consider the London School of Economics summer school course EC212 Introduction to Econometrics http://www2.lse.ac.uk/study/summerSchools/summerSchool/Home.aspx http://www2.lse.ac.uk/study/summerSchools/summerSchool/Home.aspx or the University of London International Programmes distance learning course 20 Elements of Econometrics www.londoninternational.ac.uk/lsewww.londoninternational.ac.uk/lse. 11.07.25


Download ppt "Christopher Dougherty EC220 - Introduction to econometrics (chapter 10) Slideshow: binary choice logit models Original citation: Dougherty, C. (2012) EC220."

Similar presentations


Ads by Google