Download presentation

Presentation is loading. Please wait.

Published byHannah Holgate Modified about 1 year ago

1
Part 10: Prediction 10-1/21 Econometrics I Professor William Greene Stern School of Business Department of Economics

2
Part 10: Prediction 10-2/21 Econometrics I Part 10 - Prediction

3
Part 10: Prediction 10-3/21 Forecasting Objective: Forecast Distinction: Ex post vs. Ex ante forecasting Ex post: RHS data are observed Ex ante: RHS data must be forecasted Prediction vs. model validation. Within sample prediction “Hold out sample”

4
Part 10: Prediction 10-4/21 Prediction Intervals Given x 0 predict y 0. Two cases: Estimate E[y|x 0 ] = x 0 ; Predict y 0 = x 0 + 0 Obvious predictor, b’x0 + estimate of 0. Forecast 0 as 0, but allow for variance. Alternative: When we predict y 0 with bx 0, what is the 'forecast error?' Est.y 0 - y 0 = bx 0 - x 0 - 0, so the variance of the forecast error is x 0 Var[b - ]x 0 + 2 How do we estimate this? Form a confidence interval. Two cases: If x 0 is a vector of constants, the variance is just x 0 Var[b] x 0. Form confidence interval as usual. If x 0 had to be estimated, then we use a random variable. What is the variance of the product? (Ouch!) One possibility: Use bootstrapping.

5
Part 10: Prediction 10-5/21 Forecast Variance Variance of the forecast error is 2 + x 0 ’ Var[b]x 0 = 2 + 2 [x 0 ’(X’X) -1 x 0 ] If the model contains a constant term, this is In terms squares and cross products of deviations from means. Interpretation: Forecast variance is smallest in the middle of our “experience” and increases as we move outside it.

6
Part 10: Prediction 10-6/21 Butterfly Effect

7
Part 10: Prediction 10-7/21 Internet Buzz Data

8
Part 10: Prediction 10-8/21 A Prediction Interval The usual 95% Due to ε Due to estimating α and β with a and b

9
Part 10: Prediction 10-9/21 Slightly Simpler Formula for Prediction

10
Part 10: Prediction 10-10/21 Prediction from Internet Buzz Regression

11
Part 10: Prediction 10-11/21 Prediction Interval for Buzz =.8

12
Part 10: Prediction 10-12/21 Dummy Variable for One Observation A dummy variable that isolates a single observation. What does this do? Define d to be the dummy variable in question. Z = all other regressors. X = [Z,d] Multiple regression of y on X. We know that X'e = 0 where e = the column vector of residuals. That means d'e = 0, which says that e j = 0 for that particular residual. The observation will be predicted perfectly. Fairly important result. Important to know.

13
Part 10: Prediction 10-13/21 Oaxaca Decomposition Two groups, two regression models: (Two time periods, men vs. women, two countries, etc.) y 1 = X 1 1 + 1 and y 2 = X 2 2 + 2 Consider mean values, y 1 * = E[y 1 |mean x 1 ] = x 1 * 1 y 2 * = E[y 2 |mean x 2 ] = x 2 * 2 Now, explain why y 1 * is different from y 2 *. (I.e., departing from y 2, why is y 1 different?) (Could reverse the roles of 1 and 2.) y 1 * - y 2 * = x 1 * 1 - x 2 * 2 = x 1 *( 1 - 2 ) + (x 1 * - x 2 *) 2 (change in model) (change in conditions)

14
Part 10: Prediction 10-14/21 The Oaxaca Decomposition

15
Part 10: Prediction 10-15/21 Application - Income German Health Care Usage Data, 7,293 Individuals, Varying Numbers of Periods Variables in the file are Data downloaded from Journal of Applied Econometrics Archive. This is an unbalanced panel with 7,293 individuals. They can be used for regression, count models, binary choice, ordered choice, and bivariate binary choice. This is a large data set. There are altogether 27,326 observations. The number of observations ranges from 1 to 7. (Frequencies are: 1=1525, 2=2158, 3=825, 4=926, 5=1051, 6=1000, 7=987). HHNINC = household nominal monthly net income in German marks / (4 observations with income=0 were dropped) HHKIDS = children under age 16 in the household = 1; otherwise = 0 EDUC = years of schooling AGE = age in years MARRIED = 1 if married, 0 if not FEMALE = 1 if female, 0 if male

16
Part 10: Prediction 10-16/21 Regression: Female=0 (Men)

17
Part 10: Prediction 10-17/21 Regression Female=1 (Women)

18
Part 10: Prediction 10-18/21 Pooled Regression

19
Part 10: Prediction 10-19/21 Application namelist ; X = one,age,educ,married,hhkids$ ? Get results for females include ; new ; female=1$ Subsample females regr ; lhs=hhninc;rhs=x$ Regression matrix ; bf=b ; vf = varb ; xbarf = mean(x) $ Coefficients, variance, mean X calc ; meanincf = bf'xbarf $ Mean prediction for females ? Get results for males include ; new ; female=0$ Subsample males regr ; lhs=hhninc;rhs=x$ Regression matrix ; bm=b ; vm = varb ; xbarm = mean(x) $ Coefficients, etc. calc ; meanincm = bm'xbarm $ Mean prediction for males ? Examine difference in mean predicted income calc ; list ; meanincm ; meanincf Display means ; diff = xbarm'bm - xbarf'bf $ Difference in means matrix ; vdiff = xbarm'[vm]xbarm + xbarf'[vf]xbarf $ Variance of difference calc ; list ; diffwald = diff^2 / vdiff $ Wald test of difference = 0 ? “Discrimination” component of difference matrix ; db = bm-bf ; discrim = xbarm'db Difference in coeffs., discrimination ; vdb = vm+vf ; vdiscrim = xbarm'[vdb]xbarm $ Variance of discrimination calc ; list ; discrim ; dwald = discrim^2 / vdiscrim $ Walt test that D = 0. ? “Difference due difference in X” matrix ; dx = xbarm - xbarf $ Difference in characteristics matrix ; qual = dx'bf ; vqual = dx'[vf]dx $ Contribution to total difference calc ; list ; qual ; qualwald = qual^2/vqual $ Wald test.

20
Part 10: Prediction 10-20/21 Results | Listed Calculator Results | MEANINCM = MEANINCF = DIFF = DIFFWALD = DISCRIM = DWALD = QUAL = QUALWALD =

21
Part 10: Prediction 10-21/21 Decompositions

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google