Download presentation

Presentation is loading. Please wait.

Published byKelton Kelman Modified over 3 years ago

1
Lecture 10 Feb 1 2006

2
Added-variable Added variable plots give you a visual sense of whether x2 is a useful addition to the model: E(y|x1) = a + b x1

3
Steps to making one Regress y on x1 Compute residuals of y on x1: y "-"x1 (remove x1 from y) Regress x2 on x1 Compute residuals of x1 on x2: x2 "-"x1 (remove x1 from x2) Plot y"-"x1 vs. x2 "-"x1

4
Interpret If there is a "significant" slope, then x2 is useful. Slope of the added variable plot is the same as the coefficient if you fit E(y|x1, x2) = B 0 + B 1 x1 + B 2 x2

5
Significance Tests Find the slope of an added variable plot, and do a t-test to see if the slope is significant. The value of the t-stat is almost the same as the value of the t-stat for the "full" model. The p-values will differ because the degrees of freedom are different: n-2 for added- variable slope, n-3 for full model

6
Mussel Beds

7
Is density related to food levels?

8
Is density related to human use?

9
If we know the human use level, do we need to know food leve?

10
Added-Variable

11
Summary of added variable plot slope = 10.87 additional amount of food worth 10.87 mm of thickness, on avg., controlling for human use t = 4.742, p=1.45e-05

12
Summary of lm(thickness~food+human.use) E(thickness|food, human) = 63+18.87*food - 6.294 * human.use t_food + 4.70, p = 1.84e-05

13
Testing one variable: Plan #1 H0: Beta1 = 0, Beta0, Beta2, Beta3, etc. "arbitrary" Ha: Beta1 <> 0, others arbitrary Fit full model: y = B0+B1*x1+B2*x2+B3*x3 etc. Fit reduced model y = B0 + B2*x2+B3*x3 Compare RSS

14
partial F-test Compare RSS (full) with RSS (reduced) Note: RSS(reduced) will be ???? than RSS(full) F = ( ( RSS(red) - RSS(full) ) /1 ) /RSS(full)/(n-p) Note: The denominator is just = ????

15
Formula for partial F (Reduction in RSS)/DF num divided by RSS/DF den DF of a model is n - # of parameters estimated. DF num is DF(full)-DF(reduced) DF den is n - p

16
In R (long) full <- lm(y ~ x1 + x2 + x3 + x4) red <- lm(y~x2 + x3 + x4) anova(full); anova(red) compute by hand from output

17
R(short) full <- lm(y ~ x1+x2+x3) anova(full) Read output

18
Plan #2 t^2 = F full <- lm(y~x1+x2+x3) summary(full) look at t-statistic

Similar presentations

OK

Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) = + 1 X 1 + + k.

Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) = + 1 X 1 + + k.

© 2018 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google

Ppt on pricing policy benefits Ppt on video teleconferencing system Ppt on conservation of energy resources Ppt on cement based materials Ppt on single phase and three phase dual converter operation Ppt on teamviewer download Ppt on human chromosomes 23 Ppt on search engine google Ppt on food web and food chain Ppt on collection of primary data