Presentation is loading. Please wait.

Presentation is loading. Please wait.

Prediction/Regression

Similar presentations


Presentation on theme: "Prediction/Regression"— Presentation transcript:

1 Prediction/Regression
Chapter 12 Prediction/Regression Part 3: Apr. 22, 2008

2 Multiple Regression Bivariate prediction – 1 predictor, 1 criterion
Multiple regression – use multiple predictors Reg model/equations are same, just use separate reg coefficients () for each predictor Ex) multiple regression formula with three predictor variables a is still the regression constant (where the reg line crosses the y axis) b1 is the regression coefficient for X1 b2 is the regression coefficient for X2, etc…

3 Standardized regression coefficients
With bivariate regression, we discussed finding the slope of the reg line, b. That was an unstandardized regression coefficient (based on the original scale) If the variable was measured on a 1-8 scale, that would be the scale for b as well. But many times, we’re interested in comparing our regression results to other researchers’ They may have measured the same variables but used different measures (maybe a 1-20 scale) Standardized regression coefficients (β or beta) will let us compare (more generalizable)

4 Using standardized coefficients (betas)
There is a formula for changing b into β in the chapter, but you won’t be asked to use it So the regression equation (model) would look like this if we use standardized regression coefficients (β):

5 Overlap among predictors
Common for there to be correlation among predictor variables β gives us the unique contribution of each variable β1 gives unique contribution of X1 in predicting Y, excluding any overlap w/other predictors R2 gives the % variance in y explained by all of the predictors together There will be a significance test for R2 in SPSS to determine whether the entire regression model explains significant variance in Y. If yes  Then examine the individual predictors’ β – there will be a signif test for each of these. Is each predictor important or only some of them?

6 Interpreting beta In general, can interpret it like a correlation between your predictor and criterion: if β is positive, higher scores on predictor (x) are related to higher scores on criterion (y) If β is negative, higher scores on x go with lower scores on y. More specifically, β gives us the predicted amount of change (in SD units) in the criterion for every 1 SD increase in the predictor Example… In bivariate regression (1 predictor), β is equal to the correlation betw the predictor & criterion (r) But this doesn’t work when we use more than 1 predictor

7 Hypothesis tests for regression
We are usually interested in multiple issues Is the β significantly different from 0? (similar to the hyp test for correlation – is there any relationship?) If β = 0, then knowing someone’s score on x (predictor) tell us nothing about their score on y (criterion)…we can’t predict y from x. In multiple regression, we may be interested in which predictor is the best (has the strongest relationship to the criterion)

8 Mult Reg (cont.) How to judge the relative importance of each predictor variable in predicting the criterion? Consider both the rs and the βs Not necessarily the same rank order of magnitude for rs and βs, so check both. βs indicate unique relationship betw a predictor and criterion, controlling for other predictors r’s indicate general relationship betw x & y (includes effects of other predictors)

9 Extensions of Multiple Reg
We’ve discussed the simplest version of multiple regression where all predictors are entered into the equation at the same time. Another option: Hierarchical mult reg – enter X1 at step 1, enter X2 at step 2, enter X3 at step 3 and examine the changes in the equation at each step How does R2 change at each step? What happens to betas for each variable when others are introduced into the equation? When might you use hierarchical regression?

10 Prediction in Research Articles
Bivariate prediction models rarely reported Multiple regression results commonly reported Note example table in book, reports r’s and βs for each predictor; reports R2 in note at bottom.

11 Reporting mult. regression
From previous table… The multiple regression equation was significant, R2 = .13, p < Depression (β = .30, p<.001) and age (β = .20, p < .001) both significantly predicted intragroup effect, but number of sessions and duration of the disorder were not significant predictors. This indicates that older adults and those with higher levels of depression had higher (better) intragroup effects.

12 SPSS Reg Example Analyze Regression  Linear
Note that terms used in SPSS are “Independent Variable”…this is x (predictor) “Dependent Variable”…this is y (criterion) Class handout of output – what to look for: “Model Summary” section - shows R2 ANOVA section – 1st line gives ‘sig value’, if < .05  signif This tests the significance of the R2 (is the whole regression equation significant or not? If yes  it does predict y) Coefficients section – 1st line gives ‘constant’ = a Other lines give ‘standardized coefficients’ = b or beta for each predictor For each predictor, there is also a significance test (if ‘sig’ if < .05, that predictor is significantly different from 0 and does predict y) If it is significant, you’d want to interpret the beta (like a correlation)


Download ppt "Prediction/Regression"

Similar presentations


Ads by Google