Presentation is loading. Please wait.

Presentation is loading. Please wait.

Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Similar presentations


Presentation on theme: "Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:"— Presentation transcript:

1 Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples: T-test (independent or paired) Three samples: One-way ANOVA F-test Factorial design: Two-Way ANOVA F-test

2 Overview of Techniques Case 2 Independent Variable is continuous (x) Dependent Variable is continuous (y) One DV, one predictor: correlation, simple linear regression One DV, multiple predictors: partial correlation, multiple correlation, multiple regression

3 What if we have two predictor variables? We want to predict depression. We have measured stress and loneliness. We can ask several questions: 1) which is the stronger predictor? 2) how well do they predict depression together? 3) what is the effect of loneliness on depression, controlling for stress?

4 What if we have two predictor variables? Regressing depression on stress PredictorUnstandardized Coefficient Standard error Standardized Coefficient tsig Stress.50.16.253.125<.05 Regressing depression on loneliness PredictorUnstandardized Coefficient Standard error Standardized Coefficient tsig Loneliness.80.28.202.86<.05 R 2 =.0625 R 2 =.04 Which is the better predictor? How well do they predict depression together?

5 Multiple Correlation

6 How well do they predict depression together? depression loneliness R2R2

7 How well do they predict depression together? depression stress R2R2

8 How well do they predict depression together? depression loneliness stress (a) (b) (c) Multiple R 2 : (a) + (b) + (c)Pearson’s R 2 for loneliness: (a) + (b) Pearson’s R 2 for stress: (c) + (b)

9 Partial Correlation

10 depression loneliness stress (a) (b) (c) What is the effect of loneliness controlling for stress? Pearson’s R 2 for loneliness: (a) + (b) Pearson’s R 2 for stress: (c) + (b) Partial R 2 for loneliness: (a) Partial R 2 for stress: (b)

11 Multiple Regression

12 Types of effects Total effect of stress: (b) + (c) depression loneliness stress (a) (b) (c) Shared effect of stress and loneliness: (b) Unique effect of stress: (c) Slope coefficients in simple regression capture total effects Slope coefficients in multiple regression capture unique effects

13 Reasons for Multiple Regression 1) It allows you to directly compare the effect sizes for different predictor variables 2) Adding additional predictors that are related to your Y variable (we call them covariates) allows you to explain more of the residual variance. This makes MS error smaller and increases your power. 2) If you are worried that your key predictor is confounded with other variables, you can “partial them out” or “control for them” in your multiple regression by including them in the analysis. depression loneliness stress (a) (b) (c)

14 Two separate regressions Regressing depression on stress PredictorUnstandardized Coefficient Standard error Standardized Coefficient tsig Stress.50.16.253.125<.05 Regressing depression on loneliness PredictorUnstandardized Coefficient Standard error Standardized Coefficient tsig Loneliness.80.28.202.86<.05 R 2 =.0625 R 2 =.04

15 Regressing depression on loneliness and stress PredictorUnstandardized Partial Coefficient Standard error Standardized Partial Coefficient tsig Intercept1.81.1-1.64.08 Stress.34.11.173.09<.05 Loneliness.10.05 2.00.06 Multiple R 2 =.0625 df = n – p - 1 A multiple regression

16 SourceSSdfs2s2 Model Error Total The F-test is for the whole model, doesn’t tell you about individual predictors Multiple Regression ANOVA

17 Categorical Predictors in Multiple Regression

18 Regressing depression on gender (0=female, 1=male) A dichotomous 0/1 predictor Genderdepression 08 14 110 015 18 014

19 Regressing depression on gender (0=female, 1=male) PredictorUnstandardized Partial Coefficient Standard error Standardized Partial Coefficient tsig Intercept12.331.99-6.2<.01 Gender-5.002.81.66-1.8.15 A dichotomous 0/1 predictor The intercept coefficient tells you the mean depression of the 0 (female) group The gender coefficient tells you what to add to get the mean depression of the 1 (male) group If the gender coefficient is significant, the groups significantly differ

20 Categorical and Continuous Predictors in Multiple Regression

21 Combining Types of Predictors T-tests and ANOVAs use group variables to predict continuous outcomes Correlations and simple regressions use continuous variables to predict continuous outcomes Multiple regressions allow you to use 1) information about group membership and 2) information about other continuous measurements, in the same analysis

22 Combining Types of Predictors WHY would we want this? Imagine that we have a control group and a highly-provoked group, and we also measure the “TypeA-ness” of each participant. We noticed that because of streaky random sampling, we got more TypeA people in the control group than in the provoked group. Multiple regression allows us to see if there was an effect of our manipulation, controlling for individual differences in TypeA-ness. Basically, it allows us to put a situational manipulation and a personality scale measurement into the same study.

23 GroupProvokeTypeAaggression Control033 064 0106 Control085 High129 1410 High11124 High11220

24 PredictorUnstandardized Partial Coefficient Standard error Standardized Partial Coefficient tsig Intercept-3.262.23--1.46.20 Provoke10.6751.89.7345.65<.01 Type A1.15.26.5654.35<.01 There is a significant effect of experimental condition and a significant effect of TypeA-ness

25 General Linear Model

26 All of the techniques we’ve covered so far can be expressed as special cases of multiple regression If you run a multiple regression with an intercept and no slope, the t-test for the intercept is the same as a single sample t-test. If you put in a dichotomous (0/1) predictor, the t-test for your slope will be the same as an independent samples t-test. If you put in dummy variables for multiple groups, your regression ANOVA will be the same as your one-way ANOVA or two-way ANOVA. If you put in one continuous predictor, your β will be the same as your r.

27 General Linear Model Plus multiple regression can do so much more! Looking at several continuous predictors together in one model. Controlling for confounds. Using covariates to “soak up” residual variance. Looking at categorical and continuous predictors together in one model. Looking at interactions between categorical and continuous variables.


Download ppt "Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:"

Similar presentations


Ads by Google