Presentation is loading. Please wait.

Presentation is loading. Please wait.

Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, November 19 Chi-Squared Test of Independence.

Similar presentations


Presentation on theme: "Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, November 19 Chi-Squared Test of Independence."— Presentation transcript:

1 Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, November 19 Chi-Squared Test of Independence

2 Homework #13 due11/28 Ch 17 # 13, 14, 19, 20

3 Last Time: Clarification and review of some regression concepts Multiple regression Regression in SPSS

4 This Time: Review of multiple regression New Topic: Chi-squared test of independence Announcements: Final project due date extended from Dec. 5 to Dec. 6. Must be turned in to psychology department by 4 p.m. Extra credit due by the start of class (Dec. 5) to receive credit. Evidence of academic dishonesty regarding extra credit will be referred for disciplinary action. Exam IV (emphasizing correlation, regression, and chi- squared test) is on Tuesday, December 3 Final exam is on Tuesday, 12/10 at 7:50 a.m.

5 Multiple Regression Typically researchers are interested in predicting with more than one explanatory variable In multiple regression, an additional predictor variable (or set of variables) is used to predict the residuals left over from the first predictor.

6 Multiple Regression Y = intercept + slope (X) + error Bi-variate regression prediction models

7 Multiple Regression Multiple regression prediction models “fit” “residual” Y = intercept + slope (X) + error Bi-variate regression prediction models

8 Multiple Regression Multiple regression prediction models First Explanatory Variable Second Explanatory Variable Fourth Explanatory Variable whatever variability is left over Third Explanatory Variable

9 Multiple Regression Predict test performance based on: First Explanatory Variable Second Explanatory Variable Fourth Explanatory Variable whatever variability is left over Third Explanatory Variable Study time Test time What you eat for breakfast Hours of sleep

10 Multiple Regression Predict test performance based on: Study time Test time What you eat for breakfast Hours of sleep Typically your analysis consists of testing multiple regression models to see which “fits” best (comparing R 2 s of the models) versus For example:

11 Multiple Regression Response variable Total variability it test performance Total study time r =.6 Model #1: Some co-variance between the two variables R 2 for Model =.36 64% variance unexplained If we know the total study time, we can predict 36% of the variance in test performance

12 Multiple Regression Response variable Total variability it test performance Test time r =.1 Model #2: Add test time to the model Total study time r =.6 R 2 for Model =.37 63% variance unexplained Little co-variance between these test performance and test time We can explain more the of variance in test performance

13 Multiple Regression Response variable Total variability it test performance breakfast r =.0 Model #3: No co-variance between these test performance and breakfast food Total study time r =.6 Test time r =.1 R 2 for Model =.37 63% variance unexplained Not related, so we can NOT explain more the of variance in test performance

14 Multiple Regression Response variable Total variability it test performance breakfast r =.0 We can explain more the of variance But notice what happens with the overlap (covariation between explanatory variables), can’t just add r’s or r 2 ’s Total study time r =.6 Test time r =.1 Hrs of sleep r =.45 R 2 for Model =.45 55% variance unexplained Model #4: Some co-variance between these test performance and hours of sleep

15 Multiple Regression The “least squares” regression equation when there are multiple intercorrelated predictor (x) variables is found by calculating “partial regression coefficients” for each x A partial regression coefficient for x 1 shows the relationship between y and x 1 while statistically controlling for the other x variables (or holding the other x variables constant)

16 Multiple Regression The formula for the partial regression coefficient is: b 1 = (r Y1 -r Y2 r 12 )/(1-r 12 2 )*(s Y /s 1 ) Where r Y1 =correlation of x 1 and y r Y2 =correlation of x 2 and y r 12 =correlation of x 1 and x 2 s Y =standard deviation of y, s 1 =standard deviation of x 1

17 Multiple Regression Multiple correlation coefficient (R) is an estimate of the relationship between the dependent variable (y) and the best linear combination of predictor variables (correlation of y and y-pred.) R 2 tells you the amount of variance in y explained by the particular multiple regression model being tested.

18 Multiple Regression in SPSS Setup as before: Variables (explanatory and response) are entered into columns A couple of different ways to use SPSS to compare different models

19 Regression in SPSS Analyze: Regression, Linear

20 Multiple Regression in SPSS Method 1: enter all the explanatory variables together – Enter: All of the predictor variables into the Independent Variable field Predicted (criterion) variable into Dependent Variable field

21 Multiple Regression in SPSS The variables in the model r for the entire model r 2 for the entire model Unstandardized coefficients Coefficient for var1 (var name) Coefficient for var2 (var name)

22 Multiple Regression in SPSS The variables in the model r for the entire model r 2 for the entire model Standardized coefficients Coefficient for var1 (var name)Coefficient for var2 (var name)

23 Multiple Regression –Which coefficient to use, standardized or unstandardized? –Unstandardized b’s are easier to use if you want to predict a raw score based on raw scores (no z-scores needed). –Standardized β’s are nice to directly compare which variable is most “important” in the equation

24 Multiple Regression in SPSS Predicted (criterion) variable into Dependent Variable field First Predictor variable into the Independent Variable field Click the Next button Method 2: enter first model, then add another variable for second model, etc. –Enter:

25 Multiple Regression in SPSS Method 2 cont: – Enter: Second Predictor variable into the Independent Variable field Click Statistics

26 Multiple Regression in SPSS –Click the ‘R squared change’ box

27 Multiple Regression in SPSS The variables in the first model (math SAT) Shows the results of two models The variables in the second model (math and verbal SAT)

28 Multiple Regression in SPSS The variables in the first model (math SAT) r 2 for the first model Coefficients for var1 (var name) Shows the results of two models The variables in the second model (math and verbal SAT) Model 1

29 Multiple Regression in SPSS The variables in the first model (math SAT) Coefficients for var1 (var name) Coefficients for var2 (var name) Shows the results of two models r 2 for the second model The variables in the second model (math and verbal SAT) Model 2

30 Multiple Regression in SPSS The variables in the first model (math SAT) Shows the results of two models The variables in the second model (math and verbal SAT) Change statistics: is the change in r 2 from Model 1 to Model 2 statistically significant?

31 Cautions in Multiple Regression We can use as many predictors as we wish but we should be careful not to use more predictors than is warranted. –Simpler models are more likely to generalize to other samples. –If you use as many predictors as you have participants in your study, you can predict 100% of the variance. Although this may seem like a good thing, it is unlikely that your results would generalize to any other sample and thus they are not valid. –You probably should have at least 10 participants per predictor variable (and probably should aim for about 30).

32 New (Final) Topic Chi-Squared Test of Independence

33 Chi-Squared Test for Independence A manufacturer of watches takes a sample of 200 people. Each person is classified by age and watch type preference (digital vs. analog). A manufacturer of watches takes a sample of 200 people. Each person is classified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference? Young (under 30) Old (over 30)

34 Chi-Squared Test for Independence A manufacturer of watches takes a sample of 200 people. Each person is classified by age and watch type preference (digital vs. analog). A manufacturer of watches takes a sample of 200 people. Each person is classified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference? Young (under 30) Old (over 30)

35 Statistical analysis follows design We have finished the top part of the chart! Focus on this section for rest of semester

36 A manufacturer of watches takes a sample of 200 people. Each person is classified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference? A manufacturer of watches takes a sample of 200 people. Each person is classified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference? Chi-Squared Test for Independence

37 Step 1 : State the hypotheses – H 0 : Preference is independent of age (“no relationship”) – H A : Preference is related to age (“there is a relationship”) A manufacturer of watches takes a sample of 200 people. Each person is classified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference? A manufacturer of watches takes a sample of 200 people. Each person is classified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference? Observed scores

38 Chi-Squared Test for Independence Step 2: Compute your degrees of freedom & get critical value df = (#Columns - 1) * (#Rows - 1) = (3-1) * (2-1) = 2 For this example, with df = 2, and  = 0.05 The critical chi-squared value is 5.99 –Go to Chi-square statistic table (B-8) and find the critical value

39 Chi-Squared Test for Independence Step 3: Collect the data. Obtain row and column totals (sometimes called the marginals) and calculate the expected frequencies Observed scores

40 Chi-Squared Test for Independence Step 3: Collect the data. Obtain row and column totals (sometimes called the marginals) and calculate the expected frequencies Observed scores Spot check: make sure the row totals and column totals add up to the same thing

41 Chi-Squared Test for Independence Step 3: Collect the data. Obtain row and column totals (sometimes called the marginals) and calculate the expected frequencies Under 30 Over 30 Digital AnalogUndecided 705614 30246 Observed scores Expected scores

42 Chi-Squared Test for Independence Step 3: Collect the data. Obtain row and column totals (sometimes called the marginals) and calculate the expected frequencies Under 30 Over 30 Digital AnalogUndecided 705614 30246 Observed scores Expected scores “expected frequencies” - if the null hypothesis is correct, then these are the frequencies that you would expect

43 Find the residuals (f o - f e ) for each cell Chi-Squared Test for Independence Step 3: compute the  2

44 Computing the Chi-square Step 3: compute the  2 Find the residuals (f o - f e ) for each cell

45 Computing the Chi-square Square these differences Find the residuals (f o - f e ) for each cell Step 3: compute the  2

46 Computing the Chi-square Square these differences Find the residuals (f o - f e ) for each cell Divide the squared differences by f e Step 3: compute the  2

47 Computing the Chi-square Square these differences Find the residuals (f o - f e ) for each cell Divide the squared differences by f e Sum the results Step 3: compute the  2

48 Chi-Squared, the final step Step 4 : Compare this computed statistic (38.09) against the critical value (5.99) and make a decision about your hypotheses A manufacturer of watches takes a sample of 200 people. Each person is classified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference? A manufacturer of watches takes a sample of 200 people. Each person is classified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference? here we reject the H 0 and conclude that there is a relationship between age and watch preference

49 In SPSS Analyze => Descriptives => Crosstabs Select the two variables (usually they are nominal or ordinal) you want to examine and click the arrow to move one into the “rows” and one into the “columns” box. Click on “statistics” button, and check the “Chi- square” box. Click “continue.” Click “OK.”

50 SPSS Output Look at the “Chi-square tests” box. The top row of this box gives results for “Pearson’s Chi-Square” “Value” is the value of the χ 2 statistic, “df” is the degrees of freedom for the test “Asymp. Sig. (2-sided)” is the probability (p- value) associated with the test. The chi-squared distribution, like the F- distribution, is “squared” so 1-tailed test is not possible.


Download ppt "Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, November 19 Chi-Squared Test of Independence."

Similar presentations


Ads by Google