Presentation is loading. Please wait.

Presentation is loading. Please wait.

© Copyright McGraw-Hill 2000 12-1 CHAPTER 12 Analysis of Variance (ANOVA)

Similar presentations


Presentation on theme: "© Copyright McGraw-Hill 2000 12-1 CHAPTER 12 Analysis of Variance (ANOVA)"— Presentation transcript:

1 © Copyright McGraw-Hill 2000 12-1 CHAPTER 12 Analysis of Variance (ANOVA)

2 © Copyright McGraw-Hill 2000 12-2 Objectives Use the one-way ANOVA technique to determine if there is a significant difference among three or more means. Determine which means differ using Scheffe or Tukey test if the the null hypothesis is rejected in the ANOVA. Use the two-way ANOVA technique to determine if there is a significant difference in the main effects or interaction.

3 © Copyright McGraw-Hill 2000 12-3 Introduction The F -test, used to compare two variances, can also be used to compare three of more means. This technique is called analysis of variance or ANOVA. For three groups, the F- test can only show whether or not a difference exists among the three means, not where the difference lies. Other statistical tests, Scheffé test and the Tukey test, are used to find where the difference exists.

4 © Copyright McGraw-Hill 2000 12-4 Analysis of Variance When an F -test is used to test a hypothesis concerning the means of three or more populations, the technique is called analysis of variance (commonly abbreviated as ANOVA ). Although the t -test is commonly used to compare two means, it should not be used to compare three or more.

5 © Copyright McGraw-Hill 2000 12-5 Assumptions for the F-test The following assumptions apply when using the F- test to compare three or more means. 1. The populations from which the samples were obtained must be normally or approximately normally distributed. 2. The samples must be independent of each other. 3. The variances of the populations must be equal.

6 © Copyright McGraw-Hill 2000 12-6 F-Test In the F -test, two different estimates of the population variance are made. The first estimate is called the between-group variance, and it involves finding the variance of the means. The second estimate, the within-group variance, is made by computing the variance using all the data and is not affected by differences in the means.

7 © Copyright McGraw-Hill 2000 12-7 F-Test (cont’d.) If there is no difference in the means, the between-group variance will be approximately equal to the within-group variance, and the F test value will be close to 1— the null hypothesis will not be rejected. However, when the means differ significantly, the between-group variance will be much larger than the within-group variance; the F- test will be significantly greater than 1— the null hypothesis will be rejected.

8 © Copyright McGraw-Hill 2000 12-8 Hypotheses For a test of the difference among three or more means, the following hypotheses should be used: H 0 :  1   2    k, where k is the number of groups H 1 : At least one mean is different from the others. A significant test value means that there is a high probability that this difference in the means is not due to chance.

9 © Copyright McGraw-Hill 2000 12-9 Degrees of Freedom The degrees of freedom for this F- test are d.f.N.  k  1, where k is the number of groups, And d.f.D.  N  k where N is the sum of the sample sizes of the groups, N  n 1  n 2   n k. The sample sizes do not need to be equal. The F- test to compare means is always right-tailed.

10 © Copyright McGraw-Hill 2000 12-10 Procedure for Finding F-Test Value Step 1Find the mean and variance of each sample. Step 2Find the grand mean. Step 3Find the between-group variance. Step 4Find the within-group variance. Step 5Find the F -test value.

11 © Copyright McGraw-Hill 2000 12-11 Analysis of Variance Summary Table Total MS B MS W k 1N  kk 1N  k SS B SS W Between Within F Mean Squares d.f.Sum of squares Source

12 © Copyright McGraw-Hill 2000 12-12 Sum of Squares Between Groups The sum of the squares between groups, denoted SS B, is found using the following formula: Where the grand mean, denoted by, is the mean of all values in the samples.

13 © Copyright McGraw-Hill 2000 12-13 Sum of Squares Within Groups The sum of the squares within groups, denoted SS W, is found using the following formula: Note: This formula finds an overall variance by calculating a weighted average of the individual variances. It does not involve using differences of the means.

14 © Copyright McGraw-Hill 2000 12-14 The Mean Squares The mean square values are equal to the sum of the squares divided by the degrees of freedom.

15 © Copyright McGraw-Hill 2000 12-15 Scheffé Test In order to conduct the Scheffé test, one must compare the means two at a time, using all possible combinations of means. For example, if there are three means, the following comparisons must be done:

16 © Copyright McGraw-Hill 2000 12-16 Formula for Scheffé Test where and are the means of the samples being compared, n i and n j are the respective sample sizes, and is the within-group variance.

17 © Copyright McGraw-Hill 2000 12-17 F Value for Scheffé Test To find the critical value F for the Scheffé test, multiply the critical value for the F -test by k  1: There is a significant difference between the two means being compared when F s is greater than F.

18 © Copyright McGraw-Hill 2000 12-18 Tukey Test The Tukey test can also be used after the analysis of variance has been completed to make pair-wise comparisons between means when the groups have the same sample size. The symbol for the test value in the Tukey test is q.

19 © Copyright McGraw-Hill 2000 12-19 Formula for Tukey Test where and are the means of the samples being compared, n is the size of the sample and is the within-group variance.

20 © Copyright McGraw-Hill 2000 12-20 Tukey Test Results When the absolute value of q is greater than the critical value for the Tukey test, there is a significant difference between the two means being compared.

21 © Copyright McGraw-Hill 2000 12-21 Two-Way Analysis of Variance The two-way analysis of variance is an extension of the one-way analysis of variance already discussed; it involves two independent variables. The independent variables are also called factors.

22 © Copyright McGraw-Hill 2000 12-22 Two-Way Analysis of Variance (cont’d.) Using the two-way analysis of variance, the researcher is able to test the effects of two independent variables or factors on one dependent variable. In addition, the interaction effect of the two variables can be tested.

23 © Copyright McGraw-Hill 2000 12-23 Two-Way ANOVA Terms Variables or factors are changed between two levels (i.e., two different treatments). The groups for a two-way ANOVA are sometimes called treatment groups.

24 © Copyright McGraw-Hill 2000 12-24 Two-Way ANOVA Designs A3A3 B2B2 B1B1 A1A1 A2A2 3  2 design B3B3 B1B1 A3A3 A1A1 A2A2 B2B2 3  3 design

25 © Copyright McGraw-Hill 2000 12-25 Two-Way ANOVA Null Hypothesis A two-way ANOVA has several null hypotheses. There is one for each independent variable and one for the interaction.

26 © Copyright McGraw-Hill 2000 12-26 Two-Way ANOVA Summary Table

27 © Copyright McGraw-Hill 2000 12-27 Assumptions for the Two-Way ANOVA The population from which the samples were obtained must be normally or approximately normally distributed. The samples must be independent. The variances of the population from which the samples were selected must be equal. The groups must be equal in sample size.

28 © Copyright McGraw-Hill 2000 12-28 Graphing Interactions To interpret the results of a two-way analysis of variance, researchers suggest drawing a graph, plotting the means of each group, analyzing the graph, and interpreting the results.

29 © Copyright McGraw-Hill 2000 12-29 Disordinal Interaction If the graph of the means has lines that intersect each other, the interaction is said to be disordinal. When there is a disordinal interaction, one should not interpret the main effects without considering the interaction effect. y x

30 © Copyright McGraw-Hill 2000 12-30 Ordinal Interaction An ordinal interaction is evident when the lines of the graph do not cross nor are they parallel. If the F -test value for the interaction is significant and the lines do not cross each other, then the interaction is said to be ordinal and the main effects can be interpreted independently of each other. y x

31 © Copyright McGraw-Hill 2000 12-31 No Interaction When there is no significant interaction effect, the lines in the graph will be parallel or approximately parallel. When this situation occurs, the main effects can be interpreted independently of each other because there is no significant interaction. y x

32 © Copyright McGraw-Hill 2000 12-32 Summary The F- test can be used to compare two sample variances to determine whether they are equal. It can also be used to compare three or more means—this procedure is called an analysis of variance, or ANOVA. When there is one independent variable, the analysis of variance is called a one-way ANOVA; when there are two independent variables—a two-way ANOVA.

33 © Copyright McGraw-Hill 2000 12-33 Summary (cont’d.) The ANOVA technique uses two estimates of the population variance. The between-group variance is the variance of the sample means; the within-group variance is the overall variance of all the values. When there is no significant difference among the means, the two estimates will be approximately equal, and the F -test value will be close to 1.

34 © Copyright McGraw-Hill 2000 12-34 Summary (cont’d.) If there is a significant difference among the means, the between-group variance will be larger than the within-group and a significant test value will result. If there is a significant difference among the means and the sample sizes are the same, the Tukey test can be used to find where the difference lies. The Scheffé test is more general and can be used even if the sample sizes are not equal.

35 © Copyright McGraw-Hill 2000 12-35 Conclusions The two-way ANOVA enables the researcher to test the effects of two independent variables and a possible interactions effect on one dependent variable.


Download ppt "© Copyright McGraw-Hill 2000 12-1 CHAPTER 12 Analysis of Variance (ANOVA)"

Similar presentations


Ads by Google