Presentation is loading. Please wait.

Presentation is loading. Please wait.

10/22/20151 PUAF 610 TA Session 8. 10/22/20152 Recover from midterm.

Similar presentations


Presentation on theme: "10/22/20151 PUAF 610 TA Session 8. 10/22/20152 Recover from midterm."— Presentation transcript:

1 10/22/20151 PUAF 610 TA Session 8

2 10/22/20152 Recover from midterm

3 10/22/20153 TODAY F - Distribution Analysis-of-variance Kruskal-Wallis test Correlation

4 10/22/20154 F - Distribution An F-distribution has two numbers of degrees of freedom : –Degrees of Freedom for the Numerator (dfn) –Degrees of Freedom for the Denominator (dfd) There is a different F distribution for each combination of the degrees of freedom of the numerator and denominator.

5 10/22/20155 F - Distribution

6 10/22/20156 F - Distribution Using F table –Select the significance level to be used –Determine the appropriate combination of degrees of freedom

7 10/22/20157 F - Distribution If the α = 0.10 level of significance is selected. There are 5 degrees of freedom in the numerator, and 7 degrees of freedom in the denominator. The F value from the table is 2.88. This means that there is exactly 0.10 of the area under the F curve that lies to the right of F = 2.88

8 10/22/20158 Analysis-of-variance Introduction Question: Do the means of the quantitative variables depend on which group (given by categorical variable) the individual is in? –If categorical variable has only 2 values: 2- sample t-test –If categorical variable has 3 or more values: ANOVA

9 10/22/20159 Assumptions of ANOVA

10 10/22/201510 Logic of ANOVA Hypothesis F-statistic Compare CV or p Conclusion

11 10/22/201511 Analysis-of-variance ANOVA measures two sources of variation in the data and compares their relative sizes variation BETWEEN groups –the difference between its group mean and the overall mean variation WITHIN groups –the difference between that value and the mean of its group

12 10/22/201512 Analysis-of-variance The ANOVA F-statistic is a ratio of the “Between Group Variation” divided by the “Within Group Variation”: A large F indicates that there is more difference between groups than within groups. Treatment mean square Error mean square

13 10/22/201513 Analysis-of-variance

14 10/22/201514 Analysis-of-variance

15 10/22/201515

16 10/22/201516

17 10/22/201517

18 10/22/201518

19 10/22/201519 Example A manager wishes to determine whether the mean times required to complete a certain task differ for the three levels of employee training. He randomly selected 10 employees with each of the three levels of training (Beginner, Intermediate and Advanced). Do the data provide sufficient evidence to indicate that the mean times required to complete a certain task differ for at least two of the three levels of training?

20 10/22/201520 Example Level of Training N Advanced1024.221.54 Intermediate1027.118.64 Beginner1030.217.76

21 10/22/201521 Example Ho: The mean times required to complete a certain task do not differ the three levels of training. ( µB = µI = µA) Ha: The mean times required to complete a certain task differ for at least two of the three levels of training.

22 10/22/201522 Example SourcedfSSMSF Treatments2180.06790.0334.662 Error27521.4619.313 Total29702.527

23 10/22/201523 Example

24 10/22/201524 Example Decision: Reject Ho. Conclusion: There is sufficient evidence to indicate that the mean times required to complete a certain task differ for at least two of the three levels of training.

25 Example Consider the following random samples from three different populations. Do the data provide sufficient evidence to indicate that the mean differ? 10/22/201525

26 Example The data below resulted from measuring the difference in resistance resulting from subjecting identical resistors to three different temperatures for a period of 24 hours. The sample size of each group was 5. In the language of Design of Experiments, we have an experiment in which each of three treatments was replicated 5 times. 10/22/201526

27 Example Level 1Level 2Level 3 6.9 8.3 8.0 5.4 6.8 10.5 5.8 7.8 8.1 4.6 9.2 6.9 4.0 6.5 9.3 Is there a difference among the population means? 10/22/201527

28 10/22/201528 Kruskal-Wallis test The Kruskal-Wallis test is a nonparametric (distribution free) test, which is used to compare three or more groups of independent groups of sampled data. Kruskal-Wallis Test is used when assumptions of ANOVA are not met. –In ANOVA, we assume that distribution of each group should be normally distributed. –In Kruskal-Wallis Test, we do not assume the distribution. If normality assumptions are met, then the Kruskal-Wallis Test is not as powerful as ANOVA.

29 10/22/201529 Kruskal-Wallis test The hypotheses for the comparison of two independent groups are: –H0 : μ1 = μ2 = μ3 –Ha : Not all the means are equal.

30 10/22/201530 Kruskal-Wallis test The test statistic for the Kruskal-Wallis test is H. When sample sizes are small in each group (< 5) and the number of groups is less than 4, a tabled value for the Kruskal-Wallis should be compared to the H statistic to determine the significance level. Otherwise, a Chi-square with k-1 (the number of groups-1) degrees of freedom can be used to approximate the significance level for the test.

31 10/22/201531

32 10/22/201532

33 10/22/201533 Kruskal-Wallis test Do not worry. You just should know how to interpret the STATA output.

34 10/22/201534 When to use what?

35 10/22/201535 Correlation Find out the relationship of the two variables: if they are associated or not? Standard measure of the relationship is called correlation. This relationship could be –negative (with increased x y goes down) –positive (as x increases y also increases).

36 10/22/201536 Correlation Scatter plots provide a graphical way of interpreting the correlation. –Scatter plots are similar to line graphs in that they use horizontal and vertical axes to plot data points –The closer the data points come when plotted to making a straight line, the higher the correlation between the two variables, or the stronger the relationship.

37 10/22/201537 Correlation Scatter plots provide an approximation. –If the data points make a straight line going from the origin out to high x- and y- values, then the variables have a positive correlation. –If the line goes from a high- value on the y-axis down to a high-value on the x-axis, the variables have a negative correlation.

38 10/22/201538 Correlation Variables may not have a correlation at all. There is no pattern to where the data points lie. They do not seem to go in any particular direction.

39 10/22/201539 Correlation we use correlation coefficient to figure out the exact value for the relationship between two variables. –the strength and the direction of a linear relationship between two variables. Correlation coefficient (r) has a range of -1 to +1 The absolute value of the coefficient is a measure of magnitude.

40 10/22/201540 Correlation Positive correlation: –If x and y have a strong positive linear correlation, r is close to +1. –An r value of exactly +1 indicates a perfect positive fit. –As values for x increases, values for y also increase. Negative correlation: –If x and y have a strong negative linear correlation, r is close to -1. –An r value of exactly -1 indicates a perfect negative fit. –As values for x increase, values for y decrease. No correlation: –If there is no linear correlation or a weak linear correlation, r is close to 0.

41 10/22/201541

42 10/22/201542 Correlation Interpret the Correlation Coefficient ! Suppose that a research study that reported a correlation of r =.75 between compensation and number of years of education. What could you say about the relationship between these two variables?

43 10/22/201543 Correlation There is a positive relationship between level of education and compensation. This means that people with more education tend to earn higher salaries. Similarly, people with low levels of education tend to have correspondingly lower salaries.


Download ppt "10/22/20151 PUAF 610 TA Session 8. 10/22/20152 Recover from midterm."

Similar presentations


Ads by Google