Presentation is loading. Please wait.

Presentation is loading. Please wait.

Stats Lunch: Day 7 One-Way ANOVA. Basic Steps of Calculating an ANOVA M = 3 M = 6 M = 10 Remember, there are 2 ways to estimate pop. variance in ANOVA:

Similar presentations


Presentation on theme: "Stats Lunch: Day 7 One-Way ANOVA. Basic Steps of Calculating an ANOVA M = 3 M = 6 M = 10 Remember, there are 2 ways to estimate pop. variance in ANOVA:"— Presentation transcript:

1 Stats Lunch: Day 7 One-Way ANOVA

2 Basic Steps of Calculating an ANOVA M = 3 M = 6 M = 10 Remember, there are 2 ways to estimate pop. variance in ANOVA: 1. Within Subjects: Estimate variance using data from each group -This is a measure of ‘error’, how much Ss differ w/in a group 2. Between Groups: Estimate variance using means of each group: -This estimate is affected by BOTH error and the impact of our independent variable (drug type)

3 Step 1: Estimate pop. variance from variation w/in each group -Estimate from each group the same way we did w/ t tests: S 2 group = SS/df -At this point, we are dealing with equal N’s in each group -So, we don’t have to control for the quality of estimates we get from different sample sizes (like we did w/ independent t tests) -Instead, we can just directly pool (average) the different estimates we get from each group (S 2 1 + S 2 2 + S 2 3 )/ Number of Groups Ex: (1 +.5 + 1)/3 =.83 This is S 2 within, or as it’s more commonly known: - “Mean Squares Within” = MS within -or “Mean Squares Error” = MS Error

4 Step 2: Estimate pop. variance from variation between groups: -Get the variance of the means of each group This has two parts: Part A) Estimate variance of distribution of means: -Squared deviation of each mean from the average of all the means divided by the df for the between group estimate… -Grand Mean (GM): Mean of all the means Ex: (3 + 6 + 10)/3 =19/3 = 6.33 - df between : # of groups - 1 S 2 M =  (M - GM) 2 / df between - ( (3-6.33) 2 + (6-6.33) 2 + (10-6.33) 2 )/2 - (11.09 +.11 + 13.49)/2 = 12.35

5 Part B: Figure estimated variance of pop. of individual scores -What we’ve done so far is to estimate the dist. of means from a small # of means…now we need to estimate the dist. of individual scores this was based on… -This is essentially the opposite of what we’ve done before -We do this by reversing the procedure we’ve used to estimate variance in the past: -Instead of dividing by df, we multiply by df -S 2 Between (aka MS between ) = S 2 M * n = 12.35 * 5 = 61.75

6 Step 3: Calculate the F Ratio F Ratio = Between Groups Variance / Within Groups Variance F Ratio = MS between / MS within EX: F = 61.75/.83 = 74.40 Step 4: Decide if you should reject the null -If F > 1, we should start to think about rejecting the null -Of course, we need to quantify this somehow -We compare our F score to an F distribution -Like t tests, if our F ratio exceeds a cutoff point (based on our alpha level) we reject the null F ratio = What we can explain/What we can’t

7 Setting Up a One-Way B/TWN Ss ANOVA in SPSS 1)Click on Analyze… 2)Choose “Compare Means”, and then One-Way ANOVA.. Or better yet… 1)Click on Analyze… 2)Choose “G.L.M.”, and then Univariate..

8 Setting Up a One-Way B/TWN Ss ANOVA in SPSS 3) Select your D.V. 4) Add your I.V. to the “Fixed Factors” window (you can run a bunch of individual ANOVAs by adding more IVs) 5) Click on “Options” 6) I like to display means.. 7) You’ll probably want to click here to get your simple comparisons (more on this in a second). 8) Get your effect sizes, change your  level, etc. 9) Click on “Continue” then “OK”

9 Setting Up a One-Way B/TWN Ss ANOVA in SPSS

10 But what does this tell us?

11 The ANOVA is often called the “omnibus” or “overall” test: -see if there is a difference anywhere between the groups -Controls FW , protects against Type I Error -However, if we were doing this study for real, we’d also want to figure out WHERE the differences are: -Is the New Drug better than the placebo? Better than Old Drug? -We do this with “Planned Comparisons”: specific comparisons between individual means (decided in advance) -AKA: “planned contrasts” and “a priori comparisons” -We do planned comps the same way as the ANOVA: What we know(Between)/What we don’t (Within)

12 What we don’t know = w/in subjects variance -We can just recycle this from the omnibus F What we know = between groups variance: -Same idea as before, but we are only interested in the variance between a pair of means: -To figure B/twn groups variance, we do the same two steps as before A) Estimate variance of Distribution of Means B) Estimate variance of population of individual scores So, say we want to make two comparisons: New Drug (3) vs. Placebo (10) New Drug (3) vs. Old Drug (6)

13 Problems w/ Multiple Comparisons (again) -Doing multiple planned comparisons can lead to the same problems as doing a bunch of t tests -Hence, people often use the “Bonferroni Correction” to control for the problems associated w/ multiple comparisons -Adjusts the alpha level for each comparison so FW  does not exceed.05 (or whatever your accepted cutoff happens to be) -Bonferroni Correction: True Significance level/# comparisons EX:.05/2 =.025, so we’d use an alpha of.025 for each comp -However, it is generally considered that you get 2 (or even 3) “free” planned comparisons…where you don’t have to adjust alpha -Balance Type I and Type II errors

14 Setting Up Planned Comparisons in SPSS 7) Click here to get your simple comparisons (more on this in a second). …Do the same thing as before, and then 8) Choose “LSD” (least squared difference) for no adjustments, or choose Bonferroni.. LSDBonferroni Note the p values…

15 Other Issues: Orthogonality  Not all planned comparisons are created equal…  To maintain statistical integrity, a set of comparisons SHOULD be orthogonal to one another.  Orthogonal: Comparisons reflect non-overlapping (independent) information… the outcome of one comparison gives no info about the outcome of another. Example of Orthogonal Comparisons  We know they’re orthogonal if the sum of the products of each column = 0  Can have (Number of Levels – 1) orthogonal comparisons.  SS of a set of Orthogonal Comparisons = SS Between.

16 Other Issues: Orthogonality Problems: Orthogonal comparisons might not be what you WANT to know (e.g., they’re not psychologically, psychiatrically, or experimentally important) Example of Non- Orthogonal Comparisons What to do about it?  Depends who you ask…  However, most people would say that the meaningfulness of a set of contrasts is important…  Can also do “Post-Hoc” tests…


Download ppt "Stats Lunch: Day 7 One-Way ANOVA. Basic Steps of Calculating an ANOVA M = 3 M = 6 M = 10 Remember, there are 2 ways to estimate pop. variance in ANOVA:"

Similar presentations


Ads by Google