Presentation on theme: "Experimental Design & Analysis"— Presentation transcript:
1 Experimental Design & Analysis Hypothesis Testing and Analysis of VarianceJanuary 30, 2007DOCTORAL SEMINAR, SPRING SEMESTER 2007
2 Outline Statistical inferences Null hypothesis testing Sources of varianceTreatment variance vs. error varianceSums of squares
3 Statistical Inference MeansThe mean is a measure of central tendencyμ or xSum of squaresThe sum of squares refers to the sum of the squared deviations of a set of scores or values from their mean(1) Subtract the mean from each score(2) Square each resulting difference(3) Add up the squared differencesMean square, or varianceMean square of a sample is a measure of variabilityDivide sum of squares by the degrees of freedomσ2 or s2Standard deviationσ or s
4 Statistical Inference Student's t-test is used to compare two groups when sample sizes are small (N < 30), but with larger samples the normal curve z test is usedLet E be the experimental condition and let C be the control condition. Let m be the means, s the standard deviations, and n be the sample sizet = (mE - mC) / √[(s2E + s2C) / n ]The critical value is the value found in t-tables
5 Statistical Inference Hypothesis testingInferences about the population based on parameters estimated from a sampleμ1 = μ2 = μ3 = etc. suggests that there are no treatment effectsObserving an effect requires falsifying the null hypothesis H0How different are two mean scores?
6 Statistical Inference Differences due to treatmentSystematic source of difference by virtue of experimental conditionDifferences due to errorRandom source of difference because participants are randomly assigned to groups
7 Statistical Inferences Evaluation of null hypothesisdifferences between experimental groups of subjectsdifferences among subjects within same groupsexperimental error= 1treatment effects + experimental errorexperimental error= 1 ?
8 Null Hypothesis Significance Testing 5 steps of NHSTState H0, H1, and levelDetermine rejection region and state rejection ruleCompute test statisticMake a decision (reject H0 or fail to reject H0)Conclude in terms of the problem scenario
9 Statistical Inference When H0 is true, F value = 1, although random variation can result in it being > 1 or < 1When H0 is false, F expected >1Since random variation can account for why F > 1, the problem we face is determining how much >1 it must be in order for us to conclude, on the basis of improbability, that its size indicates a real difference in the treatmentsThe F distribution enables us to determine how improbable an experimental outcome is under the assumption our null hypothesis is true. If the value for F is so large that it is improbable (p < .05 or p < .01), given that our null hypothesis is true, we will conclude our null hypothesis is false and that the hypothetical population means are not equal
10 Sources of Variance Partitioning variance In order to distinguish systematic variability (treatment effects) from random error, we must partition variance by calculating component deviation scoresTotal deviationBetween-groups deviationWithin-groups deviation
11 Component Deviations Total deviation Within-group deviation 246810Y2Y5,2YT- Y2- YTTotal deviationWithin-group deviationBetween-group deviationFrom Keppel & Wickens, p. 23
12 Sources of Variance What is the grand mean, YT? What are the group means?Treatment A1Treatment A2Treatment A3ParticipantParticipantParticipantParticipantParticipantParticipantParticipantParticipantParticipant 4 10ParticipantParticipantParticipantParticipantParticipantParticipantA1 = ?A2 = ?A3 = ?T = ?Y2 = ?Y3 = ?YT = ?Y1 = ?From Keppel & Wickens, p. 24
14 Sums of SquaresTotal = Sums of SquaresBetween + Sums of SquaresWithin Sources of VarianceCalculate sums of squaresTotal sums of squares = Σ(Yij – YT)2Between sums of squares = Σ(Yj – YT)2Within sums of squares = Σ(Yij – Yj)2Sums of SquaresTotal = Sums of SquaresBetween + Sums of SquaresWithin
16 Analysis of VarianceSums of squares provide the building blocks for analysis of variance and significance testingAnalysis of variance involves 2 stepsCalculate variance estimates, known as mean squares, by dividing component sums of squares by degrees of freedomRatio of between-group mean square and within-group mean square provides F statistic
17 Analysis of Variance Source df Mean Square F A a-1 SSA/dfA MSA/ MSS/A S/A a (n-1) SSS/A/dfS/ATotal an-1a = # of levels of factor An = sample size of a groupdf = degrees of freedomCheck F value in table of critical values organizedby degrees of freedom to determine significance levelNumerator degrees of freedom = a-1Denominator degrees of freedom = a(n-1)