Presentation is loading. Please wait.

Presentation is loading. Please wait.

CHAPTER 13 ANOVA.

Similar presentations


Presentation on theme: "CHAPTER 13 ANOVA."— Presentation transcript:

1 CHAPTER 13 ANOVA

2 Relationship of Statistical Tests
Does this Diagram Make Sense to You?

3 ANALYSIS OF VARIANCE ANOVA
TESTS FOR DIFFERENCES AMONG TWO OR MORE POPULATION MEANS σ²=S²=MS=Variance MS=Mean of Squared Deviation Ex of ANOVA Research: The effect of temperature on recall.

4 Statistics Standard Deviations and Variances
X σ² = ss/N Pop σ = √ss/N 2 s = √ss/df Sample s² = ss/n-1 or ss/df MS = SS/df

5 Effects of Temperature (IV) on Recall (DV) One IV, 3 levels/Factors (50, 70, 90) and one DV

6 FACTORIAL ANOVA

7

8 MS bet = SS bet / df bet MS with = SS with /df with

9

10 ANOVA SS bet =Σ(T²/n-G²/N) SS with =Σss SS total =SS bet + SS with
df bet = K-1 df with =N-K df total= df bet + df with

11 Post Hoc Tests (Post Tests)
Post Hoc Tests are additional hypothesis tests that are done after an ANOVA to determine exactly which mean difference are significant and which are not.

12 Post Hoc Tests (Post Tests)
We use the PHT when we 1.reject the Null 2. there are 3 or more groups. 1.Tukeys, 2. Scheffe, Bonferroni, Duncan, LSD etc. Tukey’s Honestly Significant Difference(HSD) Test q=Studentized Range Statistic HSD= q

13 Coefficient of Determination
If r = then, r² = 0.64 This means 64% of the variability in the Y scores can be predicted from the relationship with X. r=√r² or √r²=r

14 Measuring Effect Size for Anova by Coefficient of Determination r²=η² eta squared

15 Problems The data in next slide were obtained from an independent-measures experiment designed to examine people’s performances for viewing distance of a 60-inch high definition television. Four viewing distances were evaluated, 9 feet, 12 feet, 15 feet, and 18 feet, with a separate group of participants tested at each distance. Each individual watched a 30-minute television program from a specific distance and then completed a brief questionnaire measuring their satisfaction with the experience.

16 Problems One question asked them to rate the viewing distance on a scale from 1 (Very Bad definitely need to move closer or farther away) to 7 (Excellent-perfect viewing distance). The purpose of the ANOVA is to determine whether there are any significant differences among the four viewing distances that were tested. Before we begin the hypothesis test, note that we have already computed several summary statistics for the data in next slide. Specifically, the tretment totals (T) and SS values are shown for the entire set of data.

17 Problems 9 feet 12 feet 15 fet 18 feet N= G= ΣX² = K= T1=5 T2=10 T3=25 T4=20 SS1=8 SS2= 8 SS3=10 SS4=6 M1=1 M2=2 M3=5 M4=4 n1=5 n2=5 n3=5 n4=5

18 H0 : µ1= µ2=µ3=µ4 (There is no treatment effect.)
Problems Having these summary values simplifies the computations in the hypothesis test, and we suggest that you always compute these summary statistics before you begin an ANOVA. We will set alpha at α =.05 Step 1) H0 : µ1= µ2=µ3=µ4 (There is no treatment effect.) H1 : (At least one of the treatment means is different.)

19 Step 2

20 Problems A human factor psychologist studied three computer keyboard designs. Three samples of individuals were given material to type on a particular keyboard, and the number of errors committed by each participant was recorded. the data are on next slide. Set alpha at α =.01

21 Problems Keyboard A Keyboard B Keyboard C N= G= ΣX² = T1=5 T2=25 T3=30 SS1=12 SS2=20 SS3=14 M1=1 M2=5 M3=6 Is there a significant differences among the three computer keyboard designs ?

22 H0 : µ1= µ2=µ3 (No differences between the computer keyboard designs )
Problems Step 1) H0 : µ1= µ2=µ3 (No differences between the computer keyboard designs ) H1 : (At least one of the computer keyboard designs is different.)

23 Step 2

24 CHAPTER 14

25 Chapter 15 Correlation & Regression

26 What is Correlation??? Correlation measures the strength and the direction of the relationship between two or more variables. A correlation has three components: 1.The strength of the coefficient 2.The direction of the relationship 3.The form of the relationship 1.The strength of the coefficient is indicated by the absolute value of the coefficient. The closer the value is to 1.0, either positive or negative, the stronger or more linear the relationship. The closer the value is to 0, the weaker or nonlinear the relationship.

27 Correlation 2. The direction of coefficient is indicated by the sign of the correlation coefficient. A positive coefficient indicates that as one variable (X) increases, so does the other (Y). A negative coefficient indicates that as one variable (X) increases, the other variable (Y) decreases. 3. The form of the relationship The form of the relationship is linear. In correlation variables are not identified as independent or dependent because the researcher is measuring the one relationship that is mutually shared between the two variables As a result, causality should not be implied with correlation.

28 Correlation Remember, the correlation coefficient can only measure a linear relationship. A zero correlation indicates no linear relationship. However, does not indicate no relationship. a coefficient of zero rules out linear relationship, but a curvilinear could still exist. The scatterplots below illustrate this point:

29 The Correlation is based on a Statistic Called Covariance
Variance and Covariance are used to measure the quality of an item in a test. Reliability and validity measure the quality of the entire test. σ²=SS/N  used for one set of data Variance is the degree of variability of scores from mean.

30 The Correlational Method SS, Standard Deviations and Variances
X σ² = ss/N Pop σ = √ss/N 2 s = √ss/df s² = ss/n-1 or ss/df Sample SS=Σx²-(Σx)²/N SS=Σ( x-μ)² Sum of Squared Deviation from Mean

31 Variance X σ² = ss/N Pop 1 s² = ss/n-1 or ss/df Sample 2 4 5
SS=Σx²-(Σx)²/N SS=Σ( x-μ)² Sum of Squared Deviation from Mean

32 Covariance Correlation is based on a statistic called Covariance (Cov xy or S xy) = SP/N-1 r=sp/√ssx.ssy - Correlation Covariance is a number that reflects the degree to which 2 variables vary together. Original Data X Y

33 2 ways to calculate the SP SP= Σxy-(Σx.Σy)/N Computation
Covariance COVxy=SP/N-1 2 ways to calculate the SP SP= Σxy-(Σx.Σy)/N Computation SP= Σ(x-μx)(y-μy) Definition SP requires 2 sets of data SS requires only one set of data

34 The Correlational Method
Correlation is the degree to which events or characteristics vary from each other Measures the strength of a relationship Does not imply cause and effect The people chosen for a study are its subjects or participants, collectively called a sample The sample must be representative

35 The Correlational Method
Correlational data can be graphed and a “line of best fit” can be drawn. 1- Pearson Correlations 2-Spearman 3-Point-Biserial Correlation 4- Partial Correlation

36 Types of Correlation In correlational research we use continues variables (interval or ratio scale) for 1. Pearson Correlation (for linear relationship). If it is difficult to measure a variable on an interval or ratio scale then we use

37 Types of Correlation 2.Spearman Correlation
Spearman Correlation uses ordinal or rank ordered data Spearman Correlation measures the consistency of a relationship (Monotonic Relationship). Ex. A teacher may feel confident about rank ordering students’ leadership abilities but would find it difficult to measure leadership on some other scale.

38 Monotonic Transformation
They are rank ordered numbers (DATA), and use ordinal scale(data) examples; 1, 2, 3, 4, or 2, 4, 6, 8. Spearman Correlation can be used to measure the degree of Monotonic relationship between two variables.

39 Ex. of Monotonic data X Y

40 Types of Correlation 3.The Point-Biserial Correlation. However, we can use both continues and discrete variables(data) in The Point-Biserial Correlation. (can be a substitute for two independent t-test)

41 3. The Point-Biserial Correlation
The point-biserial correlation is used to measure the relationship between two variables in situations in which one variable consist of regular, numerical scores (non-dichotomies), but the second variable has only two values (dichotomies). We can also calculate this correlation from t-test r² = Coefficient of Determination which measures the effect size=d r² = t²/t²+df r = √r²

42 4. A Partial Correlation In special situations we can use Partial Correlations. Measures the relationship between two variables while controlling the influence of a third variable by holding it constant. Ex. The correlation between churches and crime. (third variable is population)

43 The Correlational Method
Correlational data can be graphed and a “line of best fit” can be drawn

44 Positive correlation: variables change in the same direction

45 Positive Correlation

46 Negative correlation: variables change in the opposite direction

47 Negative Correlation

48 Unrelated: no consistent relationship
No Correlation Unrelated: no consistent relationship

49 No Correlation

50 The Correlational Method
The magnitude (strength) of a correlation is also important High magnitude = variables which vary closely together; fall close to the line of best fit Low magnitude = variables which do not vary as closely together; loosely scattered around the line of best fit

51 The Correlational Method
Direction and magnitude of a correlation are often calculated statistically Called the “correlation coefficient,” symbolized by the letter “r” Sign (+ or -) indicates direction Number (from 0.00 to 1.00) indicates magnitude 0.00 = no consistent relationship +1.00 = perfect positive correlation -1.00 = perfect negative correlation Most correlations found in psychological research fall far short of “perfect”

52 The Correlational Method
Correlations can be trusted based on statistical probability “Statistical significance” means that the finding is unlikely to have occurred by chance By convention, if there is less than a 5% probability that findings are due to chance (p < 0.05), results are considered “significant” and thought to reflect the larger population Generally, confidence increases with the size of the sample and the magnitude of the correlation

53 The Correlational Method
Advantages of correlational studies: Have high external validity Can generalize findings Can repeat (replicate) studies on other samples Difficulties with correlational studies: Lack internal validity Results describe but do not explain a relationship

54 External & Internal Validity
External Validity External validity addresses the ability to generalize your study to other people and other situations. Internal Validity Internal validity addresses the "true" causes of the outcomes that you observed in your study. Strong internal validity means that you not only have reliable measures of your independent (predictors) and dependent variables (criterions) BUT a strong justification that causally links your independent variables (IV) to your dependent variables (DV).

55 The Correlational Method Pearson
r=sp/√ssx.ssy Original Data X Y SP requires 2 sets of data SS requires only one set of data df=n-2

56 The Correlational Method Spearman
r=sp/√ssx.ssy Original Data  Ranks X Y X Y SP requires 2 sets of data SS requires only one set of data

57 Percentage of Variance Accounted for by the Treatment (similar to Cohen’s d) is known as ω² Omega Squared also is called Coefficient of Determination next page

58 Coefficient of Determination
If r = then, r² = 0.64 This means 64% of the variability in the Y scores can be predicted from the relationship with X. r=√r² or √r²=r

59 Problems Test the hypothesis for the following n=4 pairs of scores for a correlation, α =.01 r=sp/√ssx.ssy Original Data X Y

60 Step 1) Problems H0 : ρ=0 (There is no population correlation.)
H1 : ρ≠0 (There is a real correlation.) Ρ: probability or chances … We will set alpha at α =.01

61 STEP 2

62

63 Problems Test the hypothesis for the following set of n=5 pairs of scores for a positive correlation, α =.05 Original Data X Y

64 Problems Step 1) H0 : ρ≤0 ((The population correlation is not positive.) H1 : ρ>0 (The population correlation is positive.) Ρ: probability or chances are… We will set alpha at α =.05

65

66 Three Levels of Analysis for Prediction/validity
INPUTS PROCESSES OUTCOMES Ex. Stress (INPUT) is an unpleasant psychological (PROCESS) that occurs in response to environmental pressures e.g., job demands) and can lead to withdrawal (OUTCOME).

67

68 Bi-Variate Regression Analysis
Bi-variate regression analysis extends correlation and attempts to measure the extent to which a predictor variable (X) can be used to make a prediction about a criterion measure (Y). E Bi-variate regression uses a linear model to predict the criterion measure. The formula for the predicted score is: Y' = a + bX

69 Bivariate Regression The components of the line of best fit (Y' = a + bX) are: the Y-intercept or (a) is Constant the slope (b) Variable (X)

70 Bivariate Regression The Y-intercept is the average value of Y when X is zero. The Y-intercept is also called constant. Because, this is the amount of Y that is constant or present when the influence of X is null (0). The slope is average value of a one unit change in Y for a corresponding one unit change in X. Thus, the slope represents the direction and intensity of the line.

71 Regression and Prediction
Y=bX+a Regression Line e

72

73 Bivariate Regression Line of Best Fit: Y' = 2.635 + .204X
With this equation a predicted score may be made for any value of X within the range of data. a= and b=.204 Y-intercept Slope = .204 2.635

74 Multiple Regression Analysis
Multiple regression analysis is an extension of bi-variate regression, in which several predictor variables are used to predict one criterion measure (Y). In general, this method is considered to be advantageous; since seldom can an outcome measure be accurately explained by one predictor variable. Ex. 3 aspects of personality (OCPD, Narcissistic, Histrionic) and Depression E Y' = a + b1X1 +b2X2 +b3X3

75 Path Analysis Path Analysis is an extension of regression.
In Path analysis the researcher is examining the ability of more than one predictor variable (X) to explain or predict multiple dependent variables (Y). 2 aspects of personality (OCPD, Narcissistic) and (Depression and Anxiety) X 1 Y Y 1 2 X 2 E E

76

77 Relationship of Statistical Tests
Does this diagram make sense to you?


Download ppt "CHAPTER 13 ANOVA."

Similar presentations


Ads by Google