Orthogonal Linear Contrasts This is a technique for partitioning ANOVA sum of squares into individual degrees of freedom.

Slides:



Advertisements
Similar presentations
Week 2 – PART III POST-HOC TESTS. POST HOC TESTS When we get a significant F test result in an ANOVA test for a main effect of a factor with more than.
Advertisements

Chapter 4: Basic Estimation Techniques
1-Way ANOVA with Numeric Factor – Dose-Response Dose Response Studies in Laboratory Animals S.J. Ruberg (1995). “Dose Response Studies. II. Analysis and.
1-Way Analysis of Variance
Comparing k Populations Means – One way Analysis of Variance (ANOVA)
Analysis of Variance (ANOVA) Statistics for the Social Sciences Psychology 340 Spring 2010.
More on ANOVA. Overview ANOVA as Regression Comparison Methods.
Copyright © 2008 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics, 9e Managerial Economics Thomas Maurice.
Lecture 10 PY 427 Statistics 1 Fall 2006 Kin Ching Kong, Ph.D
Analysis of Variance: Inferences about 2 or More Means
Comparing Means.
Statistics for the Social Sciences Psychology 340 Fall 2006 ANOVA: Book vs. instructor.
Chapter 3 Experiments with a Single Factor: The Analysis of Variance
DOCTORAL SEMINAR, SPRING SEMESTER 2007 Experimental Design & Analysis Contrasts, Trend Analysis and Effects Sizes February 6, 2007.
Lecture 9: One Way ANOVA Between Subjects
8. ANALYSIS OF VARIANCE 8.1 Elements of a Designed Experiment
Statistics for the Social Sciences Psychology 340 Spring 2005 Analysis of Variance (ANOVA)
Anthony J Greene1 ANOVA: Analysis of Variance 1-way ANOVA.
Statistics for the Social Sciences
Comparing Means.
Statistical Methods in Computer Science Hypothesis Testing II: Single-Factor Experiments Ido Dagan.
Introduction to Analysis of Variance (ANOVA)
If = 10 and = 0.05 per experiment = 0.5 Type I Error Rates I.Per Comparison II.Per Experiment (frequency) = error rate of any comparison = # of comparisons.
Repeated Measures ANOVA
Comparing Means. Anova F-test can be used to determine whether the expected responses at the t levels of an experimental factor differ from each other.
1 1 Slide © 2005 Thomson/South-Western Chapter 13, Part A Analysis of Variance and Experimental Design n Introduction to Analysis of Variance n Analysis.
Chapter 13: Introduction to Analysis of Variance
MANOVA Multivariate Analysis of Variance. One way Analysis of Variance (ANOVA) Comparing k Populations.
Chapter 11 HYPOTHESIS TESTING USING THE ONE-WAY ANALYSIS OF VARIANCE.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
So far... We have been estimating differences caused by application of various treatments, and determining the probability that an observed difference.
Statistics 11 Confidence Interval Suppose you have a sample from a population You know the sample mean is an unbiased estimate of population mean Question:
Orthogonal Linear Contrasts
Randomized Block Design Blocks All treats appear once in each block.
Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, October 15, 2013 Analysis of Variance (ANOVA)
Copyright © 2005 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics Thomas Maurice eighth edition Chapter 4.
Orthogonal Linear Contrasts This is a technique for partitioning ANOVA sum of squares into individual degrees of freedom.
ANOVA: Analysis of Variance.
1 ANALYSIS OF VARIANCE (ANOVA) Heibatollah Baghi, and Mastee Badii.
Comparing k Populations Means – One way Analysis of Variance (ANOVA)
Three Broad Purposes of Quantitative Research 1. Description 2. Theory Testing 3. Theory Generation.
Copyright © Cengage Learning. All rights reserved. 12 Analysis of Variance.
Introduction to Basic Statistical Tools for Research OCED 5443 Interpreting Research in OCED Dr. Ausburn OCED 5443 Interpreting Research in OCED Dr. Ausburn.
MARKETING RESEARCH CHAPTER 17: Hypothesis Testing Related to Differences.
Orthogonal Linear Contrasts A technique for partitioning ANOVA sum of squares into individual degrees of freedom.
Research Methods and Data Analysis in Psychology Spring 2015 Kyle Stephenson.
Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)
Research Methods and Data Analysis in Psychology Spring 2015 Kyle Stephenson.
Introduction to ANOVA Research Designs for ANOVAs Type I Error and Multiple Hypothesis Tests The Logic of ANOVA ANOVA vocabulary, notation, and formulas.
The p-value approach to Hypothesis Testing
EDUC 200C Section 9 ANOVA November 30, Goals One-way ANOVA Least Significant Difference (LSD) Practice Problem Questions?
Chapters Way Analysis of Variance - Completely Randomized Design.
Outline of Today’s Discussion 1.Independent Samples ANOVA: A Conceptual Introduction 2.Introduction To Basic Ratios 3.Basic Ratios In Excel 4.Cumulative.
1 Statistics for the Behavioral Sciences (5 th ed.) Gravetter & Wallnau Chapter 13 Introduction to Analysis of Variance (ANOVA) University of Guelph Psychology.
ANOVA: Why analyzing variance to compare means?.
Stats/Methods II JEOPARDY. Jeopardy Estimation ANOVA shorthand ANOVA concepts Post hoc testsSurprise $100 $200$200 $300 $500 $400 $300 $400 $300 $400.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 4 Investigating the Difference in Scores.
Experimental Designs The objective of Experimental design is to reduce the magnitude of random error resulting in more powerful tests to detect experimental.
Chapter 12 Introduction to Analysis of Variance
Comparing k Populations Means – One way Analysis of Variance (ANOVA)
Chapter 4: Basic Estimation Techniques
Week 2 – PART III POST-HOC TESTS.
Basic Estimation Techniques
1-Way ANOVA with Numeric Factor – Dose-Response
Comparing k Populations
Basic Estimation Techniques
Comparing k Populations
Comparing Means.
Multiple Testing Tukey’s Multiple comparison procedure
Presentation transcript:

Orthogonal Linear Contrasts This is a technique for partitioning ANOVA sum of squares into individual degrees of freedom

Definition Let x 1, x 2,..., x p denote p numerical quantities computed from the data. These could be statistics or the raw observations. A linear combination of x 1, x 2,..., x p is defined to be a quantity,L,computed in the following manner: L = c 1 x 1 + c 2 x c p x p where the coefficients c 1, c 2,..., c p are predetermined numerical values:

Definition If the coefficients c 1, c 2,..., c p satisfy: c 1 + c c p = 0, Then the linear combination L = c 1 x 1 + c 2 x c p x p is called a linear contrast.

Examples L = x x 2 + 6x x 4 + x 5 = (1)x 1 + (-4)x 2 + (6)x 3 + (-4)x 4 + (1)x 5 A linear combination A linear contrast

Definition Let A = a 1 x 1 + a 2 x a p x p and B= b 1 x 1 + b 2 x b p x p be two linear contrasts of the quantities x 1, x 2,..., x p. Then A and B are c called Orthogonal Linear Contrasts if in addition to: a 1 + a a p = 0 and b 1 + b b p = 0, it is also true that: a 1 b 1 + a 2 b a p b p = 0..

Example Let Note:

Definition Let A = a 1 x 1 + a 2 x a p x p, B= b 1 x 1 + b 2 x b p x p,..., and L= l 1 x 1 + l 2 x l p x p be a set linear contrasts of the quantities x 1, x 2,..., x p. Then the set is called a set of Mutually Orthogonal Linear Contrasts if each linear contrast in the set is orthogonal to any other linear contrast..

Theorem: The maximum number of linear contrasts in a set of Mutually Orthogonal Linear Contrasts of the quantities x 1, x 2,..., x p is p - 1. p - 1 is called the degrees of freedom (d.f.) for comparing quantities x 1, x 2,..., x p.

Comments 1.Linear contrasts are making comparisons amongst the p values x 1, x 2,..., x p 2.Orthogonal Linear Contrasts are making independent comparisons amongst the p values x 1, x 2,..., x p. 3.The number of independent comparisons amongst the p values x 1, x 2,..., x p is p – 1.

Definition denotes a linear contrast of the p means If each mean,, is calculated from n observations then: The Sum of Squares for testing the Linear Contrast L, is defined to be:

the degrees of freedom (df) for testing the Linear Contrast L, is defined to be the F-ratio for testing the Linear Contrast L, is defined to be:

Theorem: Let L 1, L 2,..., L p-1 denote p-1 mutually orthogonal Linear contrasts for comparing the p means. Then the Sum of Squares for comparing the p means based on p – 1 degrees of freedom, SS Between, satisfies:

Comment Defining a set of Orthogonal Linear Contrasts for comparing the p means allows the researcher to "break apart" the Sum of Squares for comparing the p means, SS Between, and make individual tests of each the Linear Contrast.

The Diet-Weight Gain example The sum of Squares for comparing the 6 means is given in the Anova Table:

Five mutually orthogonal contrasts are given below (together with a description of the purpose of these contrasts) : (A comparison of the High protein diets with Low protein diets) (A comparison of the Beef source of protein with the Pork source of protein)

(A comparison of the Meat (Beef - Pork) source of protein with the Cereal source of protein) (A comparison representing interaction between Level of protein and Source of protein for the Meat source of Protein) (A comparison representing interaction between Level of protein with the Cereal source of Protein)

The Anova Table for Testing these contrasts is given below: The Mutually Orthogonal contrasts that are eventually selected should be determine prior to observing the data and should be determined by the objectives of the experiment

Another Five mutually orthogonal contrasts are given below (together with a description of the purpose of these contrasts) : (A comparison of the Beef source of protein with the Pork source of protein) (A comparison of the Meat (Beef - Pork) source of protein with the Cereal source of protein)

(A comparison of the high and low protein diets for the Beef source of protein) (A comparison of the high and low protein diets for the Cereal source of protein) (A comparison of the high and low protein diets for the Pork source of protein)

The Anova Table for Testing these contrasts is given below:

Orthogonal Linear Contrasts Polynomial Regression

Orthogonal Linear Contrasts for Polynomial Regression

Example In this example we are measuring the “Life” of an electronic component and how it depends on the temperature on activation

The Anova Table SourceSSdfMSF Treat Linear Quadratic Cubic Quartic Error Total73014 L = 25.00Q 2 = C = 0.00Q 4 = 30.00

The Anova Tables for Determining degree of polynomial Testing for effect of the factor

Testing for departure from Linear

Testing for departure from Quadratic

Multiple Testing Tukey’s Multiple comparison procedure Scheffe’s multiple comparison procedure

Multiple Testing – a Simple Example Suppose we are interested in testing to see if two parameters (  1 and  2 ) are equal to zero. There are two approaches 1.We could test each parameter separately a) H 0 :  1 = 0 against H A :  1 ≠ 0, then b)H 0 :  2 = 0 against H A :  2 ≠ 0 2.We could develop an overall test H 0 :  1 = 0,  2 = 0 against H A :  1 ≠ 0 or  2 ≠ 0

1.To test each parameter separately a) then b) We might use the following test: then is chosen so that the probability of a Type I errorof each test is .

2.To perform an overall test H 0 :  1 = 0,  2 = 0 against H A :  1 ≠ 0 or  2 ≠ 0 we might use the test is chosen so that the probability of a Type I error is .

Post-hoc Tests Multiple Comparison Tests

Post-hoc Tests Multiple Comparison Tests

Suppose we have p means An F-test has revealed that there are significant differences amongst the p means We want to perform an analysis to determine precisely where the differences exist.

Tukey’s Multiple Comparison Test

Let Tukey's Critical Differences Two means are declared significant if they differ by more than this amount. denote the standard error of each = the tabled value for Tukey’s studentized range p = no. of means, = df for Error

Scheffe’s Multiple Comparison Test

Scheffe's Critical Differences (for Linear contrasts) A linear contrast is declared significant if it exceeds this amount. = the tabled value for F distribution (p -1 = df for comparing p means, = df for Error)

Scheffe's Critical Differences (for comparing two means) Two means are declared significant if they differ by more than this amount.

Underlined groups have no significant differences

There are many multiple (post hoc) comparison procedures 1.Tukey’s 2.Scheffe’, 3.Duncan’s Multiple Range 4.Neumann-Keuls etc Considerable controversy: “I have not included the multiple comparison methods of D.B. Duncan because I have been unable to understand their justification” H. Scheffe, Analysis of Variance