Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Orthogonality One way to delve further into the impact a factor has on the yield is to break the Sum of Squares (SSQ) into “orthogonal” components. If.

Similar presentations


Presentation on theme: "1 Orthogonality One way to delve further into the impact a factor has on the yield is to break the Sum of Squares (SSQ) into “orthogonal” components. If."— Presentation transcript:

1 1 Orthogonality One way to delve further into the impact a factor has on the yield is to break the Sum of Squares (SSQ) into “orthogonal” components. If SSB col has (C-1) df (which corresponds with having C levels, or C columns ), the SSB col can be broken up into (C-1) individual SSQ values, each with a single degree of freedom, each addressing a different inquiry into the data’s message.

2 2 If each “question” asked of the data is orthogonal to all the other “questions”, two generally desirable properties result: 1. Each “question” is independent of each other one; the probabilities of Type I and Type II errors in the ensuing hypothesis tests are independent, and “stand alone”. Orthogonality

3 3 Consider 4 column means: 6 4 1 -3 Grand Mean = 2 2. The (C-1) SSQ values are guaranteed to add up exactly to the total SSB col you started with. What? How? -Watch! Orthogonality 1234

4 4 Call these values: Y1, Y2, Y3, Y4, 4 and define  1 =  a 1 j Y j, j=1 4  2 =  a  j Y j, j =1 4 and    =  a  j Y j, j=1

5 5 Under what conditions will 3 4  2 i =  Y j - Y) 2 ? i=1 j=1 one answer: 4 (1)  a ij = 1 for all i j=1 (i=1,2,3) 4 (2)  a ij = 0 for all i j=1 2 4 (3.)  a i 1 j. a i 2 j = 0 for all i 1, i 2, j=1 i 1 = i 2. A linear combination of treatment means satisfying (2) is called a contrast. orthogonal

6 6 Writing the a ij ’s as a “matrix”, one possibility among many: 1/2 1/2 -1/2 -1/2 1/2 -1/2 1/2 -1/2 1/2 -1/2 -1/2 1/2 6 4 1 -3  2 Y 1 Y 2 Y 3 Y 4 Y= 2  1 = 1/2 1/2 -1/2 -1/2 6 36  2 = 1/2 -1/2 1/2 -1/2 3 9  3  = 1/2 -1/2 -1/2 1/2 -1 1 46

7 7  Y j -Y) 2 = (6-2) 2 + (4-2) 2 + (1-2) 2 + (-3-2) 2 = 16 + 4 + 1 + 25 = OK! How does this help us? 46

8 8 Consider the following data, which, let’s say, are the column means of a one factor ANOVA, with the one factor being “DRUG”: Y. 1 Y. 2 Y. 3 Y. 4 Y.. = 7 5 6 7 10 and  (Y. j - Y..) 2 = 14. (SSB c = 14. R, where R = # rows)

9 9 Consider the following two examples: Example 1 1 2 34 Placebo Sulfa Type S 1 Sulfa Type S 2 Anti- biotic Type A Suppose the questions of interest are (1) Placebo vs. Non-placebo (2) S 1 vs. S 2 (3) (Average) S vs. A

10 10 How would you combine columns to analyze the question? P vs. P: S 1 vs. S 2 : S vs. A: 1 2 3 4 -3 1 1 1 0 -1 1 0 0 -1 -1 2 Note Conditions 2 & 3 Satisfied PS1S1 S2S2 A

11 11 divide top row by middle row by bottom row by                            (to satisfy condition 1)

12 12 Y. 1 Y. 2 Y. 3 Y. 4 Z i 2 3  12 1  12 1  12 1  12 5.33 1212 1212 0.50 1616 1616 2626 8.17 0 0 0 Placebo vs. drugs S 1 vs. S 2 Average S vs. A P S 1 S 2 A 14.00 56710

13 13 Example 2: Y. 1 Y. 2 Y. 3 Y. 4 antibiotic type antibiotic type sulfa type sulfa type S 1 S 2 A 1 A 2

14 14 Exercise: Suppose the questions of interest are: 1.The difference between sulfa types 2.The difference between antibiotic types 3.The difference between sulfa and antibiotic types, on average. Write down the three corresponding contrasts. Are they orthogonal? If not, can we make them orthogonal?

15 15 OK! Now to the analysis: Y. 1 Y. 2 Y. 3 Y. 4 Z i 2 S 1 vs. S 2 A 1 vs. A 2 Ave. S vs. Ave. A (5)(6)(7)(10) S 1 S 2 A 1 A 2 0.5 4.5 9.0 0 1212 1212 0 0 0 1212 1212 1414 1414 1414 1414 14.00

16 16 Example: { R=8 Placebo. 5 ASP 1. 6 ASP 2. 7 Buff. 10 Y..= 7

17 17 ANOVA F.05 (3,28)=2.95

18 18 Now, an orthogonal breakdown: Placebo vs. others ASP 1 vs. ASP 2 ASP vs. Buff 0 - 3  12 1  12 1  2  6  6 2  6 1  12 1  12 0  2 0 1  2 7  6 8  12 Placebo ASP 1 ASP 2 Buff Z +5 +6 +7 10 Pl’bo vs Bff00  2  2  2

19 19 Z 8  12 Z 2 Z 2 x 8 42.64 1  7 .504.00 8.1765.36 14.00 112.00 5.33 5  100 25/2

20 20 ANOVA Source SSQ df MSQ F Drugs Error { Z1Z2Z3Z1Z2Z3 112 { 42.64 4.00 65.36 3 { 111 42.64 4.00 4.0065.36 8.53 8.53.80.8013.07 140 28 5 F 1-.05 (1,28)=4.20 Z4Z4 100 1 20 F 1-.05/3 (1,28)<7.64

21 21 Another Example: The variable (coded) is mileage per gallon. Gasoline I II III IV V YIELD -4 19 21 10 18 Standard Gasoline Standard, plus additive A made by P Standard, plus additive B made by P Standard, plus additive A made by Q Standard, plus additive B made by Q A significant difference between Placebo and the rest, and between ASP’s and BUFF, but not between the two different ASP’s.

22 22 Questions actually chosen: Standard gasoline vs gasoline with an additive P vs. Q Between the two additives of P Between the two additives of Q (Z1)(Z2)(Z3)(Z4)

23 23 With appropriate orthogonal matrix and Z 2 values: 1212 422.8 II III IVV Z 2 i 1  20 1  20 352.8 36.0 2.0 32.0 1212 1212 Z1Z2Z3Z4Z1Z2Z3Z4 I + 4  20 0 0000 1414 0 1414 0 1  20 1414 0 1  20 1414 0 1212 By far, the largest part of the total variability in yields is associated with standard gasoline vs. gasoline with an additive.

24 24 Let n=4. Let the four observed yields be the four yields of a 2 2 factorial experiment: Orthogonal Breakdowns In 2 k and 2 k-p designs Y 1 = 1Y 2 = aY 3 = bY 4 = ab

25 25 Example: Miles per Gallon by Gas Type and Auto Make 1234 16281628 22272530 16171619 10201618 18231924 8231625 15231824 Group

26 26 Suppose: 1  115 2  a23 3  b18 4  ab 24 A = Gas Type = 0, 1 B = Auto Make = 0, 1 (2 2 design)

27 27 Earlier we formed estimate of 2A estimate of 2B estimate of 2AB 1 a b ab 1 1 111111

28 28 Which for present purposes we replace by: 1 a b ab Z 1 2 1 2 1 2................ 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 - - - - - - Now we can see that these coefficients of the yields are elements of the orthogonal matrix. So, A, B and AB constitute orthogonal estimates. A B AB

29 29 Standard one-way ANOVA: F.95 (3,20)  3.1 SourceSSQdfMSQF Col324 3 108 5.4 Error40020 20

30 30 Then, A =(-15+23-18+24) = B=(-15-23+18+24) = AB=(15-23-18+24) = A 2 = 49, B 2 = 4, AB 2 = 1 1 1 1 14 4 -2 4 4 4 4 4 4

31 31 Multiply each of these by the number of data points in each column: A 2  6(49)=294 B 2  6(4)=24 AB 2  6(1)=6 TOTAL : 324

32 32 And: ANOVA: F.95 (1,20)  4.3 SourceSSQdfMSF calc Col3243 Error400 2020 ABAB { 294 24 6 { 111111 { 294 24 6 14.7 1.2.3

33 33 If: 1  c15 2  a23 3  b18 4  abc 24 A = Gas Type B = Auto Make C = Highway (2 3-1 design)

34 34 ANOVA: We’d get the same breakdown of the SSQ, but being the + block of I = ABC, SourceSSQdf A+BC2941 B+AC2431 AB+C61 Error40020 { { e t c.

35 35 What if contrasts of interest are not orthogonal? *Bonferroni Method: The same F test (SSQ = RxZ i ^2) but using  = a/k, where a is the overall error rate. *Scheffe Method: p.108, skipped. Reference: Statistical Principles of Research Design and Analysis by Robert O. Kuehl. Let k be the number of contrasts of interest. 1.If k <= c-1  Bonferroni method 2.If k > c-1  Bonferroni or Scheffe method

36 36 1.Can k be larger than c-1? 2.Can k be smaller than c-1? If k contrasts are orthogonal, No. Yes. For case 2, do the same F test (but the sum of SSQ will not be equal to SSB). See Slide 16.


Download ppt "1 Orthogonality One way to delve further into the impact a factor has on the yield is to break the Sum of Squares (SSQ) into “orthogonal” components. If."

Similar presentations


Ads by Google