Presentation is loading. Please wait.

Presentation is loading. Please wait.

STT 511-STT411: DESIGN OF EXPERIMENTS AND ANALYSIS OF VARIANCE Dr. Cuixian Chen Chapter 3: Experiments with a single factor: ANOVA Design & Analysis of.

Similar presentations


Presentation on theme: "STT 511-STT411: DESIGN OF EXPERIMENTS AND ANALYSIS OF VARIANCE Dr. Cuixian Chen Chapter 3: Experiments with a single factor: ANOVA Design & Analysis of."— Presentation transcript:

1 STT 511-STT411: DESIGN OF EXPERIMENTS AND ANALYSIS OF VARIANCE Dr. Cuixian Chen Chapter 3: Experiments with a single factor: ANOVA Design & Analysis of Experiments 8E 2012 Montgomery 1 Chapter 3

2 Design & Analysis of Experiments 8E 2012 Montgomery 2 Portland Cement Formulation (page 26) Review: Two (independent) sample scenario Chapter 2 An engineer is studying the formulation of a Portland cement mortar. He has added a polymer latex emulsion during mixing to determine if this impacts the curing time and tension bond strength of the mortar. The experimenter prepared 10 samples of the original formulation and 10 samples of the modified formulation. Q: How many factor(s)? How many levels? Factor: mortar formulation; Levels: two different formulations as two treatments or as two levels.

3 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 3 Review: In Chap 2, we compare two conditions or treatments...  It is a single-factor experiment with two levels of the factor, where the factor is mortar formulation and the two levels are the two different formulation methods.  Many experiments of this type involve more than two levels of the factor. Chapter 3 focuses on methods for the design and analysis of single factor experiments with an arbitrary number a levels of the factor (or a treatments).  Eg: In clinical trials, we compare mean responses among a # of parallel- dose groups, or among various strata based on patients’ background info, such as race, age group, or disease severity.  We assume that experiment has been completely randomized.

4 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 4 What If There Are More Than Two Factor Levels?  There are lots of practical situations where there are: either more than two levels of interest, or there are several factors of simultaneous interest.  The t-test does not directly apply.  The analysis of variance (ANOVA) is the appropriate analysis “engine” for these types of experiments  The ANOVA was developed by Fisher in the early 1920s, and initially applied to agricultural experiments  Used extensively today for industrial experiments

5 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 5 An Example (See pg. 66)  An engineer is interested in investigating the relationship between the RF power setting and the etch rate for this tool. The objective of an experiment like this is to model the relationship between etch rate and RF power, and to specify the power setting that will give a desired target etch rate.  The response variable is etch rate.  She is interested in a particular gas (C2F6) and gap (0.80 cm), and wants to test four levels of RF power: 160W, 180W, 200W, and 220W. She decided to test five wafers at each level of RF power.  The experimenter chooses 4 levels of RF power 160W, 180W, 200W, and 220W  The experiment is replicated 5 times – runs made in random order.

6 Example, page 66: Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 6

7 7 An Example (See pg. 66) Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery

8 8 An Example (See pg. 66) Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery Does changing the power change the mean etch rate? Is there an optimum level for power? We would like to have an objective way to answer these questions. The t-test really doesn’t apply here – more than two factor levels.

9 Chapter 3 Design & Analysis of Experiments 8E 2012 Montgomery 9 The Analysis of Variance (Sec. 3.2, pg. 68)  In general, there will be a levels of the factor, or a treatments, and n replicates of the experiment, run in random order…a completely randomized design (CRD)  N = # of total runs.  We consider the fixed effects case…the random effects case will be discussed later.  Objective is to test hypotheses about the equality of the a treatment means. TreatObservation L10.91.01.1 L21.952.02.05 L32.993.03.01

10 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 10 Notation for ANOVA Chapter 3 TreatObservation L10.91.01.1 L21.952.02.05 L32.993.03.01

11 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 11 ANOVA Effects model  The name “analysis of variance” stems from a partitioning of the total variability in the response variable into components that are consistent with a model for the experiment.  The basic single-factor ANOVA effects model is  where NID means Normal Independent Distributed. TreatObservation L10.91.01.1 L21.952.02.05 L32.993.03.01

12 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 12 ANOVA Means Models for the Data There are several ways to write a model for the data: And you would expect:

13 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 13 The Analysis of Variance (ANOVA)  Total variability is measured by the total sum of squares:  The basic ANOVA partitioning is: TreatObservation L10.91.01.11.0 L21.952.02.052.0 L32.993.03.013.0 =2.0 Does this expression depend on each j? Chapter 3 Let’s add and subtract one term here…

14 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 14 The Analysis of Variance (ANOVA)  Total variability is measured by the total sum of squares:  The basic ANOVA partitioning is: TreatObservation L10.91.01.11.0 L21.952.02.052.0 L32.993.03.013.0 =2.0 Does this expression depend on each j? Chapter 3

15 Design & Analysis of Experiments 8E 2012 Montgomery 15 The fundamental ANOVA identity Chapter 3  The total variability in the data, as measured by the total corrected sum of squares, can be partitioned into a sum of squares of the differences between the treatment averages and the grand average, plus a sum of squares of the differences of observations within treatments from the treatment average.  Now, the difference between the observed treatment averages and the grand average is a measure of the differences between treatment means, whereas the differences of observations within a treatment from the treatment average can be due only to random error.

16 16 The Analysis of Variance (ANOVA)  SS Treatments : Sum of squares due to treatments (i.e. between treatments)  SS E : Sum of squares due to errors (i.e. within treatments)  A large value of SS Treatments reflects large differences in treatment means  A small value of SS Treatments likely indicates no differences in treatment means  Formal statistical hypotheses are: OR Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery

17 17 The Analysis of Variance (ANOVA)  While sums of squares cannot be directly compared to test the hypothesis of equal means, mean squares can be compared.  A mean square is a sum of squares divided by its degrees of freedom:  If the treatment means are equal, the treatment and error mean squares will be (theoretically) equal.  If treatment means differ, the treatment mean square will be larger than the error mean square. Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery

18 Chapter 3 Design & Analysis of Experiments 8E 2012 Montgomery 18 The Analysis of Variance is Summarized in a Table  Computing…see text, pp 69  The reference distribution for F 0 is the F (a-1, a(n-1)) distribution  Reject the null hypothesis (equal treatment means) if

19 page 74: Calculation of ANOVA table Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 19

20  Use the ANOVA to test: H0:µ1=µ2=µ3=µ4 v.s. H1: some means are different. Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 20 ANOVA Table TreatObservation L10.91.01.11.0 L21.952.02.052.0 L32.993.03.013.0 =2.0 Chapter 3 Q: Find SS Total, SS treat, and SS error ; then find MS treat, MS error, F0 and p-value.

21 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 21 ANOVA Table: Example 3-1 Q: SS Total = 72,209.75, SS treat = 66,870.55, first find SS error ; then find MS treat, MS error, F0 and p-value.

22 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 22 ANOVA Table: Example 3-1 SS Total = 72,209.75, SS treat = 66,870.55, SS error = 5339.20, MS treat = 22,290.18, MS error = 333.70, F 0 = 66.80, p-value = 2.881934e-09.

23 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 23 ANOVA Table: Example 3-1 Pvalue = 1-pf(66.80, 3, 16) = 2.881934e-09 R Example?

24 ANOVA Table: Example 3-1 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 24 P-value The Reference Distribution: Critical.value =qf(0.95, 3, 16) =3.238872

25 Example 3.1: ANOVA in R Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 25 ## One way Anova in R ## tab3.1<-read.table("\\\\bearsrv\\classrooms\\Math\\wangy\\stt4511\\Etch-Rate.TXT",header=TRUE); y=tab3.1[,2]; ## assign y as response variable x=tab3.1[,1]; ## assign y as explainary variable anova(lm(y~as.factor(x))); ##or anova(lm(tab3.1[,2]~as.factor(tab3.1[,1]))); ##or summary(aov(tab3.1$EtchRate~as.factor(tab3.1$Power))) ## or summary(aov(tab3.1[,2]~as.factor(tab3.1[,1]))) ## the WRONG way: anova(lm(y~x)); ## Power is NOT categorical variable here.

26 Chapter 3 Design & Analysis of Experiments 8E 2012 Montgomery 26 Note 1: A factor is a vector object used to specify a discrete classification. If the treatment is made of numerical values, “as.factor” must be used on the treatment variable (tab3.1$Power). Note 2: If the treatment is made of character values, “as.factor” may be ignored. Note 3: always check the “df” to verify the results! Note 4: some software has a total row.

27 Example 3.1: ANOVA in R with Data input Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 27 ## Here is the way we code the data into R: ## x=c(rep(160, 5), rep(180, 5), rep(200, 5), rep(220, 5)); y=c(575, 542, 530, 539, 570, 565, 593, 590, 579, 610, 600, 651, 610, 637, 629, 725, 700, 715, 685, 710); anova(lm(y~as.factor(x)))

28 ANOVA in R with Data input Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 28 data.tab2.1<- read.table("\\\\bearsrv\\classrooms\\Math\\wangy\\stt4 511\\Tension-Bond.TXT",header=TRUE); x<-data.tab2.1[,1]; y<-data.tab2.1[,2]; With Table 2.1, how can we input the data to R for ANOVA? ## Here is the way we code the data into R: ## x=c(rep("Modified", 10), rep("Unmodified", 10)); y=c(16.85,16.40,17.21,16.35,16.52,17.04,16.96,17.15,16.59,16.57,16.62,16.75,17.37,17.12,16.98,16.87,17.34,17.02,17.08,17.27); anova(lm(y~as.factor(x)))

29 The connection of ANVOA with 2-sample (pooled) t- test, output analysis Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 29 ## two sample pooled t-test is a special case of ANOVA? ## data.tab2.1<-read.table("\\\\bearsrv\\classrooms\\Math\\wangy\\stt4511\\Tension-Bond.TXT",header=TRUE); x<-data.tab2.1[,1]; y<-data.tab2.1[,2]; t.test(x,y,alternative ="two.sided",mu=0,var.equal = TRUE); ## see what ANOVA does... ## data2.1<-c(x,y); group<-c(rep("modfied",10),rep("unmodified",10)); ## group is a categorical variable here ## anova(lm(data2.1~group));

30 The connection of ANVOA with 2-sample (pooled) t-test, output analysis Chapter 3 Design & Analysis of Experiments 8E 2012 Montgomery 30

31 The connection of ANVOA with 2-sample (pooled) t- test, Another example… Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 31 ## another example... ## bhh.tab3.1<- read.table("\\\\bearsrv\\classrooms\\Math\\wangy\\stt4511\\B HH2-Data\\tab0301.dat",header=TRUE); ## Yield column is our response variable ## x<-bhh.tab3.1$yield[1:10]; y<-bhh.tab3.1$yield[11:20]; t.test(x, y, alternative ="less", mu=0, var.equal = TRUE); ## see what ANOVA does... ## Yield=bhh.tab3.1$yield; Method=bhh.tab3.1$method anova(lm(Yield~Method)); ## OR ## anova(lm(bhh.tab3.1$yield~bhh.tab3.1$method));

32 The connection of ANVOA with t-test, another output analysis Chapter 3 Design & Analysis of Experiments 8E 2012 Montgomery 32

33 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 33 Review: ANOVA Effects model  The name “analysis of variance” stems from a partitioning of the total variability in the response variable into components that are consistent with a model for the experiment.  The basic single-factor ANOVA effects model is  where NID means Normal Independent Distributed. TreatObservation L10.91.01.1 L21.952.02.05 L32.993.03.01

34 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 34 Review: ANOVA Means Models for the Data There are several ways to write a model for the data: And you would expect:

35 Estimation of the Model Parameters Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 35

36 A confidence interval estimate of ith treatment mean Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 36 df=N-a

37 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 37 Example: Estimation of the Model Parameters and find Confidence Interval Q1: Find the estimates of the overall mean and each of the treatment effects. Q2: Find a 95 percent confidence interval on the mean of treatment 4 (220W of RF power). Q3: Find a 95 percent confidence interval on µ 1 - µ 4.

38 Example: Estimation of the Model Parameters and find Confidence Interval Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 38

39 Chapter 3 Design & Analysis of Experiments 8E 2012 Montgomery 39 A little (very little) humor…

40 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 40 ANOVA calculations are usually done via computer  Text exhibits sample calculations from multiple very popular software packages, R, SAS Design-Expert, JMP and Minitab  See pages 102-105  Text discusses some of the summary statistics provided by these packages

41 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 41 Model Adequacy Checking in the ANOVA Text reference, Section 3.4, pg. 80  Checking assumptions is important  Normality for residuals  Constant/equal variance  Independence  Have we fit the right model?  Later we will talk about what to do if some of these assumptions are violated

42 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 42 Model Adequacy Checking in the ANOVA  Examination of residuals (see text, Sec. 3-4, pg. 80)  Computer software generates the residuals  Residual plots are very useful  Normal probability plot of residuals. If the model is adequate, the residuals should be structureless; that is, they should contain no obvious patterns. If the underlying error distribution is normal, this plot will resemble a straight line. In visualizing the straight line, place more emphasis on the central values of the plot than on the extremes.

43 Model Adequacy, Checking in the ANOVA Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 43 ## residual analysis ## tab3.1<-read.table("\\\\bearsrv\\classrooms\\Math\\wangy\\stt4511\\Etch-Rate.TXT",header=TRUE); y=tab3.1$EtchRate; x=tab3.1$Power; model3.1<- lm(y~as.factor(x)) res3.1<- resid(model3.1); fit3.1<-model3.1$fitted par(mfrow=c(1,3)) plot(c(1:20), res3.1) ; abline(h=0); qqnorm(res3.1); qqline(res3.1); plot(fit3.1,res3.1); dev.off(); ## OR you can do## par(mfrow=c(2,2)) plot(model3.1) ################## equal variance test ##################### boxplot(tab3.1$EtchRate~as.factor(tab3.1$Power)) bartlett.test(tab3.1$EtchRate~as.factor(tab3.1$Power)) ################### multiple comparison in R ################## tapply(tab3.1$EtchRate,as.factor(tab3.1$Power),mean) pairwise.t.test(tab3.1$EtchRate,as.factor(tab3.1$Power))

44 …………………. Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 44 Plot1 is helpful in detecting strong correlation between residuals. A tendency to have runs of positive and negative residuals indicates positive correlation. If model is correct and the assumptions are satisfied, plot3 should be structureless. Plot of Residuals in Time Sequence/run order for correlation Normal QQ-plot Plot of Residuals Versus Fitted Values for model checking

45 Statistical Tests for Equality of Variance: Bartlett’s test Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 45 ln(10)=2.302585 http://en.wikipedia.org/wiki/Bartlett's_test

46 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 46 Statistical Tests for Equality of Variance: Bartlett’s test  Use Bartlett’s Test to test for:  H0: σ1=σ2=σ3= σ4, v.s.  Ha: some σi are different.

47 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 47 Statistical Tests for Equality of Variance: Bartlett’s test  Pvalue=1-pchisq(0.43, 3)=0.9339778

48 Bartlett’s test in R for equal variances (a>=2) Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 48 ## equal variance test ## tab3.1<- read.table("\\\\bearsrv\\classrooms\\Math\\wangy \\stt4511\\Etch-Rate.TXT",header=TRUE); y=tab3.1[,2]; x=tab3.1[,1]; boxplot(y ~ as.factor(x)) bartlett.test(y ~ as.factor(x)) ## OR you can use ## boxplot(tab3.1$EtchRate~as.factor(tab3.1$Power)) bartlett.test(tab3.1$EtchRate~as.factor(tab3.1$Power))

49 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 49 Post-ANOVA Comparison of Means  The ANOVA tests the hypothesis of equal treatment means  Assume that residual analysis is satisfactory  In ANOVA, if H0 is rejected, we don’t know which specific means are different.  (Review) Checking ANOVA assumptions is important:  Normality  Constant variance  Independence  Have we fit the right model? OR

50 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 50 Post-ANOVA Comparison of Means  Determining which specific means differ, following an ANOVA is called the multiple pairwise comparisons b/w all a populations means.  Suppose we are interesting in comparing all pairs of a treatment means and that the null hypothesis that we wish to test H0: µi=µj v.s. Ha: µi ≠ µj for all i≠j.  There are lots of ways to do this…see text, Section 3.5, pg. 91.  We will use pairwise t-tests on means…sometimes called Fisher’s Least Significant Difference (or Fisher’s LSD) Method

51 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 51 Pairwise t-tests on means: Fisher’s Least Significant Difference Test H0: µi=µj v.s. Ha: µi ≠ µj for all i≠j.

52 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 52 Pairwise t-tests on means: Fisher’s Least Significant Difference Q: Use Fisher’s LSD for Pairwise t-tests on means. That is, to test which two population means are different, for all distinct i, j, where i, j=1, 2, 3, 4.

53 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 53 Pairwise t-tests on means: Fisher’s Least Significant Difference

54 Pairwise t.test from R Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 54 ################### mutiple comparison in R ################## tab3.1<-read.table("\\\\bearsrv\\classrooms\\Math\\wangy\\stt4511\\Etch-Rate.TXT",header=TRUE); y=tab3.1[,2]; x=tab3.1[,1]; tapply(y, as.factor(x), mean) ## finding group-means directly ## pairwise.t.test(y, as.factor(x), p.adjust.method ="none") ##Calculate pairwise comparisons b/w group levels with corrections for multiple testing ## ## OR you can do ## tapply(tab3.1$EtchRate,as.factor(tab3.1$Power),mean) pairwise.t.test(tab3.1$EtchRate,as.factor(tab3.1$Power))

55 STT511: Tukey's ‘Honest Significant Difference’ method

56 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 56 STT511: Tukey's ‘Honest Significant Difference’ method

57 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 57 Tukey's ‘Honest Significant Difference’ method

58 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 58 Tukey's ‘Honest Significant Difference’ method Q: Use Tukey’s HSD for Pairwise comparison on means. That is, to test which two population means are different, for all distinct i, j, where i, j=1, 2, 3, 4.

59 Page 702, Table VI Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 59

60 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 60 Tukey's ‘Honest Significant Difference’ method

61 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 61 tab3.1<-read.table("\\\\bearsrv\\classrooms\\Math\\wangy\\stt4511\\Etch-Rate.TXT",header=TRUE); y=tab3.1[,2]; x=tab3.1[,1]; tapply(y, as.factor(x), mean) ## finding group-means directly ## a1 <- aov(y ~ as.factor(x)); summary(a1) ## for ANOVA analysis ## ## OR use ## anova(lm(y~as.factor(x))) ## for ANOVA analysis ## posthoc <- TukeyHSD(a1, 'as.factor(x)', conf.level=0.95) ## Tukey's ‘Honest Significant Difference’ method ## plot(posthoc) ## Online Tukey Table: http://www.watpon.com/table/studen_range.pdf ## Create a set of confidence intervals on the differences between the means of the levels of a factor with the specified ## family-wise probability of coverage. ## The intervals are based on the Studentized range statistic, Tukey's ‘Honest Significant Difference’ method. ## Each component is a matrix with columns diff giving the difference in the observed means, ## lwr giving the lower end point of the interval, ## upr giving the upper end point and ## p adj giving the p-value after adjustment for the multiple comparisons. ## It will list all possible pairwise comparison within different groups ##

62 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 62 Design-Expert Output Q: how do we obtain the SE from the ANOVA table? Q: how do we get the CI for the group average Q: how do we get the Ci for the group average difference?

63 Extra Examples for students’ practice

64 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 64 Why Does the ANOVA Work? ~~ ~~

65 3.8 Other Examples of Single-Factor Experiments Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 65

66 3.8 Other Examples of Single-Factor Experiments Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 66 ############################# tab3.12 #################################### y<-c(118.8,122.6,115.6,113.6,119.5,115.9,115.8,115.1,116.9,115.4,115.6,107.9, 105.4,101.1,102.7, 97.1,101.9, 98.9,100.0, 99.8,102.6,100.9,104.5, 93.5, 102.1,105.8, 99.6,102.7, 98.8,100.9,102.8, 98.7, 94.7, 97.8, 99.7, 98.6) x<-c(rep("DC",12),rep("DC+MK",12),rep("MC",12)) tab3.12<-list(y,x) anova(lm(y~as.factor(x))); #######residual analysis ########### model3.12<- lm(y~as.factor(x)); res3.12<- resid(model3.12); fit3.12<-model3.12$fitted par(mfrow=c(1,3)) plot(c(1:36),res3.12) qqnorm(res3.12);qqline(res3.12);plot(fit3.12,res3.12) dev.off(); ##### equal variance test ##################### boxplot(y~as.factor(x)) bartlett.test(y~as.factor(x)) ###### mutiple comparison in R ################## tapply(y,x,mean) pairwise.t.test(y,as.factor(x))

67 Example 3.8.1, page 110: Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 67

68 68 Conclusions? Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery

69  Another Example 69 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery

70 70 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery

71 71 Q: find a 95% CI for the average of DC Q: find a 95% CI for the diff in the average of DC and MC Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery

72 Example 3.8.1, page 110 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 72

73 Example 3.8.1, page 110 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery 73

74 74 Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery

75  aov,  lm,  as.factor,  resid,  rep,  bartlett.test,  pairwise.t.test,  tapply. 75 Commands in R: summary Chapter 3Design & Analysis of Experiments 8E 2012 Montgomery


Download ppt "STT 511-STT411: DESIGN OF EXPERIMENTS AND ANALYSIS OF VARIANCE Dr. Cuixian Chen Chapter 3: Experiments with a single factor: ANOVA Design & Analysis of."

Similar presentations


Ads by Google