1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 6 Solving Normal Equations and Estimating Estimable Model Parameters.

Slides:



Advertisements
Similar presentations
General Linear Model With correlated error terms  =  2 V ≠  2 I.
Advertisements

FACTORIAL ANOVA Overview of Factorial ANOVA Factorial Designs Types of Effects Assumptions Analyzing the Variance Regression Equation Fixed and Random.
Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 15 Analysis of Data from Fractional Factorials and Other Unbalanced.
1 Chapter 4 Experiments with Blocking Factors The Randomized Complete Block Design Nuisance factor: a design factor that probably has an effect.
Chapter 4 Randomized Blocks, Latin Squares, and Related Designs
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Experimental Design, Response Surface Analysis, and Optimization
Design of Engineering Experiments - Experiments with Random Factors
Chapter 5 Introduction to Factorial Designs
Chapter 10 Simple Regression.
Confidence intervals. Population mean Assumption: sample from normal distribution.
L Berkley Davis Copyright 2009 MER301: Engineering Reliability Lecture 14 1 MER301: Engineering Reliability LECTURE 14: Chapter 7: Design of Engineering.
Lesson #23 Analysis of Variance. In Analysis of Variance (ANOVA), we have: H 0 :  1 =  2 =  3 = … =  k H 1 : at least one  i does not equal the others.
Multi-Factor Studies Stat 701 Lecture E. Pena.
1 Chapter 5 Introduction to Factorial Designs Basic Definitions and Principles Study the effects of two or more factors. Factorial designs Crossed:
Chapter 11 Multiple Regression.
Simple Linear Regression Analysis
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 11 Multifactor Analysis of Variance.
Chapter 12 Section 1 Inference for Linear Regression.
Simple Linear Regression Analysis
Design & Analysis of Split-Plot Experiments (Univariate Analysis)
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
1 14 Design of Experiments with Several Factors 14-1 Introduction 14-2 Factorial Experiments 14-3 Two-Factor Factorial Experiments Statistical analysis.
Chapter 13: Inference in Regression
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved OPIM 303-Lecture #9 Jose M. Cruz Assistant Professor.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Chapter 11 Multifactor Analysis of Variance.
Statistical Analysis Professor Lynne Stokes
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
DOX 6E Montgomery1 Design of Engineering Experiments Part 9 – Experiments with Random Factors Text reference, Chapter 13, Pg. 484 Previous chapters have.
Design Of Experiments With Several Factors
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 8 Analysis of Variance.
Chapter 14 Repeated Measures and Two Factor Analysis of Variance
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
PSYC 3030 Review Session April 19, Housekeeping Exam: –April 26, 2004 (Monday) –RN 203 –Use pencil, bring calculator & eraser –Make use of your.
1 The Two-Factor Mixed Model Two factors, factorial experiment, factor A fixed, factor B random (Section 13-3, pg. 495) The model parameters are NID random.
Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 15 Review.
Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 18 Random Effects.
1 Experiments with Random Factors Previous chapters have considered fixed factors –A specific set of factor levels is chosen for the experiment –Inference.
Chapter 13 Repeated-Measures and Two-Factor Analysis of Variance
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 6QF Matrix Solutions to Normal Equations.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
The Mixed Effects Model - Introduction In many situations, one of the factors of interest will have its levels chosen because they are of specific interest.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 9 Review.
Factorial Experiments Analysis of Variance Experimental Design.
1 Mixed and Random Effects Models 1-way ANOVA - Random Effects Model 1-way ANOVA - Random Effects Model 2-way ANOVA - Mixed Effects Model 2-way ANOVA -
Summary of the Statistics used in Multiple Regression.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Two-Factor Study with Random Effects In some experiments the levels of both factors A & B are chosen at random from a larger set of possible factor levels.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture #10 Testing the Statistical Significance of Factor Effects.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Econometrics III Evgeniya Anatolievna Kolomak, Professor.
Chapter 14 Repeated Measures and Two Factor Analysis of Variance PowerPoint Lecture Slides Essentials of Statistics for the Behavioral Sciences Seventh.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
AP Statistics Chapter 14 Section 1.
Statistical Analysis Professor Lynne Stokes
Chapter 5 Introduction to Factorial Designs
Statistical Analysis Professor Lynne Stokes
Linear Regression.
Regression Models - Introduction
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Linear regression Fitting a straight line to observations.
Chapter 14 Inference for Regression
ENM 310 Design of Experiments and Regression Analysis Chapter 3
14 Design of Experiments with Several Factors CHAPTER OUTLINE
Regression Models - Introduction
Presentation transcript:

1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 6 Solving Normal Equations and Estimating Estimable Model Parameters

2 Regression Models Residuals Least Squares Solution: Solve the Normal Equations Model Sum of Squared Residuals Sum of Squared Residuals

3 Regression Solution Under usual assumptions, the least squares estimator is Unique Unbiased Minimum Variance Consistent Known sampling distribution Universally used

4 Analysis of Completely Randomized Designs Fixed Factor Effects Factor levels specifically chosen Inferences desired only on the factor levels included in the experiment Systematic, repeatable changes in the mean response

5 Flow Rate Experiment MGH Fig 6.1 Fixed or Random ?

6 Flow Rate Experiment ABCD Average Flow Rate Conclusion ? Filter Type 0.35

7 Statistical Model for Single-Factor, Fixed Effects Experiments ResponseOverall Mean (Constant) Main Effect for Level i Error Model y ij =  +  i + e ij i = 1,..., a; j = 1,..., r i  i : Effect of Level i = change in the mean response

8 Statistical Model for Single-Factor, Fixed Effects Experiments Cell Means Model y ij =  i + e ij i = 1,..., a; j = 1,..., r i Effects Model y ij =  +  i + e ij i = 1,..., a; j = 1,..., r i Fixed Effects Models Connection:  i =  +  i Fixed Effects Models Connection:  i =  +  i

9 Solving the Normal Equations Single-Factor, Balanced Experiment y ij =  +  i + e ij i = 1,..., a j = 1,..., r n = ar Matrix Formulation y = X  + e y = (y 11 y y 1r... y a1 y a2... y ar )’

10 Solving the Normal Equations Residuals Least Squares Solution: Solve the Normal Equations

11 Solving the Normal Equations Normal Equations Check

12 Solving the Normal Equations Normal Equations Check Linearly Dependent a + 1 Parameters, a Linearly Independent Equations Infinite Number of Solutions

13 Solving the Normal Equations Normal Equations One Solution

14 Solving the Normal Equations Normal Equations Another Solution

15 Solving the Normal Equations Another Solution Normal Equations

16 Solving the Normal Equations Solutions are not estimates Estimable Functions All solutions provide one unique estimator Estimators are unbiased All solutions to the normal equations produce the same estimates of “estimable functions” of the model means All solutions to the normal equations produce the same estimates of “estimable functions” of the model means

17 Solving the Normal Equations Two-Factor, Balanced Experiment Matrix Formulation y = X  + e y ijk =  ij + e ijk =  +  i +  j + (  ) ij + e ijk i = 1,..., a j = 1,..., b k = 1,..., r X = [ 1 : X A : X B : X AB ]  1  a  1  b  11  ab  n = abr

18 Solving the Normal Equations Two-Factor, Balanced Experiment Matrix Formulation y = X  + e y ijk =  ij + e ijk =  +  i +  j + (  ) ij + e ijk i = 1,..., a j = 1,..., b k = 1,..., r X = [ 1 : X A : X B : X AB ] Number of Parameters 1 + a + b + ab rank( X ) < 1+a+b+ab n = abr  1  a  1  b  11  ab 

19 Solving the Normal Equations Normal Equations Check

20 Solving the Normal Equations Matrix Linear Dependencies One Solution 1 n None X A 1 : Columns of X A Sum to 1 n  a  = 0 Eliminates a column From X A Eliminates a column From X A a – 1 “degrees of freedom”

21 Solving the Normal Equations Matrix Linear Dependencies One Solution 1 n None X A 1 : Columns Sum of X A to 1 n  a  = 0 X B 1 : Columns Sum of X B to 1 n  b = 0 Eliminates a column From X B Eliminates a column From X B b – 1 “degrees of freedom”

22 Solving the Normal Equations Matrix Linear Dependencies One Solution 1 n None X A 1 : Columns sum to 1 n  a  = 0 X B 1 : Columns sum to 1 n  b = 0 X AB 1 + (a - 1) + (b - 1) : Sum over all columns = 1 n (  ) ab = 0 Eliminates a column from X AB Eliminates a column from X AB

23 Solving the Normal Equations Matrix Linear Dependencies One Solution 1 n None X A 1 : Columns Sum to 1 n  a  = 0 X B 1 : Columns Sum to 1 n  b = 0 X AB 1 + (a - 1) + (b - 1) : Sum over all columns = 1 n (  ) ab = 0 Sums of columns over each i = 1,...,a-1 & each j = 1,...,b-1 (  ) ib = 0 equal one of the remaining i=1,...,a-1 columns of X A and X B (  ) aj = 0 j=1,...,b-1 (a – 1)(b – 1) “degrees of freedom”

24 Solving the Normal Equations Matrix Linear Dependencies One Solution X A 1 : Columns sum to 1 n  a  = 0 X B 1 : Columns sum to 1 n  b = 0 X AB 1 + (a - 1) + (b - 1) : Sum over all columns = 1 n (  ) ab = 0 Sums of columns over each i = 1,...,a-1 & each j = 1,...,b-1 (  ) ib = 0 equal one of the remaining i=1,...,a-1 columns of X A and X B (  ) aj = 0 j=1,...,b-1 Constraints : {1 + (a - 1) + (b - 1)} = a + b + 1 Degrees of Freedom : (1 + a + b + ab) - (a + b + 1) = ab = 1 + (a - 1) + (b - 1) + (a - 1)(b - 1)

25 Solving the Normal Equations Check

26 Solving the Normal Equations Check Another Solution

27 Flow Rate Experiment MGH Fig 6.1 Fixed or Random ?

28 Quantifying Factor Effects Effect Change in average response due to changes in factor levels 123k... Factor Level Average Overall Average Effect of Level t : -

29 Quantifying Factor Effects Effect Change in average response due to changes in factor levels 123k... Factor Level Average Overall Average Effect of changing from Level s to Level t :...

30 Quantifying Factor Effects Main Effects for Factor A Change in average response due to changes in the levels of Factor A Main Effects for Factor B Change in average response due to changes in the levels of Factor B Interaction Effects for Factors A & B Effect of Level i of Factor A at Level j of Factor B Effect of Level i of Factor A

31 Quantifying Factor Effects Main Effects for Factor A Main Effects for Factor B Interaction Effects for Factors A & B Change in average response due to changes in the levels of Factor A Change in average response due to changes in the levels of Factor B Change in average response due joint changes in Factors A & B in excess of changes in the main effects

32 Two-Level Factors Common to Use Note: If r 1 = r 2, Effect of Level 1: Effect of Level 2:

33 Factors at Two Levels Most common choice for designs involving many factors Many efficient fractional factorial and screening designs available Can use p two-level factors in place of factors whose number of levels is 2 p

34 Calculating Two-Level Factor Effects: Pilot Plant Study Main Effect Difference between the average responses at the two levels M(Temp) = 180 o o = = 23.0 M(Conc) = 40% - 20% = = -5.0 M(Catalyst) = C 2 - C 1 = = 1.5 BHH Section 10.3 MGH Section 5.3

35 Calculating Two-Level Factor Effects Two-Factor Interaction Effect Half the difference between the main effects of one factor at each level of the second factor C 2 )= 40%&C %&C 2 = = -5.0 C 1 ) = 40%&C %&C 1 = = -5.0 I(Conc,Cat)= C 2 ) - C 1 )} / 2 = 0 BHH Section 10.4 MGH Section 5.3

36 Calculating Two-Level Factor Effects Two-Factor Interaction Effect Half the difference between the main effects of one factor at each level of the second factor C 2 )= 180 o &C o &C 2 = = 33.0 C 1 ) = 180 o &C o &C 1 = = 13.0 I(Temp,Cat)= C 2 ) - C 1 )} / 2 = ( ) / 2 = 10.0

37 Cell Means and Effects Model Estimability Three-Factor Balanced Experiment y ijkl =  ijk + e ijkl i = 1,..., a ; j = 1,..., b ; k = 1,..., c ; l = 1,..., r  ijk =  +  i +  j +  k + (  ) ij + (  ) ik + (  ) jk + (  ) ijk

38 Cell Means Models: Estimable Functions All cell means are estimable

39 Cell Means Models: Estimable Functions All cell means are estimable All linear combinations of cell means are estimable Does not depend on parameter constraints Does not depend on parameter constraints

40 Cell Means Models: Estimable Functions All cell means are estimable Some linear combinations of cell means are uninterpretable Some linear combinations of cell means are essential

41 Cell Means and Effects Models Imposing parameter constraints simplifies the relationships; makes the parameters more interpretable

42 Parameter Equivalence: Effects Representation & Cell Means Model Parameter constraints Means and mean effects

43 Contrasts Contrasts often eliminate nuisance parameters; e.g., 

44 Contrasts Main Effects Interactions Show