Web example squares-means-marginal-means-vs.html.

Slides:



Advertisements
Similar presentations
Applied Econometrics Second edition
Advertisements

Lecture 28 Categorical variables: –Review of slides from lecture 27 (reprint of lecture 27 categorical variables slides with typos corrected) –Practice.
1 BIS APPLICATION MANAGEMENT INFORMATION SYSTEM Advance forecasting Forecasting by identifying patterns in the past data Chapter outline: 1.Extrapolation.
FACTORIAL ANOVA.
More on ANOVA. Overview ANOVA as Regression Comparison Methods.
Announcements: Next Homework is on the Web –Due next Tuesday.
Statistics 350 Lecture 16. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Statistics 350 Lecture 23. Today Today: Exam next day Good Chapter 7 questions: 7.1, 7.2, 7.3, 7.28, 7.29.
Stat 112: Lecture 8 Notes Homework 2: Due on Thursday Assessing Quality of Prediction (Chapter 3.5.3) Comparing Two Regression Models (Chapter 4.4) Prediction.
Multiple Linear Regression
Chi-Square and F Distributions Chapter 11 Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania.
Intro to Statistics for the Behavioral Sciences PSYC 1900
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 14: Factorial ANOVA.
Lecture 20 – Tues., Nov. 18th Multiple Regression: –Case Studies: Chapter 9.1 –Regression Coefficients in the Multiple Linear Regression Model: Chapter.
Statistics 200b. Chapter 5. Chapter 4: inference via likelihood now Chapter 5: applications to particular situations.
PSY 307 – Statistics for the Behavioral Sciences Chapter 7 – Regression.
Regression Approach To ANOVA
Understanding the Two-Way Analysis of Variance
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Linear Trend Lines Y t = b 0 + b 1 X t Where Y t is the dependent variable being forecasted X t is the independent variable being used to explain Y. In.
Notes on Data Collection and Analysis Dale Weber PLTW EDD Fall 2009.
HAWKES LEARNING SYSTEMS Students Matter. Success Counts. Copyright © 2013 by Hawkes Learning Systems/Quant Systems, Inc. All rights reserved. Section 12.4.
Correlation and Linear Regression. Evaluating Relations Between Interval Level Variables Up to now you have learned to evaluate differences between the.
Statistical Analysis. Statistics u Description –Describes the data –Mean –Median –Mode u Inferential –Allows prediction from the sample to the population.
© Copyright McGraw-Hill CHAPTER 12 Analysis of Variance (ANOVA)
Chapter 9 Analyzing Data Multiple Variables. Basic Directions Review page 180 for basic directions on which way to proceed with your analysis Provides.
Multiple Regression I KNNL – Chapter 6. Models with Multiple Predictors Most Practical Problems have more than one potential predictor variable Goal is.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Project 1 FINA B. Group of 5. Due by 18/09/ parts. Each worth 50% of total. Need to provide 1 excel workbook for part 1 and part 2. This.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Analysis of Covariance Combines linear regression and ANOVA Can be used to compare g treatments, after controlling for quantitative factor believed to.
Confidence Intervals vs. Prediction Intervals Confidence Intervals – provide an interval estimate with a 100(1-  ) measure of reliability about the mean.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 14 Comparing Groups: Analysis of Variance Methods Section 14.3 Two-Way ANOVA.
Mixed ANOVA Models combining between and within. Mixed ANOVA models We have examined One-way and Factorial designs that use: We have examined One-way.
General Linear Model.
9.1 Chapter 9: Dummy Variables A Dummy Variable: is a variable that can take on only 2 possible values: yes, no up, down male, female union member, non-union.
Two-Way (Independent) ANOVA. PSYC 6130A, PROF. J. ELDER 2 Two-Way ANOVA “Two-Way” means groups are defined by 2 independent variables. These IVs are typically.
Week 101 ANOVA F Test in Multiple Regression In multiple regression, the ANOVA F test is designed to test the following hypothesis: This test aims to assess.
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
1 Response Surface A Response surface model is a special type of multiple regression model with: Explanatory variables Interaction variables Squared variables.
The General Linear Model. Estimation -- The General Linear Model Formula for a straight line y = b 0 + b 1 x x y.
Fixed effects analysis in a Two–way ANOVA. Problem 5.6 Layout.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Aron, Aron, & Coups, Statistics for the Behavioral and Social Sciences: A Brief Course (3e), © 2005 Prentice Hall Chapter 10 Introduction to the Analysis.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Chapter 12 Simple Regression Statistika.  Analisis regresi adalah analisis hubungan linear antar 2 variabel random yang mempunyai hub linear,  Variabel.
1. Refresher on the general linear model, interactions, and contrasts UCL Linguistics workshop on mixed-effects modelling in R May 2016.
Announcements There’s an in class exam one week from today (4/30). It will not include ANOVA or regression. On Thursday, I will list covered material and.
Decomposition of Sum of Squares
Multiple Regression Equations
REGRESSION (R2).
LEAST – SQUARES REGRESSION
Basic Estimation Techniques
Two-way ANOVA problems
Multiple Regression.
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
…Don’t be afraid of others, because they are bigger than you
BIBD and Adjusted Sums of Squares
Basic Estimation Techniques
Soc 3306a: ANOVA and Regression Models
Soc 3306a Lecture 11: Multivariate 4
Scatter Plots and Least-Squares Lines
CALCULATING EQUATION OF LEAST SQUARES REGRESSION LINE
Individual Assignment 6
Chapter 10 Introduction to the Analysis of Variance
2k Factorial Design k=2 Ex:.
Decomposition of Sum of Squares
Two-way ANOVA problems
Slides are available at:
Presentation transcript:

web example squares-means-marginal-means-vs.html

Data

Model with only main effects (JMP output): Center Level Least Sq Mean Mean

Model with only main effects (JMP output): Trt Level Least Sq MeanMean A B

Model with interaction in JMP (the right model to use in JMP): Center Level Least Sq MeanMean

Model with interaction in JMP (the right model to use in JMP): Trt Level Least Sq MeanMean A B

How are Means calculated? (from webpage) The mean value for Treatment A is simply the summation of all measures divided by the total number of observations: Mean for treatment A = 24/5 = 4.8) Mean for treatment B = 26/5 = 5.2. Mean for treatment A > Mean for treatment B.

How are LS Means calculated? (again, webpage) Table 2 shows the calculation of least squares means. First step is to calculate the means for each cell of treatment and center combination. The mean 9/3=3 for treatment A and center 1 combination 7.5 for treatment A and center 2 combination 5.5 for treatment B and center 1 combination 5 for treatment B and center 2 combination.

LS Means continued (again from webpage) After the mean for each cell is calculated, the least squares means are simply the average of these means. For treatment A, the LS mean is (3+7.5)/2 = 5.25 For treatment B, it is (5.5+5)/2=5.25 The LS Mean for both treatment groups are identical.

Ahh, that’s fine, but what about empty cells? First fit a two-way model with least squares Then estimate the predicted value for that empty cell Put that value in as though it were “real” data, this is called “imputing” the value Redo the analysis and voila, you get the LS Means!!! One can do this iteratively and this is called “multiple imputation”, but that gets used elsewhere in Statistics (i.e. you don’t have to worry about it)

Predicted values for empty cells Predicted values for empty cells are obtained with a Regression model. With Regression, you can obtain “predicted values” even where there is no data point. You can do the same thing in ANOVA by using Regression with Dummy Variables. Dummy variables are “indicator variables’ for class variables.