1-Way ANOVA with Numeric Factor – Dose-Response Dose Response Studies in Laboratory Animals S.J. Ruberg (1995). “Dose Response Studies. II. Analysis and.

Slides:



Advertisements
Similar presentations
Ecole Nationale Vétérinaire de Toulouse Linear Regression
Advertisements

Overview of Lecture Partitioning Evaluating the Null Hypothesis ANOVA
Lecture 2 ANALYSIS OF VARIANCE: AN INTRODUCTION
Multiple-choice question
1 Contact details Colin Gray Room S16 (occasionally) address: Telephone: (27) 2233 Dont hesitate to get in touch.
SADC Course in Statistics Inferences about the regression line (Session 03)
Week 2 – PART III POST-HOC TESTS. POST HOC TESTS When we get a significant F test result in an ANOVA test for a main effect of a factor with more than.
Chapter 4: Basic Estimation Techniques
© 2014 by Pearson Higher Education, Inc Upper Saddle River, New Jersey All Rights Reserved HLTH 300 Biostatistics for Public Health Practice, Raul.
P.3A Polynomials and Special Factoring In the following polynomial, what is the degree and leading coefficient? 4x 2 - 5x x Degree = Leading coef.
Chi-square and F Distributions
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Chapter Thirteen The One-Way Analysis of Variance.
Bottoms Up Factoring. Start with the X-box 3-9 Product Sum
Copyright © 2013 Pearson Education, Inc. All rights reserved Chapter 11 Simple Linear Regression.
So far... Until we looked at factorial interactions, we were looking at differences and their significance - or the probability that an observed difference.
One-Way BG ANOVA Andrew Ainsworth Psy 420. Topics Analysis with more than 2 levels Deviation, Computation, Regression, Unequal Samples Specific Comparisons.
Forecasting Using the Simple Linear Regression Model and Correlation
Analytic Comparisons & Trend Analyses Analytic Comparisons –Simple comparisons –Complex comparisons –Trend Analyses Errors & Confusions when interpreting.
Statistics Measures of Regression and Prediction Intervals.
Simple Linear Regression and Correlation (Part II) By Asst. Prof. Dr. Min Aung.
More on ANOVA. Overview ANOVA as Regression Comparison Methods.
1 Regression Econ 240A. 2 Outline w A cognitive device to help understand the formulas for estimating the slope and the intercept, as well as the analysis.
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Chapter 11 Multiple Regression.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Measures of Regression and Prediction Intervals
Introduction to Linear Regression and Correlation Analysis
Comparing Means. Anova F-test can be used to determine whether the expected responses at the t levels of an experimental factor differ from each other.
Decomposition of Treatment Sums of Squares using prior information on the structure of the treatments and/or treatment groups.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Orthogonal Linear Contrasts This is a technique for partitioning ANOVA sum of squares into individual degrees of freedom.
Chapter 8 Curve Fitting.
Multiple Regression I KNNL – Chapter 6. Models with Multiple Predictors Most Practical Problems have more than one potential predictor variable Goal is.
Orthogonal Linear Contrasts
6.1 Polynomial Functions.
Basic Concepts of Correlation. Definition A correlation exists between two variables when the values of one are somehow associated with the values of.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Orthogonal Linear Contrasts This is a technique for partitioning ANOVA sum of squares into individual degrees of freedom.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
The general linear test approach to regression analysis.
Orthogonal Linear Contrasts A technique for partitioning ANOVA sum of squares into individual degrees of freedom.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Section 9.3 Measures of Regression and Prediction Intervals.
Simple Statistical Designs One Dependent Variable.
Experimental Designs The objective of Experimental design is to reduce the magnitude of random error resulting in more powerful tests to detect experimental.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Global predictors of regression fidelity A single number to characterize the overall quality of the surrogate. Equivalence measures –Coefficient of multiple.
REGRESSION G&W p
Week 2 – PART III POST-HOC TESTS.
Why is this important? Requirement Understand research articles
Statistical Quality Control, 7th Edition by Douglas C. Montgomery.
Multiple comparisons
How to Interpret a 2-Way ANOVA
Two Way ANOVAs Factorial Designs.
1-Way ANOVA with Numeric Factor – Dose-Response
Linear Regression.
How to Interpret a 2-Way ANOVA
Chapter 11: The ANalysis Of Variance (ANOVA)
24/02/11 Tutorial 3 Inferential Statistics, Statistical Modelling & Survey Methods (BS2506) Pairach Piboonrungroj (Champ)
Review of Chapter 2 Some Basic Concepts: Sample center
Comparing Means.
Correlation and Regression
Nonlinear Fitting.
Ch 4.1 & 4.2 Two dimensions concept
2k Factorial Design k=2 Ex:.
Multiple comparisons - multiple pairwise tests - orthogonal contrasts
Presentation transcript:

1-Way ANOVA with Numeric Factor – Dose-Response Dose Response Studies in Laboratory Animals S.J. Ruberg (1995). “Dose Response Studies. II. Analysis and Interpretation,” Journal of Biopharmaceutical Statistics, 5(1), 15-42

Data Description N=60 animals tested g=10 doses (0.0 to 4.5 by 0.5) n i = 6 animals per dose Data given as mean and standard deviation by dose

Analysis of Variance - Calculations

Analysis of Variance Table & F-Test

Dunnett’s Pairwise Comparisons with a Control 1-Sided Tests: H 0i :  i -  1 = 0 H Ai :  i -  1 > 0 i=2,…,10 Overall Experiment-wise error rate = 0.05 Number of Comparisons = 9 Critical Value (50 Error DF, 9 Comparisons) = 2.49 Std Error of difference in pairs of means =SQRT(60.08(2/6))=SQRT( )=4.48 Minimum Significant Difference = 2.49(4.48) = 11.14

Contrasts and Sums of Squares

Orthogonal Polynomials Coefficients of Dose Means that describe the structure of means in polynomial form: Linear, Quadratic, Cubic,… (up to order g-1=9 for this example) Squared Coefficients Sum to 1 Products of Coefficients Sum to 0 for Different Polynomial Contrasts (Orthogonal) Note: P0 is not a contrast, but is used to get the intercept in regression

Estimated Contrasts, Sums of Squares, ANOVA Based on the F-tests, we will consider the Orders 5 and 3 Polynomials

Fitted Polynomial Regression Model To Obtain the k th order fitted Polynomial, we multiply the estimated “Contrasts” for P 0,...,P k by the corresponding Coefficients of the Contrasts for each Dose. Note that P 0 is not a contrast, but a linear function of the means

3 rd and 5 th Order Polynomials