1 G89.2229 Lect 6W Polynomial example Orthogonal polynomials Statistical power for regression G89.2229 Multiple Regression Week 6 (Wednesday)

Slides:



Advertisements
Similar presentations
Canonical Correlation
Advertisements

Applied Econometrics Second edition
Chapter 10 Curve Fitting and Regression Analysis
Matrix Algebra Matrix algebra is a means of expressing large numbers of calculations made upon ordered sets of numbers. Often referred to as Linear Algebra.
1 G Lect 14b G Lecture 14b Within subjects contrasts Types of Within-subject contrasts Individual vs pooled contrasts Example Between subject.
Statistics for Managers Using Microsoft® Excel 5th Edition
Stat 112: Lecture 10 Notes Fitting Curvilinear Relationships –Polynomial Regression (Ch ) –Transformations (Ch ) Schedule: –Homework.
Curve-Fitting Regression
Chapter 11 Multiple Regression.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
Chapter 6 Numerical Interpolation
Lecture 21 – Thurs., Nov. 20 Review of Interpreting Coefficients and Prediction in Multiple Regression Strategy for Data Analysis and Graphics (Chapters.
Copyright ©2011 Pearson Education 15-1 Chapter 15 Multiple Regression Model Building Statistics for Managers using Microsoft Excel 6 th Global Edition.
Essential Question: Describe two methods for solving polynomial equations that have a degree greater than two.
Least-Squares Regression
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Time-Series Analysis and Forecasting – Part V To read at home.
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 15-1 Chapter 15 Multiple Regression Model Building Statistics for Managers using Microsoft.
Applications The General Linear Model. Transformations.
Fitting a line to N data points – 1 If we use then a, b are not independent. To make a, b independent, compute: Then use: Intercept = optimally weighted.
1 G Lect 6M Comparing two coefficients within a regression equation Analysis of sets of variables: partitioning the sums of squares Polynomial curve.
Copyright © Cengage Learning. All rights reserved. Polynomials 4.
So far... We have been estimating differences caused by application of various treatments, and determining the probability that an observed difference.
Chapter 8: Regression Models for Quantitative and Qualitative Predictors Ayona Chatterjee Spring 2008 Math 4813/5813.
Chapter 8 Curve Fitting.
Multiple Regression I KNNL – Chapter 6. Models with Multiple Predictors Most Practical Problems have more than one potential predictor variable Goal is.
Topic 1: Be able to combine functions and determine the resulting function. Topic 2: Be able to find the product of functions and determine the resulting.
Digital Image Processing Lecture 6: Image Geometry
Curvilinear 2 Modeling Departures from the Straight Line (Curves and Interactions)
Regression Regression relationship = trend + scatter
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
Multiple Linear Regression Partial Regression Coefficients.
Multiple Regression Petter Mostad Review: Simple linear regression We define a model where are independent (normally distributed) with equal.
Factoring and Solving Polynomial Equations Chapter 6.4.
1 Quadratic Model In order to account for curvature in the relationship between an explanatory and a response variable, one often adds the square of the.
Chapter 22: Building Multiple Regression Models Generalization of univariate linear regression models. One unit of data with a value of dependent variable.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Stat 112 Notes 10 Today: –Fitting Curvilinear Relationships (Chapter 5) Homework 3 due Thursday.
Orthogonal Linear Contrasts A technique for partitioning ANOVA sum of squares into individual degrees of freedom.
UNIT 2, LESSON 1 POLYNOMIAL FUNCTIONS. WHAT IS A POLYNOMIAL FUNCTION? Coefficients must be real numbers. Exponents must be whole numbers.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 10 th Edition.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
Anareg Week 10 Multicollinearity Interesting special cases Polynomial regression.
Polynomials CHAPTER 5. Chapter – MODELING POLYNOMIALS.
Week of March 23 Partial correlations Semipartial correlations
Chapter 20 Time Series Analysis and Forecasting. Introduction Any variable that is measured over time in sequential order is called a time series. We.
1 G Lect 10M Contrasting coefficients: a review ANOVA and Regression software Interactions of categorical predictors Type I, II, and III sums of.
Yandell – Econ 216 Chap 15-1 Chapter 15 Multiple Regression Model Building.
2.1 Linear and Quadratic Functions and Modeling Objective: Students will be able to identify specific types of functions, describe properties of those.
Chapter 4: Basic Estimation Techniques
Linear Regression.
Regression and Correlation
Chapter 4 Basic Estimation Techniques
Meadowfoam Example Continuation
Basic Estimation Techniques
What is Correlation Analysis?
Multiple Regression.
The Least-Squares Regression Line
1-Way ANOVA with Numeric Factor – Dose-Response
Basic Estimation Techniques
Linear Regression.
Essential Questions How do we use the Factor Theorem to determine factors of a polynomial? How do we factor the sum and difference of two cubes.
Polynomial Fit in R.
Linear regression Fitting a straight line to observations.
Polynomials CA 10.0.
Objectives Classify polynomials and write polynomials in standard form. Evaluate polynomial expressions.
Curvilinear Regression
9.5 Least-Squares Digital Filters
Ch 4.1 & 4.2 Two dimensions concept
Presentation transcript:

1 G Lect 6W Polynomial example Orthogonal polynomials Statistical power for regression G Multiple Regression Week 6 (Wednesday)

2 G Lect 6W Constructing polynomial fits Two approaches for constructing polynomial fits »Simply create squared, cubed versions of X »Center first: Create squared, cubed versions of (X-C) X c =(X-  X) X c and X c 2 will have little or no correlation Both approach yield identical fits Centered polynomials are easier to interpret.

3 G Lect 6W Example from Cohen Interest in minor subject as a function of credits in minor

4 G Lect 6W Interpreting polynomial regression Suppose we have the model »Y=b 0 +b 1 X 1 +b 2 X 2 +e »b 1 is interpreted as the effect of X 1 when X 2 is adjusted Suppose X 1 =W, X 2 =W 2 What does it mean to "hold constant" X 2 in this context? When the zero point is interpretable »Linear term is slope at point 0 »Quadratic is acceleration at point 0 »Cubic is change in acceleration at point 0

5 G Lect 6W Orthogonal Polynomials In experiments, one might have three or four levels of treatment with equal spacing. »0, 1, 2 »0, 1, 2, 3 These levels can be used with polynomial models to fit »Linear, quadratic or cubic trends »We would simply construct squared and cubic forms.

6 G Lect 6W Making polynomials orthogonal The linear, quadratic and cubic trends are all going up in the same way. The curve for the quadratic is like the one for cubic. Orthogonal polynomials eliminate this redundancy hierarchically. »The constant is removed from the linear trend »The const and linear are removed from quadratic »The const, lin and quad are removed from the cubic.

7 G Lect 6W Analysis with Orthogonal polynomials If we substitute orthogonal polynomials for the usual linear, squared, and cubic terms, we »Recover the same polynomial fit »Obtain effects that are useful in determining the polynomial order Even when cubic effects are included, with orthogonal effects »The linear is the average effect »The quadratic is adjusted for the linear, but not adjusted for cubic »The cubic is adjusted for all before The regression coefficients, however are difficult to interpret.

8 G Lect 6W Computing Orthogonal Polynomials Can copy values from Cohen et al or other tables »Substitute original polynomial values for orthogonal version Can use the matrix routine of SPSS to implement a special Transformation. »Read the polynomial data into Matrix. »Use program provided that essentially does four things Computes the polynomial sums or squares/cross products Finds a Cholesky factor Inverts the Cholesky factor Transforms the polynomial values to be orthogonal

9 G Lect 6W The Matrix program MATRIX. GET X /VARIABLES = X, XSQ, XCUB. COMPUTE XFULL={MAKE(100,1,1),X}. COMPUTE XX=T(XFULL)*XFULL. COMPUTE XCHOL=CHOL(XX). PRINT XCHOL. COMPUTE ICHOL=INV(XCHOL). PRINT ICHOL. COMPUTE XORTH=XFULL*ICHOL. SAVE XORTH /OUTFILE=* /VARIABLES= OTH0 ORTH1 ORTH2 ORTH3. END MATRIX. MATCH FILES /FILE=* /FILE='C:\My Documents\Pat\Courses\G Regression\Examples\Reg06W.sav'. EXECUTE.

10 G Lect 6W The transformed variables