Chem 302 - Math 252 Chapter 5 Regression. Linear & Nonlinear Regression Linear regression –Linear in the parameters –Does not have to be linear in the.

Slides:



Advertisements
Similar presentations
Nelder Mead.
Advertisements

P. Venkataraman Mechanical Engineering P. Venkataraman Rochester Institute of Technology DETC2013 – 12269: Continuous Solution for Boundary Value Problems.
Sections 4.1 and 4.2 The Simplex Method: Solving Maximization and Minimization Problems.
Optimization : The min and max of a function
Experimental Design, Response Surface Analysis, and Optimization
 Coefficient of Determination Section 4.3 Alan Craig
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Nonlinear Regression Ecole Nationale Vétérinaire de Toulouse Didier Concordet ECVPT Workshop April 2011 Can be downloaded at
P M V Subbarao Professor Mechanical Engineering Department
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Curve Fitting and Interpolation: Lecture (IV)
Curve-Fitting Regression
REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.
General Linear Least-Squares and Nonlinear Regression
Fitting. We’ve learned how to detect edges, corners, blobs. Now what? We would like to form a higher-level, more compact representation of the features.
Part 4 Chapter 13 Linear Regression
Regression Diagnostics Checking Assumptions and Data.
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
Chapter 5 Transformations and Weighting to Correct Model Inadequacies
Least Squares Regression
Chapter 10 Real Inner Products and Least-Square (cont.)
Review Guess the correlation. A.-2.0 B.-0.9 C.-0.1 D.0.1 E.0.9.
CpE- 310B Engineering Computation and Simulation Dr. Manal Al-Bzoor
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Introduction to Error Analysis
Real-Time Optimization (RTO) In previous chapters we have emphasized control system performance for load and set-point changes. Now we will be concerned.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Analytical vs. Numerical Minimization Each experimental data point, l, has an error, ε l, associated with it ‣ Difference between the experimentally measured.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Quality of Curve Fitting P M V Subbarao Professor Mechanical Engineering Department Suitability of A Model to a Data Set…..
VI. Evaluate Model Fit Basic questions that modelers must address are: How well does the model fit the data? Do changes to a model, such as reparameterization,
ES 240: Scientific and Engineering Computation. Chapter 13: Linear Regression 13. 1: Statistical Review Uchechukwu Ofoegbu Temple University.
Curve-Fitting Regression
Thomas Knotts. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR.
2014. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR We need to be.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 4 Section 2 – Slide 1 of 20 Chapter 4 Section 2 Least-Squares Regression.
Variation This presentation should be read by students at home to be able to solve problems.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Section 12.3 Regression Analysis HAWKES LEARNING SYSTEMS math courseware specialists Copyright © 2008 by Hawkes Learning Systems/Quant Systems, Inc. All.
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Least Squares Regression.   If we have two variables X and Y, we often would like to model the relation as a line  Draw a line through the scatter.
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
CHAPTER Continuity Optimization Problems. Steps in Solving Optimizing Problems : 1.Understand the problem. 2.Draw a diagram. 3.Introduce notation.
Fitting.
Copyright © 2006 Brooks/Cole, a division of Thomson Learning, Inc. Linear Programming: An Algebraic Approach 4 The Simplex Method with Standard Maximization.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Numerical Analysis – Data Fitting Hanyang University Jong-Il Park.
Chapter 14, continued More simple linear regression Download this presentation.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Chapter 4: Basic Estimation Techniques
Chapter 4 Basic Estimation Techniques
Chapter 7. Classification and Prediction
Basic Estimation Techniques
Basic Estimation Techniques
Linear regression Fitting a straight line to observations.
Instructor :Dr. Aamer Iqbal Bhatti
Nonlinear regression.
6.5 Taylor Series Linearization
Correlation and Regression
Nonlinear Fitting.
Multiple linear regression
Presentation transcript:

Chem Math 252 Chapter 5 Regression

Linear & Nonlinear Regression Linear regression –Linear in the parameters –Does not have to be linear in the independent variable(s) –Can be solved through a system of linear equations Nonlinear –Nonlinear in parameters –Usually requires linearization and iteration

Linear Least-Squares Regression Residual Sum of Square Residuals Want to minimize Z

Linear Least-Squares Regression

Example

Linear Least-Squares Regression Uncertainties in Parameters Example

Linear Least-Squares Regression Regression on “y” Treat x as y and y as x Choose x as variable with smallest error Can also be determined by equation

Linear Least-Squares Regression

Example – Vapour Pressure of Cadmium

Linear Least-Squares Regression Uncertainties in Parameters

Nonlinear Least-Squares Regression This results in a system of nonlinear equations Linearize & solve iteratively Need initial estimate of parameters

Nonlinear Least-Squares Regression - Example Van der Waals parameters for nitrogen p/atmT/KV m /(L mol -1 )p/atmT/KV m /(L mol -1 )

Weighted Least-Squares Regression may not always want to give equal weight to each point Applies to linear and nonlinear case

Drawbacks of Iterative Matrix Method Local minima can cause problems Can be sensitive to initial guess Derivatives must be evaluated for each iteration

Simplex Method Simplex has one more vertex than dimension of space –2D – Triangle m parameters – m+1 vertices Simplex Method used to optimize a set of parameters –Find optimal set of  ’s such that Z is minimum More robust than previous iterative procedure –Often slower

Simplex Method 1.Evaluate Z at m+1 unique sets of parameters 2.Identify Z B (best, smallest) and Z W (worst, largest) 3.Calculate Centroid of all but worst (average of different sets of parameters ignoring worst set) 4.Reflect worst point through Centroid

Simplex Method 5.Replace Worst point: a.If Z R 1 <Z B (reflected point is better than previous best) calculate i.If Z R 2 <Z R 1 replace W with R 2 ii.Otherwise replace W with R 1 b.If Z B <Z R 1 <Z W replace W with R 1 c.If Z R 1 >Z W a contracted point id calculated i.If Z R 3 <Z W replace W with R 3 ii.Otherwise move all points closer to the best point 6.Repeat until converged or maximum number of iterations have been performed

Simplex Regression - Example Van der Waals parameters for nitrogen p/atmT/KV m /(L mol -1 )p/atmT/KV m /(L mol -1 )

Simplex program

Simplex - Example Iteration 1: Response betaResponse Best Worst Centroid First reflected point Second reflected point Iteration 2: Response betaResponse Worst Best Centroid First reflected point Second reflected point Iteration 3: Response betaResponse Best Worst Centroid First reflected point Iteration 4: Response betaResponse Best Worst Centroid First reflected point Contracted point Iteration 31: Response betaResponse Best Worst Centroid First reflected point Contracted point Iteration 32: Response betaResponse Worst Best Centroid First reflected point Contracted point Iterations converged. R^ Final Converged Parameters kbeta

Simplex – Example (Iteration 1) B W C R1R1 R2R2

Simplex – Example (Iteration 2) B W C R1R1 R2R2

Simplex – Example (Iteration 3) B W C R1R1

Simplex – Example (Iteration 4) B W C R1R1 Contracted

Simplex – Example (Iteration 32) B W C R1R1 Contracted

Comparing Models Often have more than 1 equation that can be used to represent the data If two equations (models) have the same number of parameters the one with smaller Z is a better representation (fit) If two models have different number of parameters then can not do a direct comparison –Need to use F distribution & Confidence level –Model A – fewer number of parameters Model B – larger number of parameters

Comparing Models Model B is a better model if (and only if) Usually lookup F in Table and compare ratios With Maple can calculate confidence level for which B is a better model than A