Nonlinear Fitting.

Slides:



Advertisements
Similar presentations
Nonlinear models Hill et al Chapter 10. Types of nonlinear models Linear in the parameters. –Includes models that can be made linear by transformation:
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
Data Modeling and Parameter Estimation Nov 9, 2005 PSCI 702.
Section 4.2 Fitting Curves and Surfaces by Least Squares.
Function Approximation
General Linear Least-Squares and Nonlinear Regression
Least Square Regression
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Empirical Estimation Review EconS 451: Lecture # 8 Describe in general terms what we are attempting to solve with empirical estimation. Understand why.
9.4 – Solving Quadratic Equations By Completing The Square
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Stat13-lecture 25 regression (continued, SE, t and chi-square) Simple linear regression model: Y=  0 +  1 X +  Assumption :  is normal with mean 0.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Fitting a line to N data points – 1 If we use then a, b are not independent. To make a, b independent, compute: Then use: Intercept = optimally weighted.
Linear Regression James H. Steiger. Regression – The General Setup You have a set of data on two variables, X and Y, represented in a scatter plot. You.
CISE301_Topic41 CISE301: Numerical Methods Topic 4: Least Squares Curve Fitting Lectures 18-19: KFUPM Read Chapter 17 of the textbook.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Multiple Regression I KNNL – Chapter 6. Models with Multiple Predictors Most Practical Problems have more than one potential predictor variable Goal is.
ES 240: Scientific and Engineering Computation. Chapter 13: Linear Regression 13. 1: Statistical Review Uchechukwu Ofoegbu Temple University.
Curve-Fitting Regression
Thomas Knotts. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR.
2014. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR We need to be.
Factoring Special Products
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Factoring and Solving Polynomial Equations (Day 1)
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
9.2A- Linear Regression Regression Line = Line of best fit The line for which the sum of the squares of the residuals is a minimum Residuals (d) = distance.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
A. Write an equation in slope-intercept form that passes through (2,3) and is parallel to.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
Notes Over 10.7 Factoring Special Products Difference of Two Squares.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Statistics 350 Review. Today Today: Review Simple Linear Regression Simple linear regression model: Y i =  for i=1,2,…,n Distribution of errors.
Regression and Correlation of Data Summary
Solving Higher Degree Polynomial Equations.
Non-linear relationships
Solving Quadratic Equations by the Complete the Square Method
Kin 304 Regression Linear Regression Least Sum of Squares
CH 5: Multivariate Methods
Ch12.1 Simple Linear Regression
BPK 304W Regression Linear Regression Least Sum of Squares
Regression Analysis 4e Montgomery, Peck & Vining
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Linear Regression.
Linear Regression.
MATH 2140 Numerical Methods
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Factor and Solve Polynomial Equations
Linear regression Fitting a straight line to observations.
Least Squares Fitting A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the.
Opener Perform the indicated operation.
Contact: Machine Learning – (Linear) Regression Wilson Mckerrow (Fenyo lab postdoc) Contact:
Section 2: Linear Regression.
3.5 Solving Nonlinear Systems
Regression Statistics
Discrete Least Squares Approximation
Least Square Regression
CALCULATING EQUATION OF LEAST SQUARES REGRESSION LINE
Solving simultaneous linear and quadratic equations
Topic 11: Matrix Approach to Linear Regression
Solving a System of Linear Equations
Regression and Correlation of Data
Microeconometric Modeling
Presentation transcript:

Nonlinear Fitting

Linearizing nonlinear Functions Not recommended unless you have information on the errors in your y data and you weight the fit according to those errors. Note these errors will change due to the transformation!

Example x y 1 0.5 2 1.7 3 3.4 4 5.7 5 8.7

Fitting Polynomials (Polynomial Regression) A system of 3 x 3 equations must be solved, meaning you must have at least three data points to fit a quadratic.

Fitting Polynomials (Polynomial Regression)

Fitting Polynomials (Polynomial Regression) Form the objective function:

Example inv(X'*X)*(X'*a) In Matlab you would write Switch to octave…. 1 2 5 3 8 4 17 16 In Matlab you would write inv(X'*X)*(X'*a) Switch to octave….

Errors in parameters Sigma^2 is estimated from the sum of squared errors (residuals) as before:

Errors in parameters The diagonals of the Cov(a) are the standard errors (variances) of the parameters