# Function Approximation

## Presentation on theme: "Function Approximation"— Presentation transcript:

Function Approximation
Function approximation (Chapters 13 & 14) -- method of least squares -- minimize the residuals -- given data of points have noises -- the purpose is to find the trend represented by data. Function interpolation (Chapters 15 & 16) -- approximating function match the given data exactly -- given data of points are precise -- the purpose is to find data between these points

Interpolation and Regression

Curve Fitting: Fitting a Straight Line
Chapter 13 Curve Fitting: Fitting a Straight Line

Least Square Regression
Curve Fitting Statistics Review Linear Least Square Regression Linearization of Nonlinear Relationships MATLAB Functions

Wind Tunnel Experiment
Curve Fitting Measure air resistance as a function of velocity

Regression and Interpolation
Curve fitting (a) Least-squares regression (b) Linear interpolation (c) Curvilinear interpolation

Least-squares fit of a straight line
v, m/s F, N

Simple Statistics Mean, standard deviation, variance, etc.
Measurement of the coefficient of thermal expansion of structural steel [106 in/(inF)] Mean, standard deviation, variance, etc.

Statistics Review Arithmetic mean Standard deviation about the mean
Variance (spread) Coefficient of variation (c.v.)

Coefficient of Thermal Expansion
Mean Sum of the square of residuals Standard deviation Variance Coefficient of variation

Histogram Normal Distribution
A histogram used to depict the distribution of data For large data set, the histogram often approaches the normal distribution (use data in Table 12.2)

Regression and Residual

Linear Regression Fitting a straight line to observations
Small residual errors Large residual errors

Linear Regression Equation for straight line
Difference between observation and line ei is the residual or error

Least Squares Approximation
Minimizing Residuals (Errors) minimum average error (cancellation) minimum absolute error minimax error (minimizing the maximum error) least squares (linear, quadratic, ….)

Minimize Sum of Absolute Errors Minimize the Maximum Error
Minimize Sum of Errors Minimize Sum of Absolute Errors Minimize the Maximum Error

Linear Least Squares Minimize total square-error
Straight line approximation Not likely to pass all points if n > 2

Linear Least Squares Solve for (a0 ,a1)
Total square-error function: sum of the squares of the residuals Minimizing square-error Sr(a0 ,a1) Solve for (a0 ,a1)

Linear Least Squares Minimize Normal equation y = a0 + a1x

Positive differences do not cancel negative differences Differentiation is straightforward weighted differences Small differences become smaller and large differences are magnified

Linear Least Squares Use sum( ) in MATLAB

Correlation Coefficient
Sum of squares of the residuals with respect to the mean Sum of squares of the residuals with respect to the regression line Coefficient of determination Correlation coefficient

Correlation Coefficient
Alternative formulation of correlation coefficient More convenient for computer implementation

Standard Error of the Estimate
If the data spread about the line is normal “Standard deviation” for the regression line Standard error of the estimate No error if n = 2 (a0 and a1)

Spread of data around the mean
Linear regression reduce the spread of data Normal distributions Spread of data around the mean Spread of data around the best-fit line

Standard Deviation for Regression Line
Sy/x Sy Sy : Spread around the mean Sy/x : Spread around the regression line

Example: Linear Regression

Example: Linear Regression
Standard deviation about the mean Standard error of the estimate Correlation coefficient

Linear Least Square Regression

Modified MATLAB M-File

Sum of squares of residuals Sr Standard error of the estimate
» x=1:7 x = » y=[ ] y = » s=linear_LS(x,y) a0 = 0.0714 a1 = 0.8393 x y (a0+a1*x) (y-a0-a1*x) err = 2.9911 Syx = 0.7734 r = 0.9318 s = Sum of squares of residuals Sr Standard error of the estimate Correlation coefficient y = x

Linear regression y = 0.0714+0.8393x
» x=0:1:7; y=[ ]; Linear regression y = x Error : Sr = correlation coefficient : r =

r = 0.9617 y = 4.5933  1.8570 x function [x,y] = example1
» s=Linear_LS(x,y) a0 = 4.5933 a1 = x y (a0+a1*x) (y-a0-a1*x) err = Syx = 1.6996 r = 0.9617 s = r = y =  x

Linear Least Square y =  x Error Sr = Correlation Coefficient r =

Data in arbitrary order
» [x,y]=example2 x = Columns 1 through 7 Columns 8 through 10 y = » s=Linear_LS(x,y) a0 = a1 = 3.6005 x y (a0+a1*x) (y-a0-a1*x) err = 4.2013e+003 Syx = r = 0.4434 s = Data in arbitrary order Large errors !! Correlation coefficient r = Linear Least Square: y =  x

Linear regression y =  x Error Sr = Correlation r = !!

Linearization of Nonlinear Relationships

Untransformed power equation x vs. y transformed data log x vs. log y

Linearization of Nonlinear Relationships
Exponential equation Power equation log : Base-10

Linearization of Nonlinear Relationships
Saturation-growth-rate equation Rational function

Example 12.4: Power Equation Power equation fit along with the data
Transformed Data log xi vs. log yi y = 2 x 2 Power equation fit along with the data x vs. y

>> x=[ ]; >> y = [ ]; >> [a, r2] = linregr(x,y) a = r2 = 0.8805 y = x  12-12

log x vs. log y 12-13 log y = 1.9842 log x – 0.5620
>> linregr(log10(x),log10(y)) r2 = 0.9481 ans = log y = log x – y = (10–0.5620)x = x1.9842 log x vs. log y 12-13

MATLAB Functions Least-square fit of nth-order polynomial
p = polyfit(x,y,n) Evaluate the value of polynomial using y = polyval(p,x)

CVEN 302-501 Homework No. 9 Chapter 13
Prob (20)& 13.2(20) (Hand Calculations) Prob (30) & 13.7(30) (Hand Calculation and MATLAB program) You may use spread sheets for your hand computation Due Oct/22, 2008 Wednesday at the beginning of the period