Download presentation

Presentation is loading. Please wait.

1
**Function Approximation**

Function approximation (Chapters 13 & 14) -- method of least squares -- minimize the residuals -- given data of points have noises -- the purpose is to find the trend represented by data. Function interpolation (Chapters 15 & 16) -- approximating function match the given data exactly -- given data of points are precise -- the purpose is to find data between these points

2
**Interpolation and Regression**

3
**Curve Fitting: Fitting a Straight Line**

Chapter 13 Curve Fitting: Fitting a Straight Line

4
**Least Square Regression**

Curve Fitting Statistics Review Linear Least Square Regression Linearization of Nonlinear Relationships MATLAB Functions

5
**Wind Tunnel Experiment**

Curve Fitting Measure air resistance as a function of velocity

6
**Regression and Interpolation**

Curve fitting (a) Least-squares regression (b) Linear interpolation (c) Curvilinear interpolation

7
**Least-squares fit of a straight line**

v, m/s F, N

8
**Simple Statistics Mean, standard deviation, variance, etc.**

Measurement of the coefficient of thermal expansion of structural steel [106 in/(inF)] Mean, standard deviation, variance, etc.

9
**Statistics Review Arithmetic mean Standard deviation about the mean**

Variance (spread) Coefficient of variation (c.v.)

10

11
**Coefficient of Thermal Expansion**

Mean Sum of the square of residuals Standard deviation Variance Coefficient of variation

12
**Histogram Normal Distribution**

A histogram used to depict the distribution of data For large data set, the histogram often approaches the normal distribution (use data in Table 12.2)

13
**Regression and Residual**

14
**Linear Regression Fitting a straight line to observations**

Small residual errors Large residual errors

15
**Linear Regression Equation for straight line**

Difference between observation and line ei is the residual or error

16
**Least Squares Approximation**

Minimizing Residuals (Errors) minimum average error (cancellation) minimum absolute error minimax error (minimizing the maximum error) least squares (linear, quadratic, ….)

17
**Minimize Sum of Absolute Errors Minimize the Maximum Error**

Minimize Sum of Errors Minimize Sum of Absolute Errors Minimize the Maximum Error

18
**Linear Least Squares Minimize total square-error**

Straight line approximation Not likely to pass all points if n > 2

19
**Linear Least Squares Solve for (a0 ,a1)**

Total square-error function: sum of the squares of the residuals Minimizing square-error Sr(a0 ,a1) Solve for (a0 ,a1)

20
Linear Least Squares Minimize Normal equation y = a0 + a1x

21
**Advantage of Least Squares**

Positive differences do not cancel negative differences Differentiation is straightforward weighted differences Small differences become smaller and large differences are magnified

22
Linear Least Squares Use sum( ) in MATLAB

23
**Correlation Coefficient**

Sum of squares of the residuals with respect to the mean Sum of squares of the residuals with respect to the regression line Coefficient of determination Correlation coefficient

24
**Correlation Coefficient**

Alternative formulation of correlation coefficient More convenient for computer implementation

25
**Standard Error of the Estimate**

If the data spread about the line is normal “Standard deviation” for the regression line Standard error of the estimate No error if n = 2 (a0 and a1)

26
**Spread of data around the mean**

Linear regression reduce the spread of data Normal distributions Spread of data around the mean Spread of data around the best-fit line

27
**Standard Deviation for Regression Line**

Sy/x Sy Sy : Spread around the mean Sy/x : Spread around the regression line

28
**Example: Linear Regression**

29
**Example: Linear Regression**

Standard deviation about the mean Standard error of the estimate Correlation coefficient

30
**Linear Least Square Regression**

31
**Modified MATLAB M-File**

32
**Sum of squares of residuals Sr Standard error of the estimate**

» x=1:7 x = » y=[ ] y = » s=linear_LS(x,y) a0 = 0.0714 a1 = 0.8393 x y (a0+a1*x) (y-a0-a1*x) err = 2.9911 Syx = 0.7734 r = 0.9318 s = Sum of squares of residuals Sr Standard error of the estimate Correlation coefficient y = x

33
**Linear regression y = 0.0714+0.8393x**

» x=0:1:7; y=[ ]; Linear regression y = x Error : Sr = correlation coefficient : r =

34
**r = 0.9617 y = 4.5933 1.8570 x function [x,y] = example1**

» s=Linear_LS(x,y) a0 = 4.5933 a1 = x y (a0+a1*x) (y-a0-a1*x) err = Syx = 1.6996 r = 0.9617 s = r = y = x

35
Linear Least Square y = x Error Sr = Correlation Coefficient r =

36
**Data in arbitrary order**

» [x,y]=example2 x = Columns 1 through 7 Columns 8 through 10 y = » s=Linear_LS(x,y) a0 = a1 = 3.6005 x y (a0+a1*x) (y-a0-a1*x) err = 4.2013e+003 Syx = r = 0.4434 s = Data in arbitrary order Large errors !! Correlation coefficient r = Linear Least Square: y = x

37
Linear regression y = x Error Sr = Correlation r = !!

38
**Linearization of Nonlinear Relationships**

39
**Untransformed power equation x vs. y transformed data log x vs. log y**

40
**Linearization of Nonlinear Relationships**

Exponential equation Power equation log : Base-10

41
**Linearization of Nonlinear Relationships**

Saturation-growth-rate equation Rational function

42
**Example 12.4: Power Equation Power equation fit along with the data**

Transformed Data log xi vs. log yi y = 2 x 2 Power equation fit along with the data x vs. y

43
>> x=[ ]; >> y = [ ]; >> [a, r2] = linregr(x,y) a = r2 = 0.8805 y = x 12-12

44
**log x vs. log y 12-13 log y = 1.9842 log x – 0.5620**

>> linregr(log10(x),log10(y)) r2 = 0.9481 ans = log y = log x – y = (10–0.5620)x = x1.9842 log x vs. log y 12-13

45
**MATLAB Functions Least-square fit of nth-order polynomial**

p = polyfit(x,y,n) Evaluate the value of polynomial using y = polyval(p,x)

46
**CVEN 302-501 Homework No. 9 Chapter 13**

Prob (20)& 13.2(20) (Hand Calculations) Prob (30) & 13.7(30) (Hand Calculation and MATLAB program) You may use spread sheets for your hand computation Due Oct/22, 2008 Wednesday at the beginning of the period

Similar presentations

© 2020 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google