Presentation is loading. Please wait.

Presentation is loading. Please wait.

Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7.

Similar presentations


Presentation on theme: "Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7."— Presentation transcript:

1 Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7

2 Engineering Computation Curve fitting 2 Curve Fitting: Given a set of points: - experimental data - tabular data - etc. Fit a curve (surface) to the points so that we can easily evaluate f(x) at any x of interest. If x within data range  interpolating (generally safe) If x outside data range  extrapolating (often dangerous)

3 Engineering Computation Curve fitting 3 Curve Fitting: Two main methods will be covered: 1. Least-Squares Regression Function is "best fit" to data. Does not necessarily pass through points. Used for scattered data (experimental) Can develop models for analysis/design. 2. Interpolation Function passes through all (or most) points. Interpolates values of well-behaved (precise) data or for geometric design.

4 Engineering Computation Curve Fitting & Interpolation 2. We now discuss Interpolation & Extrapolation The function passes through all (or at least most) points. Curve Fitting: 1. We have discussed Least-Squares Regression where the function is "best fit" to points but does not necessarily pass through the points. extrapolationinterpolation

5 Engineering Computation Least Squares Regression: General Procedure 5

6 Engineering Computation Least Squares Regression 6 Curve Fitting by Least-Squares Regression: Objective: Obtain low order approximation (curve or surface) that "best fits" data Note: Because the order of the approximation is < the number of data points, the curve or surface can not pass through all points. We will need a consistent criterion for determining the "best fit." Typical Usage: Scattered (experimental) data Develop empirical models for analysis/design.

7 Engineering Computation Least Squares Regression 7 Least-Squares Regression: 1. In laboratory, apply x, measure y, tabulate data. 2. Plot data and examine the relationship. y x x i y i

8 Engineering Computation Least Squares Regression 8 Least-Squares Regression: 1. In laboratory, apply x, measure y, tabulate data. 2. Plot data and examine the relationship. y x x i y i

9 Engineering Computation Least Squares Regression 9 Least-Squares Regression: 3. Develop a "model" – an approximate relationship between y and x: y = m x + b 4. Use the model to predict or estimate y for any given x. 5. "Best fit" of the data requires: Optimal way of finding parameters (e.g., slope and intercept of a straight line. Perhaps optimize the selection of the model form (i.e., linear, quadratic, exponential,...). That the magnitudes of the residual errors do not vary in any systematic fashion. [In statistical applications, the residual errors should be independent and identically distributed.]

10 Engineering Computation Least Squares Regression 10 Least-Squares Regression Given: n data points: (x 1,y 1 ), (x 2,y 2 ), … (x n,y n ) Obtain: "Best fit" curve: f(x) =a 0 Z 0 (x) + a 1 Z 1 (x) + a 2 Z 2 (x)+…+ a m Z m (x) a i 's are unknown parameters of model Z i 's are known functions of x. We will focus on two of the many possible types of regression models: Simple Linear Regression Z 0 (x) = 1 & Z 1 (x) = x General Polynomial Regression Z 0 (x) = 1, Z 1 (x)= x, Z 2 (x) = x 2, …, Z m (x)= x m

11 b = REGRESS(y,X) returns the vector of regression coefficients, b, in the linear model y = Xb, (X is an nxp matrix, y is the nx1 vector of observations). [B,BINT,R,RINT,STATS] = REGRESS(y,X,alpha) uses the input, ALPHA to calculate 100(1 - ALPHA) confidence intervals for B and the residual vector, R, in BINT and RINT respectively. The vector STATS contains the R-square statistic along with the F and p values for the regression. >> x=linspace(0,1,20)’; >> y=2*x+1+0.1*randn(20,1); >> plot(x,y,'.') >> xx=[ones(20,1), x]; >> b=regress(y,xx) b = 1.0115 1.9941 >> yy=xx*b; >> hold on >> plot(x,yy,‘k-')

12 Engineering Computation Least Squares Regression: General Procedure 12 Least Squares Regression (cont'd): General Procedure: For the i th data point, (x i,y i ) we find the set of coefficients for which: y i = a 0 Z 0 (x i ) + a 1 Z 1 (x i ).... + a m Z m (x i ) + e i where e i is the residual error = the difference between reported value and model: e i = y i – a 0 Z 0 (x i ) – a 1 Z 1 (x) i –… – a m Z m (x i ) Our "best fit" will minimize the total sum of the squares of the residuals:

13 Engineering Computation Least Squares Regression: General Procedure 13 Our "best fit" will be the function which minimizes the sum of squares of the residuals: y x x i yiyi eiei measured value modeled value

14 Engineering Computation Least Squares Regression: General Procedure 14 Least Squares Regression (cont'd): To minimize this expression with respect to the unknowns a 0, a 1 … a m take derivatives of S r and set them to zero:

15 Engineering Computation Least Squares: Linear Algebra 15 Least Squares Regression (cont'd): In Linear Algebra form: {Y} = [Z] {A} + {E} or {E} = {Y} – [Z] {A} where:{E} and {Y} --- n x 1 [Z] -------------- n x (m+1) {A} ------------- (m+1) x 1 n = # points(m+1) = # unknowns {E} T = [e 1 e 2... e n ], {Y} T = [y 1 y 2... y n ], {A} T = [a 0 a 1 a 2... a m ]

16 Engineering Computation Least Squares: Sum Square error 16 Least Squares Regression (cont'd): {E} = {Y} – [Z]{A} Then S r = {E} T {E} = ({Y}–[Z]{A}) T ({Y}–[Z]{A}) = {Y} T {Y} – {A} T [Z] T {Y} – {Y} T [Z]{A} + {A} T [Z] T [Z]{A} = {Y} T {Y}– 2 {A} T [Z] T {Y} + {A} T [Z] T [Z]{A} Setting = 0 for i =1,...,n yields: = 0 = 2 [Z] T [Z]{A} – 2 [Z] T {Y} or [Z] T [Z]{A} = [Z] T {Y}

17 Engineering Computation Least Squares: Normal Equations 17 Least Squares Regression (cont'd): [Z] T [Z]{A} = [Z] T {Y} (C&C Eq. 17.25) This is the general form of Normal Equations. They provides (m+1) equations in (m+1) unknowns. (Note that we end up with a system of linear equations.)

18 Engineering Computation Least Squares: Simple Linear Regression 18 Simple Linear Regression (m = 1): Given: n data points, (x 1,y 1 ),(x 2,y 2 ),…(x n,y n ) with n > 2 Obtain: "Best fit" curve:f(x) = a 0 + a 1 x from the n equations: y 1 = a 0 + a 1 x 1 + e 1 y 2 = a 0 + a 1 x 2 + e 2 y n = a 0 + a 1 x n + e n Or, in matrix form: [Z] T [Z] {A} = [Z] T {Y}

19 Engineering Computation Least Squares: Simple Linear Regression 19 Simple Linear Regression (m = 1): Normal Equations [Z] T [Z] {A} = [Z] T {Y} upon multiplying the matrices become Normal Equations for Linear Regression C&C Eqs. (17.4-5) (This form works well for spreadsheets.)

20 Engineering Computation Least Squares: Simple Linear Regression 20 Simple Linear Regression (m = 1): [Z] T [Z] {A} = [Z] T {Y} C&C equations (17.6) and (17.7) Solving for {a}:

21 Engineering Computation Least Squares: Simple Linear Regression 21 Simple Linear Regression (m = 1): [Z] T [Z] {A} = [Z] T {Y} which is easier and numerically more stable, but the 2 nd equation remains the same: A better version of the first normal equation is:

22 ENGRD 241 / CEE 241: Engineering Computation Curve Fitting 22 Common Nonlinear Relations: Objective: Use linear equations for simplicity. Remedy: Transform data into linear form and perform regressions. Given: data which appears as: (1)exponential-like curve: (e.g., population growth, radioactive decay, attenuation of a transmission line) Can also use: ln(y) = ln(a 1 ) + b 1 x

23 ENGRD 241 / CEE 241: Engineering Computation Curve Fitting 23 Common Nonlinear Relations: (2)Power-like curve: ln(y) = ln(a 2 ) + b 2 ln(x) (3) Saturation growth-rate curve population growth under limiting conditions Be careful about the implied distribution of the errors. Always use the untransformed values for error analysis. a 3 =5 b 3 =1..10

24 Engineering Computation Goodness of fit 24 Major Points in Least-Squares Regression: 1.In all regression models one is solving an overdetermined system of equations, i.e., more equations than unknowns. 2.How good is the fit? Often based on a coefficient of determination, r 2

25 Engineering Computation Goodness of fit 25 r 2 Compares the average spread of the data about the regression line compared to the spread of the data about the mean. Spread of the data around the regression line: Spread of the data around the mean:

26 Engineering Computation Goodness of fit 26 Coefficient of determination describes how much of variance is “explained” by the regression equation Want r 2 close to 1.0. Doesn't work if models have different numbers of parameters. Be careful when using different transformations – always do the analysis on the untransformed data.

27 Engineering Computation Standard Errpr of the estimate 27 Precision : If the spread of the points around the line is of similar magnitude along the entire range of the data, Then one can use to describe the precision of the regression estimate (in which m+1 is the number of coefficients calculated for the fit, e.g., m+1=2 for linear regression) = standard error of the estimate (standard deviation in y)

28 Engineering Computation Standard Errpr of the estimate 28 Statistics Chapra and Canale in sections PT5.2, 17.1.3 and 17.4.3 discuss the statistical interpretation of least squares regression and some of the associated statistical concepts. The statistical theory of least squares regression is elegant, powerful, and widely used in the analysis of real data throughout the sciences. See Lecture Notes pages X-14 through X-16.


Download ppt "Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7."

Similar presentations


Ads by Google