Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.

Similar presentations


Presentation on theme: "Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter."— Presentation transcript:

1 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter 17

2 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 2 Fit the best curve to a discrete data set and obtain estimates for other data points Two general approaches: –Data exhibit a significant degree of scatter Find a single curve that represents the general trend of the data. –Data is very precise. Pass a curve(s) exactly through each of the points: interpolation. Curve Fitting

3 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 3 In sciences, if several measurements are made of a particular quantity, additional insight can be gained by summarizing the data in one or more well chosen statistics: Arithmetic mean - The sum of the individual data points (y i ) divided by the number of points. Standard deviation – a common measure of spread for a sample or variance Coefficient of variation – quantifies the spread of data (similar to relative error) Simple Statistics

4 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 4 Fitting a straight line to a set of paired observations: (x 1, y 1 ), (x 2, y 2 ),…,(x n, y n ) y i : measured value e i : error (residual) y i = a 0 + a 1 x i + e i e i = y i - a 0 - a 1 x i a 1 : slope a 0 : intercept Linear Regression e i Error Line equation y = a 0 + a 1 x

5 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Minimize the sum of the residual errors for all available data? Inadequate! (see  ) Sum of the absolute values? Inadequate! (see  ) How about minimizing the maximum distance that an individual point falls from the line? This does not work either! see  Choosing Criteria For a “Best Fit” Regression line 5

6 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 6 Best strategy is to minimize the sum of the squares of the residuals between the measured-y and the y calculated with the linear model: Yields a unique line for a given set of data Need to compute a 0 and a 1 such that S r is minimized! e i Error

7 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Least-Squares Fit of a Straight Line Normal equations which can be solved simultaneously 7

8 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Least-Squares Fit of a Straight Line Normal equations which can be solved simultaneously Mean values 8

9 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. S r = Sum of the squares of residuals around the regression line S t = total sum of the squares around the mean (S t – S r ) quantifies the improvement or error reduction due to describing data in terms of a straight line rather than as an average value. For a perfect fit S r =0 and r = r 2 = 1 signifies that the line explains 100 percent of the variability of the data. For r = r 2 = 0  S r =S t  the fit represents no improvement r : correlation coefficient r 2 : coefficient of determination “Goodness” of our fit The spread of data (a)around the mean (b)around the best-fit line Notice the improvement in the error due to linear regression 9

10 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Example Use linear regression to fit the following data to a straight line. Solution: Solving this system of two equations gives a 0 = 4.56 and a 1 = 0.6 y = 4.56 + 0.6 x xyx2x2 xy 15.11 25.9411.8 36.3918.9 ∑617.31435.8 10

11 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Example Use linear regression to fit the following data to a straight line. Solution: Solving this system of two equations gives a 0 =5 and a 1 =4 y = 5 + 4x xyx2x2 xy 0500 1919 213426 317951 4211684 ∑106530170 11

12 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Linearization of Nonlinear Relationships (a)Data that is ill-suited for linear least-squares regression (b)Indication that a parabola may be more suitable Exponential Eq. 12

13 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Saturation growth-rate Eq. Power Eq. Linearization of Nonlinear Relationships 13

14 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Data to be fit to the power equation: xy 10.5 21.7 33.4 45.7 58.4 After (log y)–(log x) plot is obtained Find   and   using: Linear Regression yields the result: log y = 1.75*log x – 0.300  2 = 1.75 log  2 = - 0.3   2 = 0.5 See Exercises.xls log x 0 0.301 0.477 0.602 0.699 log y -0.301 0.226 0.531 0.756 0.924 14

15 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 15 Polynomial Regression Some engineering data is poorly represented by a straight line. A curve (polynomial) may be better suited to fit the data. The least squares method can be extended to fit the data to higher order polynomials. As an example let us consider a second order polynomial to fit the data points:

16 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 16 To fit the data to an m th order polynomial, we need to solve the following system of linear equations ((m+1) equations with (m+1) unknowns) Matrix Form Polynomial Regression

17 Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Example: Polynomial Regression Fit a second-order polynomial to the following data Solution: Solving this system of three equations gives a 0 = 2.48, a 1 = 2.36, a 2 = 1.86 y = 2.48 + 2.36 x + 1.86 x 2 xixi yiyi xi2xi2 xi3xi3 xi4xi4 xiyixiyi xi2yixi2yi 02.100000 17.7111 213.6481627.254.4 327.29278181.6244.8 440.91664256163.6654.4 561.125125625305.51527.5 ∑15152.655225979585.62488.8 17


Download ppt "Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter."

Similar presentations


Ads by Google