Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae

Similar presentations


Presentation on theme: "Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae"— Presentation transcript:

1 Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae

2

3 Curve Fitting Finding a function passing thru
-May require estimates at points between the discrete values. -May require a simplified version of a complicated function.

4 12.1 Two general approaches for curving fitting
Least-squares regression The data exhibits “scatter” Interpolation Where the data is known to be very precise, the basic approach is to fit a curve or a series of curves that pass directly through each of the points.

5 12.2 Statistics Review n data points y1,y2,…,yn mean
standard deviation variance coefficient of variation degrees of freedom

6 The Normal Distribution
As n increases, the histogram often approaches the normal distribution. 68% of total measurements in

7 12.3 Least Squares Regression
Minimize some measure of the difference between the approximating function and the given data points. In least squares method, the error is measured as :

8 Linear Least Squares Regression
f(x) is in a linear form : f(x)=ax+b The error : e = y – ax - b The sum of the residuals errors for all the variable data, as in Does it work? -Any straight line passing through the midpoint of the connecting line get the minimum value Then NO !

9 Is minimized when : -The minimum of E occurs when the partial derivatives of E with respect to each of the variables are 0. Equ. (12.15) and (12.16)

10 Example 12.2 Find a function of a straight line that fits (10, 25), (20, 70), (30, 380), (40, 550), (50, 610), (60, 1220), (70, 830), (80, 1450) in least squares method.

11 Quantification of Error

12 The spread of the points around the line is of similar magnitude along the entire range of the data.
The distribution of these points about the line is normal. the estimates of a and b are the best!! This is called the maximum likelihood principle in statistics.

13 Standard Error of Estimate
It is divided by n-2 since we lost two data a and b. Quantifies the spread around the regression line. Quantifies the spread around the mean.

14 Coefficient of Determination
correlation coefficient r=1 => the line explains 100% of the variability of the data r=0 => the fit represents no improvement over mean

15 Example 12.3 These results indicate that 88.05% of the original uncertainty has been explained by the linear model.

16 Example 12.3

17 12.3.4 Linearization of Nonlinear Relationships
The exponential equation The power equation The sturation-growth-rate equation

18

19 Example 12.4 f(x) is in a linear form : f(x)=ax+b
(10, 25), (20, 70), (30, 380), (40, 550), (50, 610), (60, 1220), (70, 830), (80, 1450) using power equation. (That is we assume we fit a power equation to the given data.) f(x) is in a linear form : f(x)=ax+b

20

21 Example 12.4

22

23 12.4 Implementation of Linear Regression

24 MATLAB’s methods z=polyfit(x,y,n) : n : the degree of the polynomial
yy=polyval(z,xx)


Download ppt "Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae"

Similar presentations


Ads by Google