Presentation is loading. Please wait.

Presentation is loading. Please wait.

Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.

Similar presentations


Presentation on theme: "Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010."— Presentation transcript:

1

2 Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010

3 Material Outline Curve Fitting –Least square fit –Quantification of error –Coefficient of determination –Coefficient of correlation

4 4  CURVE FITTING In Curve Fitting, n pairs of numbers are expressed in ((x 1,y 1 ), (x 2,y 2 ), …(x n, y n )). These pairs are possibly from observation or field measurements of certain quantity. The objective: To find a certain function such that we can inter-relate the pairs of numbers,  f(x j )  y j. In other word, if the function is plotted, the resulted graph will best fit the pairs of numbers.

5 5  CURVE FITTING One method that can be used to find the function for curve fitting of n pairs of observation values is to minimize the discrepancy between n pairs of observations with the curve. To minimize the discrepancy is known as Least Squared Regression. Least Squared Regression  Linear Regression Polynomial Regression

6 6 LINEAR REGRESSION In Linear regression, n pairs observations or field measurements is fitted to a straight line (linear). Linear or straight line can be written as: y= a 0 + a 1 x + E, in which a 0 : intercept, a 1 : slope, gradient E : error (discrepancy) between data points with the chosen linear line model. The above equation can be written as :  E = y - a 0 - a 1 x  From this equation, it can be seen that the error E is the difference between the true value y with the approximate value a 0 + a 1 x.

7 7 LINEAR REGRESSION E = y - a 0 - a 1 x  There are several methods to find the Best Fit as to 1.Minimize the sum of the residual (error), E 2.Minimize the sum of the absolute of residual (error), |E| 3.Minimize the sum of the squared of residual (error), E 2  Out of these 3 methods, the best method is to minimize the sum of the squared of residual (error), E 2. One of the advantage of using this method is that the resulted line is unique for each set of n pairs of data.  This approach is known as Least Squares Fit.

8 8 Least Square Fit  The coefisients a 0 and a 1 in the previous equation will be determined by minimizing the sum of error (residual) squared as follows :  To minimize means (Calculus):

9 9 Least Square Fit  From previous equations then a 0 and a 1 can be written as:

10 10  QUANTIFICATION OF ERROR OF LINEAR REGRESSION Standard Deviation between prediction model with data distribution can be quantified using the following formula :

11 11  QUANTIFICATION OF ERROR OF LINEAR REGRESSION In addition to the sum of the squares of residuals (s r ), there is a quantity the sum of the squares around the mean value s t =  (y-y i ) 2. The difference between s t and s r quantifies the improvement or error reduction due to linear regression rather than average value. Two coefficients to quantifies this improvement is given below: Coefficients of determination (r 2 ) and Correlation coeff. These 2 Coeff quantify the perfect ness of the fit of the linear regression

12 12 Coefficient of Determination Correlation Coeff  r; 0 r  1; r=1  Perfect Fit r=0  No improvement s t =s r  QUANTIFICATION OF ERROR OF LINEAR REGRESSION

13 13 ixixi yiyi 110,5 222,5 332,0 444,0 553,5 666,0 775,5 Least Square Fit  Example: Find the linear regression line to fit the following data and estimate the deviation standard.

14 14 ixixi yiyi x i y i xi2xi2 y i -a 0 -a 1 x i 110,5……… 222,5……… 332,0……… 444,0……… 553,5……… 666,0……… 775,5………  = … Answer:

15 15 Answer (cont): after completion of the previous Table

16 16 POLYNOMIAL REGRESSION  For most cases the linear regression that just discussed is appropriate to fit data distribution. For some case, however, it is not. For these cases, Polynomial functions can be used as an alternative.  Polynomial functions can be written as:  As before, the sum of the squares of error can be written as:

17 17 POLYNOMIAL REGRESSION  In a polynomial function given before, there are m+1 unknown quantities they are: a 0, a 1, …, a m.  These quantities will be determined by minimizing the sum of the squares of error S r as follows  From the above m+1 equations, the parameters a 0, a 1, …, a m can be determined


Download ppt "Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010."

Similar presentations


Ads by Google