Presentation is loading. Please wait.

Presentation is loading. Please wait.

Curve Fitting and Regression EEE 244. Descriptive Statistics in MATLAB MATLAB has several built-in commands to compute and display descriptive statistics.

Similar presentations


Presentation on theme: "Curve Fitting and Regression EEE 244. Descriptive Statistics in MATLAB MATLAB has several built-in commands to compute and display descriptive statistics."— Presentation transcript:

1 Curve Fitting and Regression EEE 244

2 Descriptive Statistics in MATLAB MATLAB has several built-in commands to compute and display descriptive statistics. Assuming some column vector s : –mean(s), median(s), mode(s) Calculate the mean, median, and mode of s. mode is a part of the statistics toolbox. –min(s), max(s) Calculate the minimum and maximum value in s. –var(s), std(s) Calculate the variance (square of standard deviation) and standard deviation of s Note - if a matrix is given, the statistics will be returned for each column.

3 Measurements of a Voltage Drain Current in mA at intervals t=10am-10pm 6.5 6.3 6.2 6.5 6.2 6.7 6.4 6.8 Find mean, median, mode, min, max, variance and standard deviation using appropriate Matlab functions.

4 Histograms in MATLAB [n, x] = hist(s, x) – Determine the number of elements in each bin of data in s. x is a vector containing the center values of the bins. – Open Matlab help and run example with 1000 randomly generated values for s. [n, x] = hist(s, m) – Determine the number of elements in each bin of data in s using m bins. x will contain the centers of the bins. The default case is m=10 – Repeat the previous example with setting m=5 Hist(s) ->histogram plot

5 REGRESSION EEE 244

6 6 Linear Regression Fitting a straight line to a set of paired observations: (x 1, y 1 ), (x 2, y 2 ),…,(x n, y n ). y=a 0 +a 1 x+e a 1 - slope a 0 - intercept e- error, or residual, between the model and the observations

7 Linear Least-Squares Regression Linear least-squares regression is a method to determine the “best” coefficients in a linear model for given data set. “Best” for least-squares regression means minimizing the sum of the squares of the estimate residuals. For a straight line model, this gives:

8 Least-Squares Fit of a Straight Line Using the model: the slope and intercept producing the best fit can be found using:

9 Example V (m/s) F (N) ixixi yiyi (x i ) 2 xiyixiyi 11025100250 220704001400 33038090011400 440550160022000 550610250030500 6601220360073200 770830490058100 88014506400116000  360513520400312850

10 Standard Error of the Estimate Regression data showing (a) the spread of data around the mean of the dependent data and (b) the spread of the data around the best fit line: The reduction in spread represents the improvement due to linear regression.

11 MATLAB Functions MATLAB has a built-in function polyfit that fits a least-squares nth order polynomial to data: –p = polyfit(x, y, n) x : independent data y : dependent data n : order of polynomial to fit p : coefficients of polynomial f(x)=p 1 x n +p 2 x n-1 +…+p n x+p n+1 MATLAB’s polyval command can be used to compute a value using the coefficients. –y = polyval(p, x)

12 Polyfit function Can be used to perform REGRESSION if the number of data points is a lot larger than the number of coefficients –p = polyfit(x, y, n) x : independent data (Vce, 10 data points) y : dependent data (Ic) n : order of polynomial to fit n=1 (linear fit) p : coefficients of polynomial (two coefficients) f(x)=p 1 x n +p 2 x n-1 +…+p n x+p n+1

13 Polynomial Regression The least-squares procedure can be readily extended to fit data to a higher-order polynomial. Again, the idea is to minimize the sum of the squares of the estimate residuals. The figure shows the same data fit with: a)A first order polynomial b)A second order polynomial

14 Process and Measures of Fit For a second order polynomial, the best fit would mean minimizing: In general, this would mean minimizing:

15 INTERPOLATION EEE 244

16 Polynomial Interpolation You will frequently have occasions to estimate intermediate values between precise data points. The function you use to interpolate must pass through the actual data points - this makes interpolation more restrictive than fitting. The most common method for this purpose is polynomial interpolation, where an (n-1) th order polynomial is solved that passes through n data points:

17 Determining Coefficients using Polyfit MATLAB’s built in polyfit and polyval commands can also be used - all that is required is making sure the order of the fit for n data points is n-1.

18 Newton Interpolating Polynomials Another way to express a polynomial interpolation is to use Newton’s interpolating polynomial. The differences between a simple polynomial and Newton’s interpolating polynomial for first and second order interpolations are:

19 Newton Interpolating Polynomials (contd.) The first-order Newton interpolating polynomial may be obtained from linear interpolation and similar triangles, as shown. The resulting formula based on known points x 1 and x 2 and the values of the dependent function at those points is:

20 Newton Interpolating Polynomials (contd.) The second-order Newton interpolating polynomial introduces some curvature to the line connecting the points, but still goes through the first two points. The resulting formula based on known points x 1, x 2, and x 3 and the values of the dependent function at those points is:

21 Lagrange Interpolating Polynomials Another method that uses shifted value to express an interpolating polynomial is the Lagrange interpolating polynomial. The differences between a simply polynomial and Lagrange interpolating polynomials for first and second order polynomials is: where the L i are weighting coefficients that are functions of x.

22 Lagrange Interpolating Polynomials (contd.) The first-order Lagrange interpolating polynomial may be obtained from a weighted combination of two linear interpolations, as shown. The resulting formula based on known points x 1 and x 2 and the values of the dependent function at those points is:

23 23 As with Newton’s method, the Lagrange version has an estimated error of:

24 24 Figure 18.10

25 Lagrange Interpolating Polynomials (contd.) In general, the Lagrange polynomial interpolation for n points is: where L i is given by:


Download ppt "Curve Fitting and Regression EEE 244. Descriptive Statistics in MATLAB MATLAB has several built-in commands to compute and display descriptive statistics."

Similar presentations


Ads by Google