Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 13. General Least-Squares and Nonlinear Regression Gab-Byung Chae.

Similar presentations


Presentation on theme: "Chapter 13. General Least-Squares and Nonlinear Regression Gab-Byung Chae."— Presentation transcript:

1 Chapter 13. General Least-Squares and Nonlinear Regression Gab-Byung Chae

2

3 13.1 Polynomial Regression -> poorly represented by a straight line. As discussed in Chap. 12, one method to accomplish this objective is to use transformations. Another alternative is to fit polynomials to the data using polynomial regression.

4 Least Squares Regression  Minimize some measure of the difference between the approximating function and the given data points.  In least squares method, the error is measured as :  The minimum of E occurs when the partial derivatives of E with respect to each of the variables are 0.

5 Linear Least Squares Regression  f(x) is in a linear form : f(x)=ax+b  the error :  Is minimized when : 

6 Quadratic Least Squares Approximation  f(x) is in a quadratic form : f(x)=ax 2 +bx+c  the error :  Is minimized when :

7 Cubic Least Squares Approximation  f(x) is in a cubic form : f(x)=ax 3 +bx 2 +cx+d  the error :  Is minimized when :  This case can be easily extended to an mth-order polynomial.

8  Determining the coefficients of an mth-order polynomial is equivalent to solving a system of m+1 simutaneous linear equations.  The standard error is formulated as (m+1) data-drived coefficients- a 0, a 1, … a m, - were used to compute S r.

9 Example 13.1 Fit a second-order polynomial to the data in the first two columns of Table 13.1

10  Sol>

11 >> N = [6 15 55; 15 55 225; 55 225 979]; >> r = [152.6 585.6 2488.8] >> a = N\r a = a = 2.4786 2.4786 2.3593 2.3593 1.8607 1.8607

12 The standard error The coefficient of determination The correlation coefficient Sum of the squares of the residuals between the data points(yi) and the mean Sum of the squares of the residuals between the data points(y i ) and regression curve

13 Fit of a second-order polynomial. Figure 13.2

14 13.2 Multiple Linear Regression  An extension of linear regression : y is a linear function of two or more independent variables.  For this two-dimensional case, the regression line becomes a plane(Fig. 13.3).  The sum of the squares of the residuals:

15 Graphical depiction of multiple linear regression where y is a linear function of x 1 and x 2. Figure 13.3

16

17 Example 13.2 Multiple Linear Regression  Use multiple linear regression to fit this data.

18 Which gives us

19 Extension to m dimensions  Power equations of the form Standard error

20 13.3 General Linear Least Squares When We have simple or multiple linear regression. When We have polynomial regression.

21  The functions can be highly nonlinear.  For example:  Or

22  Equation (13.7) can be expressed in matrix notation as where m is the number of variables in the model and n is the number of data points. where m is the number of variables in the model and n is the number of data points.

23  Because n>m, mostly Z is not a square matrix.

24  By taking its partial derivative with respect to each of the coefficients and setting the resulting equation equal to zero.

25 Example 13.3 Polynomial Regression with MATLAB Repeat example 13.1 >> x=[0 1 2 3 4 5]'; >> y=[2.1 7.7 13.6 27.2 40.9 61.1]'; >> Z=[ones(size(x)) x x.^2] Z = 1 0 0 1 0 0 1 1 1 1 1 1 1 2 4 1 2 4 1 3 9 1 3 9 1 4 16 1 4 16 1 5 25 1 5 25

26 >> Z'*Z ans = ans = 6 15 55 6 15 55 15 55 225 15 55 225 55 225 979 55 225 979 >> a=(Z'*Z)\(Z'*y) a = a = 2.4786 2.4786 2.3593 2.3593 1.8607 1.8607

27 >> Sr = sum((y-Z*a).^2) Sr = Sr = 3.7466 3.7466 >> r2=1-Sr/sum((y-mean(y)).^2) r2 = r2 = 0.9985 0.9985 >> Syx= sqrt(Sr/(length(x)-length(a))) Syx = Syx = 1.1175 1.1175

28 13.4 QR factorization and the backslash operator  QR factorization and singular value decomposition : beyond the scope of this book but we can use it in MATLAB which is implemented as polyfit and backslash {y} = [Z]{a} : general model Eq.(13.8) {y} = [Z]{a} : general model Eq.(13.8) >> x = [ 0 1 2 3 4 5]' ; >> y=[2.1 7.7 13.6 27.2 40.9 61.1]‘; >> z=[ones(size(x)) x x.^2]; >> a=polyfit(x,y,2) >> a =z\y

29 13.5 Nonlinear regression  Ex :  The sum of the square  Find a 0 and a 1 that minimize the function f  Matlab’s fminsearch function can be used for this purpose. [x, fval] =fminsearch(fun, x0, options, p1, p2,…) [x, fval] =fminsearch(fun, x0, options, p1, p2,…)

30 Example 13.4 Nonlinear Regression with MATLAB  Recall example 12.4 with the table 12.1: we have  This time use nonlinear regression. Employ initial guesses of 1 for the coefficient.

31  Sol  M-file function f = fSSR(a, xm, ym) yp = a(1)*xm.^a(2); f =sum((ym-yp).^2); >> x=[10 20 30 40 50 60 70 80 ]; >> y = [25 70 380 550 610 1220 830 1450]; >> fminsearch(@fSSR, [1,1],[], x,y) ans = 2.53839730236869 1.43585317478585 2.53839730236869 1.43585317478585

32 Comparison of transformed and untransformed model fits for force versus velocity data from Table 12.1. Figure 13.4


Download ppt "Chapter 13. General Least-Squares and Nonlinear Regression Gab-Byung Chae."

Similar presentations


Ads by Google