Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tutorial 7 SVD Total Least Squares. 2 We already know, that the basis of eigenvectors of a matrix A is a convenient basis for work with A. However, for.

Similar presentations


Presentation on theme: "Tutorial 7 SVD Total Least Squares. 2 We already know, that the basis of eigenvectors of a matrix A is a convenient basis for work with A. However, for."— Presentation transcript:

1 Tutorial 7 SVD Total Least Squares

2 2 We already know, that the basis of eigenvectors of a matrix A is a convenient basis for work with A. However, for rectangular matrix A, dim(Ax) ≠ dim(x) and the concept of eigenvectors doesn’t exist. Yet, A T A is symmetric real matrix (A is real) and therefore, there is a orthonormal basis of eigenvectors {u K }. Consider the vectors {v K } They are also orthonormal, since: Singular Value Decomposition

3 3 Since A T A is positive semidefinite, its {λ k ≥0}. Define the singular values of A as and order them in non-increasing order: Motivation: One can see, that if A itself square and symmetric, than {u k, σ k } are the set of its own eigenvectors and eigenvalues. For a general matrix A, assume {σ 1 ≥ σ 2 ≥… σ R >0= σ r+1 = σ r+2 =…= σ m }.

4 4 SVD: Example Now we can write:

5 5 Let us find SVD for the matrix In order to find V, we are calculating eigenvectors of A T A: (5-λ) 2 -9=0; λ 2 -10 λ +16=0; SVD: Example

6 6 The corresponding eigenvectors are found by: SVD: Example

7 7 Now, we obtain the U and Σ : A=VΣU T : SVD: Example

8 8 Consider again (see Tutorial 4) the set of data points, and the problem of linear approximation of this set: In the Least Squares (LS) approach, we defined a set of equations: Total Least Squares If, then the LS solution minimizes the sum of squared errors:

9 9 Total Least Squares This approach assumes that in the set of points the values of b i are measured with errors while the values of t i are exact, as demonstrated on the figure.

10 10 Assume, that we rewrite the line equation in the form:. Then the corresponding LS equation becomes: Total Least Squares Corresponding to minimization of Which means noise in t i and in will usually lead to different solution.

11 11 Consider the following Matlab code: Illustration % Create the data x=(0:0.01:2)'; y=0.5*x+4; xn=x+randn(201,1)*0.3; yn=y+randn(201,1)*0.3; figure(1); clf; plot(x,y,'r'); hold on; grid on; plot(xn,yn,'+'); % LS - version 1 - horizontal is fixed A=[ones(201,1),xn]; b=yn; param=inv(A'*A)*A'*b; plot(xn,A*param,'g'); % LS - version 2 - vertical is fixed C=[ones(201,1),yn]; t=xn; param=inv(C'*C)*C'*t; plot(C*param,yn,'b');

12 12 To solve the problem with the noise along both ti and bi, we rewrite the line equation as: TLS where Now we can write: The exact solution of this system is possible if t i, b i lie on the same line, in this case rank(A)=1. This formulation is symmetric relatively to t and b.

13 13 The rank of A is 2 since the points are noisy and do not lie on the same line. SVD factorization, and zeroing of the second singular value allow to construct the matrix A 1, closest to A with rank(A 1 )=1. TLS

14 14 The geometric interpretation of the TLS method is finding a constant a and a set of points, such that the points lie closest in the L 2 to the data set : TLS

15 15 Total Least Squares xnM=mean(xn); ynM=mean(yn); A=[xn-xnM,yn-ynM]; [U,D,V]=svd(A); D(2,2)=0; Anew=U*D*V'; plot(Anew(:,1)+xnM; Anew(:,2)+ynM,‘r');


Download ppt "Tutorial 7 SVD Total Least Squares. 2 We already know, that the basis of eigenvectors of a matrix A is a convenient basis for work with A. However, for."

Similar presentations


Ads by Google