Properties Of the Quadratic Performance Surface

Slides:



Advertisements
Similar presentations
Ch 7.6: Complex Eigenvalues
Advertisements

5.1 Real Vector Spaces.
Chapter 4 Euclidean Vector Spaces
Eigen Decomposition and Singular Value Decomposition
Ch 7.7: Fundamental Matrices
Chapter 6 Eigenvalues and Eigenvectors
OCE301 Part II: Linear Algebra lecture 4. Eigenvalue Problem Ax = y Ax = x occur frequently in engineering analysis (eigenvalue problem) Ax =  x [ A.
8 CHAPTER Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1.
Linear Transformations
Symmetric Matrices and Quadratic Forms
Symmetric Matrices and Quadratic Forms
Ch 7.9: Nonhomogeneous Linear Systems
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
5.1 Orthogonality.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear algebra: matrix Eigen-value Problems
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Vector Norms and the related Matrix Norms. Properties of a Vector Norm: Euclidean Vector Norm: Riemannian metric:
Diagonalization and Similar Matrices In Section 4.2 we showed how to compute eigenpairs (,p) of a matrix A by determining the roots of the characteristic.
What is the determinant of What is the determinant of
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
7.1 Eigenvalues and Eigenvectors
Signal & Weight Vector Spaces
A function is a rule f that associates with each element in a set A one and only one element in a set B. If f associates the element b with the element.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
Chapter 5 Eigenvalues and Eigenvectors
Chapter 6 Eigenvalues and Eigenvectors
Mathematics-I J.Baskar Babujee Department of Mathematics
Continuum Mechanics (MTH487)
ISHIK UNIVERSITY FACULTY OF EDUCATION Mathematics Education Department
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Section 4.1 Eigenvalues and Eigenvectors
Systems of First Order Linear Equations
Chapter 27.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Linear Equations in Linear Algebra
Numerical Analysis Lecture 16.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Chapter 6 Eigenvalues Basil Hamed
Chapter 3 Linear Algebra
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Equivalent State Equations
Symmetric Matrices and Quadratic Forms
Eigen Decomposition Based on the slides by Mani Thomas
Principal Components What matters most?.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Linear Algebra Lecture 32.
Engineering Mathematics-I Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Linear Algebra Lecture 20.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Performance Surfaces.
Physics 319 Classical Mechanics
Linear Algebra Lecture 35.
Eigenvalues and Eigenvectors
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Chapter 2 Determinants.
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Properties Of the Quadratic Performance Surface Lecture Three

The Quadratic Performance Surface From derivation of Weiner Hopf; Quadratic Performance Surface is expressed by:  =  min + (W – W*)TR (W – W*) Quadratic Performance Surface is a function of autocorrelation matrix (R) of input vector The Eigenanalysis of the autocorrelation matrix gives information about the characteristics of the Performance Surface

Introduction From the standpoint of Engineering applications eigenvalue problem are among the most important problems in connection with matrices. For a linear discrete time and continuous linear systems matrix A completely determine system stability. The eigenvectors of A forms a very convenient choice for the eigenvectors . Eigenvectors may be used to uncouple the state equations and determine convenient way for the system analysis

Definition Let the domain and range of a linear transformation be D(A) and R(A) within the vector space X. Those vectors xi and scalars λi which satisfy the condition A(xi)= λi xi are called as eigenvectors and eigenvectors respectively. Case for xi is excluded.

Eigenvalues For Linear transformation this becomes Necessary condition for the existence of solution to set of n homogeneous equations is that rank(A- Iλi )<n Which results

If there are p<n distinct roots The integer mi is called the algebraic multiplicity of λi Equation has to be solved for the roots to get the Eigenvalues

Eigenvectors and Eigenvalues Problems Membrane Stretch problem An elastic membrane is stretched such that a point P goes over into point Q Find the principle directions of stretch

Governing equations for the stretch is Eigenvalues for transformation matrix are 2 and 8

Original and Stretched Membrane

Thus the eigenvectors specify dimensions along which the output is directed for the specific value of the input vector. Output is just the integral multiple of input at these values The constant factor is called as the eigenvalue The output is eigenvector corresponding to that eigenvalue

The Quadratic Performance Surface Using Eigen analysis we can get the idea of basis of the performance surfaces on which it is defined Hence analysis of the performance surface is simplified

Normal Form of the Input Correlation Matrix The Eigenvalues of the input autocorrelation matrix R is defined as R Qn = n Qn Where Qn is the nth eigenvector corresponding to n nth eigenvalue The eigenvalues are computed from the following characteristic equation det[R - I] = 0

Normal Form of the Input Correlation Matrix Eigenvector form the basis vectors for the input autocorrelation matrix R. And Eigenvalues are the weights of the vectors. We can write

Normal Form of the Input Correlation Matrix Therefore we can also write RQ = Q or R = QQ-1 This is the normal form of R Where Q = [Q0 Q1 … QL], is the eigenvector matrix And  is a diagonal matrix with eigenvalue as the diagonal entries and is called ‘eigenvalue matrix’

Properties of the Eigenvalues and Eigenvectors As R is a symmetric matrix, the eigenvectors corresponding to distinct eigenvalues are mutually orthogonal. Since R is real, all eigenvalues must be real and greater than or equal to zero The eigenvector matrix Q can be normalized such that QQT = I

Geometrical Significance of the Eigenvectors and Eigenvalues The eigenvectors and the eigenvalues are related to certain properties of the error surface. We know that the error performance surface form a hyperparabolic surface in a space of N dimensions for N-1 weights.

Geometrical Significance of the Eigenvectors and Eigenvalues MSE w1 w0 The hyperparabolic surface of three dimensions for 2 weights

Geometrical Significance of the Eigenvectors and Eigenvalues If we cut the paraboloid with planes parallel to the w0w1-plane, we obtain concentric ellipses corresponding to different values of mean square error.

Geometrical Significance of the Eigenvectors and Eigenvalues w1 w0 Ellipses with different color shades corresponding to different values of mean square error

Geometrical Significance of the Eigenvectors and Eigenvalues From mean square error expression, equation for ellipses it can be written as WTRW – 2PTW = constant Using the Alternate expression for gradient we can also write it as VTRV = constant The general expression for the ellipses in function form can be written as F(V) = VTRV

Geometrical Significance of the Eigenvectors and Eigenvalues A vector normal to the ellipses can be obtained by taking gradient of F The principal axis of the ellipse pass through the origin and therefore is of form V

Geometrical Significance of the Eigenvectors and Eigenvalues Also the principal axis is normal to the ellipses F(V),therefore 2RV’ = V’ [R – (/2)I] V’ = 0 Thus V’ is the principle axis and also the eigenvector of the matrix R.

Geometrical Significance of the Eigenvectors and Eigenvalues The eigenvectors of the input correlation matrix define the principle axes of the error surface

Geometrical Significance of the Eigenvectors and Eigenvalues Take the expression for the mean square error  =  min + VTR V Where V = (W – W*) Replace R by its normal form R = QQ-1 We have  =  min + V’T V’

Geometrical Significance of the Eigenvectors and Eigenvalues The gradient of above expression would be  = 2 V’ = 2[0v’0 1v’1 … Lv’L] To summarize V = (W – W*) can be considered as a translation to a new axis V’ = QTV is the transformation to the principal coordinate system

Geometrical Significance of the Eigenvectors and Eigenvalues The gradient of  along any principal axis is given as Thus the eigenvalues of the input correlation matrix R give the second derivative of the error surface, , with respect to the principal axes of 

Assignment All exercise Questions from Chapter 2 MATLAB questions due in next lab Tuesday