 # 6 1 Linear Transformations. 6 2 Hopfield Network Questions.

## Presentation on theme: "6 1 Linear Transformations. 6 2 Hopfield Network Questions."— Presentation transcript:

6 1 Linear Transformations

6 2 Hopfield Network Questions

6 3 The network output is repeatedly multiplied by the weight matrix W. What is the effect of this repeated operation? Will the output converge, go to infinity, oscillate? In this chapter we want to investigate matrix multiplication, which represents a general linear transformation.

6 4 Linear Transformations A transformation consists of three parts: 1.A set of elements X = { x i }, called the domain, 2.A set of elements Y = { y i }, called the range, and 3.A rule relating each x i   X to an element y i   Y. A transformation is linear if: 1.For all x 1, x 2   X, A ( x 1 + x 2 ) = A ( x 1 ) + A ( x 2 ), 2. For all x   X, a   , A (a x ) = a A ( x ).

6 5 Example - Rotation Is rotation linear? θ

6 6 Example - Translation Is translation linear?

6 7 Matrix Representation - (1) Any linear transformation between two finite-dimensional vector spaces can be represented by matrix multiplication. Let { v 1, v 2,..., v n } be a basis for X, and let { u 1, u 2,..., u m } be a basis for Y. Let A :X  Y

6 8 Matrix Representation - (2) Since A is a linear operator, Since the u i are a basis for Y, (The coefficients a ij will make up the matrix representation of the transformation.)

6 9 Matrix Representation - (3) Because the u i are independent, a 11 a 12  a 1n a 21 a 22  a 2n a m1 a m2  a mn x 1 x 2 x n y 1 y 2 y m = This is equivalent to matrix multiplication.

6 10 Summary A linear transformation can be represented by matrix multiplication. To find the matrix which represents the transformation we must transform each basis vector for the domain and then expand the result in terms of the basis vectors of the range. Each of these equations gives us one column of the matrix.

6 11 Change of Basis Consider the linear transformation A :X  Y. Let { v 1, v 2,..., v n } be a basis for X, and let { u 1, u 2,..., u m } be a basis for Y. The matrix representation is:  a 11 a 12  a 1n a 21 a 22  a 2n a m1 a m2  a mn x 1 x 2 x n y 1 y 2 y m = 

6 12 New Basis Sets Now let’s consider different basis sets. Let { t 1, t 2,..., t n } be a basis for X, and let { w 1, w 2,..., w m } be a basis for Y. The new matrix representation is:  a' 11 a' 12  a' 1n a' 21 a' 22  a' 2n a' m1 a' m2  a' mn x' 1 x' 2 x' n y' 1 y' 2 y' m = 

6 13 How are A and A' related? Expand t i in terms of the original basis vectors for X. Expand w i in terms of the original basis vectors for Y.  w i w 1i w 2i w mi =  t i t 1i t 2i t ni =

6 14 How are A and A' related? B t t 1 t 2  t n = Similarity Transform

6 15 Example Consider a transformation A : Ｒ 3 → Ｒ 2 whose matrix representation relative to the standard basis sets is Find the matrix for this transformation relative to the basis sets :

6 16 Example Step 1 : form the matrices Step 2 : Use

6 17 Eigenvalues and Eigenvectors Let A :X  X be a linear transformation. Those vectors z    X, which are not equal to zero, and those scalars which satisfy A ( z ) = z are called eigenvectors and eigenvalues, respectively.

6 18 Computing the Eigenvalues Skewing example (45°): For this transformation there is only one eigenvector. 21 0=z 01 00 z 1 01 00 z 11 z 21 0 0 ==

6 19 Diagonalization Perform a change of basis (similarity transformation) using the eigenvectors as the basis vectors. If the eigenvalues are distinct, the new matrix will be diagonal. Eigenvectors Eigenvalues n  B 1– AB  1 0  0 0 2  0 00  = 

6 20 Example Diagonal Form: