Presentation is loading. Please wait.

Presentation is loading. Please wait.

6 1 Linear Transformations. 6 2 Hopfield Network Questions The network output is repeatedly multiplied by the weight matrix W. What is the effect of this.

Similar presentations


Presentation on theme: "6 1 Linear Transformations. 6 2 Hopfield Network Questions The network output is repeatedly multiplied by the weight matrix W. What is the effect of this."— Presentation transcript:

1 6 1 Linear Transformations

2 6 2 Hopfield Network Questions The network output is repeatedly multiplied by the weight matrix W. What is the effect of this repeated operation? Will the output converge, go to infinity, oscillate? In this chapter we want to investigate matrix multiplication, which represents a general linear transformation.

3 6 3 Linear Transformations A transformation consists of three parts: 1.A set of elements X = { x i }, called the domain, 2.A set of elements Y = { y i }, called the range, and 3.A rule relating each x i   X to an element y i   Y. A transformation is linear if: 1.For all x 1, x 2   X, A ( x 1 + x 2 ) = A ( x 1 ) + A ( x 2 ), 2. For all x   X, a   , A (a x ) = a A ( x ).

4 6 4 Example - Rotation Is rotation linear? 1. 2.

5 6 5 Matrix Representation - (1) Any linear transformation between two finite-dimensional vector spaces can be represented by matrix multiplication. Let { v 1, v 2,..., v n } be a basis for X, and let { u 1, u 2,..., u m } be a basis for Y. Let A :X  Y

6 6 6 Matrix Representation - (2) Since A is a linear operator, Since the u i are a basis for Y, (The coefficients a ij will make up the matrix representation of the transformation.)

7 6 7 Matrix Representation - (3) Because the u i are independent, a 11 a 12  a 1n a 21 a 22  a 2n a m1 a m2  a mn x 1 x 2 x n y 1 y 2 y m = This is equivalent to matrix multiplication.

8 6 8 Summary A linear transformation can be represented by matrix multiplication. To find the matrix which represents the transformation we must transform each basis vector for the domain and then expand the result in terms of the basis vectors of the range. Each of these equations gives us one column of the matrix.

9 6 9 Example - (1) Stand a deck of playing cards on edge so that you are looking at the deck sideways. Draw a vector x on the edge of the deck. Now “skew” the deck by an angle , as shown below, and note the new vector y = A ( x ). What is the matrix of this transforma- tion in terms of the standard basis set?

10 6 10 Example - (2) To find the matrix we need to transform each of the basis vectors. We will use the standard basis vectors for both the domain and the range.

11 6 11 Example - (3) We begin with s 1 : This gives us the first column of the matrix. If we draw a line on the bottom card and then skew the deck, the line will not change.

12 6 12 Example - (4) Next, we skew s 2 : This gives us the second column of the matrix.

13 6 13 Example - (5) The matrix of the transformation is:

14 6 14 Change of Basis Consider the linear transformation A :X  Y. Let { v 1, v 2,..., v n } be a basis for X, and let { u 1, u 2,..., u m } be a basis for Y. The matrix representation is:  a 11 a 12  a 1n a 21 a 22  a 2n a m1 a m2  a mn x 1 x 2 x n y 1 y 2 y m = 

15 6 15 New Basis Sets Now let’s consider different basis sets. Let { t 1, t 2,..., t n } be a basis for X, and let { w 1, w 2,..., w m } be a basis for Y. The new matrix representation is:  a' 11 a' 12  a' 1n a' 21 a' 22  a' 2n a' m1 a' m2  a' mn x' 1 x' 2 x' n y' 1 y' 2 y' m = 

16 6 16 How are A and A' related? Expand t i in terms of the original basis vectors for X. Expand w i in terms of the original basis vectors for Y.  w i w 1i w 2i w mi =  t i t 1i t 2i t ni =

17 6 17 How are A and A' related? B t t 1 t 2  t n = Similarity Transform

18 6 18 Example - (1) Take the skewing problem described previously, and find the new matrix representation using the basis set { s 1, s 2 }. t 1 0.5 1 = t 2 1– 1 = (Same basis for domain and range.)

19 6 19 Example - (2) For  = 45°:

20 6 20 Example - (3) Try a test vector: Check using reciprocal basis vectors:

21 6 21 Eigenvalues and Eigenvectors Let A :X  X be a linear transformation. Those vectors z    X, which are not equal to zero, and those scalars which satisfy A ( z ) = z are called eigenvectors and eigenvalues, respectively. Can you find an eigenvector for this transformation?

22 6 22 Computing the Eigenvalues Skewing example (45°): For this transformation there is only one eigenvector. 21 0=z 01 00 z 1 01 00 z 11 z 21 0 0 ==

23 6 23 Diagonalization Perform a change of basis (similarity transformation) using the eigenvectors as the basis vectors. If the eigenvalues are distinct, the new matrix will be diagonal. Eigenvectors Eigenvalues n  B 1– AB  1 0  0 0 2  0 00  = 

24 6 24 Example Diagonal Form:


Download ppt "6 1 Linear Transformations. 6 2 Hopfield Network Questions The network output is repeatedly multiplied by the weight matrix W. What is the effect of this."

Similar presentations


Ads by Google