6 1 Linear Transformations. 6 2 Hopfield Network Questions The network output is repeatedly multiplied by the weight matrix W. What is the effect of this.

Slides:



Advertisements
Similar presentations
Matrix Representation
Advertisements

Chapter 4 Euclidean Vector Spaces
Ch 7.7: Fundamental Matrices
Chapter 6 Eigenvalues and Eigenvectors
Signal , Weight Vector Spaces and Linear Transformations
Signal , Weight Vector Spaces and Linear Transformations
Linear Transformations
Eigenvalues and Eigenvectors
Ch 7.8: Repeated Eigenvalues
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Chapter 3 Determinants and Matrices
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Chapter 1 Systems of Linear Equations
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 08 Chapter 8: Linear Transformations.
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
10.1 Gaussian Elimination Method
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
Systems and Matrices (Chapter5)
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Slide Chapter 7 Systems and Matrices 7.1 Solving Systems of Two Equations.
CHAPTER 2 MATRIX. CHAPTER OUTLINE 2.1 Introduction 2.2 Types of Matrices 2.3 Determinants 2.4 The Inverse of a Square Matrix 2.5 Types of Solutions to.
Presentation by: H. Sarper
Day 1 Eigenvalues and Eigenvectors
Day 1 Eigenvalues and Eigenvectors
Elementary Linear Algebra Anton & Rorres, 9th Edition
4 4.2 © 2012 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
Copyright © 2013, 2009, 2005 Pearson Education, Inc. 1 5 Systems and Matrices Copyright © 2013, 2009, 2005 Pearson Education, Inc.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Linear algebra: matrix Eigen-value Problems
Domain Range definition: T is a linear transformation, EIGENVECTOR EIGENVALUE.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Vector Norms and the related Matrix Norms. Properties of a Vector Norm: Euclidean Vector Norm: Riemannian metric:
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Meeting 18 Matrix Operations. Matrix If A is an m x n matrix - that is, a matrix with m rows and n columns – then the scalar entry in the i th row and.
L 7: Linear Systems and Metabolic Networks. Linear Equations Form System.
KEY THEOREMS KEY IDEASKEY ALGORITHMS LINKED TO EXAMPLES next.
Chapter 1 Systems of Linear Equations Linear Algebra.
Arab Open University Faculty of Computer Studies M132: Linear Algebra
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
Signal & Weight Vector Spaces
Matrices and Matrix Operations. Matrices An m×n matrix A is a rectangular array of mn real numbers arranged in m horizontal rows and n vertical columns.
A function is a rule f that associates with each element in a set A one and only one element in a set B. If f associates the element b with the element.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Chapter 4 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis 5.Dimension 6. Row Space, Column Space, and Nullspace 8.Rank.
Matrices and Linear Systems Roughly speaking, matrix is a rectangle array We shall discuss existence and uniqueness of solution for a system of linear.
Section 4.3 Properties of Linear Transformations from R n to R m.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
REVIEW Linear Combinations Given vectors and given scalars
Eigenvalues and Eigenvectors
Continuum Mechanics (MTH487)
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Eigenvalues and Eigenvectors
Linear Algebra Lecture 36.
Equivalent State Equations
Linear Transformations
Linear Algebra Lecture 32.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Eigenvalues and Eigenvectors
Linear Transformations
NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
Eigenvalues and Eigenvectors
Presentation transcript:

6 1 Linear Transformations

6 2 Hopfield Network Questions The network output is repeatedly multiplied by the weight matrix W. What is the effect of this repeated operation? Will the output converge, go to infinity, oscillate? In this chapter we want to investigate matrix multiplication, which represents a general linear transformation.

6 3 Linear Transformations A transformation consists of three parts: 1.A set of elements X = { x i }, called the domain, 2.A set of elements Y = { y i }, called the range, and 3.A rule relating each x i   X to an element y i   Y. A transformation is linear if: 1.For all x 1, x 2   X, A ( x 1 + x 2 ) = A ( x 1 ) + A ( x 2 ), 2. For all x   X, a   , A (a x ) = a A ( x ).

6 4 Example - Rotation Is rotation linear? 1. 2.

6 5 Matrix Representation - (1) Any linear transformation between two finite-dimensional vector spaces can be represented by matrix multiplication. Let { v 1, v 2,..., v n } be a basis for X, and let { u 1, u 2,..., u m } be a basis for Y. Let A :X  Y

6 6 Matrix Representation - (2) Since A is a linear operator, Since the u i are a basis for Y, (The coefficients a ij will make up the matrix representation of the transformation.)

6 7 Matrix Representation - (3) Because the u i are independent, a 11 a 12  a 1n a 21 a 22  a 2n a m1 a m2  a mn x 1 x 2 x n y 1 y 2 y m = This is equivalent to matrix multiplication.

6 8 Summary A linear transformation can be represented by matrix multiplication. To find the matrix which represents the transformation we must transform each basis vector for the domain and then expand the result in terms of the basis vectors of the range. Each of these equations gives us one column of the matrix.

6 9 Example - (1) Stand a deck of playing cards on edge so that you are looking at the deck sideways. Draw a vector x on the edge of the deck. Now “skew” the deck by an angle , as shown below, and note the new vector y = A ( x ). What is the matrix of this transforma- tion in terms of the standard basis set?

6 10 Example - (2) To find the matrix we need to transform each of the basis vectors. We will use the standard basis vectors for both the domain and the range.

6 11 Example - (3) We begin with s 1 : This gives us the first column of the matrix. If we draw a line on the bottom card and then skew the deck, the line will not change.

6 12 Example - (4) Next, we skew s 2 : This gives us the second column of the matrix.

6 13 Example - (5) The matrix of the transformation is:

6 14 Change of Basis Consider the linear transformation A :X  Y. Let { v 1, v 2,..., v n } be a basis for X, and let { u 1, u 2,..., u m } be a basis for Y. The matrix representation is:  a 11 a 12  a 1n a 21 a 22  a 2n a m1 a m2  a mn x 1 x 2 x n y 1 y 2 y m = 

6 15 New Basis Sets Now let’s consider different basis sets. Let { t 1, t 2,..., t n } be a basis for X, and let { w 1, w 2,..., w m } be a basis for Y. The new matrix representation is:  a' 11 a' 12  a' 1n a' 21 a' 22  a' 2n a' m1 a' m2  a' mn x' 1 x' 2 x' n y' 1 y' 2 y' m = 

6 16 How are A and A' related? Expand t i in terms of the original basis vectors for X. Expand w i in terms of the original basis vectors for Y.  w i w 1i w 2i w mi =  t i t 1i t 2i t ni =

6 17 How are A and A' related? B t t 1 t 2  t n = Similarity Transform

6 18 Example - (1) Take the skewing problem described previously, and find the new matrix representation using the basis set { s 1, s 2 }. t = t 2 1– 1 = (Same basis for domain and range.)

6 19 Example - (2) For  = 45°:

6 20 Example - (3) Try a test vector: Check using reciprocal basis vectors:

6 21 Eigenvalues and Eigenvectors Let A :X  X be a linear transformation. Those vectors z    X, which are not equal to zero, and those scalars which satisfy A ( z ) = z are called eigenvectors and eigenvalues, respectively. Can you find an eigenvector for this transformation?

6 22 Computing the Eigenvalues Skewing example (45°): For this transformation there is only one eigenvector. 21 0=z z z 11 z ==

6 23 Diagonalization Perform a change of basis (similarity transformation) using the eigenvectors as the basis vectors. If the eigenvalues are distinct, the new matrix will be diagonal. Eigenvectors Eigenvalues n  B 1– AB  1 0   0 00  = 

6 24 Example Diagonal Form: