Linear Transformations

Slides:



Advertisements
Similar presentations
Matrix Representation
Advertisements

Chapter 4 Euclidean Vector Spaces
8.2 Kernel And Range.
Ch 7.7: Fundamental Matrices
Chapter 6 Eigenvalues and Eigenvectors
Signal , Weight Vector Spaces and Linear Transformations
Signal , Weight Vector Spaces and Linear Transformations
Linear Transformations
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Ch 7.8: Repeated Eigenvalues
Chapter 3 Determinants and Matrices
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 08 Chapter 8: Linear Transformations.
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Day 1 Eigenvalues and Eigenvectors
Day 1 Eigenvalues and Eigenvectors
Elementary Linear Algebra Anton & Rorres, 9th Edition
4 4.2 © 2012 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
6 1 Linear Transformations. 6 2 Hopfield Network Questions The network output is repeatedly multiplied by the weight matrix W. What is the effect of this.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Linear algebra: matrix Eigen-value Problems
Domain Range definition: T is a linear transformation, EIGENVECTOR EIGENVALUE.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
KEY THEOREMS KEY IDEASKEY ALGORITHMS LINKED TO EXAMPLES next.
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
Signal & Weight Vector Spaces
2.5 – Determinants and Multiplicative Inverses of Matrices.
A function is a rule f that associates with each element in a set A one and only one element in a set B. If f associates the element b with the element.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
4.5: The Dimension of a Vector Space. Theorem 9 If a vector space V has a basis, then any set in V containing more than n vectors must be linearly dependent.
Chapter 4 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis 5.Dimension 6. Row Space, Column Space, and Nullspace 8.Rank.
Section 4.3 Properties of Linear Transformations from R n to R m.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
Chapter 5 Eigenvalues and Eigenvectors
REVIEW Linear Combinations Given vectors and given scalars
Chapter 6 Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Continuum Mechanics (MTH487)
ISHIK UNIVERSITY FACULTY OF EDUCATION Mathematics Education Department
Linear Equations in Linear Algebra
Matrices and vector spaces
Eigenvalues and Eigenvectors
Linear Algebra Lecture 36.
Properties Of the Quadratic Performance Surface
Linear Transformations
Linear Equations in Linear Algebra
Numerical Analysis Lecture 16.
Hopfield Network.
Signal & Weight Vector Spaces
Chapter 3 Linear Algebra
Signal & Weight Vector Spaces
Equivalent State Equations
Numerical Analysis Lecture 17.
Linear Transformations
EIGENVECTORS AND EIGENVALUES
Elementary Linear Algebra Anton & Rorres, 9th Edition
Performance Surfaces.
Linear Algebra Lecture 35.
Eigenvalues and Eigenvectors
NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
Eigenvalues and Eigenvectors
Presentation transcript:

Linear Transformations

Hopfield Network Questions The network output is repeatedly multiplied by the weight matrix W. What is the effect of this repeated operation? Will the output converge, go to infinity, oscillate? In this chapter we want to investigate matrix multiplication, which represents a general linear transformation.

Linear Transformations A transformation consists of three parts: 1. A set of elements X = {xi}, called the domain, 2. A set of elements Y = {yi}, called the range, and 3. A rule relating each x i Î X to an element yi Î Y. A transformation is linear if: 1. For all x 1, x 2 Î X, A(x 1 + x 2 ) = A(x 1) + A(x 2 ), 2. For all x Î X, a Î Â , A(a x ) = a A(x ) .

Example - Rotation Is rotation linear? 1. 2.

Matrix Representation - (1) Any linear transformation between two finite-dimensional vector spaces can be represented by matrix multiplication. Let {v1, v2, ..., vn} be a basis for X, and let {u1, u2, ..., um} be a basis for Y. Let A:X®Y

Matrix Representation - (2) Since A is a linear operator, Since the ui are a basis for Y, (The coefficients aij will make up the matrix representation of the transformation.)

Matrix Representation - (3) Because the ui are independent, a 11 12 ¼ 1 n 21 22 2 m x y = This is equivalent to matrix multiplication.

Summary A linear transformation can be represented by matrix multiplication. To find the matrix which represents the transformation we must transform each basis vector for the domain and then expand the result in terms of the basis vectors of the range. Each of these equations gives us one column of the matrix.

Example - (1) Stand a deck of playing cards on edge so that you are looking at the deck sideways. Draw a vector x on the edge of the deck. Now “skew” the deck by an angle q, as shown below, and note the new vector y = A(x). What is the matrix of this transforma-tion in terms of the standard basis set?

To find the matrix we need to transform each of the basis vectors. Example - (2) To find the matrix we need to transform each of the basis vectors. We will use the standard basis vectors for both the domain and the range.

Example - (3) We begin with s1: If we draw a line on the bottom card and then skew the deck, the line will not change. This gives us the first column of the matrix.

This gives us the second column of the matrix. Example - (4) Next, we skew s2: This gives us the second column of the matrix.

The matrix of the transformation is: Example - (5) The matrix of the transformation is:

The matrix representation is: Change of Basis Consider the linear transformation A:X®Y. Let {v1, v2, ..., vn} be a basis for X, and let {u1, u2, ..., um} be a basis for Y. The matrix representation is: ¼ a 11 12 1 n 21 22 2 m x y =

The new matrix representation is: New Basis Sets Now let’s consider different basis sets. Let {t1, t2, ..., tn} be a basis for X, and let {w1, w2, ..., wm} be a basis for Y. The new matrix representation is: ¼ a ' 11 12 1 n 21 22 2 m x y =

How are A and A' related? Expand ti in terms of the original basis vectors for X. ¼ t i 1 2 n = Expand wi in terms of the original basis vectors for Y. ¼ w i 1 2 m =

How are A and A' related? B t 1 2 ¼ n = Similarity Transform

Example - (1) Take the skewing problem described previously, and find the new matrix representation using the basis set {s1, s2}. t 1 0.5 = t 2 1 – = (Same basis for domain and range.)

Example - (2) For q = 45°:

Example - (3) Try a test vector: Check using reciprocal basis vectors:

Eigenvalues and Eigenvectors Let A:X®X be a linear transformation. Those vectors z Î X, which are not equal to zero, and those scalars l which satisfy A(z) = l z are called eigenvectors and eigenvalues, respectively. Can you find an eigenvector for this transformation?

Computing the Eigenvalues Skewing example (45°): 1 z 11 21 = 21 = z For this transformation there is only one eigenvector.

Diagonalization Perform a change of basis (similarity transformation) using the eigenvectors as the basis vectors. If the eigenvalues are distinct, the new matrix will be diagonal. Eigenvectors Eigenvalues ¼ B 1 – A [ ] l 2 = n

Example Diagonal Form: