Lecture 13 Inner Product Space & Linear Transformation Last Time - Orthonormal Bases:Gram-Schmidt Process - Mathematical Models and Least Square Analysis.

Slides:



Advertisements
Similar presentations
Chapter 4 Euclidean Vector Spaces
Advertisements

6.4 Best Approximation; Least Squares
8.2 Kernel And Range.
Chapter 5 Inner Product Spaces
Chapter 5 Orthogonality
5.1 Rn上之長度與點積 5.2 內積空間 5.3 單範正交基底:Gram-Schmidt過程 5.4 數學模型與最小平方分析
Chapter 3 Determinants 3.1 The Determinant of a Matrix
CHAPTER 5 INNER PRODUCT SPACES
Orthogonality and Least Squares
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 08 Chapter 8: Linear Transformations.
CHAPTER 1 SYSTEMS OF LINEAR EQUATIONS Elementary Linear Algebra 投影片設計製作者 R. Larson (7 Edition) 淡江大學 電機系 翁慶昌 教授 1.1 Introduction to Systems of Linear Equations.
Chapter 5 Inner Product Spaces
Chapter 1 Systems of Linear Equations
Chapter 6 Linear Transformations
Chapter 7 Eigenvalues and Eigenvectors
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Graphics CSE 581 – Interactive Computer Graphics Mathematics for Computer Graphics CSE 581 – Roger Crawfis (slides developed from Korea University slides)
2.1 Operations with Matrices 2.2 Properties of Matrix Operations
6.1 Introduction to Linear Transformations
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Chapter 2A Matrices 2A.1 Definition, and Operations of Matrices: 1 Sums and Scalar Products; 2 Matrix Multiplication 2A.2 Properties of Matrix Operations;
Chapter 1 Systems of Linear Equations
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Chapter 5 Orthogonality.
Chapter 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
Elementary Linear Algebra Anton & Rorres, 9th Edition
1 MAC 2103 Module 6 Euclidean Vector Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Use vector notation.
Vectors in R n a sequence of n real number An ordered n-tuple: the set of all ordered n-tuple  n-space: R n Notes: (1) An n-tuple can be viewed.
Chapter 2 Basic Linear Algebra ( 基本線性代數 ) to accompany Operations Research: Applications and Algorithms 4th edition by Wayne L. Winston Copyright (c)
Chap. 6 Linear Transformations
Elementary Linear Algebra Anton & Rorres, 9th Edition
Introductions to Linear Transformations Function T that maps a vector space V into a vector space W: V: the domain of T W: the codomain of T Chapter.
1 Chapter 3 – Subspaces of R n and Their Dimension Outline 3.1 Image and Kernel of a Linear Transformation 3.2 Subspaces of R n ; Bases and Linear Independence.
Chapter 4 Linear Transformations 4.1 Introduction to Linear Transformations 4.2 The Kernel and Range of a Linear Transformation 4.3 Matrices for Linear.
CHAPTER 3 DETERMINANTS 3.1 The Determinant of a Matrix 3.2 Determinant and Elementary Operations 3.3 Properties of Determinants 3.4 Application of Determinants.
Lecture 11 Inner Product Space
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
Lecture 1 Systems of Linear Equations
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvalues and Eigenvectors
Ch 6 Vector Spaces. Vector Space Axioms X,Y,Z elements of  and α, β elements of  Def of vector addition Def of multiplication of scalar and vector These.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Lecture 14 Linear Transformation Last Time - Mathematical Models and Least Square Analysis - Inner Product Space Applications - Introduction to Linear.
1 MAC 2103 Module 11 lnner Product Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal.
Lecture 9 Vector & Inner Product Space
A function is a rule f that associates with each element in a set A one and only one element in a set B. If f associates the element b with the element.
CHAPTER Four Linear Transformations. Outlines Definition and Examples Matrix Representation of linear transformation Similarity.
Lecture 9 Vector & Inner Product Space Last Time - Spanning Sets and Linear Independence (Cont.) - Basis and Dimension - Rank of a Matrix and Systems of.
Chapter 2 Matrices 2.1 Operations with Matrices 2.2 Properties of Matrix Operations 2.3 The Inverse of a Matrix 2.4 Elementary Matrices Elementary Linear.
Lecture 7 Vector Spaces Lat Time Properties of Determinants Introduction to Eigenvalues Applications of Determinants Vectors in R n Elementary Linear Algebra.
Chapter 4 Linear Transformations 4.1 Introduction to Linear Transformations 4.2 The Kernel and Range of a Linear Transformation 4.3 Matrices for Linear.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
Lecture 11 Inner Product Space Last Time - Coordinates and Change of Basis - Applications - Length and Dot Product in R n Elementary Linear Algebra R.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
CHAPTER 6 LINEAR TRANSFORMATIONS 6.1 Introduction to Linear Transformations 6.2 The Kernel and Range of a Linear Transformation 6.3 Matrices for Linear.
Lecture 9 Vector & Inner Product Spaces Last Time Spanning Sets and Linear Independence (Cont.) Basis and Dimension Rank of a Matrix and Systems of Linear.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
Lecture 7 Vector Space Last Time - Properties of Determinants
8.2 Kernel And Range.
Lecture 2 Matrices Lat Time - Course Overview
Lecture 10 Inner Product Spaces
Lecture 8 Vector Space Last Time - Vector Spaces and Applications
Lecture 8 Vector Space Last Time - Vector Spaces and Applications
Lecture 7 Vector Space Last Time - Properties of Determinants
Lecture 14 Linear Transformation
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 20.
Chapter 4 Linear Transformations
Presentation transcript:

Lecture 13 Inner Product Space & Linear Transformation Last Time - Orthonormal Bases:Gram-Schmidt Process - Mathematical Models and Least Square Analysis - Inner Product Space Applications Elementary Linear Algebra R. Larsen et al. (5 Edition) TKUEE 翁慶昌 -NTUEE SCC_12_2007

13- 2 Lecture 12: Inner Product Spaces & L.T. Today Mathematical Models and Least Square Analysis Inner Product Space Applications Introduction to Linear Transformations Reading Assignment: Secs 5.4,5.5,6.1,6.2 Next Time The Kernel and Range of a Linear Transformation Matrices for Linear Transformations Transition Matrix and Similarity Reading Assignment: Secs

What Have You Actually Learned about Projection So Far? 13- 3

Mathematical Models and Least Squares Analysis Let W be a subspace of an inner product space V. (a) A vector u in V is said to orthogonal to W, if u is orthogonal to every vector in W. (b) The set of all vectors in V that are orthogonal to W is called the orthogonal complement of W. (read “ perp”) Orthogonal complement of W:  Notes:

Direct sum: Let and be two subspaces of. If each vector can be uniquely written as a sum of a vector from and a vector from,, then is the direct sum of and, and you can write. Thm 5.13: (Properties of orthogonal subspaces) Let W be a subspace of R n. Then the following properties are true. (1) (2) (3)

Find by the other method:

13- 7 Thm 5.16: (Fundamental subspaces of a matrix) If A is an m×n matrix, then (1) (2) (3) (4)

Ex 6: (Fundamental subspaces) Find the four fundamental subspaces of the matrix. (reduced row-echelon form) Sol:

Check:

Ex 3: Let W is a subspace of R 4 and. (a) Find a basis for W (b) Find a basis for the orthogonal complement of W. Sol: (reduced row-echelon form)

is a basis for W Notes:

Least Squares Problem Least squares problem: (A system of linear equations) (1) When the system is consistent, we can use the Gaussian elimination with back-substitution to solve for x (2) When the system is consistent, how to find the “best possible” solution of the system. That is, the value of x for which the difference between Ax and b is small.

Least squares solution: Given a system Ax = b of m linear equations in n unknowns, the least squares problem is to find a vector x in R n that minimizes with respect to the Euclidean inner product on R n. Such a vector is called a least squares solution of Ax = b.

(the normal system associated with Ax = b)

Note: The problem of finding the least squares solution of is equal to he problem of finding an exact solution of the associated normal system. Thm: For any linear system, the associated normal system is consistent, and all solutions of the normal system are least squares solution of Ax = b. Moreover, if W is the column space of A, and x is any least squares solution of Ax = b, then the orthogonal projection of b on W is

Thm: If A is an m×n matrix with linearly independent column vectors, then for every m×1 matrix b, the linear system Ax = b has a unique least squares solution. This solution is given by Moreover, if W is the column space of A, then the orthogonal projection of b on W is

Ex 7: (Solving the normal equations) Find the least squares solution of the following system and find the orthogonal projection of b on the column space of A.

Sol: the associated normal system

the least squares solution of Ax = b the orthogonal projection of b on the column space of A

Keywords in Section 5.4: orthogonal to W: 正交於 W orthogonal complement: 正交補集 direct sum: 直和 projection onto a subspace: 在子空間的投影 fundamental subspaces: 基本子空間 least squares problem: 最小平方問題 normal equations: 一般方程式

Application: Cross Product Cross product (vector product) of two vectors  向量 (vector) 方向 : use right-hand rule The cross product is not commutative: The cross product is distributive:

Parallelogram representation of the vector product x y θ Bsinθ Area Application: Cross Product

向量之三重純量積 Triple Scalar product The dot and the cross may be interchanged : 純量 (scalar)

向量之三重純量積 Parallelepiped representation of triple scalar product x y z Volume of parallelepiped defined by,, and

Fourier Approximation

Fourier Approximation The Fourier series transforms a given periodic function into a superposition of sine and cosine waves The following equations are used

Today Mathematical Models and Least Square Analysis (Cont.) Inner Product Space Applications Introduction to Linear Transformations The Kernel and Range of a Linear Transformation

6.1 Introduction to Linear Transformations Function T that maps a vector space V into a vector space W: V: the domain of T W: the codomain of T

Image of v under T: If v is in V and w is in W such that Then w is called the image of v under T. the range of T: The set of all images of vectors in V. the preimage of w: The set of all v in V such that T(v)=w

Ex 1: (A function from R 2 into R 2 ) (a) Find the image of v=(-1,2). (b) Find the preimage of w=(-1,11) Sol: Thus {(3, 4)} is the preimage of w=(-1, 11)

Linear Transformation (L.T.):

Notes: (1) A linear transformation is said to be operation preserving. Addition in V Addition in W Scalar multiplication in V Scalar multiplication in W (2) A linear transformation from a vector space into itself is called a linear operator

Ex 2: (Verifying a linear transformation T from R 2 into R 2 ) Pf:

Therefore, T is a linear transformation

Ex 3: (Functions that are not linear transformations)

Notes: Two uses of the term “linear”. (1) is called a linear function because its graph is a line. (2) is not a linear transformation from a vector space R into R because it preserves neither vector addition nor scalar multiplication

Zero transformation: Identity transformation: Thm 6.1: (Properties of linear transformations)

Ex 4: (Linear transformations and bases) Let be a linear transformation such that Sol: (T is a L.T.) Find T(2, 3, -2)

Ex 5: (A linear transformation defined by a matrix) The function is defined as Sol: (vector addition) (scalar multiplication)

Thm 6.2: (The linear transformation given by a matrix) Let A be an m  n matrix. The function T defined by is a linear transformation from R n into R m. Note:

Show that the L.T. given by the matrix has the property that it rotates every vector in R 2 counterclockwise about the origin through the angle . Ex 7: (Rotation in the plane) Sol: (polar coordinates) r : the length of v  : the angle from the positive x-axis counterclockwise to the vector v

r : the length of T(v)  +  : the angle from the positive x-axis counter- clockwise to the vector T(v) Thus, T(v) is the vector that results from rotating the vector v counterclockwise through the angle .

is called a projection in R 3. Ex 8: (A projection in R 3 ) The linear transformation is given by

Show that T is a linear transformation. Ex 9: (A linear transformation from M m  n into M n  m ) Sol: Therefore, T is a linear transformation from M m  n into M n  m

Keywords in Section 6.1: function: 函數 domain: 論域 codomain: 對應論域 image of v under T: 在 T 映射下 v 的像 range of T: T 的值域 preimage of w: w 的反像 linear transformation: 線性轉換 linear operator: 線性運算子 zero transformation: 零轉換 identity transformation: 相等轉換

Today Mathematical Models and Least Square Analysis (Cont.) Inner Product Space Applications Introduction to Linear Transformations The Kernel and Range of a Linear Transformation

6.2 The Kernel and Range of a Linear Transformation Kernel of a linear transformation T: Let be a linear transformation Then the set of all vectors v in V that satisfy is called the kernel of T and is denoted by ker(T). Ex 1: (Finding the kernel of a linear transformation) Sol:

Ex 2: (The kernel of the zero and identity transformations) (a) T(v)=0 (the zero transformation ) (b) T(v)=v (the identity transformation ) Ex 3: (Finding the kernel of a linear transformation) Sol: