Chapter 5 Orthogonality

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

10.4 Complex Vector Spaces.
5.4 Basis And Dimension.
5.1 Real Vector Spaces.
Chapter 4 Euclidean Vector Spaces
6.4 Best Approximation; Least Squares
CHAPTER ONE Matrices and System Equations
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Symmetric Matrices and Quadratic Forms
Chapter 6 Eigenvalues.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Lecture 12 Least Square Approximation Shang-Hua Teng.
Orthogonality and Least Squares
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 08 Chapter 8: Linear Transformations.
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Chapter 5 Inner Product Spaces
5.1 Orthogonality.
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
CHAPTER SIX Eigenvalues
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Chapter 5 Orthogonality.
Linear Algebra Chapter 4 Vector Spaces.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
Vectors CHAPTER 7. Ch7_2 Contents  7.1 Vectors in 2-Space 7.1 Vectors in 2-Space  7.2 Vectors in 3-Space 7.2 Vectors in 3-Space  7.3 Dot Product 7.3.
Chapter 3 Euclidean Vector Spaces Vectors in n-space Norm, Dot Product, and Distance in n-space Orthogonality
Chapter Content Real Vector Spaces Subspaces Linear Independence
AN ORTHOGONAL PROJECTION
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Chap. 6 Linear Transformations
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Orthogonality and Least Squares
Chapter 3 Vectors in n-space Norm, Dot Product, and Distance in n-space Orthogonality.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Chapter 10 Real Inner Products and Least-Square
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Chapter 4 Euclidean n-Space Linear Transformations from to Properties of Linear Transformations to Linear Transformations and Polynomials.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Linear Algebra Chapter 2 Matrices.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
A function is a rule f that associates with each element in a set A one and only one element in a set B. If f associates the element b with the element.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Chapter 5 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis and Dimension 5. Row Space, Column Space, and Nullspace 6.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Matrices, Vectors, Determinants.
Lecture 11 Inner Product Space Last Time - Coordinates and Change of Basis - Applications - Length and Dot Product in R n Elementary Linear Algebra R.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
Matrices and vector spaces
Orthogonality and Least Squares
Linear Algebra Lecture 39.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Orthogonality and Least Squares
Presentation transcript:

Chapter 5 Orthogonality

Outline Scalar Product in Rn Orthogonal Subspaces Least Square Problems Inner Product Spaces Orthogonal Sets The Gram-Schmidt Orthogonalization Process

Scalar product in Rn

Def: Let and be vectors in either R2 or R3. The distance between and is defined to be the number

Theorem 5.1.1 If and are two nonzero vectors in either R2 or R3 and is the angle between them , then

Proof: By the law of cosines,

Corollary 5.1.2(Cauchy-Schwarz Inequality) If and are vectors in either R2 or R3, then With equality holding if and only if one of the vectors is or one vector is a multiple of the other.

Note: If is the angle between , then Thus

Def: The vectors and in R2(or R3)are said to be orthogonal if .

Scalar and Vector Projections

Example: Find the point on the line that is closest to the point (1,4) Sol: Note that the vector is on the line Thus the desired point is

Example: Find the equation of the plane passing through and normal to Sol:

Example: Find the distance form to the plane Sol: a normal vector to the plane is The distance

Application 1: Information Retrieval Revisited Table 1 Frequency of Key words Modules Key Words M1 M2 M3 M4 M5 M6 M7 M8 determines 6 3 1 eignvalues 5 2 linear 4 matrices numerical orthogonality spaces systems transformations vector

Application I: Information Retrieval Revisited A is the matrix corresponding to Table I, then the columns of the database matrix Q are determined by setting To do a search for the key words orthogonality, spaces, vector, we form a unit search vector whose entries are all zero except for the three rows(be put in each of the rows) corresponding to the search rows.

Application I: Information Retrieval Revisited Since is the entry of that is closest to 1,this indicates that the direction of the search vector is closest to the direction of and hence that Module 5 is the one that best matches our search criteria.

Application 2: Correlation And Covariance Matrices Table 2 Math Scores Fall 1996 Scores Student Assignment Exams Final S1 198 200 196 S2 160 165 S3 158 133 S4 150 91 S5 175 182 151 S6 134 135 101 S7 152 136 80 Average 161 163 131

Application 2: Correlation And Covariance Matrices The column vectors of X represent the deviations from the mean for each of the three sets of scores. The three sets of translated data specified by the column vectors of X all have mean 0 and all sum to 0. A cosine value near 1 indicates that the two sets of scores are highly correlated. Scale to make them unit vectors

Application 2: Correlation And Covariance Matrices The matrix C is referred to as a correlation matrix. The three sets of scores in our example are all positively correlated since the correlation coefficients are all positive. A negative coefficient would indicate that two data sets were negatively correlated. A coefficient of 0 would indicate that they were uncorrelated.

5-2 Orthogonal Subspaces Def: Two subspaces X and Y of are said to be orthogonal if = 0 for every and If X and Y are orthogonal, we write

Def: Let Y be a subspace of . The set of all vectors in that are orthogonal to every vector in Y will be denoted . Thus = { for every } The set is called the orthogonal complement of Y

Remarks: If X and Y are orthogonal subspaces of , then . If Y is a subspace of , then is also a subspace of .

Four Fundamental Subspaces Let

Theorem 5.2.1(Fundamental Subspace Theorem) pf: Let and Also, if Similarly,

Example: Let Clearly,

Theorem 5.2.2 If S is a subspace of , then Furthermore, if { } is a basis for S and { }is a basis for , then { , } is a basis for .

Proof: If The result follows Suppose . Let and

To show that is a basis for , It remains to show their independency. Let . Then Similarly,

Def: If U and V are subspaces of a vector space W and each can be written uniquely as a sum , where and ,then we say that W is a direct sum of U and V, and we write

then pf: By Theorem5.2.2, To show uniqueness, Suppose where Theorem5.2.3: If is a subspace of , then pf: By Theorem5.2.2, To show uniqueness, Suppose where

Theorem5.2.4: If is a subspace of , then pf: Let If

Remark: Let . i.e. , Since and are bijections .

Let bijection bijection

Cor5.2.5: Let and . Then either (i) or (ii) pf:

Example: Let . Find The basic idea is that the row space and the sol. of are invariant under row operations. Sol: (i) (Why?) (ii) (iii) Similarly, and (iv) Clearly,

Example: Let (i) and (ii) The mapping (iv) What is the matrix representation for ?

5-4 Inner Product Spaces A tool to measure the orthogonality of two vectors in general vector space

Def: An inner product on a vector space is a function Satisfying the following conditions: (i) with equality iff (ii) (iii)

Example: (i) Let Then is an inner product of (ii) Let , Then is an inner product of (iii) Let and then is an inner product of (iv) Let , is a positive function and are distinct real numbers. Then is an inner product of

Def: Let be an inner product of a vector space and . we say The length or norm of is given by

Theorem5.4.1: (The Pythagorean Law) pf:

Example 1: Consider with inner product (ii) (iii) (iv) (Pythagorean Law) or

Example 2: Consider with inner product It can be shown that (i) (ii) (iii) Thus is an orthonormal set.

Remark Remark: The inner product in example 2 plays a key role in Fourier analysis application involving trigo- nometric approximation of functions.

Example 3: Let and let Then is not orthogonal to

Def: Let be two vectors in an inner product space . Then the scalar projection of onto is defined as The vector projection of onto is

Lemma: Let be the vector projection of onto . Then for some pf:

Theorem5.4.2: (Cauchy-Schwarz Inequality) Let be two vectors in an inner product space . Then Moreover, equality holds are linear dependent. pf: If If Equality holds i.e., equality holds iff are linear dependent.

Note: From Cauchy-Schwarz Inequality for . This, we can define as the angle between the two nonzero vectors

Def: Let be a vector space a function is said to be a norm if it satisfies Remark: Such a vector space is called a normed linear space.

Theorem5.4.3: If is an inner product space, then defines a norm on pf: trivial Def: The distance between is defined as

Example: Let , then

Example: Let Thus, However, (Why?) Remark: In the case of a norm that is not derived from an inner product, the Pythagorean Law will not hold. Example: Let Thus, However, (Why?)

Example: Let , then

Example: Let Then

5-3 Least Squares Problems

Least squares problems A typical example: Given Find the best line to fit the data . or or find such that is minimum Geometrical meaning :

Least squares problems: Given then the equation may not have solutions The objective of least square problem is trying to find such that is minimum value i.e., find satisfying

Preview of the results: It will be shown that If columns of are linear independent .

Theorem5.3.1: Let be a subspace of , then (i) for all (ii) pf: (i) where If (ii) follows directly from (i) by noting that unique expression

Question: How to find which solves Ans.: From previous Theorem , we know that Definition:

Remark: one solution to the normal equation. In general, it is possible to have more than one solution to the normal equation. If is a solution, then the general solution is of the form

Theorem5.3.2: Let and Then the normal equation has an unique solution . and is the unique least squares solution to pf: To show that is nonsingular

Note: The projection vector is the element of that is closet to in the least squares sense . Thus, The matrix is called the projection matrix (that project any vector of to )

Application 2: Spring Constants Suppose a spring obeys the Hook’s law and a series of data are taken (with measurement error) as How to determine ? sol: Note that is inconsistent The normal equation is so,

Example 2: Given the data Find the best least squares fit by a linear function. sol: Let the desired linear function be The problem becomes to find the least squares solution of is the unique solution. Thus, the best linear least square fit is ∵ rank(A)=2

Example3: Find the best quadratic least squares fit to the data sol: Let the desired quadratic function be The problem becomes to find the least square solution of is the unique solution. Thus, the best quadratic least square fit is ∵ rank(A)=3

5-5 Orthonormal Sets

Orthonormal Set Simplify the least squares solution (avoid computing inverse) Numerical computational stability

Def: is said to be an orthogonal set in an inner product space if Moreover, if , then is said to be orthonormal.

Example 2: is an orthogonal set but not orthonormal. However , is orthonormal.

Theorem5.5.1: Let be an orthogonal set of nonzero vectors in an inner product space . Then they are linear independent. pf: Suppose that

Example: is an orthonormal set of with inner product . Note: Now you know the meaning what one says that .

Theorem5.5.2: Let be an orthonormal basis for an inner product space . If , then . pf:

Cor: Let be an orthonormal basis for an inner product space . If and , then . pf:

Cor: (Parseval’s Formula) If is an orthonormal basis for an inner product space and , then pf: By Corollary 5.5.3,

Example 4: and form an orthonormal basis for . If , then and

Example 5: Determine without computing antiderivatives . sol:

Def: is said to be an orthogonal matrix if the column vectors of form an orthonormal set in . Example 6: The rotational matrix and the elementary reflection matrix are orthogonal matrix .

Properties of orthogonal matrices: If is orthogonal, then

Theorem 5.5.6: If the columns of form an orthonormal set in , then and the least squares solution to is This avoid computing matrix inverse .

Theorem 5.5.7 & 5.5.8: Let be a subspace of an inner product space and let . Let be an orthonormal basis for . If , where , then

Cor5.5.9: Let be a subspace of and If be an orthonormal basis for and then the projection of onto is . pf:

Note: Let columns of be an orthonormal set

Example 7: Let Find the vector in that is closet to Sol:

Approximation of functions Example 8: Find the best least squares approximation to on by a linear function . Sol:

Sol:

Approximation of trigonometric polynomials FACT: forms an orthonormal set in with respect to the inner product Problem: Given a continuous 2π-periodic function , find a trigonometric polynomial of degree n which is a best least squares approximation to .

Sol: It suffices to find the projection of onto the subspace The best approximation of has coefficients

Example: Consider with inner product of (i) Check that is orthonormal (ii) Let

(iii) (iv)

5-6 Gram-Schmidt Orthogonalization Process

Cram-Schmidt Orthogonalization Process Question: Given an ordinary basis , how to transform them into an orthonormal basis ?

Given ,Clearly Clearly, Similarly, We have the next result

Theorem5.6.1: (The Gram-Schmidt process) H. (i) Let be a basis for an inner product space . (ii) C. is an orthonormal basis.

Example: Find an orthonormal basis for with inner product given by , where Sol: Starting with a basis

Theorem5.6.2: (QR Factorization) If A is an m×n matrix of rank n, then A can be factored into a product QR, where Q is an m×n matrix with orthonormal columns and R is an n×n matrix that is upper triangular and invertible.

Proof. of QR-Factorization

Proof. of QR-Factorization (cont.)

Theorem5.6.3: If A is an m×n matrix of rank n, then the solution to the least squares problem is given by , where Q and R are the matrices obtained from Thm.5.6.2. The solution may be obtained by using back substitution to solve .

Proof. of Thm.5.6.3

Example 3: Solve By direct calculation,