Chapter 5 Orthogonality.

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

Chapter 4 Euclidean Vector Spaces
Vector Spaces & Subspaces Kristi Schmit. Definitions A subset W of vector space V is called a subspace of V iff a.The zero vector of V is in W. b.W is.
Signal , Weight Vector Spaces and Linear Transformations
Signal , Weight Vector Spaces and Linear Transformations
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Chapter 5 Orthogonality
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Chapter 5 Inner Product Spaces
Subspaces, Basis, Dimension, Rank
Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone
5.1 Orthogonality.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Section 6.6 Orthogonal Matrices.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Signal-Space Analysis ENSC 428 – Spring 2008 Reference: Lecture 10 of Gallager.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Linear Algebra Chapter 4 Vector Spaces.
Gram-Schmidt Orthogonalization
Elementary Linear Algebra Anton & Rorres, 9th Edition
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Vectors in R n a sequence of n real number An ordered n-tuple: the set of all ordered n-tuple  n-space: R n Notes: (1) An n-tuple can be viewed.
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
Chapter 3 Vector Spaces. The operations of addition and scalar multiplication are used in many contexts in mathematics. Regardless of the context, however,
Vectors CHAPTER 7. Ch7_2 Contents  7.1 Vectors in 2-Space 7.1 Vectors in 2-Space  7.2 Vectors in 3-Space 7.2 Vectors in 3-Space  7.3 Dot Product 7.3.
Chapter 3 Euclidean Vector Spaces Vectors in n-space Norm, Dot Product, and Distance in n-space Orthogonality
AN ORTHOGONAL PROJECTION
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Chapter 3 Vectors in n-space Norm, Dot Product, and Distance in n-space Orthogonality.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Chapter 10 Real Inner Products and Least-Square
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition
Lecture 11 Inner Product Space
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Chapter 4 Euclidean n-Space Linear Transformations from to Properties of Linear Transformations to Linear Transformations and Polynomials.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Lecture 9 Vector & Inner Product Space
A function is a rule f that associates with each element in a set A one and only one element in a set B. If f associates the element b with the element.
Section 6.2 Angles and Orthogonality in Inner Product Spaces.
4 4.1 © 2016 Pearson Education, Ltd. Vector Spaces VECTOR SPACES AND SUBSPACES.
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1: ……………………………………………………………………. The elements in R n called …………. 4.1 The vector Space R n Addition.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
Chapter 5 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis and Dimension 5. Row Space, Column Space, and Nullspace 6.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
An inner product on a vector space V is a function that, to each pair of vectors u and v in V, associates a real number and satisfies the following.
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1. Let be a sequence of n real numbers. The set of all such sequences is called n-space (or.
Lecture 11 Inner Product Space Last Time - Coordinates and Change of Basis - Applications - Length and Dot Product in R n Elementary Linear Algebra R.
Beyond Vectors Hung-yi Lee. Introduction Many things can be considered as “vectors”. E.g. a function can be regarded as a vector We can apply the concept.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
1 Chapter 6 Orthogonality 6.1 Inner product, length, and orthogonality 内积 , 模长 , 正交 6.2 Orthogonal sets 正交组,正交集 6.4 The Gram-Schmidt process 格莱姆 - 施密特过程.
Chapter 1 Linear Equations and Vectors
Elementary Linear Algebra
Matrices and Vectors Review Objective
Signal & Weight Vector Spaces
Linear Algebra Lecture 39.
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Signal & Weight Vector Spaces
Linear Algebra Lecture 38.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Vectors and Dot Products
Presentation transcript:

Chapter 5 Orthogonality

1 The scalar product in Rn The product xTy is called the scalar product of x and y. In particular, if x=(x1, …, xn)T and y=(y1, …,yn)T, then xTy=x1y1+x2y2+‥‥+xnyn The Scalar Product in R2 and R3 Definition Let x and y be vectors in either R2 or R3. The distance between x and y is defined to be the number ‖x-y‖.

Example If x=(3, 4)T and y=(-1, 7)T, then the distance between x and y is given by ‖y-x‖= 5

Theorem 5.1.1 If x and y are two nonzero vectors in either R2 or R3 and θ is the angle between them, then (1) xTy=‖x‖‖y‖cosθ Corollary 5.1.2 ( Cauchy-Schwarz Inequality) If x and y are vectors in either R2 or R3 , then (2) ︱xTy︱≤‖x‖‖y‖ with equality holding if and only if one of the vectors is 0 or one vector is a multiple of the other.

Definition The vector x and y in R2 (or R3) are said to be orthogonal if xTy=0.

Example (a) The vector 0 is orthogonal to every vector in R2. (b) The vectors and are orthogonal in R2. (c) The vectors and are orthogonal in R3.

Scalar and Vector Projections x z=x-p y p=αu θ u The scalar is called the scalar projection of x and y, and the vector p is called the vector projection of x and y.

Scalar projection of x onto y: Vector projection of x onto y:

Example The point Q is the point on the line that is closet to the point (1, 4). Determine the coordinates of Q. (1, 4) v w Q

Orthogonality in Rn The vectors x and y are said to be orthogonal if xTy=0.

2 Orthogonal Subspaces Definition Two subspaces X and Y of Rn are said to be orthogonal if xTy=0 for every x∈X and every y∈Y. If X and Y are orthogonal, we write X⊥Y. Example Let X be the subspace of R3 spanned by e1, and let Y be the subspace spanned by e2. Example Let X be the subspace of R3 spanned by e1 and e2, and let Y be the subspace spanned by e3.

Definition Let Y be a subspace of Rn . The set of all vectors in Rn that are orthogonal to every vector in Y will be denoted Y⊥. Thus Y⊥={ x∈Rn︱xTy=0 for every y∈Y } The set Y⊥ is called the orthogonal complement of Y. Remarks 1. If X and Y are orthogonal subspaces of Rn, then X∩Y={0}. 2. If Y is a subspace of Rn, then Y⊥ is also a subspace of Rn.

Fundamental Subspaces Theorem 5.2.1 ( Fundamental Subspaces Theorem) If A is an m×n matrix, then N(A)=R(AT) ⊥ and N(AT)=R(A) ⊥. Theorem 5.2.2 If S is a subspace of Rn, then dim S+dim S⊥=n. Furthermore, if {x1, …, xr} is a basis for S and {xr+1, …, xn} is a basis for S⊥, then {x1, …, xr, xr+1, …, xn} is a basis for Rn.

Theorem 5.2.3 If S is a subspace of Rn, then Rn=S S⊥. Definition If U and V are subspaces of a vector space W and each w∈W can be written uniquely as a sum u+v, where u∈U and v∈V, then we say that W is a direct sum of U and V, and we write W=U V. Theorem 5.2.3 If S is a subspace of Rn, then Rn=S S⊥. Theorem 5.2.4 If S is a subspace of Rn, then (S⊥) ⊥=S.

Example Let Theorem 5.2.5 If A is an m×n matrix and b∈Rm, then either there is a vector x∈Rn such that Ax=b or there is a vector y∈Rm such that ATy=0 and yTb≠0. Example Let Find the bases for N(A), R(AT), N(AT), and R(A).

4 Inner Product Spaces Definition An inner product on a vector space V is an operation on V that assigns to each pair of vectors x and y in V a real number <x, y> satisfying the following conditions: Ⅰ. <x, x>≥0 with equality if and only if x=0. Ⅱ. <x, y>=<y, x> for all x and y in V. Ⅲ. <αx+βy, z>=α<x, z>+β<y, z> for all x, y, z in V and all scalars α and β.

The Vector Space Rm×n Given A and B in Rm×n, we can define an inner product by

Basic Properties of Inner product Spaces If v is a vector in an inner product space V, the length or norm of v is given by Theorem 5.4.1 ( The Pythagorean Law ) If u and v are orthogonal vectors in an inner product space V, then

Example If and then

Definition If u and v are vectors in an inner product space V and v≠0, then the scalar projection of u onto v is given by and the vector projection of u onto v is given by

Theorem 5.4.2 ( The Cauchy- Schwarz Inequality) If u and v are any two vectors in an inner product space V, then Equality holds if and only if u and v are linearly dependent.

5 Orthonormal Sets Definition Let v1, v2, …, vn be nonzero vectors in an inner product space V. If <vi, vj>=0 whenever i≠j, then { v1, v2, …, vn} is said to be an orthogonal set of vectors. Example The set {(1, 1, 1)T, (2, 1, -3)T, (4, -5, 1)T} is an orthogonal set in R3. Theorem 5.5.1 If { v1, v2, …, vn} is an orthogonal set of nonzero vectors in an inner product space V, then v1, v2, …,vn are linearly independent.

Theorem 5.5.2 Let { u1, u2, …, un} be an orthonoemal basis Definition An orthonormal set of vectors is an orthogonal set of unit vectors. The set {u1, u2, …, un} will be orthonormal if and only if where Theorem 5.5.2 Let { u1, u2, …, un} be an orthonoemal basis for an inner product space V. If , then ci=<v, ui>.

Corollary 5.5.3 Let { u1, u2, …, un} be an orthonoemal basis for an inner product space V. If and , then Corollary 5.5.4 If { u1, u2, …, un} is an orthonoemal basis for an inner product space V and , then

Orthogonal Matrices Definition An n×n matrix Q is said to be an orthogonal matrix if the column vectors of Q form an orthonormal set in Rn. Theorem 5.5.5 An n×n matrix Q is orthogonal if and only if QTQ=I. Example For any fixed , the matrix is orthogonal.

Properties of Orthogonal Matrices If Q is an n×n orthogonal matrix, then (a) The column vectors of Q form an orthonormal basis for Rn. (b) QTQ=I (c) QT=Q-1 (d) det(Q)=1 or -1 (e) The thanspose of an orthogonal matrix is an orthogonal matrix. (f) The product of two orthogonal matrices is also an orthogonal

The Gram-Schmidt Orthogonalization Process Theorem 5.6.1 ( The Gram-Schmidt Process) Let {x1, x2, …, xn} be a basis for the inner product space V. Let and define u2, …, un recursively by for k=1, …, n-1

where pk=<xk+1, u1>u1+<xk+1, u2>+‥‥<xk+1, uk>uk is the projection of xk+1 onto Span(u1, u2, …, uk). The set {u1, u2, …, un} is an orthonormal basis for V. Example Let Find an orthonormal basis for the column space of A.