Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

6.4 Best Approximation; Least Squares
CS 450: COMPUTER GRAPHICS LINEAR ALGEBRA REVIEW SPRING 2015 DR. MICHAEL J. REALE.
Linear Equations in Linear Algebra
Vector Spaces & Subspaces Kristi Schmit. Definitions A subset W of vector space V is called a subspace of V iff a.The zero vector of V is in W. b.W is.
Signal , Weight Vector Spaces and Linear Transformations
Signal , Weight Vector Spaces and Linear Transformations
Symmetric Matrices and Quadratic Forms
Chapter 2 Matrices Definition of a matrix.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
Orthogonal Sets (12/2/05) Recall that “orthogonal” matches the geometric idea of “perpendicular”. Definition. A set of vectors u 1,u 2,…,u p in R n is.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
5.1 Orthogonality.
Quantum One: Lecture 8. Continuously Indexed Basis Sets.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
राघव वर्मा Inner Product Spaces Physical properties of vectors  aka length and angles in case of arrows Lets use the dot product Length of Cosine of the.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Linear Algebra Lecture 25.
Chapter 5: The Orthogonality and Least Squares
Chapter 5 Orthogonality.
Linear Algebra Chapter 4 Vector Spaces.
Gram-Schmidt Orthogonalization
4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate systems.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
Chapter Content Real Vector Spaces Subspaces Linear Independence
AN ORTHOGONAL PROJECTION
Page 146 Chapter 3 True False Questions. 1. The image of a 3x4 matrix is a subspace of R 4 ? False. It is a subspace of R 3.
Matrices. Definitions  A matrix is an m x n array of scalars, arranged conceptually as m rows and n columns.  m is referred to as the row dimension.
Chap. 6 Linear Transformations
Orthogonality and Least Squares
Class 27: Question 1 TRUE or FALSE: If P is a projection matrix of the form P=A(A T A) -1 A T then P is a symmetric matrix. 1. TRUE 2. FALSE.
Elementary Linear Algebra Anton & Rorres, 9th Edition
1 Chapter 3 – Subspaces of R n and Their Dimension Outline 3.1 Image and Kernel of a Linear Transformation 3.2 Subspaces of R n ; Bases and Linear Independence.
Section 2.3 Properties of Solution Sets
Chapter 10 Real Inner Products and Least-Square
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Ch 6 Vector Spaces. Vector Space Axioms X,Y,Z elements of  and α, β elements of  Def of vector addition Def of multiplication of scalar and vector These.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
Chapter 5 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis and Dimension 5. Row Space, Column Space, and Nullspace 6.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Lecture XXVI.  The material for this lecture is found in James R. Schott Matrix Analysis for Statistics (New York: John Wiley & Sons, Inc. 1997).  A.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
CS479/679 Pattern Recognition Dr. George Bebis
Linear Equations in Linear Algebra
Chapter 1 Linear Equations and Vectors
Matrices and Vectors Review Objective
Linear Equations in Linear Algebra
1.3 Vector Equations.
Signal & Weight Vector Spaces
Linear Algebra Lecture 39.
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Signal & Weight Vector Spaces
Symmetric Matrices and Quadratic Forms
Elementary Linear Algebra
Linear Algebra Lecture 20.
Linear Algebra Lecture 41.
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal basis for the null space of A

Class 26: Answer 1: (B) The given vectors are n linear independent columns of a matrix A so they form a basis for the column space of A. Starting with this basis, Gram-Schmidt Orthogonalization produces an orthogonal basis which spans the same space the original collection of vectors does. Thus the answer is (B), it will span the column space of A

Class 26: Question 2

Class 26: Answer 2: (B) The way to do this problem is to do one step of Gram- Schmidt. Therefore the new pair (-2,1,0) and (3/5,6/5,1) is an orthogonal set of vectors formed from (-2,1,0) and (1,1,1)

Class 26: Question 3 1. TRUE 2. FALSE TRUE or FALSE:

Class 26: Answer 3: (A) Although this second pair of vectors is not formed by using Gram-Schmidt, it is still an orthogonal pair. The question is does span{(2,-1,0),(1,1,1)} equal span{2,-1,0),(3,6,5)}? In other words are they bases for the same space? Is (3,6,5) a linear combination of (-2,1,0) and (1,1,1)? Yes! 5(1,1,1)+(-2,1,0)=(3,6,5). Another way to see if the spaces are the same is to see if their cross-product is the same, i.e. they have the same normal vector, and thus they must lie in the same plane

Class 26: Question 4 TRUE or FALSE: The Gram-Schmidt Orthogonalization process can be used to construct an orthonormal set of vectors from an arbitrary set of vectors. 1. TRUE. 2. FALSE.

Class 26: Answer 4: (B) FALSE! The set of vectors that Gram-Schmidt works on is not any arbitrary set of vectors, but a linearly independent set. Although technically Gram-Schmidt produces an orthogonal set, one can always continue the process to the end to produce an orthonormal set.

Class 25: Question 5 1. TRUE. 2. FALSE.

Class 25: Answer 5 (B) There are multiple bases for R 2 so there must be multiple orthogonal bases for R 2. All you need to do is pick two vectors that are not scalar multiples of each other, are orthogonal and then normalize them to produce multiple orthonormal bases in R 2. Here are some more

Class 25: Question 6 Let Q be a square matrix with orthonormal columns. TRUE or FALSE: Q -1 =Q T. 1. TRUE. 2. FALSE.

Class 25: Answer 6: A If Q is a square matrix with orthonormal columns then Q T Q must equal the identity matrix. This is clear because if you multiply a matrix by it’s transpose then each element of the product consists of dot products of the rows of the matrix with it’s own columns. Since the matrix has orthonormal columns those dot products will produce either 1 or 0, which will result in the identity matrix. The product has 1 along the diagonal because that is when a particular column vector is being dotted with itself. Thus since Q T Q=I that must mean that Q -1 =Q T.

Class 25: Question 7

Class 25: Answer 7 (A)

Class 25: Question 8

Class 25: Answer 8: A The key thing to understand about this question is that it is the same question as the one before! In other words, to make an orthogonal projection onto a subspace one needs to have a basis for that subspace. What is a basis for the subspace corresponding to the line y=x/2? The line corresponds to span{(2,1)}. Thus (2,1) is a basis for the subspace. Thus the orthogonal projection of (-3,1) onto (2,1) is (2,1).

Class 25: Question 9

Class 25: Answer 9: D The line named l is y=3x. This is the subspace span{(1,3)}. If z is the projection of b onto this subspace it will be some scalar multiple of (1,3), i.e. (c,3c). However, the more important part of the problem is the interpretation of b-z which equals b-proj l (b). Recall that this new vector b-z will be orthogonal to l by definition. There’s no way b-z could be a point on l and orthogonal to it. Thus only two of the statements are true.