Orthogonal Sets (12/2/05) Recall that “orthogonal” matches the geometric idea of “perpendicular”. Definition. A set of vectors u 1,u 2,…,u p in R n is.

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

Arbitrary Rotations in 3D Lecture 18 Wed, Oct 8, 2003.
1.7 Diagonal, Triangular, and Symmetric Matrices.
Eigenvalues and Eigenvectors
The Inverse of a Matrix (10/14/05) If A is a square (say n by n) matrix and if there is an n by n matrix C such that C A = A C = I n, then C is called.
Determinants (10/18/04) We learned previously the determinant of the 2 by 2 matrix A = is the number a d – b c. We need now to learn how to compute the.
Eigenvalues and Eigenvectors
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Symmetric Matrices and Quadratic Forms
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
4.I. Definition 4.II. Geometry of Determinants 4.III. Other Formulas Topics: Cramer’s Rule Speed of Calculating Determinants Projective Geometry Chapter.
Basis of a Vector Space (11/2/05)
Matrix Operations. Matrix Notation Example Equality of Matrices.
Chapter 2 Matrices Definition of a matrix.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Row Reduction and Echelon Forms (9/9/05) A matrix is in echelon form if: All nonzero rows are above any all-zero rows. Each leading entry of a row is in.
Dirac Notation and Spectral decomposition
Subspaces, Basis, Dimension, Rank
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
1.7 Diagonal, Triangular, and Symmetric Matrices 1.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
ME451 Kinematics and Dynamics of Machine Systems Review of Matrix Algebra – 2.2 September 13, 2011 Dan Negrut University of Wisconsin-Madison © Dan Negrut,
Section 6.6 Orthogonal Matrices.
Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Chapter 5: The Orthogonality and Least Squares
MAT 4725 Numerical Analysis Section 8.2 Orthogonal Polynomials and Least Squares Approximations (Part II)
Gram-Schmidt Orthogonalization
Eigenvalues and Eigenvectors
AN ORTHOGONAL PROJECTION
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Orthogonality and Least Squares
Elementary Linear Algebra Anton & Rorres, 9th Edition
4.4 Identify and Inverse Matrices Algebra 2. Learning Target I can find and use inverse matrix.
Inner Products, Length, and Orthogonality (11/30/05) If v and w are vectors in R n, then their inner (or dot) product is: v  w = v T w That is, you multiply.
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
6 6.3 © 2016 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
The Matrix Equation A x = b (9/16/05) Definition. If A is an m  n matrix with columns a 1, a 2,…, a n and x is a vector in R n, then the product of A.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
2.1 Matrix Operations 2. Matrix Algebra. j -th column i -th row Diagonal entries Diagonal matrix : a square matrix whose nondiagonal entries are zero.
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Eigenvalues and Eigenvectors
Euclidean Inner Product on Rn
Orthogonality and Least Squares
2. Matrix Algebra 2.1 Matrix Operations.
CS485/685 Computer Vision Dr. George Bebis
Orthogonality and Least Squares
Linear Algebra Lecture 39.
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 41.
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Orthogonal Sets (12/2/05) Recall that “orthogonal” matches the geometric idea of “perpendicular”. Definition. A set of vectors u 1,u 2,…,u p in R n is called an orthogonal set if each pair of distinct vectors is orthogonal, i.e., u i  u j = 0 for all i  j. An orthogonal basis for a subspace W of R n is … (guess!)

Orthogonal bases are nice! Orthogonal bases are especially easy to work with since the weights of any vector with respect to that basis can be computed easily (no row reduction). Theorem. If u 1,u 2,…,u p is an orthogonal basis for W and y = c 1 u 1 + c 2 u 2 +…+ c p u p, then for each i, c i = y  u i / u i  u i. Example: Verify that (1,1) and (1,-1) form an orthogonal basis for R 2 and express (5,-2) in terms of this basis.

Orthogonal Projections Looking at the picture of the above example, we see that the weights we get give the orthogonal projection of the given vector onto the lines determined by the basis vectors. That is, if L is the line determined by a basis vector u, then the orthogonal projection of a vector y onto L is just proj L y = c u = (y  u / u  u) u.

Orthonormal Sets An orthogonal set all of whose vectors are unit vectors (recall, this means they have norm, or length, equal to 1) is called an orthonormal set. Note that orthonormal bases are even simpler than orthogonal bases since now each weight on y is just y  u. The standard basis e 1,e 2,…,e n for R n is obviously an orthonormal basis.

Orthogonal Matrices A square matrix is called orthogonal if its columns consist of orthonormal vectors. It’s easy to check that if U is an orthogonal matrix, then U T U = I n. Hence if U is orthogonal, its transpose and its inverse are the same!

Assignments Correct your test (due Wed 12/7) On Monday (12/5), we will have Lab #4 on diagonalization of matrices. Please read Section 5.3 in preparation for that lab. For Wednesday (12/7), please Read Section 6.2. Do Exercises 1-23 odd.