Section 6.6 Orthogonal Matrices.

Slides:



Advertisements
Similar presentations
5.3 Orthogonal Transformations
Advertisements

10.4 Complex Vector Spaces.
Section 13-4: Matrix Multiplication
6.4 Best Approximation; Least Squares
Matrices A matrix is a rectangular array of quantities (numbers, expressions or function), arranged in m rows and n columns x 3y.
Section 1.7 Diagonal, Triangular, and Symmetric Matrices.
Lecture 19 Singular Value Decomposition
Maths for Computer Graphics
Symmetric Matrices and Quadratic Forms
Matrix Operations. Matrix Notation Example Equality of Matrices.
Lecture # 9 Matrix Representation of Symmetry Groups
Chapter 2 Matrices Definition of a matrix.
Ch 7.2: Review of Matrices For theoretical and computation reasons, we review results of matrix theory in this section and the next. A matrix A is an m.
1 Neural Nets Applications Vectors and Matrices. 2/27 Outline 1. Definition of Vectors 2. Operations on Vectors 3. Linear Dependence of Vectors 4. Definition.
Section 9.6 Determinants and Inverses Objectives To understand how to find a determinant of a 2x2 matrix. To understand the identity matrix. Do define.
Orthogonal Sets (12/2/05) Recall that “orthogonal” matches the geometric idea of “perpendicular”. Definition. A set of vectors u 1,u 2,…,u p in R n is.
Subspaces, Basis, Dimension, Rank
CE 311 K - Introduction to Computer Methods Daene C. McKinney
5.1 Orthogonality.
Linear Algebra & Matrices MfD 2004 María Asunción Fernández Seara January 21 st, 2004 “The beginnings of matrices and determinants.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Graphics CSE 581 – Interactive Computer Graphics Mathematics for Computer Graphics CSE 581 – Roger Crawfis (slides developed from Korea University slides)
6.837 Linear Algebra Review Patrick Nichols Thursday, September 18, 2003.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Anton/Busby Contemporary Linear AlgebraSection 3.1, Pg. 80.
Chapter 5 Orthogonality.
Gram-Schmidt Orthogonalization
CPSC 491 Xin Liu Nov 17, Introduction Xin Liu PhD student of Dr. Rokne Contact Slides downloadable at pages.cpsc.ucalgary.ca/~liuxin.
6.837 Linear Algebra Review Patrick Nichols Thursday, September 18, 2003.
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 1 : “shiv rpi” Linear Algebra A gentle introduction Linear Algebra has become as basic and as applicable.
Overview Definitions Basic matrix operations (+, -, x) Determinants and inverses.
Section 1.5 Elementary Matrices and a Method for Finding A −1.
Matrices. Definitions  A matrix is an m x n array of scalars, arranged conceptually as m rows and n columns.  m is referred to as the row dimension.
Inverse and Identity Matrices Can only be used for square matrices. (2x2, 3x3, etc.)
Orthogonality and Least Squares
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Co. Chapter 3 Determinants Linear Algebra. Ch03_2 Let A be an n  n matrix and c be a nonzero scalar. (a)If then |B| = …….. (b)If then |B| = …..... (c)If.
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
CPSC 491 Xin Liu November 22, A change of Bases A mxn =UΣV T, U mxm, V nxn are unitary matrixes Σ mxn is a diagonal matrix Columns of a unitary.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Chapter 4 Euclidean n-Space Linear Transformations from to Properties of Linear Transformations to Linear Transformations and Polynomials.
Properties of Inverse Matrices King Saud University.
1.7 Linear Independence. in R n is said to be linearly independent if has only the trivial solution. in R n is said to be linearly dependent if there.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 4.3 Properties of Linear Transformations from R n to R m.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Linear Algebra Review Tuesday, September 7, 2010.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
Section 6-2: Matrix Multiplication, Inverses and Determinants There are three basic matrix operations. 1.Matrix Addition 2.Scalar Multiplication 3.Matrix.
Section 5.6 Rank and Nullity. FOUR FUNDAMENTAL MATRIX SPACES If we consider matrices A and A T together, then there are six vector spaces of interest:
College Algebra Chapter 6 Matrices and Determinants and Applications
Spaces.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Evaluating Determinants by Row Reduction
Euclidean Inner Product on Rn
Multiplicative Inverses of Matrices and Matrix Equations
Elementary Linear Algebra Anton & Rorres, 9th Edition
Elementary Linear Algebra Anton & Rorres, 9th Edition
Presentation transcript:

Section 6.6 Orthogonal Matrices

ORTHOGONAL MATRICES A square matrix with the property that A−1 = AT is said to be an orthogonal matrix. It follows that a square matrix A is orthogonal if and only if ATA = AAT = I.

THEOREM Theorem 6.6.1: The following are equivalent for an n×n matrix A. (a) A is orthogonal. (b) The row vectors of A form an orthonormal set in Rn with the Euclidean inner product. (c) The column vectors of A form an orthonormal set in Rn with the Euclidean inner product.

PROPERTIES OF ORTHOGONAL MATRICES Theorem 6.6.2: (a) The inverse of an orthogonal matrix is orthogonal. (b) A product of orthogonal matrices is orthogonal. (c) If A is orthogonal, then det(A) = 1 or det(A) = −1.

ORTHOGONAL MATRICES AS LINEAR OPERATORS Theorem 6.6.3: If A is an n×n matrix, the following are equivalent. (a) A is orthogonal (b) ||Ax|| = ||x|| for all x in Rn. (c) Ax ∙ Ay = x ∙ y for all x and y in Rn.

ORTHOGONAL LINEAR OPERATORS If T: Rn → Rn is multiplication by an orthogonal matrix A, then T is called an orthogonal linear operator. Note: It follows from parts (a) and (b) of the preceding theorem that orthogonal linear operators are those operators that leave the length of vectors unchanged.

CHANGE OF ORTHONORMAL BASIS Theorem 6.6.4: If P is the transition matrix from one orthonormal basis to another orthonormal basis for an inner product space, then P is an orthogonal matrix; that is, P−1 = PT