Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone 515-294-8141

Slides:



Advertisements
Similar presentations
Chapter 4 Euclidean Vector Spaces
Advertisements

Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Office 432 Carver Phone
1.5 Elementary Matrices and a Method for Finding
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Chapter 2 Matrices Definition of a matrix.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 08 Chapter 8: Linear Transformations.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
5.1 Orthogonality.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Linear Algebra With Applications by Otto Bretscher. Page The Determinant of any diagonal nxn matrix is the product of its diagonal entries. True.
ME451 Kinematics and Dynamics of Machine Systems Review of Matrix Algebra – 2.2 September 13, 2011 Dan Negrut University of Wisconsin-Madison © Dan Negrut,
Instructor: Irvin Roy Hentzel Office 432 Carver Phone
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone
January 22 Inverse of Matrices. Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver.
Chapter 5: The Orthogonality and Least Squares
Chapter 5 Orthogonality.
Linear Algebra Chapter 4 Vector Spaces.
Day 1 Eigenvalues and Eigenvectors
Day 1 Eigenvalues and Eigenvectors
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
January 22 Review questions. Math 307 Spring 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone.
Vectors CHAPTER 7. Ch7_2 Contents  7.1 Vectors in 2-Space 7.1 Vectors in 2-Space  7.2 Vectors in 3-Space 7.2 Vectors in 3-Space  7.3 Dot Product 7.3.
Chapter Content Real Vector Spaces Subspaces Linear Independence
AN ORTHOGONAL PROJECTION
Page 146 Chapter 3 True False Questions. 1. The image of a 3x4 matrix is a subspace of R 4 ? False. It is a subspace of R 3.
1 C ollege A lgebra Systems and Matrices (Chapter5) 1.
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Elementary Linear Algebra Anton & Rorres, 9th Edition
1 Chapter 3 – Subspaces of R n and Their Dimension Outline 3.1 Image and Kernel of a Linear Transformation 3.2 Subspaces of R n ; Bases and Linear Independence.
Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
3.1 Day 2 Applications and properties of a Kernel.
Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Office 432 Carver Phone
Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Matrices and Determinants
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
2.5 – Determinants and Multiplicative Inverses of Matrices.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Unsupervised Learning II Feature Extraction
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Matrices, Vectors, Determinants.
1 MAC 2103 Module 4 Vectors in 2-Space and 3-Space I.
MATRICES A rectangular arrangement of elements is called matrix. Types of matrices: Null matrix: A matrix whose all elements are zero is called a null.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Linear Algebra With Applications by Otto Bretscher.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Euclidean Inner Product on Rn
Elementary Linear Algebra Anton & Rorres, 9th Edition
Presentation transcript:

Math 307 Spring, 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone Text: Linear Algebra With Applications, Second Edition Otto Bretscher

Previous Assignment Friday, Mar 28 Chapter 5.5 Page 240 Problems 1 through If matrix A is orthogonal, then matrix A 2 must be orthogonal as well. True. Orthogonal matrices are closed under multiplication.

2. The equation (AB) T = A T B T holds for all nxn matrices A,B. False. The correct version is (AB) T = B T A T. | | 0 1 | | 0 0 | | T = | 1 0 | T = | 1 0 | | | 0 0 | | 1 0 | | | 0 0 | | 0 0 | | 0 1 | T | 0 0 | T = | 0 0 | | 0 1 | = | 0 0 | | 0 0 | | 1 0 | | 1 0 | | 0 0 | | 0 1 |

3. If A and B are symmetric nxn matrices, then A+B must be symmetric as well. True. Symmetric matrices are a subspace.

4. If matrices A and S are orthogonal, then S -1 A S is orthogonal as well. True. The product of orthogonal matrices is orthogonal.

5. All nonzero symmetric matrices are invertible. False. A counter example is | 1 1 | | 1 1 | which has rank 1.

6. If A is an nxn matrix such that A A T = I, then A must be an orthogonal matrix. True. Since A is square, A T = A -1 and so A T A = I. Thus, the columns of A are orthonormal.

7. If V is a unit vector in R n, and L = span[V], then Proj L(X) = (X o V)X for all vectors x in R n. False. This does not project into. L(X) = (XoV) V

8. If A is a symmetric matrix, then 7 A must be symmetric as well. True. The symmetric matrices are a subspace.

9. If T is a linear transformation from R n to R n such that T(E 1 ),T(E 2 ),..., T(E n ) are all unit vectors, then T must be an orthogonal transformation. False. | 1 1 | is a counter example. | 0 0 |

10. If A is an invertible matrix, then the equation (A T ) -1 = (A -1 ) T must hold. True. I = (A A -1 ) T = (A -1 ) T A T

11. If A and B are symmetric n x n matrices, then A B B A must be symmetric as well. True (A B B A) T = A T B T B T A T = A B B A

12. If matrices A and B commute, then matrices A T and B T must commute as well. True. A T B T = (B A) T = (A B) T = B T A T

13. There is a subspace V of R 5 such that dim(V) = dim(V _|_ ), where V _|_ denotes the orthogonal complement of V. False: Dim (V) + Dim(V _|_ ) = 5 so they cannot be equal.

14. Every invertible matrix A can be expressed as the product of an orthogonal matrix and an upper triangular matrix. True. This is the A = Q R decomposition.

15. If X and Y are two vectors in R n, then the equation |X+Y| 2 = |X| 2 + |Y| 2 must hold. False. This only holds when the vectors are orthogonal.

16. If A is an n x n matrix such that | A U | = 1 for all unit vectors U, then A must be an orthogonal matrix. True. This means that | A X | = | X | for all X and so A is an orthogonal matrix,

17. If matrix A is orthogonal, then A T must be orthogonal as well. True. If A is orthogonal, then A must be square and A T A = I means that A A T = I so A T is orthogonal as well.

18. If A and B are symmetric n x n matrices, then AB must be symmetric as well. FALSE. | 0 1 | | 1 0 | = | 0 0 | | 1 0 | | 0 0 | | 1 0 | is a counter example

19. If V is a subspace of R n and X is a vector in R n, then the inequality X o Proj V X >= 0 must hold. True. Suppose Y+N = X where Y is in V and N is perpendicular to V. Y = Proj V X and Y o Y >= 0 and Y o N = 0 Y o Y + Y o N = Y o X So YoX = YoY >= 0.

20. If A is any matrix with ker(A) = {0}, then the matrix A A T represents the orthogonal projection onto the image of A. False. This is true if the columns of A are orthonormal. If not, one has to use A (A T A) -1 A T

21. The entries of an orthogonal matrix are all less than or equal to 1. True. Since there squares all add to 1, each has to be at most 1.

22. For every nonzero subspace of R n there is an orthonormal basis. True. This is the Gram-Schmidt Process.

23. | 3 -4 | is an orthogonal matrix. | 4 3 | False. The columns are not of unit length.

24. If V is a subspace of R n and X is a vector in R n, then vector proj V X must be orthogonal to vector X- Proj V X True. The projection is perpendicular to the space and proj V X is in the space, so proj V X is perpendicular to X-proj V X

25. If A and B are orthogonal 2x2 matrices, then A B = B A. False. | 1 -1 | | 1 1 | = | 0 1 | | 1 1 | | 1 -1 | | 1 0 | Sqrt[2] Sqrt[2] | 1 1 | | 1 -1 | = | 1 0 | | 1 -1 | | 1 1 | | 0 -1 | Sqrt[2] Sqrt[2]

26. If A is a symmetric matrix, vector V is in the image of A and W is in the kernel of A, then the equation VoW = 0 must hold. True. VoW = V T W = (AX) T W = X T A T W = X T A W = 0

27. The formula ker(A) = ker(A T A) holds for all matrices A. True. If AX = 0, then A T (AX) = 0. If A T (AX) = 0, then X T A T AX = 0 so (A X)o(AX) = 0 and thus AX = 0.

28. If A T A = A A T for an n x n matrix A, then A must be orthogonal. False. It is true for any symmetric matrix including | 1 1 |. | 1 1 |

29. If A is any symmetric 2x2 matrix, then there must be a real number x such that X-x I 2 fails to be invertible. True. det | a-x b | = (a-x) 2 – b 2 = | b a-x | (a+b-x)(a-b-x) so if x = a+b or x = a-b, the matrix will not be invertible.

30. If A is any matrix, then matrix 1/2(A-A T ) is skew-symmetric. False: If A is not Square, then A-A T is not defined. If A is Square, then it is true. ( 1/2(A-A T )) T = 1/2 (A T – A ) = -1/2 (A - A T ).

31. If A is an invertible matrix such that A -1 = A, then A must be orthogonal. False. A = | 1 b | Squares to the identity. | 0 -1 |

32. If the entries of two vectors V and W in R n are all positive, then V and W must enclose an acute angle. True. Since V o W is positive, Cos[theta] is positive and theta < Pi/2.

33. The formula (ker B) _|_ = im( B T ) holds for all matrices A. True. It simply says that B ker(B) = 0.

34. The matrix A T A is symmetric for all matrices A. True. (A T A) T = A T A.

35. If matrix A is similar to B and A is orthogonal, them B must be orthogonal as well. False. | 1 -1 | | 1 -1 | | 1 1 | | 0 1 | | 1 1 | | 0 1 | Sqrt[2] 1/Sqrt[2] | 0 -2 | | 1 1 | | 1 1 | | 0 1 | 1/Sqrt[2] | 0 -2 | | 1 2 |

36. The formula Im(B) = Im(B T B) holds for all square matrices B. False. | 0 1 | has image | x | | 0 0 | | 0 | B T B = | 0 0 | | 0 1 | = | 0 0 | has image | 0 | |1 0 | | 0 0 | | 0 1 | | x |

37. If matrix A is symmetric and matrix S is orthogonal, then matrix S -1 A S must be symmetric. True. S T A S is symmetric when A is.

` 39. There are orthogonal 2x2 matrices A and B such that A+B is orthogonal as well. True. | ½ -Sqrt[3]/2 | | ½ +Sqrt[3]/2 | | +Sqrt[3]/2 ½ | | -Sqrt[3]/2 ½ | Are two orthogonal matrices which add to I 2.

40. If | AX | <= | X | for all X in R n, then A must represent the orthogonal projection onto a subspace V of R n. False: Let A = ½ I.

41. Any Square matrix can be written as the sum of a symmetric and a skew-symmetric matrix. True A = ½ (A +A T ) + ½ (A – A T ).

42. If x 1, x 2, …, x n are any real numbers, then the inequality n n (SUM x k ) 2 <= n SUM (x k 2 ) must hold. k=1 k=1 True | AoB| 2 <= |A| 2 |B| 2 and use A = the vector all of whose entries are 1.

43. If A A T = A 2 for a 2x2 matrix A, then A must be symmetric. True. A(A T – A) = 0 If A is not symmetric, then the first and second columns of A have to be zero.