Subject :- Applied Mathematics

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Chapter 6 Eigenvalues and Eigenvectors
OCE301 Part II: Linear Algebra lecture 4. Eigenvalue Problem Ax = y Ax = x occur frequently in engineering analysis (eigenvalue problem) Ax =  x [ A.
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Chapter 6 Eigenvalues.
Singular Value Decomposition (SVD) (see Appendix A.6, Trucco & Verri) CS485/685 Computer Vision Prof. George Bebis.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Digital Control Systems Vector-Matrix Analysis. Definitions.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
Eigenvalues and Eigenvectors
Matrices. Definitions  A matrix is an m x n array of scalars, arranged conceptually as m rows and n columns.  m is referred to as the row dimension.
Linear algebra: matrix Eigen-value Problems
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
What is the determinant of What is the determinant of
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvalues and Eigenvectors
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Linear Algebra Chapter 6 Linear Algebra with Applications -Gareth Williams Br. Joel Baumeyer, F.S.C.
2.5 – Determinants and Multiplicative Inverses of Matrices.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
Tutorial 6. Eigenvalues & Eigenvectors Reminder: Eigenvectors A vector x invariant up to a scaling by λ to a multiplication by matrix A is called.
Chapter 5 Eigenvalues and Eigenvectors
Chapter 6 Eigenvalues and Eigenvectors
MTH108 Business Math I Lecture 20.
Matrices and Vector Concepts
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Review of Matrix Operations
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Euclidean Inner Product on Rn
Eigenvalues and Eigenvectors
CS485/685 Computer Vision Dr. George Bebis
Numerical Analysis Lecture 16.
Derivative of scalar forms
Eigenvalues and Eigenvectors
Equivalent State Equations
Symmetric Matrices and Quadratic Forms
Matrix Definitions It is assumed you are already familiar with the terms matrix, matrix transpose, vector, row vector, column vector, unit vector, zero.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
EIGENVECTORS AND EIGENVALUES
RAYAT SHIKSHAN SANSTHA’S S.M.JOSHI COLLEGE HADAPSAR, PUNE
Eigenvalues and Eigenvectors
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Lecture 20 SVD and Its Applications
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Subject :- Applied Mathematics Matrices Created By, Subject Teacher :- Rahi Sonar

Matrix Operations Matrix addition/subtraction Matrix multiplication Matrices must be of same size. Matrix multiplication m x n q x p m x p Condition: n = q

Identity Matrix

Matrix Transpose

Symmetric Matrices Example:

Determinants 2 x 2 3 x 3 n x n

Determinants (cont’d) diagonal matrix:

Matrix Inverse The inverse A-1 of a matrix A has the property: AA-1=A-1A=I A-1 exists only if Terminology Singular matrix: A-1 does not exist Ill-conditioned matrix: A is close to being singular

Matrix Inverse (cont’d) Properties of the inverse:

Pseudo-inverse The pseudo-inverse A+ of a matrix A (could be non-square, e.g., m x n) is given by: It can be shown that:

Matrix trace properties:

Rank of matrix Equal to the dimension of the largest square sub-matrix of A that has a non-zero determinant. Example: has rank 3

Rank of matrix (cont’d) Alternative definition: the maximum number of linearly independent columns (or rows) of A. Therefore, rank is not 4 ! Example:

Rank and singular matrices

Orthogonal matrices Notation: A is orthogonal if: Example:

Orthonormal matrices A is orthonormal if: Note that if A is orthonormal, it easy to find its inverse: Property:

Eigenvalues and Eigenvectors The vector v is an eigenvector of matrix A and λ is an eigenvalue of A if: Interpretation: the linear transformation implied by A cannot change the direction of the eigenvectors v, only their magnitude. (assume non-zero v)

Computing λ and v To find the eigenvalues λ of a matrix A, find the roots of the characteristic polynomial: Example:

Properties Eigenvalues and eigenvectors are only defined for square matrices (i.e., m = n) Eigenvectors are not unique (e.g., if v is an eigenvector, so is kv) Suppose λ1, λ2, ..., λn are the eigenvalues of A, then:

Properties (cont’d) xTAx > 0 for every

Matrix diagonalization Given A, find P such that P-1AP is diagonal (i.e., P diagonalizes A) Take P = [v1 v2 . . . vn], where v1,v2 ,. . . vn are the eigenvectors of A:

Matrix diagonalization (cont’d) Example:

Are all n x n matrices diagonalizable? Only if P-1 exists (i.e., A must have n linearly independent eigenvectors, that is, rank(A)=n) If A has n distinct eigenvalues λ1, λ2, ..., λn , then the corresponding eigevectors v1,v2 ,. . . vn form a basis: (1) linearly independent (2) span Rn

Diagonalization  Decomposition Let us assume that A is diagonalizable, then:

Decomposition: symmetric matrices The eigenvalues of symmetric matrices are all real. The eigenvectors corresponding to distinct eigenvalues are orthogonal. P-1=PT A=PDPT=