Review of Linear Algebra

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Chapter 6 Eigenvalues and Eigenvectors
Y.M. Hu, Assistant Professor, Department of Applied Physics, National University of Kaohsiung Matrix – Basic Definitions Chapter 3 Systems of Differential.
Computer Graphics Recitation 5.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Chapter 3 Determinants and Matrices
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Chapter 2 Matrices Definition of a matrix.
CSci 6971: Image Registration Lecture 2: Vectors and Matrices January 16, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI.
1 Neural Nets Applications Vectors and Matrices. 2/27 Outline 1. Definition of Vectors 2. Operations on Vectors 3. Linear Dependence of Vectors 4. Definition.
3D Geometry for Computer Graphics
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
Matrices CS485/685 Computer Vision Dr. George Bebis.
Week 4 - Monday.  What did we talk about last time?  Vectors.
5.1 Orthogonality.
Linear Algebra & Matrices MfD 2004 María Asunción Fernández Seara January 21 st, 2004 “The beginnings of matrices and determinants.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
A vector can be interpreted as a file of data A matrix is a collection of vectors and can be interpreted as a data base The red matrix contain three column.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 2.1: Recap on Linear Algebra Daniel Cremers Technische Universität.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
CS246 Topic-Based Models. Motivation  Q: For query “car”, will a document with the word “automobile” be returned as a result under the TF-IDF vector.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Modeling of geochemical processes Linear Algebra Refreshing J. Faimon.
+ Review of Linear Algebra Optimization 1/14/10 Recitation Sivaraman Balakrishnan.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Linear algebra: matrix Eigen-value Problems
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
EE611 Deterministic Systems Vector Spaces and Basis Changes Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Chapter 6 Eigenvalues and Eigenvectors
Matrices and Vector Concepts
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
CS479/679 Pattern Recognition Dr. George Bebis
Review of Matrix Operations
Matrices and vector spaces
Matrices and Vectors Review Objective
Systems of First Order Linear Equations
Lecture 03: Linear Algebra
Some useful linear algebra
SVD: Physical Interpretation and Applications
CS485/685 Computer Vision Dr. George Bebis
Numerical Analysis Lecture 16.
6-4 Symmetric Matrices By毛.
Derivative of scalar forms
Chapter 3 Linear Algebra
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Symmetric Matrices and Quadratic Forms
Lecture 13: Singular Value Decomposition (SVD)
Linear Algebra review (optional)
Subject :- Applied Mathematics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Review of Linear Algebra Dr. Chang Shu COMP 4900C Winter 2008

Linear Equations A system of linear equations, e.g. can be written in matrix form: or in general:

Vectors e.g. The length or the norm of a vector is e.g.

Vector Arithmetic Vector addition Vector subtraction Multiplication by scalar

Dot Product (inner product)

Linear Independence A set of vectors is linear dependant if one of the vectors can be expressed as a linear combination of the other vectors. A set of vectors is linearly independent if none of the vectors can be expressed as a linear combination of the other vectors.

Vectors and Points Two points in a Cartesian coordinate system define a vector P(x1,y1) Q(x2,y2) x y v A point can also be represented as a vector, defined by the point and the origin (0,0). P(x1,y1) Q(x2,y2) x y v or Note: point and vector are different; vectors do not have positions

Matrix A matrix is an m×n array of numbers. Example:

Matrix Arithmetic Matrix addition Matrix multiplication Matrix transpose

Matrix multiplication is not commutative Example:

Symmetric Matrix We say matrix A is symmetric if Example: A symmetric matrix has to be a square matrix

Inverse of matrix If A is a square matrix, the inverse of A, written A-1 satisfies: Where I, the identity matrix, is a diagonal matrix with all 1’s on the diagonal.

Trace of Matrix The trace of a matrix:

Orthogonal Matrix A matrix A is orthogonal if or Example:

Matrix Transformation A matrix-vector multiplication transforms one vector to another Example:

Coordinate Rotation

Eigenvalue and Eigenvector We say that x is an eigenvector of a square matrix A if  is called eigenvalue and is called eigenvector. The transformation defined by A changes only the magnitude of the vector Example: and 5 and 2 are eigenvalues, and and are eigenvectors.

Properties of Eigen Vectors If 1, 2,…, q are distinct eigenvalues of a matrix, then the corresponding eigenvectors e1,e2,…,eq are linearly independent. A real, symmetric matrix has real eigenvalues with eigenvectors that can be chosen to be orthonormal.

Least Square has no solution. When m>n for an m-by-n matrix A, In this case, we look for an approximate solution. We look for vector such that is as small as possible. This is the least square solution.

Least Square Least square solution of linear system of equations Normal equation: is square and symmetric The Least Square solution makes minimal.

Least Square Fitting of a Line Line equations: x y The best solution c, d is the one that minimizes:

Least Square Fitting - Example Problem: find the line that best fit these three points: P1=(-1,1), P2=(1,1), P3=(2,3) x y Solution: or is The solution is and best line is

SVD: Singular Value Decomposition An mn matrix A can be decomposed into: U is mm, V is nn, both of them have orthogonal columns: D is an mn diagonal matrix. Example: