+ Review of Linear Algebra 10-725 - Optimization 1/14/10 Recitation Sivaraman Balakrishnan.

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
+ Review of Linear Algebra Introduction to Matlab / Machine Learning Fall 2010 Recitation by Leman Akoglu 9/16/10.
MAT 2401 Linear Algebra Exam 2 Review
Chapter 5 Orthogonality
Computer Graphics Recitation 5.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
CSC5160 Topics in Algorithms Tutorial 1 Jan Jerry Le
Orthogonality and Least Squares
Subspaces, Basis, Dimension, Rank
5.1 Orthogonality.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
Chapter 5: The Orthogonality and Least Squares
4 4.2 © 2012 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
CPSC 491 Xin Liu Nov 17, Introduction Xin Liu PhD student of Dr. Rokne Contact Slides downloadable at pages.cpsc.ucalgary.ca/~liuxin.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
Chapter Content Real Vector Spaces Subspaces Linear Independence
Linear algebra: matrix Eigen-value Problems
Orthogonality and Least Squares
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
5.5 Row Space, Column Space, and Nullspace
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
KEY THEOREMS KEY IDEASKEY ALGORITHMS LINKED TO EXAMPLES next.
4.3 Linearly Independent Sets; Bases
Arab Open University Faculty of Computer Studies M132: Linear Algebra
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Chapter 5 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis and Dimension 5. Row Space, Column Space, and Nullspace 6.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Unsupervised Learning II Feature Extraction
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Matrices, Vectors, Determinants.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
Beyond Vectors Hung-yi Lee. Introduction Many things can be considered as “vectors”. E.g. a function can be regarded as a vector We can apply the concept.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Matrices and Vectors Review Objective
Euclidean Inner Product on Rn
1.3 Vector Equations.
Chapter 3 Linear Algebra
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Mathematics for Signals and Systems
Linear Algebra Lecture 20.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Eigenvalues and Eigenvectors
NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS
Presentation transcript:

+ Review of Linear Algebra Optimization 1/14/10 Recitation Sivaraman Balakrishnan

+ Outline Matrix subspaces Linear independence and bases Gaussian elimination Eigen values and Eigen vectors Definiteness Matlab essentials Geoff’s LP sketcher linprog Debugging and using documentation

Basic concepts Vector in R n is an ordered set of n real numbers. e.g. v = (1,6,3,4) is in R 4 A column vector: A row vector: m-by-n matrix is an object with m rows and n columns, each entry filled with a real (typically) number:

Basic concepts - II Vector dot product: Matrix product:

+ Matrix subspaces What is a matrix? Geometric notion – a matrix is an object that “transforms” a vector from its row space to its column space Vector space – set of vectors closed under scalar multiplication and addition Subspace – subset of a vector space also closed under these operations Always contains the zero vector (trivial subspace)

+ Row space of a matrix Vector space spanned by rows of matrix Span – set of all linear combinations of a set of vectors This isn’t always R n – example !! Dimension of the row space – number of linearly independent rows (rank) We’ll discuss how to calculate the rank in a couple of slides

+ Null space, column space Null space – it is the orthogonal compliment of the row space Every vector in this space is a solution to the equation Ax = 0 Rank – nullity theorem Column space Compliment of rank-nullity

+ Linear independence A set of vectors is linearly independent if none of them can be written as a linear combination of the others Given a vector space, we can find a set of linearly independent vectors that spans this space The cardinality of this set is the dimension of the vector space

+ Gaussian elimination Finding rank and row echelon form Applications Solving a linear system of equations (we saw this in class) Finding inverse of a matrix

+ Basis of a vector space What is a basis? A basis is a maximal set of linearly independent vectors and a minimal set of spanning vectors of a vector space Orthonormal basis Two vectors are orthonormal if their dot product is 0, and each vector has length 1 An orthonormal basis consists of orthonormal vectors. What is special about orthonormal bases? Projection is easy Very useful length property Universal (Gram Schmidt) given any basis can find an orthonormal basis that has the same span

+ Matrices as constraints Geoff introduced writing an LP with a constraint matrix We know how to write any LP in standard form Why not just solve it to find “opt”?

A special basis for square matrices The eigenvectors of a matrix are unit vectors that satisfy Ax = λ x Example calculation on next slide Eigenvectors are orthonormal and eigenvalues are real for symmetric matrices This is the most useful orthonormal basis with many interesting properties Optimal matrix approximation (PCA/SVD) Other famous ones are the Fourier basis and wavelet basis

Eigenvalues (A – λ I)x = 0 λ is an eigenvalue iff det(A – λ I) = 0 Example:

+ Projections (vector) (0,0,1) (0,1,0) (1,0,0) (2,2,2) a = (1,0) b = (2,2)

+ Matrix projection Generalize formula from the previous slide Projected vector = (Q T Q) -1 Q T v Special case of orthonormal matrix Projected vector = Q T v You’ve probably seen something very similar in least squares regression

Definiteness Characterization based on eigen values Positive definite matrices are a special sub-class of invertible matrices One way to test for positive definiteness is by showing x T Ax > 0 for all x A very useful example that you’ll see a lot in this class Covariance matrix

Matlab Tutorial - 1 Linsolve Stability and condition number Geoff’s sketching code – might be very useful for HW1

Matlab Tutorial - 2 Linprog – Also, very useful for HW1 Also, covered debugging basics and using Matlab help

+ Extra stuff Vector and matrix norms Matrix norms - operator norm, Frobenius norm Vector norms - L p norms Determinants SVD/PCA