Review of Linear Algebra Introduction to Matlab

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

3D Geometry for Computer Graphics
Applied Informatics Štefan BEREŽNÝ
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
+ Review of Linear Algebra Introduction to Matlab / Machine Learning Fall 2010 Recitation by Leman Akoglu 9/16/10.
Object Orie’d Data Analysis, Last Time Finished NCI 60 Data Started detailed look at PCA Reviewed linear algebra Today: More linear algebra Multivariate.
Lecture 19 Singular Value Decomposition
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
MF-852 Financial Econometrics
Computer Graphics Recitation 5.
Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
1 Neural Nets Applications Vectors and Matrices. 2/27 Outline 1. Definition of Vectors 2. Operations on Vectors 3. Linear Dependence of Vectors 4. Definition.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Ordinary least squares regression (OLS)
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
1cs542g-term Notes  Extra class next week (Oct 12, not this Friday)  To submit your assignment: me the URL of a page containing (links to)
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Chapter 10 Review: Matrix Algebra
Compiled By Raj G. Tiwari
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
MATLAB Basics With a brief review of linear algebra by Lanyi Xu modified by D.G.E. Robertson.
+ Review of Linear Algebra Optimization 1/14/10 Recitation Sivaraman Balakrishnan.
CPSC 491 Xin Liu Nov 17, Introduction Xin Liu PhD student of Dr. Rokne Contact Slides downloadable at pages.cpsc.ucalgary.ca/~liuxin.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
BIOL 582 Supplemental Material Matrices, Matrix calculations, GLM using matrix algebra.
Introduction to Matrices Douglas N. Greve
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Unsupervised Learning II Feature Extraction
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
EE611 Deterministic Systems Vector Spaces and Basis Changes Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
MAT 322: LINEAR ALGEBRA.
Matrices and Vector Concepts
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Review of Matrix Operations
Introduction to Matrices
Matrices and vector spaces
L5 matrix.
Matrices and Vectors Review Objective
Systems of First Order Linear Equations
Euclidean Inner Product on Rn
Singular Value Decomposition
CS485/685 Computer Vision Dr. George Bebis
CSE 541 – Numerical Methods
Chapter 3 Linear Algebra
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Data Intensive and Cloud Computing Matrices and Arrays Lecture 9
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Lecture 13: Singular Value Decomposition (SVD)
Linear Algebra Lecture 41.
Math review - scalars, vectors, and matrices
The Elements of Linear Algebra
Subject :- Applied Mathematics
Lecture 20 SVD and Its Applications
Presentation transcript:

Review of Linear Algebra Introduction to Matlab 10-701/15-781 Machine Learning Fall 2010 Recitation by Leman Akoglu 9/16/10

Outline Linear Algebra Basics Matrix Calculus Singular Value Decomposition (SVD) Eigenvalue Decomposition Low-rank Matrix Inversion Matlab essentials

Basic concepts Vector in Rn is an ordered set of n real numbers. e.g. v = (1,6,3,4) is in R4 A column vector: A row vector: m-by-n matrix is an object in Rmxn with m rows and n columns, each entry filled with a (typically) real number:

Basic concepts Vector norms: A norm of a vector ||x|| is informally a measure of the “length” of the vector. Common norms: L1, L2 (Euclidean) Linfinity

Basic concepts We will use lower case letters for vectors The elements are referred by xi. Vector dot (inner) product: Vector outer product: If u•v=0, ||u||2 != 0, ||v||2 != 0  u and v are orthogonal If u•v=0, ||u||2 = 1, ||v||2 = 1  u and v are orthonormal

Basic concepts e.g. Matrix product: We will use upper case letters for matrices. The elements are referred by Ai,j. Matrix product: e.g.

Special matrices diagonal upper-triangular lower-triangular tri-diagonal I (identity matrix)

Basic concepts e.g. Transpose: You can think of it as “flipping” the rows and columns OR “reflecting” vector/matrix on line e.g.

Linear independence e.g. A set of vectors is linearly independent if none of them can be written as a linear combination of the others. Vectors v1,…,vk are linearly independent if c1v1+…+ckvk = 0 implies c1=…=ck=0 e.g. (u,v)=(0,0), i.e. the columns are linearly independent. x3 = −2x1 + x2

Span of a vector space e.g. If all vectors in a vector space may be expressed as linear combinations of a set of vectors v1,…,vk, then v1,…,vk spans the space. The cardinality of this set is the dimension of the vector space. A basis is a maximal set of linearly independent vectors and a minimal set of spanning vectors of a vector space (0,0,1) (0,1,0) (1,0,0) e.g.

Rank of a Matrix rank(A) (the rank of a m-by-n matrix A) is The maximal number of linearly independent columns =The maximal number of linearly independent rows =The dimension of col(A) =The dimension of row(A) If A is n by m, then rank(A)<= min(m,n) If n=rank(A), then A has full row rank If m=rank(A), then A has full column rank

Inverse of a matrix Inverse of a square matrix A, denoted by A-1 is the unique matrix s.t. AA-1 =A-1A=I (identity matrix) If A-1 and B-1 exist, then (AB)-1 = B-1A-1, (AT)-1 = (A-1)T For orthonormal matrices For diagonal matrices

Dimensions By Thomas Minka. Old and New Matrix Algebra Useful for Statistics

Examples http://matrixcookbook.com/

Singular Value Decomposition (SVD) Any matrix A can be decomposed as A=UDVT, where D is a diagonal matrix, with d=rank(A) non-zero elements The fist d rows of U are orthogonal basis for col(A) The fist d rows of V are orthogonal basis for row(A) Applications of the SVD Matrix Pseudoinverse Low-rank matrix approximation

Eigen Value Decomposition Any symmetric matrix A can be decomposed as A=UDUT, where D is diagonal, with d=rank(A) non-zero elements The fist d rows of U are orthogonal basis for col(A)=row(A) Re-interpreting Ab First stretch b along the direction of u1 by d1 times Then further stretch it along the direction of u2 by d2 times

Low-rank Matrix Inversion In many applications (e.g. linear regression, Gaussian model) we need to calculate the inverse of covariance matrix XTX (each row of n-by-m matrix X is a data sample) If the number of features is huge (e.g. each sample is an image, #sample n<<#feature m) inverting the m-by-m XTX matrix becomes an problem Complexity of matrix inversion is generally O(n3) Matlab can comfortably solve matrix inversion with m=thousands, but not much more than that

Low-rank Matrix Inversion With the help of SVD, we actually do NOT need to explicitly invert XTX Decompose X=UDVT Then XTX = VDUTUDVT = VD2VT Since V(D2)VTV(D2)-1VT=I We know that (XTX )-1= V(D2)-1VT Inverting a diagonal matrix D2 is trivial

http://matrixcookbook.com/ Basics Derivatives Decompositions Distributions …

Review of Linear Algebra Introduction to Matlab

MATrix LABoratory Mostly used for mathematical libraries Very easy to do matrix manipulation in Matlab If this is your first time using Matlab Strongly suggest you go through the “Getting Started” part of Matlab help Many useful basic syntax

Installing Matlab Matlab licenses are expensive; but “free” for you! Available for installation by contacting help+@cs.cmu.edu  SCS students only Available by download at my.cmu.edu Windows XP SP3+ MacOS X 10.5.5+ ~4GB!

Making Arrays >> v’ ans: 1 2 3 4 5 >> 1:5 >> 1:2:5 ans: 1 3 5 >> 5:-2:1 ans: 5 3 1 >> rand(3,1) ans: 0.0318 0.2769 0.0462 % A simple array >> [1 2 3 4 5] ans: 1 2 3 4 5 >> [1,2,3,4,5] >> v = [1;2;3;4;5] v = 1 2 3 4 5

Making Matrices % All the following are equivalent >> [1 2 3; 4 5 6; 7 8 9] >> [1,2,3; 4,5,6; 7,8,9] >> [[1 2; 4 5; 7 8] [3; 6; 9]] >> [[1 2 3; 4 5 6]; [7 8 9]] ans: 1 2 3 4 5 6 7 8 9

Making Matrices % Creating all ones, zeros, identity, diagonal matrices >> zeros( rows, cols ) >> ones( rows, cols ) >> eye( rows ) >> diag([1 2 3]) % Creating Random matrices >> rand( rows, cols ) % Unif[0,1] >> randn( rows, cols) % N(0, 1) % Make 3x5 with N(1, 4) entries >> 2 * randn(3,5) + 1 % Get the size >> [rows, cols] = size( matrix );

Accessing Elements Unlike C-like languages, indices start from 1 (NOT 0) >> A = [1 2 3; 4 5 6; 7 8 9] ans: 1 2 3 4 5 6 7 8 9 % Access Individual Elements >> A(2,3) ans: 6 % Access 2nd column ( : means all elements) >> A(:,2) ans: 2 5 8

Accessing Elements A= 1 2 3 4 5 6 7 8 9 Matlab has column-order >> A([1, 3, 5]) ans: 1 7 5 >> A( [1,3], 2:end ) ans: 2 3 8 9 >> A( A > 5) = -1 ans: 1 2 3 4 5 -1 -1 -1 -1 ans: 7 8 6 9 >> [i j] = find(A>5) i = 3 j = 1 3 2 2 3 3 3

Matrix Operations >> A’ >> A*A is same as A^2 >> A.*B >> inv(A) >> A/B, A./B, A+B, … % Solving Systems (A+eye(3)) \ [1;2;3] % inv(A+eye(3)) * [1; 2; 3] ans: -1.0000 -0.0000 1.0000 A= 1 2 3 4 5 6 7 8 9 >> A + 2 * (A / 4) ans: 1.5000 3.0000 4.5000 6.0000 7.5000 9.0000 10.5000 12.0000 13.5000 >> A ./ A ans: 1 1 1 1 1 1

Plotting in Matlab % Lets plot a Gaussian N(0,1) % Generate 1 million random data points d = randn(1000000,1); % Find the histogram x = min(d):0.1:max(d); c = histc(d, x); p = c / 1000000; % Plot the pdf plot(x, p);

Other things to know if, for statements scripts and functions Useful operators >, <, >=, <=, ==, &, |, &&, ||, +, -, /, *, ^, …, ./, ‘, .*, .^, \ Useful Functions sum, mean, var, not, min, max, find, exists, pause, exp, sqrt, sin, cos, reshape, sort, sortrows, length, size, length, setdiff, ismember, isempty, intersect, plot, hist, title, xlabel, ylabel, legend, rand, randn, zeros, ones, eye, inv, diag, ind2sub, sub2ind, find, logical, repmat, num2str, disp, svd, eig, sparse, clear, clc, help …