CPSC 491 Xin Liu Nov 17, 2010. Introduction Xin Liu PhD student of Dr. Rokne Contact Slides downloadable at pages.cpsc.ucalgary.ca/~liuxin.

Slides:



Advertisements
Similar presentations
6.4 Best Approximation; Least Squares
Advertisements

CS 450: COMPUTER GRAPHICS LINEAR ALGEBRA REVIEW SPRING 2015 DR. MICHAEL J. REALE.
3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Lecture 19 Singular Value Decomposition
Chapter 5 Orthogonality
Computer Graphics Recitation 5.
Lecture 9 Symmetric Matrices Subspaces and Nullspaces Shang-Hua Teng.
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 12 System of Linear Equations.
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 14 Elimination Methods.
Chapter 3 Determinants and Matrices
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Chapter 2 Matrices Definition of a matrix.
CSci 6971: Image Registration Lecture 2: Vectors and Matrices January 16, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI.
Orthogonality and Least Squares
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
Lecture 7: Matrix-Vector Product; Matrix of a Linear Transformation; Matrix-Matrix Product Sections 2.1, 2.2.1,
Intro to Matrices Don’t be scared….
Stats & Linear Models.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Linear Algebra & Matrices MfD 2004 María Asunción Fernández Seara January 21 st, 2004 “The beginnings of matrices and determinants.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
ME451 Kinematics and Dynamics of Machine Systems Review of Matrix Algebra – 2.2 Review of Elements of Calculus – 2.5 Vel. and Acc. of a point fixed in.
ME451 Kinematics and Dynamics of Machine Systems Review of Matrix Algebra – 2.2 September 13, 2011 Dan Negrut University of Wisconsin-Madison © Dan Negrut,
Section 6.6 Orthogonal Matrices.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
BMI II SS06 – Class 3 “Linear Algebra” Slide 1 Biomedical Imaging II Class 3 – Mathematical Preliminaries: Elementary Linear Algebra 2/13/06.
MA2213 Lecture 5 Linear Equations (Direct Solvers)
Chapter 5: The Orthogonality and Least Squares
+ Review of Linear Algebra Optimization 1/14/10 Recitation Sivaraman Balakrishnan.
CHAP 0 MATHEMATICAL PRELIMINARY
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 1 : “shiv rpi” Linear Algebra A gentle introduction Linear Algebra has become as basic and as applicable.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 2. Linear systems.
AN ORTHOGONAL PROJECTION
Matrices. Definitions  A matrix is an m x n array of scalars, arranged conceptually as m rows and n columns.  m is referred to as the row dimension.
Mathematical foundationsModern Seismology – Data processing and inversion 1 Some basic maths for seismic data processing and inverse problems (Refreshement.
Elementary Linear Algebra Anton & Rorres, 9th Edition
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
Ch 6 Vector Spaces. Vector Space Axioms X,Y,Z elements of  and α, β elements of  Def of vector addition Def of multiplication of scalar and vector These.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
2.5 – Determinants and Multiplicative Inverses of Matrices.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
An Introduction to Matrix Algebra Math 2240 Appalachian State University Dr. Ginn.
EE611 Deterministic Systems Vector Spaces and Basis Changes Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Linear Algebra Lecture 2.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Linear independence and matrix rank
Systems of First Order Linear Equations
4.6: Rank.
Chapter 3 Linear Algebra
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Linear Vector Space and Matrix Mechanics
Presentation transcript:

CPSC 491 Xin Liu Nov 17, 2010

Introduction Xin Liu PhD student of Dr. Rokne Contact Slides downloadable at pages.cpsc.ucalgary.ca/~liuxin The way to math world Lecture attendance Hard to learn by yourselves Practices, practices, and practices … 2

Matrix-Vector Multiplication Linear (the 1 st degree) systems are the simplest, but most widely used systems in science and engineering A basic problem: solving the linear equation system Straight forward method Gaussian elimination Hard to do because large scale poor conditioned small disturbance in coefficients causes big difference in solutions A better method SVD – Singular Vale Decomposition Will be introduced gradually in a series of lectures 3

Definitions An n-vector is defined as Think about 3-vectors in Euclidean space An mxn matrix is defined as Multiplication 4

Linear Mapping is a linear mapping, which satisfies Distributive law Associative law (for scalar) Conversely, every linear map from R n to R m can be expressed as a multiplication by an mxn matrix 5

Mat-vect multiplication View the matrix-vector multiplication from another angle If we write A as a combination of column vectors Then the mat-vect multiplication can be written as That means: b is a linear combination of the columns of A 6

Mat-mat multiplication Matrix-matrix multiplication is defined as We can calculate B columnwisely Each column of B is a linear combination of the columns a j with the coefficients c kj 7

Range Definition: The range of a matrix A, is the set of vectors that can be expressed as Ax for some x. Theorem range (A) is the space spanned by the columns of A. The range of A is also called the column space of A. 8

Nullspace Definition: The nullspace (solution space) of A is the set of vectors x that satisfy Ax = 0. Each vector x in the nullspace gives the expansion coefficients of the zero vector as a linear combination of columns of A 9

Rank Column rank = dimension of space spanned by the matrix’s columns = # of linearly independent columns Row rank = dimension of space spanned by the matrix’s rows = # of linearly independent rows Row rank = Column rank = Matrix rank Full rank Theorem 10

Inverse A nonsingular or invertible matrix must be square and full rank. The m columns of a nonsingular mxm matrix A span (form a basis) for the whole space R m Any vector in R m can be expressed as a linear combination of the columns of A The inverse of A is a matrix A -1, such that AA -1 = A -1 A = I I is the mxm identity matrix The inverse of a nonsingular matrix is unique. A -1 b is the unique solution of Ax = b. A -1 b is the vector of coefficients of the expansion of b in the basis of the columns of A. 11

Transpose Definition The transpose A T of an mxn matrix A is nxm where the (i,j) entry of A T is the (j, i) entry of A. Example A is symmetric if A = A T. Multiplication 12

Inner product Euclidean length Angle 13

Orthogonal vectors Orthogonal (perpendicular) vectors Vectors x, y are orthogonal if x T y = 0. Orthogonal vector set Orthogonal two vector sets 14

Orthonormal Definition Theorem Corollary 15

Components of a vector Inner products can be used to decompose arbitrary vectors into orthogonal components (project onto orthonormal vectors). 16

Components of a vector 17

Orthogonal matrices Definition: According to the definition Or 18

An example 2D rotation matrix 19

Multiplication by an orthogonal matrix inner products is preserved angles between vectors are preserved lengths are preserved 20