MAT 2401 Linear Algebra 5.3 Orthonormal Bases

Slides:



Advertisements
Similar presentations
CS 450: COMPUTER GRAPHICS LINEAR ALGEBRA REVIEW SPRING 2015 DR. MICHAEL J. REALE.
Advertisements

Quantum One: Lecture 9. Graham Schmidt Orthogonalization.
Leo Lam © Signals and Systems EE235 Leo Lam.
Extensions of wavelets
Signal , Weight Vector Spaces and Linear Transformations
Signal , Weight Vector Spaces and Linear Transformations
Chapter 5 Orthogonality
Wavelet Based Image Coding. [2] Construction of Haar functions Unique decomposition of integer k  (p, q) – k = 0, …, N-1 with N = 2 n, 0
Chapter 3 Determinants and Matrices
SWE 423: Multimedia Systems Chapter 7: Data Compression (3)
Basic Concepts and Definitions Vector and Function Space. A finite or an infinite dimensional linear vector/function space described with set of non-unique.
1 Computer Science 631 Lecture 4: Wavelets Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3b: Detection and Signal Spaces.
MSP1 References Jain (a text book?; IP per se; available) Castleman (a real text book ; image analysis; less available) Lim (unavailable?)
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
SWE 423: Multimedia Systems Chapter 7: Data Compression (5)
Linear Algebra, Principal Component Analysis and their Chemometrics Applications.
Quantum One: Lecture 8. Continuously Indexed Basis Sets.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Modulation, Demodulation and Coding Course
6.837 Linear Algebra Review Patrick Nichols Thursday, September 18, 2003.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Patrick Nichols Thursday, September 18, Linear Algebra Review.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
MAT 2401 Linear Algebra 4.4 Spanning Sets and Linear Independence
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Chapter 5 Orthogonality.
General Orthonormal MRA Ref: Rao & Bopardikar, Ch. 3.
MAT 4725 Numerical Analysis Section 8.2 Orthogonal Polynomials and Least Squares Approximations (Part II)
6.837 Linear Algebra Review Patrick Nichols Thursday, September 18, 2003.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Compressive Sampling Jan Pei Wu. Formalism The observation y is linearly related with signal x: y=Ax Generally we need to have the number of observation.
Chapter 10 Real Inner Products and Least-Square
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Leo Lam © Signals and Systems EE235 Lecture 19.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Motivation: Wavelets are building blocks that can quickly decorrelate data 2. each signal written as (possibly infinite) sum 1. what type of data? 3. new.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Math Review Towards Fourier Transform
Section 12.3 The Dot Product
MAT 2401 Linear Algebra 4.4 II Spanning Sets and Linear Independence
Value Function Approximation with Diffusion Wavelets and Laplacian Eigenfunctions by S. Mahadevan & M. Maggioni Discussion led by Qi An ECE, Duke University.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
1 MAC 2103 Module 11 lnner Product Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal.
Image hole-filling. Agenda Project 2: Will be up tomorrow Due in 2 weeks Fourier – finish up Hole-filling (texture synthesis) Image blending.
Orthogonal Projections Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
CS654: Digital Image Analysis Lecture 11: Image Transforms.
12.1 Orthogonal Functions a function is considered to be a generalization of a vector. We will see the concepts of inner product, norm, orthogonal (perpendicular),
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1: ……………………………………………………………………. The elements in R n called …………. 4.1 The vector Space R n Addition.
MAT 2401 Linear Algebra 4.5 Basis and Dimension
Linear Algebra Review Tuesday, September 7, 2010.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Wavelets Chapter 7 Serkan ERGUN. 1.Introduction Wavelets are mathematical tools for hierarchically decomposing functions. Regardless of whether the function.
§ Linear Spaces Christopher Crawford PHY
Image Processing Architecture, © Oleh TretiakPage 1Lecture 5 ECEC 453 Image Processing Architecture Lecture 5, 1/22/2004 Rate-Distortion Theory,
MAT 4725 Numerical Analysis Section 8.2 Orthogonal Polynomials and Least Squares Approximations (Part II)
Beyond Vectors Hung-yi Lee. Introduction Many things can be considered as “vectors”. E.g. a function can be regarded as a vector We can apply the concept.
MA4229 Lectures 13, 14 Week 12 Nov 1, Chapter 13 Approximation to periodic functions.
References Jain (a text book?; IP per se; available)
Christopher Crawford PHY
Matrices and vector spaces
Wavelets : Introduction and Examples
Section 3.2 – The Dot Product
4.3 Subspaces of Vector Spaces
Linear Algebra Lecture 40.
Linear Algebra Lecture 38.
Calculate 81 ÷ 3 = 27 3 x 3 x 3 3 x 3 x 3 x 3 ÷ 3 = This could be written as
Presentation transcript:

MAT 2401 Linear Algebra 5.3 Orthonormal Bases

HW WebAssign 5.3 Written Homework

Basis S={(1,0,0),(0,1,0),(0,0,1)} is the standard basis for R 3. It is described as an orthonormal basis. Every element in R 3 can be written as a linear combination of elements in S. (3,4,-2)=3i+4j-2k In general, we can consider this process as encoding “a piece of info” by the elements in the basis.

Preview Orthonormal basis is fundamental to the development of Fourier Analysis and Wavelets which have all kind of applications such as signal processing, image compression, and processing. We will look at how to find orthonormal bases for a inner product space V.

Quote… It is very difficult to show you why, in practical applications, we want this specific kind of bases. So I am going to show you an excerpt from chapter 6 of the book “The World According to Wavelets” by Barbara Hubbard.

Quote…about Efficiency The fact that all the vectors in a non- orthogonal basis come into play for the computation of a single coefficient is also bothersome when one wants to compute or adjust quantization errors.

Quote…about Efficiency In an orthogonal basis one can calculate the “energy” of the total error by adding the energies of the errors of each coefficient; it’s not necessary to reconstruct the signal. In a non-orthogonal basis one has to reconstruct the signal to measure the error.

Quote…about Redundancy The most dramatic comparison is between … “everything is said 10 times.” In an orthonormal basis, each vector encodes information that is encoded nowhere else.

Another Example…jpeg

JPEG is not possible without …

Basis Not all basis are created equal!

Good, Better, Best

Orthonormal Bases A basis S for an inner product space V is orthonormal if 1. For u,v  S, =0. 2. For u  S, u is a unit vector.

Example 1 S={(1,0),(0,1)} is an orthonormal basis for R 2 with the dot product. (From previous lecture, we know S is a basis of R 2 )

Example 1 S={(1,0),(0,1)} is an orthonormal basis for R 2 with the dot product.

Remark The standard basis is an orthonormal basis for R n with the dot product.

Example 2 S={1, x, x 2 } is an orthonormal basis for P 2 with the usual inner product.

Example 2 S={1, x, x 2 }

Example 3 S={(1,1,1), (-1,1,0), (1,2,1)} is a basis for R 3. However, it is not orthonormal. Q: How to “get” a orthonormal basis from S?

Gram-Schmidt Process

Idea

Example 3 S={(1,1,1), (-1,1,0), (1,2,1)}

Example 3