Chapter 13 Discrete Image Transforms

Slides:



Advertisements
Similar presentations
Multimedia Data Compression
Advertisements

Eigen Decomposition and Singular Value Decomposition
3D Geometry for Computer Graphics
Noise & Data Reduction. Paired Sample t Test Data Transformation - Overview From Covariance Matrix to PCA and Dimension Reduction Fourier Analysis - Spectrum.
Tensors and Component Analysis Musawir Ali. Tensor: Generalization of an n-dimensional array Vector: order-1 tensor Matrix: order-2 tensor Order-3 tensor.
Machine Learning Lecture 8 Data Processing and Representation
Fourier Transform (Chapter 4)
Frequency Domain The frequency domain
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Principal Component Analysis
© 2003 by Davi GeigerComputer Vision September 2003 L1.1 Face Recognition Recognized Person Face Recognition.
Chapter 5 Orthogonality
Lecture05 Transform Coding.
Linear Subspace Transforms PCA, Karhunen- Loeve, Hotelling csc610, 2001.
Wavelet Transform A very brief look.
Chapter 3 Determinants and Matrices
Chapter 2 Matrices Definition of a matrix.
1 Neural Nets Applications Vectors and Matrices. 2/27 Outline 1. Definition of Vectors 2. Operations on Vectors 3. Linear Dependence of Vectors 4. Definition.
3D Geometry for Computer Graphics
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Ordinary least squares regression (OLS)
Fundamentals of Multimedia Chapter 8 Lossy Compression Algorithms (Wavelet) Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
Digital Image Processing Final Project Compression Using DFT, DCT, Hadamard and SVD Transforms Zvi Devir and Assaf Eden.
Transforms: Basis to Basis Normal Basis Hadamard Basis Basis functions Method to find coefficients (“Transform”) Inverse Transform.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
Orthogonal Transforms
Digital Image Processing, 2nd ed. © 2002 R. C. Gonzalez & R. E. Woods Chapter 4 Image Enhancement in the Frequency Domain Chapter.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
SVD(Singular Value Decomposition) and Its Applications
Summarized by Soo-Jin Kim
Chapter 2 Dimensionality Reduction. Linear Methods
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Transforms. 5*sin (2  4t) Amplitude = 5 Frequency = 4 Hz seconds A sine wave.
CMPT 365 Multimedia Systems
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
Scientific Computing Singular Value Decomposition SVD.
Image transformations Digital Image Processing Instructor: Dr. Cheng-Chien LiuCheng-Chien Liu Department of Earth Sciences National Cheng Kung University.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Dr. Scott Umbaugh, SIUE Discrete Transforms.
Linear Subspace Transforms PCA, Karhunen- Loeve, Hotelling C306, 2000.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
Fourier Transform.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Chapter 8 Lossy Compression Algorithms. Fundamentals of Multimedia, Chapter Introduction Lossless compression algorithms do not deliver compression.
Feature Extraction 主講人:虞台文. Content Principal Component Analysis (PCA) PCA Calculation — for Fewer-Sample Case Factor Analysis Fisher’s Linear Discriminant.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
BYST Xform-1 DIP - WS2002: Fourier Transform Digital Image Processing Bundit Thipakorn, Ph.D. Computer Engineering Department Fourier Transform and Image.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Fourier Transform (Chapter 4) CS474/674 – Prof. Bebis.
Digital Image Processing Lecture 8: Fourier Transform Prof. Charlene Tsai.
Image Transformation Spatial domain (SD) and Frequency domain (FD)
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
An Example of 1D Transform with Two Variables
Matrices and Vectors Review Objective
Lecture: Face Recognition and Feature Reduction
4. DIGITAL IMAGE TRANSFORMS 4.1. Introduction
Principal Component Analysis
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Feature space tansformation methods
Digital Image Processing Lecture 21: Principal Components for Description Prof. Charlene Tsai *Chapter 11.4 of Gonzalez.
Lecture 13: Singular Value Decomposition (SVD)
Principal Component Analysis
Digital Image Processing
Presentation transcript:

Chapter 13 Discrete Image Transforms 13.1 Introduction 13.2 Linear transformations One-dimensional discrete linear transformations Definition. If x is an N-by-1 vector and T is an N-by-N matrix, then is a linear transformation of the vector x, I.e., T is called the kernel matrix of the transformation

Linear Transformations Rotation and scaling are examples of linear transformations If T is a nonsingular matrix, the inverse linear transformation is

Linear Transformations Unitary Transforms If the kernel matrix of a linear system is a unitary matrix, the linear system is called a unitary transformation. A matrix T is unitary if If T is unitary and real, then T is an orthogonal matrix.

Unitary Transforms The rows ( also the columns ) of an orthogonal matrix are a set of orthogonal vectors. One-dimensional DFT is an example of a unitary transform or where the matrix W is a unitary matrix.

Unitary Transforms Interpretation The rows (or columns) of a unitary matrix form a orthonormal basis for the N-dimensional vector space. A unitary transform can be viewed as a coordinate transformation, rotating the vector in N-space without changing its length. A unitary linear transformation converts a N-dimensional vector to a N-dimensional vector of transform coefficients, each of which is computed as the inner product of the input vector x with one of the rows of the transform matrix T. The forward transformation is referred to as analysis, and the backward transformation is referred to as synthesis.

Two-dimensional discrete linear transformations A two-dimensional linear transformation is where forms an N2-by-N2 block matrix having N-by-N blocks, each of which is an N-by-N matrix If , the transformation is called separable.

2-D separable symmetric unitary transforms If , the transformation is symmetric, The inverse transform is Example: The two-dimensional DFT

Orthogonal transformations If the matrix T is real, the linear transformation is called an orthogonal transformation. Its inverse transformation is If T is a symmetric matrix, the forward and inverse transformation are identical,

Basis functions and basis images The rows of a unitary form a orthonormal basis for a N-dimensional vector space. There are different unitary transformations with different choice of basis vectors. A unitary transformation corresponds to a rotation of a vector in a N-dimensional (N2 for two-dimensional case) vector space.

Basis Images The inverse unitary transform an be viewed as the weighted sum of N2 basis image, which is the inverse transformation of a matrix as This means that each basis image is the outer product of two rows of the transform matrix. Any image can be decomposed into a set of basis image by the forward transformation, and the inverse transformation reconstitute the image by summing the basis images.

13.4 Sinusoidal Transforms 13.4.1 The discrete Fourier transform The forward and inverse DFT’s are where

13.4 Sinusoidal Transforms The spectrum vector The frequency corresponding to the ith element of F is The frequency components are arranged as 1 i N/2 N-1

13.4 Sinusoidal Transforms The frequencies are symmetric about the highest frequency component. Using a circular right shift by the amount N/2, we can place the zero-frequency at N/2 and frequency increases in both directions from there. The Nyquist frequency (the highest frequency) at F0. This can be done by changing the signs of the odd-numbered elements of f(x) prior to computing the DFT. This is because

13.4 Sinusoidal Transforms The two-dimensional DFT For the two-dimensional DFT, changing the sign of half the elements of the image matrix shifts its zero frequency component to the center of the spectrum 1 4 3 2 2 3 4 1

13.4.2 Discrete Cosine Transform The two-dimensional discrete cosine transform (DCT) is defined as and its inverse where

The discrete cosine transform DCT can be expressed as a unitary matrix form Where the kernel matrix has elements The DCT is useful in image compression.

The sine transform The discrete sine transform (DST) is defined as and The DST has unitary kernel

The Hartley transform The forward two-dimensional discrete Hartley transform The inverse DHT where the basis function

The Hartley transform The unitary kernel matrix of the Hartley transform has elements The Hartley transform is the real part minus the imaginary part of the corresponding Fourier transform, and the Fourier transform is the even part minus j times the odd part of the Hartley transform.

13.5 Rectangular wave transforms 13.5.1 The Hadamard Transform Also called the Walsh transform. The Hadamard transform is a symmetric, separable orthogonal transformation that has only +1 and –1 as elements in its kernel matrix. It exists only for For the two-by-two case And for general cases

The Hadamard Transform Ordered Hadamard transform

13.5.3 The Slant Transform The orthogonal kernel matrix for the slant transform is obtained iteratively as

13.5.3 The Slant Transform And where I is the identity matrix of order N/2-2 and

13.5.3 The Slant Transform The basis function for N=8 4 5 1 2 6 3 7

13.5.4 The Haar Transform The Basis functions of Haar transform For any integer , let where and is the largest power of 2 that , and is the remainder. The Haar function is defined by

13.5.4 The Haar Transform And The 8-by-8 Haar orthogonal kernel matrix is

13.5.4 The Haar Transform Basis Functions for N=8

Basis images of Haar

Basis image of Hadamard

Basis images of DCT

13.6 Eigenvector-based transforms Eigenvalues and eigenvectors For an N-by-N matrix A,  is a scalar, if then  is called an eigenvalue of A. The vector that satisfies is called an eigenvectors of A.

13.6 Eigenvector-based transforms 13.6.2 Principal-Component Analysis Suppose x is an N-by-1 random vector, The mean vector can be estimated from its L samples as and its covariance matrix can be estimated by The matrix is a real and symmetric matrix.

13.6 Eigenvector-based transforms Let A be a matrix whose rows are eigenvectors of , then is a diagonal matrix having the eigenvalues of along its diagonal, I.e., Let the matrix A define a linear transformation by

13.6 Eigenvector-based transforms It can be shown that the covariance matrix of the vector is . Since the matrix is a diagonal matrix, its off-diagonal elements are zero, the element of are uncorrelated. Thus the linear transformation remove the correlation among the variables. The reverse transform can reconstruct x from y.

13.6 Dimension Reduction We can reduce the dimensionality of the y vector by ignoring one or more of the eigenvectors that have small eigenvalues. Let B be the M-by-N matrix (M<N) formed by discarding the lower N-M rows of A, and let mx=0 for simplicity, then the transformed vector has smaller dimension

13.6 Dimension Reduction The vector can be reconstructed(approximately) by The mean square error is The vector is called the principal component of the vector x.

13.6.3 The Karhunen-Loeve Transform The K-L transform is defined as The dimension-reducing capability of the K-L transform makes it quite useful for image compression. When the image is a first-order Markov process, where the correlation between pixels decreases linearly with their separation distance, the basis images for the K-L transform can be written explicitly.

13.6.3 The Karhunen-Loeve Transform When the correlation between adjacent pixels approaches unity, the K-L basis functions approach those of discrete cosine transform. Thus, DCT is a good approximation for the K-L transform.

13.6.4 The SVD Transform Singular value decomposition Any N-by-N matrix A can be decomposed as where the columns of U and V are the eigenvectors of and , respectively. is an N-by-N diagonal matrix containing the singular values of A. The forward singular value decomposition(SVD) transform The inverse SVD transform

13.6.4 The SVD transform For SVD transform, the kernel matrices are image-dependent. The SVD has a very high power of image compression, we can get lossless compression by at least a factor of N. and even higher lossy compression ratio by ignoring some small singular values may be achieved.

The SVD transform Illustration of figure 13-7

13.7 Transform domain filtering Like in the Fourier transform domain, filter can be designed in other transform domain. Transform domain filtering involves modification of the weighting coefficients prior to reconstruction of the image via the inverse transform. If either of the desired components or the undesired components of the image resemble one or a few of the basis image of a particular transform, then that transform will be useful in separating the two.

13.7 Transform domain filtering Haar transform is a good candidate for detecting vertical and horizontal lines and edges.

13.7 Transform domain filtering Illustration of fig. 13-8.

13.7 Transform domain filtering Figure 13-9