* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad

Slides:



Advertisements
Similar presentations
MMSE Estimation for Sparse Representation Modeling
Advertisements

Joint work with Irad Yavneh
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Submodular Dictionary Selection for Sparse Representation Volkan Cevher Laboratory for Information and Inference Systems - LIONS.
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
K-SVD Dictionary-Learning for Analysis Sparse Models
Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH,
Extensions of wavelets
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
A novel supervised feature extraction and classification framework for land cover recognition of the off-land scenario Yan Cui
1 Micha Feigin, Danny Feldman, Nir Sochen
Ilias Theodorakopoulos PhD Candidate
Design of Non-Linear Kernel Dictionaries for Object Recognition
An Introduction to Sparse Coding, Sparse Sensing, and Optimization Speaker: Wei-Lun Chao Date: Nov. 23, 2011 DISP Lab, Graduate Institute of Communication.
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University.
Sparse & Redundant Signal Representation, and its Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Dictionary-Learning for the Analysis Sparse Model Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Image Super-Resolution Using Sparse Representation By: Michael Elad Single Image Super-Resolution Using Sparse Representation Michael Elad The Computer.
Sparse and Overcomplete Data Representation
SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.
Mathematics and Image Analysis, MIA'06
Image Denoising via Learned Dictionaries and Sparse Representations
Over-Complete & Sparse Representations for Image Decomposition and Inpainting Michael Elad The Computer Science Department The Technion – Israel Institute.
Image Denoising and Beyond via Learned Dictionaries and Sparse Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
An Introduction to Sparse Representation and the K-SVD Algorithm
ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
New Results in Image Processing based on Sparse and Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
Image Denoising with K-SVD Priyam Chatterjee EE 264 – Image Processing & Reconstruction Instructor : Prof. Peyman Milanfar Spring 2007.
* Joint work with Michal Aharon Guillermo Sapiro
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Image Decomposition and Inpainting By Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute of.
SOS Boosting of Image Denoising Algorithms
Over-Complete & Sparse Representations for Image Decomposition*
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Multiscale transforms : wavelets, ridgelets, curvelets, etc.
Sparse and Redundant Representation Modeling for Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Alfredo Nava-Tudela John J. Benedetto, advisor
The Quest for a Dictionary. We Need a Dictionary  The Sparse-land model assumes that our signal x can be described as emerging from the PDF:  Clearly,
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
Cs: compressed sensing
Iterated Denoising for Image Recovery Onur G. Guleryuz To see the animations and movies please use full-screen mode. Clicking on.
Sparse Representations of Signals: Theory and Applications * Michael Elad The CS Department The Technion – Israel Institute of technology Haifa 32000,
“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH.
 Karthik Gurumoorthy  Ajit Rajwade  Arunava Banerjee  Anand Rangarajan Department of CISE University of Florida 1.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
Sparse & Redundant Representation Modeling of Images Problem Solving Session 1: Greedy Pursuit Algorithms By: Matan Protter Sparse & Redundant Representation.
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
Image Priors and the Sparse-Land Model
The Quest for a Dictionary. We Need a Dictionary  The Sparse-land model assumes that our signal x can be described as emerging from the PDF:  Clearly,
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Sparsity Based Poisson Denoising and Inpainting
Sparse and Redundant Representations and Their Applications in
Recent Results on the Co-Sparse Analysis Model
Sparse and Redundant Representations and Their Applications in
Parallelization of Sparse Coding & Dictionary Learning
* *Joint work with Ron Rubinstein Tomer Peleg Remi Gribonval and
Sparse and Redundant Representations and Their Applications in
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
Sparse and Redundant Representations and Their Applications in
Improving K-SVD Denoising by Post-Processing its Method-Noise
Sparse and Redundant Representations and Their Applications in
Sparse and Redundant Representations and Their Applications in
Presentation transcript:

The K–SVD Design of Dictionaries for Redundant and Sparse Representation of Signals * * Joint work with Michal Aharon Freddy Bruckstein Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel SPIE – Wavelets XI San-Diego: August 2nd, 2005

Agenda A Visit to Sparseland Motivating redundancy & Sparsity 2. The Quest for a Dictionary – Fundamentals Common Approaches? 3. The Quest for a Dictionary – Practice Introducing the K-SVD 4. Results Preliminary results and applications Welcome to Sparseland NOTE: This lecture concentrates on the fundamentals of this work, and thus does not fully overlap the accompanying paper. The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

M Generating Signals in Sparseland K N A fixed Dictionary Every column in D (dictionary) is a prototype signal (Atom). M The vector  is generated randomly with few non-zeros in random locations and random values. A sparse & random vector N The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

M Sparseland Signals Are Interesting  Sparseland is here !? Simple: Every generated signal is built as a linear combination of few atoms from our dictionary D Rich: A general model: the obtained signals are a special type mixture-of-Gaussians (or Laplacians). Popular: Recent work on signal and image processing adopt this model and successfully deploys it to applications.  Sparseland is here !? Multiply by D M The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

? M Signal Processing in Sparseland noise Multiply by D A sparse & random vector M Practical ways to get ? ? 4 Major Questions Is or even close? Recent results give optimistic answers to these questions How effective? How do we get D? OUR FOCUS TODAY!! The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Agenda A Visit to Sparseland Motivating redundancy & Sparsity 2. The Quest for a Dictionary – Fundamentals Common approaches? 3. The Quest for a Dictionary – Practice Introducing the K-SVD 4. Results Preliminary results and applications The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

M Problem Setting Multiply by D Given a set of data examples, we assume the Sparseland model and ask The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Choose D – Modeling Approach Replace the model with another, well-defined simple mathematical, model (e.g. images as piece-wise C2 smooth regions with C2 smooth edges) and fit a dictionary accordingly, based on existing methods. Examples: Curvelet [Candes & Donoho] Contourlet [Do & Vetterli] Bandlet [Mallat & Le-Pennec] and others ... Pros: Build on existing methods, Fast transforms, Proven optimality for the model. Cons: Relation to Sparseland ? Linearity? How to adapt to other signals? Bad for “smaller” signal families. The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

OUR APPROACH FOR TODAY Choose D – Training Approach Pros: Cons: Train the dictionary directly based on the given examples, optimizing w.r.t. sparsity and other desired properties (normalized atoms, etc.). Examples: ML (HVS) [Field & Olshausen] MAP [Lewicki & Sejnowski] MOD [Engan et. al.] ICA-like [Kreutz-Delgado et. al.] and others ... Pros: Fits the data, Design fits the true objectives, Can be adapted to small families. Cons: No fast version, Slow training, Scalability issues? The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Agenda A Visit to Sparseland Motivating redundancy & Sparsity 2. The Quest for a Dictionary – Fundamentals Common Approaches? 3. The Quest for a Dictionary – Practice Introducing the K-SVD 4. Results Preliminary results and applications The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

X A D  Practical Approach – Objective Each example is a linear combination of atoms from D Each example has a sparse representation with no more than L atoms Field & Olshausen (96’) Engan et. al. (99’) Lewicki & Sejnowski (00’) Cotter et. al. (03’) Gribonval et. al. (04’) (n,K,L are assumed known) The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Column-by-Column by Mean computation over the relevant examples K–Means For Clustering Clustering: An extreme sparse coding D Initialize D Sparse Coding Nearest Neighbor XT Dictionary Update Column-by-Column by Mean computation over the relevant examples The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Column-by-Column by SVD computation The K–SVD Algorithm – General Aharon, Elad, & Bruckstein (`04) D Initialize D Sparse Coding Use MP or BP XT Dictionary Update Column-by-Column by SVD computation The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

D is known! For the jth item we solve K–SVD: Sparse Coding Stage D D is known! For the jth item we solve XT Pursuit Problem !!! The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

D K–SVD: Dictionary Update Stage Gk : The examples in and that use the column dk. D The content of dk influences only the examples in Gk. Let us fix all A and D apart from the kth column and seek both dk and the kth column in A to better fit the residual! The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

D K–SVD: Dictionary Update Stage We should solve: ResidualE dk is obtained by SVD on the examples’ residual in Gk. The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Agenda A Visit to Sparseland Motivating redundancy & Sparsity 2. The Quest for a Dictionary – Fundamentals Common Approaches? 3. The Quest for a Dictionary – Practice Introducing the K-SVD 4. Results Preliminary results and applications The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

D D K–SVD: A Synthetic Experiment Results Create A 2030 random dictionary with normalized columns Generate 2000 signal examples with 3 atoms per each and add noise D Train a dictionary using the KSVD and MOD and compare Results The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

K–SVD on Images Overcomplete Haar K-SVD: 10,000 sample 8-by-8 images. 441 dictionary elements. Approximation method: OMP K-SVD: The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Filling–In Missing Pixels Given an image with missing values Get the recovered image Apply pursuit (per each block of size(88) using a decimated dictionary with rows removed Multiply the found representation by the complete dictionary 60% missing pixels Haar Results Average # coefficients 4.42 RMSE: 21.52 K-SVD Results Average # coefficients 4.08 RMSE: 11.68 The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Summary Today we discussed: Open Questions: 1. A Visit to Sparseland Motivating redundancy & Sparsity 2. The Quest for a Dictionary – Fundamentals Common Approaches? 3. The Quest for a Dictionary – Practice Introducing the K-SVD 4. Results Preliminary results and applications Open Questions: Scalability – treatment of bigger blocks and large images. Uniqueness? Influence of noise? Equivalence? A guarantee to get the perfect dictionary? Choosing K? What forces govern the redundancy? Other applications? … http://www.cs.technion.ac.il/~elad The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Supplement Slides The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Is D unique in explaining the origin of the signals? Uniqueness? Multiply by D M Is D unique in explaining the origin of the signals? (n,K,L are assumed known) The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Uniqueness? YES !! M If is rich enough* and if Uniqueness then D is unique. Uniqueness Aharon, Elad, & Bruckstein (`05) M “Rich Enough”: The signals from could be clustered to groups that share the same support. At least L+1 examples per each are needed. Comments: This result is proved constructively, but the number of examples needed to pull this off is huge – we will show a far better method next. A parallel result that takes into account noise could be constructed similarly. The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Naïve Compression K-SVD dictionary Haar dictionary DCT dictionary OMP with error bound The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals

Naïve Compression DCT BPP = 1.01 RMSE = 8.14 K-SVD Haar BPP = 0.502 The K–SVD: Design of Dictionaries for Redundant and Sparse representation of Signals