A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –

Slides:



Advertisements
Similar presentations
Object Specific Compressed Sensing by minimizing a weighted L2-norm A. Mahalanobis.
Advertisements

MMSE Estimation for Sparse Representation Modeling
Joint work with Irad Yavneh
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
Submodular Dictionary Selection for Sparse Representation Volkan Cevher Laboratory for Information and Inference Systems - LIONS.
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
K-SVD Dictionary-Learning for Analysis Sparse Models
Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH,
Extensions of wavelets
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
More MR Fingerprinting
1 Applications on Signal Recovering Miguel Argáez Carlos A. Quintero Computational Science Program El Paso, Texas, USA April 16, 2009.
Visual Recognition Tutorial
Dictionary-Learning for the Analysis Sparse Model Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Image Super-Resolution Using Sparse Representation By: Michael Elad Single Image Super-Resolution Using Sparse Representation Michael Elad The Computer.
Exploiting Statistical Dependencies in Sparse Representations Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Sparse and Overcomplete Data Representation
SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.
Mathematics and Image Analysis, MIA'06
Image Denoising via Learned Dictionaries and Sparse Representations
Image Denoising and Beyond via Learned Dictionaries and Sparse Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
An Introduction to Sparse Representation and the K-SVD Algorithm
ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
New Results in Image Processing based on Sparse and Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
* Joint work with Michal Aharon Guillermo Sapiro
Super-Resolution Reconstruction of Images -
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
SOS Boosting of Image Denoising Algorithms
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems
Lecture 10: Robust fitting CS4670: Computer Vision Noah Snavely.
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Sparse and Redundant Representation Modeling for Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Retinex by Two Bilateral Filters Michael Elad The CS Department The Technion – Israel Institute of technology Haifa 32000, Israel Scale-Space 2005 The.
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Cs: compressed sensing
“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 11: Bayesian learning continued Geoffrey Hinton.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Conjugate Priors Multinomial Gaussian MAP Variance Estimation Example.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
Image Denoising Using Wavelets
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Shape From Moments Shape from Moments An Estimation Perspective Michael Elad *, Peyman Milanfar **, and Gene Golub * SIAM 2002 Meeting MS104 - Linear.
Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm Ahmed Aziz, Ahmed Salim, Walid Osamy Presenter : 張庭豪 International Journal of Computer.
Computer Vision Lecture 6. Probabilistic Methods in Segmentation.
Sparse & Redundant Representation Modeling of Images Problem Solving Session 1: Greedy Pursuit Algorithms By: Matan Protter Sparse & Redundant Representation.
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
Image Priors and the Sparse-Land Model
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
Computational Intelligence: Methods and Applications Lecture 26 Density estimation, Expectation Maximization. Włodzisław Duch Dept. of Informatics, UMK.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Sparsity Based Poisson Denoising and Inpainting
Learning With Dynamic Group Sparsity
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Sparse and Redundant Representations and Their Applications in
Sparse and Redundant Representations and Their Applications in
Improving K-SVD Denoising by Post-Processing its Method-Noise
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Presentation transcript:

A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel * Joint work with Irad Yavneh The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel SIAM Conference on Imaging Science IS`08 Session on Topics in Sparse and Redundant Representations – Part I July 8 th, 2008 San-Diego

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 2 Noise Removal? Today we focus on signal/image denoising …  Important: (i) Practical application; (ii) A convenient platform for testing basic ideas in signal/image processing.  Many Considered Directions: Partial differential equations, Statistical estimators, Adaptive filters, Inverse problems & regularization, Wavelets, Example-based techniques, Sparse representations, …  Main Massage Today: Several sparse representations can be found and used for better denoising performance – we introduce, demonstrate and explain this new idea. Remove Additive Noise ?

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 3 Part I The Basics of Denoising by Sparse Representations

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 4 Relation to measurements Denoising By Energy Minimization Thomas Bayes Prior or regularization y : Given measurements x : Unknown to be recovered Many of the proposed signal denoising algorithms are related to the minimization of an energy function of the form  This is in-fact a Bayesian point of view, adopting the Maximum-A-posteriori Probability (MAP) estimation.  Clearly, the wisdom in such an approach is within the choice of the prior – modeling the signals of interest.

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 5 The Evolution Of Pr(x) During the past several decades we have made all sort of guesses about the prior Pr(x) for signals/images: Hidden Markov Models, Compression algorithms as priors, … EnergySmoothness Adapt+ Smooth Robust Statistics Total- Variation Wavelet Sparsity Sparse & Redundant

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 6 Sparse Modeling of Signals MM K N A fixed Dictionary  Every column in D (dictionary) is a prototype signal (atom).  The vector  is generated randomly with few (say L) non-zeros at random locations and with random values. A sparse & random vector N

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 7 D  -y = - Back to Our MAP Energy Function  We L 0 norm is effectively counting the number of non-zeros in .  The vector  is the representation (sparse/redundant).  Bottom line: Denoising of y is done by minimizing or.

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 8  Next steps: given the previously found atoms, find the next one to best fit the residual.  The algorithm stops when the error is below the destination threshold.  The MP is one of the greedy algorithms that finds one atom at a time [Mallat & Zhang (’93)].  Step 1: find the one atom that best matches the signal.  The Orthogonal MP (OMP) is an improved version that re-evaluates the coefficients by Least-Squares after each round. The Solver We Use: Greed Based

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 9 Orthogonal Matching Pursuit Initialization Main Iteration Stop Yes No OMP finds one atom at a time for approximating the solution of

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 10 Part II Finding & Using More than One Sparse Representation

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 11 Back to the Beginning. What If … Consider the denoising problem and suppose that we can find a group of J candidate solutions such that Basic Questions:  What could we do with such a set of competing solutions in order to better denoise y?  Why should this work?  How shall we practically find such a set of solutions? These questions were studied and answered recently [Elad and Yavneh (’08)]

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 12 Motivation Why bother with such a set?  Because of the intriguing relation to example-based techniques, where several nearest-neighbors for the signal are used jointly.  Because each representation conveys a different story about the desired signal.  Because pursuit algorithms are often wrong in finding the sparsest representation, and then relying on their solution becomes too sensitive.  … Maybe there are “deeper” reasons? D D

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 13 Generating Many Representations Our Answer: Randomizing the OMP Initialization Main Iteration Stop Yes No

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 14 Lets Try Proposed Experiment :  Form a random D.  Multiply by a sparse vector α 0 ( ).  Add Gaussian iid noise ( σ =1) and obtain.  Solve the problem using OMP, and obtain.  Use RandOMP and obtain.  Lets look at the obtained representations … =

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 15 Some Observations We see that The OMP gives the sparsest solution Nevertheless, it is not the most effective for denoising. The cardinality of a representation does not reveal its efficiency.

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 16 The Surprise … Lets propose the average as our representation This representation IS NOT SPARSE AT ALL but its noise attenuation is: 0.06 (OMP gives 0.16)

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 17 Is It Consistent? Yes! Here are the results of 1000 trials with the same parameters … ?

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 18 The Explanation – Our Signal Model K N A fixed Dictionary Signal Model Assumed  D is fixed and known  α is built by:  Choosing the support S w.p. P(S) of all the 2 K possibilities Ω,  Choosing the coefficients using iid Gaussian entries* N(0, σ x ): P(x|S).  The ideal signal is x=D α. * Not exactly, but this does not change our analysis. The p.d.f. of the signal P(x) is:

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 19 The Explanation – Adding Noise K N A fixed Dictionary + Noise Assumed: The noise v is additive white Gaussian vector with probability P v (v) The p.d.f. of the noisy signal P(y), and the conditionals P(y|S) and P(S|y) are clear and well-defined (although nasty).

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 20 Projection of the signal y onto the support S Some Algebra Leads To Implications: The best estimator (in terms of L 2 error) is a weighted average of many sparse representations!!!

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 21 As It Turns Out …  The MMSE estimation we got requires a sweep through all 2 K supports (i.e. combinatorial search) – impractical.  Similarly, an explicit expression for P(x/y) can be derived and maximized – this is the MAP estimation, and it also requires a sweep through all possible supports – impractical too.  The OMP is a (good) approximation for the MAP estimate.  The RandOMP is a (good) approximation of the Minimum-Mean- Squared-Error (MMSE) estimate. It is close to the Gibbs sampler of the probability P(S|y) from which we should draw the weights. Back to the beginning: Why Use Several Representations? Because their average leads to provable better noise suppression.

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad  Relative Mean-Squared-Error Comparative Results The following results correspond to a small dictionary (20×30), where the combinatorial formulas can be evaluated as well. Parameters: N=20, K=30 True support=3 σ x =1 Averaged over 1000 experiments Emp. Oracle 2. Theor. Oracle 3. Emp. MMSE 4. Theor. MMSE 5. Emp. MAP 6. Theor. MAP 7. OMP 8. RandOMP Known support

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 23 Part III Summary and Conclusion

Image Denoising & Beyond Via Learned Dictionaries and Sparse representations By: Michael Elad 24 Today We Have Seen that … In our work on we cover theoretical, numerical, and applicative issues related to this model and its use in practice What do we do? Sparsity, Redundancy, and the use of examples are important ideas that can be used in designing better tools in signal/image processing and today We have shown that averaging several sparse representations for a signal lead to better denoising, as it approximates the MMSE estimator. More on these (including the slides, the papers, and a Matlab toolbox) in