SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.

Slides:



Advertisements
Similar presentations
MMSE Estimation for Sparse Representation Modeling
Advertisements

Joint work with Irad Yavneh
Pixel Recovery via Minimization in the Wavelet Domain Ivan W. Selesnick, Richard Van Slyke, and Onur G. Guleryuz *: Polytechnic University, Brooklyn, NY.
Patch-based Image Deconvolution via Joint Modeling of Sparse Priors Chao Jia and Brian L. Evans The University of Texas at Austin 12 Sep
Sparse Representations and the Basis Pursuit Algorithm* Michael Elad The Computer Science Department – Scientific Computing & Computational mathematics.
Discriminative Approach for Transform Based Image Restoration
Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH,
Extensions of wavelets
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Newton’s Method Application to LMS Recursive Least Squares Exponentially-Weighted.
Visual Recognition Tutorial
1 Analysis versus Synthesis in Signal Priors Ron Rubinstein Computer Science Department Technion – Israel Institute of Technology Haifa, Israel October.
Sparse & Redundant Signal Representation, and its Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Dictionary-Learning for the Analysis Sparse Model Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Image Super-Resolution Using Sparse Representation By: Michael Elad Single Image Super-Resolution Using Sparse Representation Michael Elad The Computer.
Sparse and Overcomplete Data Representation
Mathematics and Image Analysis, MIA'06
Image Denoising via Learned Dictionaries and Sparse Representations
Motion Analysis (contd.) Slides are from RPI Registration Class.
Image Denoising and Beyond via Learned Dictionaries and Sparse Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
An Introduction to Sparse Representation and the K-SVD Algorithm
ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
New Results in Image Processing based on Sparse and Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
August 2007 San Diego Convention Center San Diego, CA USA Wavelet XII (6701) Michael Elad The Computer Science Department The Technion – Israel.
Image Denoising with K-SVD Priyam Chatterjee EE 264 – Image Processing & Reconstruction Instructor : Prof. Peyman Milanfar Spring 2007.
Random Convolution in Compressive Sampling Michael Fleyer.
* Joint work with Michal Aharon Guillermo Sapiro
Super-Resolution Reconstruction of Images -
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Image Decomposition and Inpainting By Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute of.
SOS Boosting of Image Denoising Algorithms
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Multiscale transforms : wavelets, ridgelets, curvelets, etc.
Sparse and Redundant Representation Modeling for Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Empirical Bayes approaches to thresholding Bernard Silverman, University of Bristol (joint work with Iain Johnstone, Stanford) IMS meeting 30 July 2002.
Retinex by Two Bilateral Filters Michael Elad The CS Department The Technion – Israel Institute of technology Haifa 32000, Israel Scale-Space 2005 The.
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
The Quest for a Dictionary. We Need a Dictionary  The Sparse-land model assumes that our signal x can be described as emerging from the PDF:  Clearly,
An ALPS’ view of Sparse Recovery Volkan Cevher Laboratory for Information and Inference Systems - LIONS
Image Representation Gaussian pyramids Laplacian Pyramids
Image Denoising using Wavelet Thresholding Techniques Submitted by Yang
WEIGHTED OVERCOMPLETE DENOISING Onur G. Guleryuz Epson Palo Alto Laboratory Palo Alto, CA (Please view in full screen mode to see.
RECPAD - 14ª Conferência Portuguesa de Reconhecimento de Padrões, Aveiro, 23 de Outubro de 2009 The data exhibit a severe type of signal-dependent noise,
1 Wavelets, Ridgelets, and Curvelets for Poisson Noise Removal 國立交通大學電子研究所 張瑞男
Cs: compressed sensing
“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Eran Treister and Irad Yavneh Computer Science, Technion (with thanks to Michael Elad)
Image Denoising Using Wavelets
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
Sparse & Redundant Representation Modeling of Images Problem Solving Session 1: Greedy Pursuit Algorithms By: Matan Protter Sparse & Redundant Representation.
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
Image Priors and the Sparse-Land Model
Stein Unbiased Risk Estimator Michael Elad. The Objective We have a denoising algorithm of some sort, and we want to set its parameters so as to extract.
Regularization of energy-based representations Minimize total energy E p (u) + (1- )E d (u,d) E p (u) : Stabilizing function - a smoothness constraint.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel Sparse & Redundant Representations by Iterative-Shrinkage.
Bayesian fMRI analysis with Spatial Basis Function Priors
Sparsity Based Poisson Denoising and Inpainting
Optimal sparse representations in general overcomplete bases
Improving K-SVD Denoising by Post-Processing its Method-Noise
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Outline Sparse Reconstruction RIP Condition
Image restoration, noise models, detection, deconvolution
Lecture 7 Patch based methods: nonlocal means, BM3D, K- SVD, data-driven (tight) frame.
Presentation transcript:

SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel SPARS'05 Signal Processing with Adaptive Sparse Structured Representations November 16-18, 2005 – Rennes, France

Shrinkage for Redundant representations? 2 Noise Removal Our story begins with signal/image denoising … Remove Additive Noise ?  100 years of activity – numerous algorithms.  Considered Directions include: PDE, statistical estimators, adaptive filters, inverse problems & regularization, sparse representations, …

Shrinkage for Redundant representations? 3 Shrinkage For Denoising  Shrinkage is a simple yet effective denoising algorithm [Donoho & Johnstone, 1993].  Justification 1: minimax near-optimal over the Besov (smoothness) signal space (complicated!!!!). Apply Wavelet Transform Apply Inv. Wavelet Transform LUT  Justification 2: Bayesian (MAP) optimal [Simoncelli & Adelson 1996, Moulin & Liu 1999].  In both justifications, an additive Gaussian white noise and a unitary transform are crucial assumptions for the optimality claims.

Shrinkage for Redundant representations? 4 Redundant Transforms? Apply its (pseudo) Inverse Transform LUT  This scheme is still applicable, and it works fine (tested with curvelet, contourlet, undecimated wavelet, and more).  However, it is no longer the optimal solution for the MAP criterion. TODAY’S FOCUS: IS SHRINKAGE STILL RELEVANT WHEN HANDLING REDUNDANT (OR NON-UNITARY) TRANSFORMS? HOW? WHY? Number of coefficients is (much) greater than the number of input samples (pixels) Apply Redundant Transform

Shrinkage for Redundant representations? 5 Agenda 1.Bayesian Point of View – a Unitary Transform Optimality of shrinkage 2.What About Redundant Representation? Is shrinkage is relevant? Why? How? 3. Conclusions Thomas Bayes

Shrinkage for Redundant representations? 6 The MAP Approach Log-Likelihood term Prior or regularization Given measurements Unknown to be recovered Minimize the following function with respect to x:

Shrinkage for Redundant representations? 7 Image Prior? During the past several decades we have made all sort of guesses about the prior Pr(x): Mumford & Shah formulation, Compression algorithms as priors, … EnergySmoothness Adapt+ Smooth Robust Statistics Total- Variation Wavelet Sparsity Sparse & Redundant Today’s Focus

Shrinkage for Redundant representations? 8 We got a separable set of 1D optimization problems (Unitary) Wavelet Sparsity L 2 is unitarily invariant Define

Shrinkage for Redundant representations? 9 Why Shrinkage? Want to minimize this 1-D function with respect to z A LUT can be built for any other robust function (replacing the |z|), including non- convex ones (e.g., L 0 norm)!! LUT

Shrinkage for Redundant representations? 10 Agenda 1.Bayesian Point of View – a Unitary Transform Optimality of shrinkage 2.What About Redundant Representation? Is shrinkage is relevant? Why? How? 3. Conclusions k n

Shrinkage for Redundant representations? 11 An Overcomplete Transform Redundant transforms are important because they can (i)Lead to a shift-invariance property, (ii)Represent images better (because of orientation/scale analysis), (iii)Enable deeper sparsity (and thus give more structured prior).

Shrinkage for Redundant representations? 12 However Analysis versus Synthesis Define Basis Pursuit Analysis Prior: Synthesis Prior:

Shrinkage for Redundant representations? 13 Basis Pursuit As Objective Our Objective: Getting a sparse solution implies that y is composed of few atoms from D

Shrinkage for Redundant representations? 14 Set j=1 Sequential Coordinate Descent Fix all entries of  apart from the j-th one Optimize with respect to  j j=j+1 mod k  The unknown, , has k entries.  How about optimizing w.r.t. each of them sequentially?  The objective per each becomes Our objective

Shrinkage for Redundant representations? 15 We Get Sequential Shrinkage and the solution was We had this 1-D function to minimize BEFORE: NOW: Our 1-D objective is and the solution now is

Shrinkage for Redundant representations? 16 Sequential? Not Good!! Set j=1 Fix all entries of  apart from the j-th one Optimize with respect to  j j=j+1 mod k  This method requires drawing one column at a time from D.  In most transforms this is not comfortable at all !!!

Shrinkage for Redundant representations? 17 How About Parallel Shrinkage?  Assume a current solution  n.  Using the previous method, we have k descent directions obtained by a simple shrinkage.  How about taking all of them at once, with a proper relaxation?  Little bit of math lead to … Our objective Update the solution by For j=1:k Compute the descent direction per  j : v j.

Shrinkage for Redundant representations? 18 The Proposed Algorithm The synthesis error Back-projection to the signal domain Shrinkage operation Update by line-search (*) At all stages, the dictionary is applied as a whole, either directly, or via its adjoint Initialize Compute (*)

Shrinkage for Redundant representations? 19 The First Iteration – A Closer Look For Example: Tight (DD H =c  I) and normalized (W=I) frame Initialize Compute (*)

Shrinkage for Redundant representations? 20 Relation to Simple Shrinkage Apply Redundant Transform Apply its (pseudo) Inverse Transform LUT

Shrinkage for Redundant representations? 21 A Simple Example D: a 100  1000, union of 10 random unitary matrices, y: D , with  having 15 non- zeros in random locations, λ =1,  0 =0, Line-Search: Armijo Iteration Objective function value Steepest Descent Sequential Shrinkage Parallel Shrinkage with line-search MATLAB’s fminunc

Shrinkage for Redundant representations? Objective function Iterations Iterative Shrinkage Steepest Descent Conjugate Gradient Truncated Newton Image Denoising The Matrix W gives a variance per each coefficient, learned from the corrupted image. D is the contourlet transform (recent version). The length of  : ~1e+6. The Seq. Shrinkage algorithm can no longer be simulated

Shrinkage for Redundant representations? Image Denoising Denoising PSNR Iterations Iterative Shrinkage Steepest Descent Conjugate Gradient Truncated Newton Even though one iteration of our algorithm is equivalent in complexity to that of the SD, the performance is much better

Shrinkage for Redundant representations? 24 Image Denoising Original Image Noisy Image with σ=20 Iterated Shrinkage – First Iteration PSNR=28.30dB Iterated Shrinkage – second iteration PSNR=31.05dB

Shrinkage for Redundant representations? 25 Closely Related Work  The “same” algorithm was derived in several other works: Sparse representation over curvelet [Starck, Candes, Donoho, 2003]. E-M algorithm for image restoration [Figueiredo & Nowak 2003]. Iterated Shrinkage for problems of the form [Daubechies, Defrise, & De-Mol, 2004].  The work proposed here is different in several ways: Motivation: Shrinkage for redundant representation, rather than general inverse problems. Derivation: We used a parallel CD-algorithm, and others used the EM or a sequence of surrogate functions. Algorithm: We obtain a slightly different algorithm, where the norms of the atoms are used differently, different thresholds are used, the choice of μ is different.

Shrinkage for Redundant representations? 26 Agenda 1.Bayesian Point of View – a Unitary Transform Optimality of shrinkage 2.What About Redundant Representation? Is shrinkage is relevant? Why? How? 3. Conclusions

Shrinkage for Redundant representations? 27 How? Conclusion When optimal? How to avoid the need to extract atoms? What if the transform is redundant? Getting what? Shrinkage is an appealing signal denoising technique Option 1: apply sequential coordinate descent which leads to a sequential shrinkage algorithm Go Parallel Compute all the CD directions, and use the average We obtain a very easy to implement parallel shrinkage algorithm that requires forward transform, scalar shrinkage, and inverse transform. For additive Gaussian noise and unitary transforms