Image Decomposition and Inpainting By Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute of.

Slides:



Advertisements
Similar presentations
Bregman Iterative Algorithms for L1 Minimization with
Advertisements

MMSE Estimation for Sparse Representation Modeling
Joint work with Irad Yavneh
Pixel Recovery via Minimization in the Wavelet Domain Ivan W. Selesnick, Richard Van Slyke, and Onur G. Guleryuz *: Polytechnic University, Brooklyn, NY.
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
Sparse Representations and the Basis Pursuit Algorithm* Michael Elad The Computer Science Department – Scientific Computing & Computational mathematics.
K-SVD Dictionary-Learning for Analysis Sparse Models
Wangmeng Zuo, Deyu Meng, Lei Zhang, Xiangchu Feng, David Zhang
Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH,
Extensions of wavelets
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Ilias Theodorakopoulos PhD Candidate
Abrupt Feature Extraction via the Combination of Sparse Representations Wei Wang, Wenchao Chen, Jinghuai Gao, Jin Xu Institute of Wave & Information, Xi’an.
Yves Meyer’s models for image decomposition and computational approaches Luminita Vese Department of Mathematics, UCLA Triet Le (Yale University), Linh.
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University.
Sparse & Redundant Signal Representation, and its Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Dictionary-Learning for the Analysis Sparse Model Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Image Super-Resolution Using Sparse Representation By: Michael Elad Single Image Super-Resolution Using Sparse Representation Michael Elad The Computer.
Sparse and Overcomplete Data Representation
SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.
Mathematics and Image Analysis, MIA'06
Image Denoising via Learned Dictionaries and Sparse Representations
Over-Complete & Sparse Representations for Image Decomposition and Inpainting Michael Elad The Computer Science Department The Technion – Israel Institute.
Image Denoising and Beyond via Learned Dictionaries and Sparse Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
An Introduction to Sparse Representation and the K-SVD Algorithm
ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
New Results in Image Processing based on Sparse and Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
* Joint work with Michal Aharon Guillermo Sapiro
Introduction to Compressive Sensing
Super-Resolution Reconstruction of Images -
Analysis of the Basis Pustuit Algorithm and Applications*
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Sparse Representations of Signals: Theory and Applications *
SOS Boosting of Image Denoising Algorithms
Over-Complete & Sparse Representations for Image Decomposition*
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Multiscale transforms : wavelets, ridgelets, curvelets, etc.
Sparse and Redundant Representation Modeling for Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Retinex by Two Bilateral Filters Michael Elad The CS Department The Technion – Israel Institute of technology Haifa 32000, Israel Scale-Space 2005 The.
Alfredo Nava-Tudela John J. Benedetto, advisor
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
1 Wavelets, Ridgelets, and Curvelets for Poisson Noise Removal 國立交通大學電子研究所 張瑞男
Cs: compressed sensing
Sparse Representations of Signals: Theory and Applications * Michael Elad The CS Department The Technion – Israel Institute of technology Haifa 32000,
“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Eran Treister and Irad Yavneh Computer Science, Technion (with thanks to Michael Elad)
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm Ahmed Aziz, Ahmed Salim, Walid Osamy Presenter : 張庭豪 International Journal of Computer.
Sparse & Redundant Representation Modeling of Images Problem Solving Session 1: Greedy Pursuit Algorithms By: Matan Protter Sparse & Redundant Representation.
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
SUB-NYQUIST DOPPLER RADAR WITH UNKNOWN NUMBER OF TARGETS A project by: Gil Ilan & Alex Dikopoltsev Guided by: Yonina Eldar & Omer Bar-Ilan Project #: 1489.
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
Compressive Sensing Techniques for Video Acquisition EE5359 Multimedia Processing December 8,2009 Madhu P. Krishnan.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Sparsity Based Poisson Denoising and Inpainting
The minimum cost flow problem
VII. Other Time Frequency Distributions (II)
Yaniv Romano The Electrical Engineering Department
Sparse and Redundant Representations and Their Applications in
Improving K-SVD Denoising by Post-Processing its Method-Noise
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Outline Sparse Reconstruction RIP Condition
Image restoration, noise models, detection, deconvolution
Presentation transcript:

Image Decomposition and Inpainting By Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel SIAM Conference on Imaging Science May 15-17, 2005 – Minneapolis, Minnesota Variational and PDE models for image decomposition – Part II David L. Donoho Statistics Department Stanford USA Jean-Luc Starck CEA - Service d’Astrophysique CEA-Saclay France

Sparse representations for Image Decomposition 2 General Sparsity and over-completeness have important roles in analyzing and representing signals. The main directions of our research efforts in recent years: Analysis of the (basis/matching) pursuit algorithms, properties of sparse representations (uniqueness), and deployment to applications. Today we discuss the image decomposition application (image=cartoon+texture). We present  Theoretical analysis serving this application,  Practical considerations, and  Application – filling holes in images (inpainting)

Sparse representations for Image Decomposition 3 Agenda 1. Introduction Sparsity and Over-completeness!? 2. Theory of Decomposition Uniqueness and Equivalence 3.Decomposition in Practice Practical Considerations, Numerical algorithm 4. Discussion

Sparse representations for Image Decomposition 4 = L N  s  Atom (De-) Composition Given a signal s, we are often interested in its representation (transform) as a linear combination of ‘atoms’ from a given dictionary: If the dictionary is over- complete (L>N), there are numerous ways to obtain the ‘atom-decomposition’. Among those possibilities, we consider the sparsest.

Sparse representations for Image Decomposition 5 Greedy stepwise regression - Matching Pursuit (MP) algorithm or ortho.version of it (OMP) [Zhang & Mallat. 93’]. Searching for the sparsest representation, we have the following optimization task: Hard to solve – complexity grows exponentially with L. Atom Decomposition? Replace the l 0 norm by an l 1 : Basis Pursuit (BP) [Chen, Donoho, Saunders. 95’]

Sparse representations for Image Decomposition 6 Questions about Decomposition Interesting observation: In many cases the pursuit algorithms successfully find the sparsest representation. Why BP/MP/OMP should work well? Are there Conditions to this success? Could there be several different sparse representations? What about uniqueness? How all this leads to image separation? Inpainting?

Sparse representations for Image Decomposition 7 Agenda 1. Introduction Sparsity and Over-completeness!? 2. Theory of Decomposition Uniqueness and Equivalence 3.Decomposition in Practice Practical Considerations, Numerical algorithm 4. Discussion

Sparse representations for Image Decomposition 8 Decomposition – Definition Family of Texture images Family of Cartoon images Our Assumption Our Inverse Problem Given s, find its building parts and the mixture weights

Sparse representations for Image Decomposition 9 Use of Sparsity N L xx =  x is chosen such that the representation of are sparse: = xx  x is chosen such that the representation of are non-sparse: We similarly construct  y to sparsify Y’s while being inefficient in representing the X’s.

Sparse representations for Image Decomposition 10 Choice of Dictionaries Training, e.g. Educated guess: texture could be represented by local overlapped DCT, and cartoon could be built by Curvelets/Ridgelets/Wavelets (depending on the content). Note that if we desire to enable partial support and/or different scale, the dictionaries must have multiscale and locality properties in them.

Sparse representations for Image Decomposition 11 Decomposition via Sparsity yy xx = + Why should this work?

Sparse representations for Image Decomposition 12 Uniqueness via ‘Spark’ Given a unit norm signal s, assume we hold two different representations for it using  = 0 v  Definition: Given a matrix , define  =Spark{  } as the smallest number of columns from  that are linearly dependent.

Sparse representations for Image Decomposition 13 Any two different representations of the same signal using an arbitrary dictionary cannot be jointly sparse [Donoho & E, 03`]. If we found a representation that satisfy Then necessarily it is unique (the sparsest). Theorem 1 Uniqueness Rule

Sparse representations for Image Decomposition 14 Uniqueness Rule - Implications yy xx = + If, it is necessarily the sparsest one possible, and it will be found. For dictionaries effective in describing the ‘cartoon’ and ‘texture’ contents, we could say that the decomposition that leads to separation is the sparsest one possible.

Sparse representations for Image Decomposition 15 Lower bound on the “Spark” We can show (based on Gerśgorin disk theorem) that a lower-bound on the spark is obtained by Define the Mutual Coherence as Since the Gerśgorin theorem is non-tight, this lower bound on the Spark is too pessimistic.

Sparse representations for Image Decomposition 16 Equivalence – The Result Theorem 2 Given a signal s with a representation, Assuming that, P 1 (BP) is Guaranteed to find the sparsest solution. We also have the following result [Donoho & E 02’,Gribonval & Nielsen 03`] : BP is expected to succeed if sparse solution exists. A similar result exists for the greedy algorithms [Tropp 03’]. In practice, the MP & BP succeed far above the bound.

Sparse representations for Image Decomposition 17 Agenda 1. Introduction Sparsity and Over-completeness!? 2. Theory of Decomposition Uniqueness and Equivalence 3.Decomposition in Practice Practical Considerations, Numerical algorithm 4. Discussion

Sparse representations for Image Decomposition 18 Noise Considerations Forcing exact representation is sensitive to additive noise and model mismatch Recent results [Tropp 04’, Donoho et.al. 04’] show that the noisy case generally meets similar rules of uniqueness and equivalence

Sparse representations for Image Decomposition 19 Artifacts Removal We want to add external forces to help the separation succeed, even if the dictionaries are not perfect

Sparse representations for Image Decomposition 20 Complexity Instead of 2N unknowns (the two separated images), we have 2L»2N ones. Define two image unknowns to be and obtain …

Sparse representations for Image Decomposition 21 Justifications Heuristics:(1) Bounding function; (2) Relation to BCR; (3) Relation to MAP. Theoretic:See recent results by D.L. Donoho. Simplification

Sparse representations for Image Decomposition 22 Algorithm An algorithm was developed to solve the above problem: It iterates between an update of s x to update of s y. Every update (for either s x or s y ) is done by a forward and backward fast transforms – this is the dominant computational part of the algorithm. The update is performed using diminishing soft-thresholding (similar to BCR but sub-optimal due to the non unitary dictionaries). The TV part is taken-care-of by simple gradient descent. Convergence is obtained after iterations.

Sparse representations for Image Decomposition 23 Results 1 – Synthetic Case Original image composed as a combination of texture and cartoon The separated texture (spanned by Global DCT functions) The very low freq. content – removed prior to the use of the separation The separated cartoon (spanned by 5 layer Curvelets functions+LPF)

Sparse representations for Image Decomposition 24 Results 2 – Synthetic + Noise Original image composed as a combination of texture, cartoon, and additive noise (Gaussian, ) The separated texture (spanned by Global DCT functions) The separated cartoon (spanned by 5 layer Curvelets functions+LPF) The residual, being the identified noise

Sparse representations for Image Decomposition 25 Results 3 – Edge Detection Edge detection on the original image Edge detection on the cartoon part of the image

Sparse representations for Image Decomposition 26 Original ‘Barbara’ image Separated texture using local overlapped DCT (32×32 blocks) Separated Cartoon using Curvelets (5 resolution layers) Results 4 – Good old ‘Barbara’

Sparse representations for Image Decomposition 27 Results 4 – Zoom in Zoom in on the result shown in the previous slide (the texture part) Zoom in on the results shown in the previous slide (the cartoon part) The same part taken from Vese’s et. al.

Sparse representations for Image Decomposition 28 Results 5 – Gemini The original image - Galaxy SBS as photographed by Gemini The texture part spanned by global DCT The residual being additive noise The Cartoon part spanned by wavelets

Sparse representations for Image Decomposition 29 Application - Inpainting For separation What if some values in s are unknown (with known locations!!!)? The image will be the inpainted outcome. Interesting comparison to [Bertalmio et.al. ’02]

Sparse representations for Image Decomposition 30 Results 6 - Inpainting Source Cartoon Part Texture Part Outcome

Sparse representations for Image Decomposition 31 Results 7 - Inpainting Source Cartoon Part Texture Part Outcome

Sparse representations for Image Decomposition 32 Results 8 - Inpainting Outcome Source

Sparse representations for Image Decomposition 33 Results 9 - Inpainting

Sparse representations for Image Decomposition 34 Agenda 1. Introduction Sparsity and Over-completeness!? 2. Theory of Decomposition Uniqueness and Equivalence 3.Decomposition in Practice Practical Considerations, Numerical algorithm 4. Discussion

Sparse representations for Image Decomposition 35 Summary Over-complete and Sparsity are powerful in representations of signals Decompose an image to Cartoon+Texture Application? We show theoretical results explaining how could this lead to successful separation. Also, we show that pursuit algorithms are expected to succeed Theoretical Justification? Practical issues? We present ways to robustify the process, and apply it to image inpainting Choice of dictionaries, performance beyond the bounds, Other applications? More... Where next?

Sparse representations for Image Decomposition 36 These slides and the following related papers can be found in: M. Elad, "Why Simple Shrinkage is Still Relevant for Redundant Representations?", Submitted to the IEEE Trans. On Information Theory on December M. Elad, J-L. Starck, P. Querre, and D.L. Donoho, “Simultaneous Cartoon and Texture Image Inpainting Using Morphological Component Analysis (MCA)”, Journal on Applied and Computational Harmonic Analysis, Vol. 19, pp , November D.L. Donoho, M. Elad, and V. Temlyakov, "Stable Recovery of Sparse Overcomplete Representations in the Presence of Noise", the IEEE Trans. On Information Theory, Vol. 52, pp. 6-18, January J.L. Starck, M. Elad, and D.L. Donoho, "Image decomposition via the combination of sparse representations and a variational approach", the IEEE Trans. On Image Processing, Vol. 14, No. 10, pp , October J.-L. Starck, M. Elad, and D.L. Donoho, "Redundant Multiscale Transforms and their Application for Morphological Component Analysis", the Journal of Advances in Imaging and Electron Physics, Vol. 132, pp , D. L. Donoho and M. Elad, “Maximal sparsity Representation via l1 Minimization”, the Proc. Nat. Aca. Sci., Vol. 100, pp , March 2003.

Sparse representations for Image Decomposition 37 If is the local DCT, then requiring sparsity parallels the requirement for oscilatory behavior If is one resolution layer of the non-decimated Haar – we get TV Appendix A – Relation to Vese’s Vese & Osher’s Formulation