Mathematics and Image Analysis, MIA'06

Slides:



Advertisements
Similar presentations
Compressive Sensing IT530, Lecture Notes.
Advertisements

MMSE Estimation for Sparse Representation Modeling
Joint work with Irad Yavneh
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
Sparse Representations and the Basis Pursuit Algorithm* Michael Elad The Computer Science Department – Scientific Computing & Computational mathematics.
K-SVD Dictionary-Learning for Analysis Sparse Models
Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH,
Extensions of wavelets
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
1 Micha Feigin, Danny Feldman, Nir Sochen
Ilias Theodorakopoulos PhD Candidate
An Introduction to Sparse Coding, Sparse Sensing, and Optimization Speaker: Wei-Lun Chao Date: Nov. 23, 2011 DISP Lab, Graduate Institute of Communication.
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
Sparse & Redundant Signal Representation, and its Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Dictionary-Learning for the Analysis Sparse Model Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Image Super-Resolution Using Sparse Representation By: Michael Elad Single Image Super-Resolution Using Sparse Representation Michael Elad The Computer.
Sparse and Overcomplete Data Representation
SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.
Image Denoising via Learned Dictionaries and Sparse Representations
Over-Complete & Sparse Representations for Image Decomposition and Inpainting Michael Elad The Computer Science Department The Technion – Israel Institute.
Image Denoising and Beyond via Learned Dictionaries and Sparse Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
An Introduction to Sparse Representation and the K-SVD Algorithm
ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
New Results in Image Processing based on Sparse and Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
Image Denoising with K-SVD Priyam Chatterjee EE 264 – Image Processing & Reconstruction Instructor : Prof. Peyman Milanfar Spring 2007.
* Joint work with Michal Aharon Guillermo Sapiro
Computing Sketches of Matrices Efficiently & (Privacy Preserving) Data Mining Petros Drineas Rensselaer Polytechnic Institute (joint.
Analysis of the Basis Pustuit Algorithm and Applications*
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Image Decomposition and Inpainting By Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute of.
Sparse Representations of Signals: Theory and Applications *
Over-Complete & Sparse Representations for Image Decomposition*
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Sparse and Redundant Representation Modeling for Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Alfredo Nava-Tudela John J. Benedetto, advisor
The Quest for a Dictionary. We Need a Dictionary  The Sparse-land model assumes that our signal x can be described as emerging from the PDF:  Clearly,
Compressed Sensing Compressive Sampling
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
Cs: compressed sensing
Sparse Representations of Signals: Theory and Applications * Michael Elad The CS Department The Technion – Israel Institute of technology Haifa 32000,
“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH.
 Karthik Gurumoorthy  Ajit Rajwade  Arunava Banerjee  Anand Rangarajan Department of CISE University of Florida 1.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Eran Treister and Irad Yavneh Computer Science, Technion (with thanks to Michael Elad)
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
Sparse & Redundant Representation Modeling of Images Problem Solving Session 1: Greedy Pursuit Algorithms By: Matan Protter Sparse & Redundant Representation.
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
Image Priors and the Sparse-Land Model
The Quest for a Dictionary. We Need a Dictionary  The Sparse-land model assumes that our signal x can be described as emerging from the PDF:  Clearly,
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Sparsity Based Poisson Denoising and Inpainting
Jeremy Watt and Aggelos Katsaggelos Northwestern University
A Motivating Application: Sensor Array Signal Processing
Sudocodes Fast measurement and reconstruction of sparse signals
Optimal sparse representations in general overcomplete bases
Parallelization of Sparse Coding & Dictionary Learning
* *Joint work with Ron Rubinstein Tomer Peleg Remi Gribonval and
Sparse and Redundant Representations and Their Applications in
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
Sparse and Redundant Representations and Their Applications in
Improving K-SVD Denoising by Post-Processing its Method-Noise
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Presentation transcript:

Mathematics and Image Analysis, MIA'06 Sparse & Redundant Signal Representation and its Role in Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel Mathematics and Image Analysis, MIA'06 Paris, 18-21 September , 2006

Sparsity and Redundancy Today’s Talk is About Sparsity and Redundancy We will try to show today that Sparsity & Redundancy can be used to design new/renewed & powerful signal/image processing tools. We will present both the theory behind sparsity and redundancy, and how those are deployed to image processing applications. This is an overview lecture, describing the recent activity in this field. Sparse and Redundant Signal Representation, and Its Role in Image Processing

Agenda Welcome to Sparseland A Visit to Sparseland Motivating Sparsity & Overcompleteness 2. Problem 1: Transforms & Regularizations How & why should this work? 3. Problem 2: What About D? The quest for the origin of signals 4. Problem 3: Applications Image filling in, denoising, separation, compression, … Welcome to Sparseland Sparse and Redundant Signal Representation, and Its Role in Image Processing

M Generating Signals in Sparseland K N A fixed Dictionary Every column in D (dictionary) is a prototype signal (Atom). M The vector  is generated randomly with few non-zeros in random locations and random values. A sparse & random vector N Sparse and Redundant Signal Representation, and Its Role in Image Processing

M Sparseland Signals Are Special Simple: Every generated signal is built as a linear combination of few atoms from our dictionary D Rich: A general model: the obtained signals are a special type mixture-of-Gaussians (or Laplacians). Multiply by D M Sparse and Redundant Signal Representation, and Its Role in Image Processing

M T Transforms in Sparseland ? M Assume that x is known to emerge from . . We desire simplicity, independence, and expressiveness. M How about “Given x, find the α that generated it in ” ? T Sparse and Redundant Signal Representation, and Its Role in Image Processing

So, In Order to Transform … We need to solve an under-determined linear system of equations: Known Among all (infinitely many) possible solutions we want the sparsest !! We will measure sparsity using the L0 norm: Sparse and Redundant Signal Representation, and Its Role in Image Processing

Measure of Sparsity? -1 +1 1 As p  0 we get a count of the non-zeros in the vector Sparse and Redundant Signal Representation, and Its Role in Image Processing

M Signal’s Transform in Sparseland Multiply by D A sparse & random vector M Are there practical ways to get ? 4 Major Questions Is ? Under which conditions? How effective are those ways? How would we get D? Sparse and Redundant Signal Representation, and Its Role in Image Processing

M Q Inverse Problems in Sparseland ? M Assume that x is known to emerge from . M M Suppose we observe , a “blurred” and noisy version of x with . How will we recover x? Noise How about “find the α that generated the x …” again? Q Sparse and Redundant Signal Representation, and Its Role in Image Processing

4 Major Questions (again!) Inverse Problems in Sparseland ? M 4 Major Questions (again!) How would we get D? Are there practical ways to get ? How effective are those ways? Is ? How far can it go? Multiply by D A sparse & random vector “blur” by H Sparse and Redundant Signal Representation, and Its Role in Image Processing

Sparseland is HERE Back Home … Any Lessons? Several recent trends worth looking at: JPEG to JPEG2000 - From (L2-norm) KLT to wavelet and non-linear approximation Sparseland is HERE Sparsity. From Wiener to robust restoration – From L2-norm (Fourier) to L1. (e.g., TV, Beltrami, wavelet shrinkage …) Sparsity. From unitary to richer representations – Frames, shift-invariance, steerable wavelet, contourlet, curvelet Overcompleteness. Approximation theory – Non-linear approximation Sparsity & Overcompleteness. ICA and related models Independence and Sparsity. Sparse and Redundant Signal Representation, and Its Role in Image Processing

To Summarize so far … The Sparseland model for signals is very interesting Who cares? We do! it is relevant to us So, what are the implications? We need to answer the 4 questions posed, and then show that all this works well in practice Sparse and Redundant Signal Representation, and Its Role in Image Processing

T Q Agenda 1. A Visit to Sparseland Motivating Sparsity & Overcompleteness 2. Problem 1: Transforms & Regularizations How & why should this work? 3. Problem 2: What About D? The quest for the origin of signals 4. Problem 3: Applications Image filling in, denoising, separation, compression, … T Q Sparse and Redundant Signal Representation, and Its Role in Image Processing

Lets Start with the Transform … Our dream for Now: Find the sparsest solution of Known known Put formally, Sparse and Redundant Signal Representation, and Its Role in Image Processing

Suppose we can solve this exactly Question 1 – Uniqueness? Multiply by D M Suppose we can solve this exactly Why should we necessarily get ? It might happen that eventually . Sparse and Redundant Signal Representation, and Its Role in Image Processing

Matrix “Spark” * In tensor decomposition, Kruskal defined something similar already in 1989. * Donoho & E. (‘02) Definition: Given a matrix D, =Spark{D} is the smallest and and number of columns that are linearly dependent. Example: Spark = 3 Rank = 4 Sparse and Redundant Signal Representation, and Its Role in Image Processing

Uniqueness Rule M Suppose this problem has been solved somehow If we found a representation that satisfy Then necessarily it is unique (the sparsest). Uniqueness Donoho & E. (‘02) This result implies that if generates signals using “sparse enough” , the solution of the above will find it exactly. M Sparse and Redundant Signal Representation, and Its Role in Image Processing

M Question 2 – Practical P0 Solver? Are there reasonable ways to find ? Multiply by D Sparse and Redundant Signal Representation, and Its Role in Image Processing

Matching Pursuit (MP) Mallat & Zhang (1993) The MP is a greedy algorithm that finds one atom at a time. Step 1: find the one atom that best matches the signal. Next steps: given the previously found atoms, find the next one to best fit … The Orthogonal MP (OMP) is an improved version that re-evaluates the coefficients after each round. Sparse and Redundant Signal Representation, and Its Role in Image Processing

Basis Pursuit (BP) Instead of solving Solve Instead Chen, Donoho, & Saunders (1995) Instead of solving Solve Instead The newly defined problem is convex (linear programming). Very efficient solvers can be deployed: Interior point methods [Chen, Donoho, & Saunders (`95)] , Sequential shrinkage for union of ortho-bases [Bruce et.al. (`98)], Iterated shrinkage [Figuerido & Nowak (`03), Daubechies, Defrise, & Demole (‘04), E. (`05), E., Matalon, & Zibulevsky (`06)]. Sparse and Redundant Signal Representation, and Its Role in Image Processing

M Question 3 – Approx. Quality? How effective are the MP/BP Multiply by D How effective are the MP/BP in finding ? Sparse and Redundant Signal Representation, and Its Role in Image Processing

Assume normalized columns Evaluating the “Spark” Compute DT = D DTD Assume normalized columns The Mutual Coherence M is the largest off-diagonal entry in absolute value. The Mutual Coherence is a property of the dictionary (just like the “Spark”). In fact, the following relation can be shown: Sparse and Redundant Signal Representation, and Its Role in Image Processing

BP and MP Equivalence Equivalence Given a signal x with a representation , Assuming that , BP and MP are Guaranteed to find the sparsest solution. Donoho & E. (‘02) Gribonval & Nielsen (‘03) Tropp (‘03) Temlyakov (‘03) MP and BP are different in general (hard to say which is better). The above result corresponds to the worst-case. Average performance results are available too, showing much better bounds [Donoho (`04), Candes et.al. (`04), Tanner et.al. (`05), Tropp et.al. (`06), Tropp (’06)]. Sparse and Redundant Signal Representation, and Its Role in Image Processing

What About Inverse Problems? “blur” by H Multiply by D A sparse & random vector We had similar questions regarding uniqueness, practical solvers, and their efficiency. It turns out that similar answers are applicable here due to several recent works [Donoho, E. and Temlyakov (`04), Tropp (`04), Fuchs (`04), Gribonval et. al. (`05)]. Sparse and Redundant Signal Representation, and Its Role in Image Processing

To Summarize so far … The Sparseland model is relevant to us. We can design transforms and priors based on it Find the sparsest solution? Use pursuit Algorithms Why works so well? What next? A sequence of works during the past 3-4 years gives theoretic justifications for these tools behavior How shall we find D? Will this work for applications? Which? Sparse and Redundant Signal Representation, and Its Role in Image Processing

Agenda 1. A Visit to Sparseland Motivating Sparsity & Overcompleteness 2. Problem 1: Transforms & Regularizations How & why should this work? 3. Problem 2: What About D? The quest for the origin of signals 4. Problem 3: Applications Image filling in, denoising, separation, compression, … Sparse and Redundant Signal Representation, and Its Role in Image Processing

M Problem Setting Multiply by D Given these P examples and a fixed size [NK] dictionary D, how would we find D? Sparse and Redundant Signal Representation, and Its Role in Image Processing

Uniqueness? M If is rich enough* and if Uniqueness then D is unique. Aharon, E., & Bruckstein (`05) M “Rich Enough”: The signals from could be clustered to groups that share the same support. At least L+1 examples per each are needed. Comments: This result is proved constructively, but the number of examples needed to pull this off is huge – we will show a far better method next. A parallel result that takes into account noise is yet to be constructed. Sparse and Redundant Signal Representation, and Its Role in Image Processing

X A D  Practical Approach – Objective Each example is a linear combination of atoms from D Each example has a sparse representation with no more than L atoms Field & Olshausen (96’) Engan et. al. (99’) Lewicki & Sejnowski (00’) Cotter et. al. (03’) Gribonval et. al. (04’) (n,K,L are assumed known, D has norm. columns) Sparse and Redundant Signal Representation, and Its Role in Image Processing

Column-by-Column by Mean computation over the relevant examples K–Means For Clustering Clustering: An extreme sparse coding D Initialize D Sparse Coding Nearest Neighbor XT Dictionary Update Column-by-Column by Mean computation over the relevant examples Sparse and Redundant Signal Representation, and Its Role in Image Processing

Column-by-Column by SVD computation The K–SVD Algorithm – General D Initialize D Sparse Coding Use MP or BP X T Dictionary Update Column-by-Column by SVD computation Aharon, E., & Bruckstein (`04) Sparse and Redundant Signal Representation, and Its Role in Image Processing

For the jth example we solve K–SVD Sparse Coding Stage D For the jth example we solve X T Pursuit Problem !!! Sparse and Redundant Signal Representation, and Its Role in Image Processing

D K–SVD Dictionary Update Stage Gk : The examples in and that use the column dk. D The content of dk influences only the examples in Gk. Let us fix all A apart from the kth column and seek both dk and the kth column to better fit the residual! Sparse and Redundant Signal Representation, and Its Role in Image Processing

D K–SVD Dictionary Update Stage We should solve: ResidualE dk is obtained by SVD on the examples’ residual in Gk. Sparse and Redundant Signal Representation, and Its Role in Image Processing

D D K–SVD: A Synthetic Experiment Results Create A 2030 random dictionary with normalized columns Generate 2000 signal examples with 3 atoms per each and add noise D Train a dictionary using the KSVD and MOD and compare Results Sparse and Redundant Signal Representation, and Its Role in Image Processing

To Summarize so far … The Sparseland model for signals is relevant to us. In order to use it effectively we need to know D How D can be found? Use the K-SVD algorithm Will it work well? What next? We have shown how to practically train D using the K-SVD Show that all the above can be deployed to applications Sparse and Redundant Signal Representation, and Its Role in Image Processing

Agenda 1. A Visit to Sparseland Motivating Sparsity & Overcompleteness 2. Problem 1: Transforms & Regularizations How & why should this work? 3. Problem 2: What About D? The quest for the origin of signals 4. Problem 3: Applications Image filling in, denoising, separation, compression, … Sparse and Redundant Signal Representation, and Its Role in Image Processing

= Application 1: Image Inpainting Assume: the signal x has been created by x=Dα0 with very sparse α0. Missing values in x imply missing rows in this linear system. By removing these rows, we get . Now solve If α0 was sparse enough, it will be the solution of the above problem! Thus, computing Dα0 recovers x perfectly. = Sparse and Redundant Signal Representation, and Its Role in Image Processing

Application 1: Side Note Compressed Sensing is leaning on the very same principal, leading to alternative sampling theorems. Assume: the signal x has been created by x=Dα0 with very sparse α0. Multiply this set of equations by the matrix Q which reduces the number of rows. The new, smaller, system of equations is If α0 was sparse enough, it will be the sparsest solution of the new system, thus, computing Dα0 recovers x perfectly. Compressed sensing focuses on conditions for this to happen, guaranteeing such recovery. Sparse and Redundant Signal Representation, and Its Role in Image Processing

Application 1: The Practice Given a noisy image y, we can clean it using the Maximum A-posteriori Probability estimator by [Chen, Donoho, & Saunders (‘95)]. What if some of the pixels in that image are missing (filled with zeros)? Define a mask operator as the diagonal matrix W, and now solve instead When handling general images, there is a need to concatenate two dictionaries to get an effective treatment of both texture and cartoon contents – This leads to separation [E., Starck, & Donoho (’05)]. Sparse and Redundant Signal Representation, and Its Role in Image Processing

Inpainting Results Source Predetermined dictionary: Curvelet (cartoon) + Overlapped DCT (texture) Outcome Sparse and Redundant Signal Representation, and Its Role in Image Processing

Inpainting Results More about this application will be given in Jean-Luc Starck talk that follows. Sparse and Redundant Signal Representation, and Its Role in Image Processing

Application 2: Image Denoising Given a noisy image y, we have already mentioned the ability to clean it by solving When using the K-SVD, it cannot train a dictionary for large support images – How do we go from local treatment of patches to a global prior? The solution: Force shift-invariant sparsity - on each patch of size N-by-N (e.g., N=8) in the image, including overlaps. Sparse and Redundant Signal Representation, and Its Role in Image Processing

Our MAP penalty becomes Application 2: Image Denoising Our MAP penalty becomes Our prior Extracts a patch in the ij location Sparse and Redundant Signal Representation, and Its Role in Image Processing

Application 2: Image Denoising The dictionary (and thus the image prior) is trained on the corrupted itself! This leads to an elegant fusion of the K-SVD and the denoising tasks. x=y and D known x and ij known D and ij known Compute ij per patch using the matching pursuit K-SVD Compute D to minimize using SVD, updating one column at a time Compute x by which is a simple averaging of shifted patches Sparse and Redundant Signal Representation, and Its Role in Image Processing

D Application 2: The Algorithm Sparse-code every patch of 8-by-8 pixels D Update the dictionary based on the above codes The computational cost of this algorithm is mostly due to the OMP steps done on each image patch - O(N2×L×K×Iterations) per pixel. Compute the output image x Sparse and Redundant Signal Representation, and Its Role in Image Processing

Denoising Results Source The obtained dictionary after 10 iterations Initial dictionary (overcomplete DCT) 64×256 Result 30.829dB The results of this algorithm tested over a corpus of images and with various noise powers compete favorably with the state-of-the-art - the GSM+steerable wavelets denoising algorithm by [Portilla, Strela, Wainwright, & Simoncelli (‘03)], giving ~1dB better results on average. Noisy image Sparse and Redundant Signal Representation, and Its Role in Image Processing

Application 3: Compression The problem: Compressing photo-ID images. General purpose methods (JPEG, JPEG2000) do not take into account the specific family. By adapting to the image-content (PCA/K-SVD), better results could be obtained. For these techniques to operate well, train dictionaries locally (per patch) using a training set of images is required. In PCA, only the (quantized) coefficients are stored, whereas the K-SVD requires storage of the indices as well. Geometric alignment of the image is very helpful and should be done. Sparse and Redundant Signal Representation, and Its Role in Image Processing

Application 3: The Algorithm On the training set Detect main features and warp the images to a common reference (20 parameters) Training set (2500 images) Divide the image into disjoint 15-by-15 patches. For each compute mean and dictionary Per each patch find the operating parameters (number of atoms L, quantization Q) Warp, remove the mean from each patch, sparse code using L atoms, apply Q, and dewarp On the test image Sparse and Redundant Signal Representation, and Its Role in Image Processing

Compression Results Results for 820 Bytes per each file 11.99 10.83 10.93 10.49 8.92 8.71 8.81 7.89 8.61 5.56 4.82 5.58 Results for 820 Bytes per each file Sparse and Redundant Signal Representation, and Its Role in Image Processing

Compression Results Results for 550 Bytes per each file 15.81 14.67 15.30 13.89 12.41 12.57 10.66 9.44 10.27 6.60 5.49 6.36 Results for 550 Bytes per each file Sparse and Redundant Signal Representation, and Its Role in Image Processing

? Compression Results Results for 400 Bytes per each file 18.62 16.12 16.81 12.30 11.38 12.54 7.61 6.31 7.20 Results for 400 Bytes per each file Sparse and Redundant Signal Representation, and Its Role in Image Processing

Today We Have Discussed 1. A Visit to Sparseland Motivating Sparsity & Overcompleteness 2. Problem 1: Transforms & Regularizations How & why should this work? 3. Problem 2: What About D? The quest for the origin of signals 4. Problem 3: Applications Image filling in, denoising, separation, compression, … Sparse and Redundant Signal Representation, and Its Role in Image Processing

There are difficulties in using them! Summary Sparsity and Over- completeness are important ideas. Can be used in designing better tools in signal/image processing There are difficulties in using them! This is an on-going work: Deepening our theoretical understanding, Speedup of pursuit methods, Training the dictionary, Demonstrating applications, … The dream? Future transforms and regularizations should be data-driven, non-linear, overcomplete, and promoting sparsity. Sparse and Redundant Signal Representation, and Its Role in Image Processing

More information, including these slides, can be found in my web-page Thank You for Your Time More information, including these slides, can be found in my web-page http://www.cs.technion.ac.il/~elad THE END !! Sparse and Redundant Signal Representation, and Its Role in Image Processing