Sparsity Based Poisson Denoising and Inpainting

Slides:



Advertisements
Similar presentations
Unsupervised Learning Clustering K-Means. Recall: Key Components of Intelligent Agents Representation Language: Graph, Bayes Nets, Linear functions Inference.
Advertisements

Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Fast Algorithms For Hierarchical Range Histogram Constructions
MMSE Estimation for Sparse Representation Modeling
Joint work with Irad Yavneh
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Submodular Dictionary Selection for Sparse Representation Volkan Cevher Laboratory for Information and Inference Systems - LIONS.
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
Patch-based Image Deconvolution via Joint Modeling of Sparse Priors Chao Jia and Brian L. Evans The University of Texas at Austin 12 Sep
K-SVD Dictionary-Learning for Analysis Sparse Models
Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH,
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
1 Micha Feigin, Danny Feldman, Nir Sochen
Ilias Theodorakopoulos PhD Candidate
Learning With Dynamic Group Sparsity Junzhou Huang Xiaolei Huang Dimitris Metaxas Rutgers University Lehigh University Rutgers University.
Image classification by sparse coding.
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
Dictionary-Learning for the Analysis Sparse Model Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
EE 290A: Generalized Principal Component Analysis Lecture 6: Iterative Methods for Mixture-Model Segmentation Sastry & Yang © Spring, 2011EE 290A, University.
Image Super-Resolution Using Sparse Representation By: Michael Elad Single Image Super-Resolution Using Sparse Representation Michael Elad The Computer.
SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.
Image Denoising via Learned Dictionaries and Sparse Representations
An Introduction to Sparse Representation and the K-SVD Algorithm
ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
New Results in Image Processing based on Sparse and Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
Image Denoising with K-SVD Priyam Chatterjee EE 264 – Image Processing & Reconstruction Instructor : Prof. Peyman Milanfar Spring 2007.
* Joint work with Michal Aharon Guillermo Sapiro
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
SOS Boosting of Image Denoising Algorithms
Image Denoising and Inpainting with Deep Neural Networks Junyuan Xie, Linli Xu, Enhong Chen School of Computer Science and Technology University of Science.
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
Sparse and Redundant Representation Modeling for Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
SPECTRO-TEMPORAL POST-SMOOTHING IN NMF BASED SINGLE-CHANNEL SOURCE SEPARATION Emad M. Grais and Hakan Erdogan Sabanci University, Istanbul, Turkey  Single-channel.
Iterated Denoising for Image Recovery Onur G. Guleryuz To see the animations and movies please use full-screen mode. Clicking on.
Learning With Structured Sparsity
Image Restoration using Iterative Wiener Filter --- ECE533 Project Report Jing Liu, Yan Wu.
Eran Treister and Irad Yavneh Computer Science, Technion (with thanks to Michael Elad)
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
Guest lecture: Feature Selection Alan Qi Dec 2, 2004.
Lecture 2: Statistical learning primer for biologists
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
Image Priors and the Sparse-Land Model
Stein Unbiased Risk Estimator Michael Elad. The Objective We have a denoising algorithm of some sort, and we want to set its parameters so as to extract.
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
Gaussian Mixture Model classification of Multi-Color Fluorescence In Situ Hybridization (M-FISH) Images Amin Fazel 2006 Department of Computer Science.
Voice Activity Detection Based on Sequential Gaussian Mixture Model Zhan Shen, Jianguo Wei, Wenhuan Lu, Jianwu Dang Tianjin Key Laboratory of Cognitive.
Learning Deep Generative Models by Ruslan Salakhutdinov
Classification of unlabeled data:
Learning With Dynamic Group Sparsity
Clustering Evaluation The EM Algorithm
Basic Algorithms Christina Gallner
School of Electronic Engineering, Xidian University, Xi’an, China
Bayesian Models in Machine Learning
10701 / Machine Learning Today: - Cross validation,
Gaussian Mixture Models And their training with the EM algorithm
* *Joint work with Ron Rubinstein Tomer Peleg Remi Gribonval and
The Analysis (Co-)Sparse Model Origin, Definition, and Pursuit
32nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society Denoising of LSFCM images with compensation for the Photoblinking/Photobleaching.
Improving K-SVD Denoising by Post-Processing its Method-Noise
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Neural networks (3) Regularization Autoencoder
Sparse and Redundant Representations and Their Applications in
Learned Convolutional Sparse Coding
Sebastian Semper1 and Florian Roemer1,2
Lecture 7 Patch based methods: nonlocal means, BM3D, K- SVD, data-driven (tight) frame.
Presentation transcript:

Sparsity Based Poisson Denoising and Inpainting Raja Giryes, Tel Aviv University Joint work with Michael Elad, Technion

Agenda Problem Definition – Poisson Denoising Existing Poisson Denoising Methods Poisson Greedy Algorithm Experimental results Poisson Inpainting

Denoising Problem Original unknown image is a noisy measurement of x. The goal is to recover x from .

Gaussian Denoising Problem where is a zero-mean white Gaussian noise with variance , i.e., each element .

Gaussian Noisy Measurements Another perspective for the Gaussian denoising problem: Look at the measurements as Gaussian distributed with mean equal to the original signal The variance determines the noise power.

Poisson Noisy Measurements The measurements are Poisson Distributed Poisson noise is not an additive noise, unlike the Gaussian case. The noise power is measured by the peak value:

Poisson Denoising Problem Noisy image distribution: is an integer. large large . small small .

Poisson Denoising Problem

Poisson Denoising Problem

Poisson Denoising Applications Tomography – CT, PET and SPECT Astrophysics Fluorescence Microscopy Night Vision Spectral Imaging etc.

Tomography Slices of skeletal SPECT image [Takalo , Hytti and Ihalainen 2011]

Fluorescence Microscopy C. elegans embryo labeled with three fluorescent dyes [Luisier, Vonesch, Blu and Unser 2010]

Astrophysics XMM/Newton image of the Kepler SN1604 supernova [Starck, Donoho and Candès 2003]

Agenda Problem Definition – Poisson Denoising Existing Poisson Denoising Methods Poisson Greedy Algorithm Experimental results Poisson Inpainting

Denoising Methods Many denoising methods exists. However, most of them assume a Gaussian model for the noise. We have two options: Use a transformation that converts the noise to be Gaussian. Work directly with the Poisson model.

The Anscombe Transform The Anscombe transform converts Poisson distributed noise into an approximately Gaussian distributed data with variance 1 using the following formula elementwise [Anscombe, 1948]. Valid only when peak>4

Poisson Log-likelihood We will work directly with the Poisson data. By maximizing the log-likelihood of the Poisson distribution we get the following minimization problem Reminder: In the Gaussian case we had A prior is needed

Sparsity Prior for Poisson Denoising (1) Regular sparsity prior leads to which is a non-negative optimization problem Instead we use that yields the following D is a given dictionary. counts the non-zero elements

Sparsity Prior for Poisson Denoising (2) The minimization problem is likely to be NP-hard. Approximations are needed.

l1 Relaxation One option is to use l1 relaxation is a relaxation parameter. This problem can be solved using the SPIRAL algorithm [Harmany et al., 2012].

Non-local PCA (NLPCA) GMM (Gaussian Mixture Model) based method. Cluster the noisy patches into small number of large groups. For each cluster train a PCA subspace Non-local Sparse PCA (NLSPCA) Uses l1 regularization with NLPCA. Binning Aggregate nearby pixels to improve SNR. Denoise down-sampled image. Interpolate recovered image to return to initial size. [Salmon, Harmany, Deledalle, Willett 2013]

Agenda Problem Definition – Poisson Denoising Existing Poisson Denoising Methods Poisson Greedy Algorithm Experimental results Poisson Inpainting Novel Part

Exponential Sparsity Prior Zero entries Non-zero entries Stress that the classical approaches try to recover z. We focus on x. [Salmon et al. 2012, Giryes and Elad 2012]

Poisson Greedy Algorithm - Summary Divide the image into set of overlapping patches. Cluster (using Gaussian filtering) the noisy patches into large number of small groups. Each group of patches is assumed to have the same non-zero locations (support) in their representations . A global dictionary is used for all groups of patches. Having the reconstructed patches we form the final image by averaging. Put in a table comparing the NLPCA

Dictionary Learning Joint dictionary D and representation with a fixed support learning [Smith, Elad 2013]. after we have the representation of all the patches and their supports we minimize: Global initial dictionary for all images Trained using the following image

Our Algorithm vs. NLPCA Large number of clusters. Small cluster size. Poisson Greedy Algorithm NLPCA Large number of clusters. Small cluster size. Clustering using Gaussian filtering. Global dictionary for all patches. Dictionary learning based approach. Small number of clusters . Large cluster size. Clustering using k-means. Local dictionary for each cluster. GMM based approach.

Algorithm Summary ..… ..… ..… Extracting overlapping patches Applying Poisson greedy algorithm for each group Gaussian filtering Averaging patches Dictionary learning Patch grouping ..…

Poisson Greedy Algorithm-Sparse Coding Input: Group of noisy patches Initialization: While t<k t=t+1 Find new support element and representations: Update the support Form patches estimate:

Boot-strapped Stopping Criterion Ideally we want to select different number of non-zeros for each patch. We want to add elements to the support till the error with respect to the original patch (in the original image) stops decreasing. We do not have access to the original image. Use the patches of the estimated image from the previous iteration.

Agenda Problem Definition – Poisson Denoising Existing Poisson Denoising Methods Poisson Greedy Algorithm Denoising Results Poisson Inpainting Novel Part

Experiment- Parameter Setting Patches of size 20 by 20. Patches clustered to groups of size 50. Initial cardinality of the patches is k=2. 5 dictionary learning iterations. Repeat the process one time with re-clustering based on the recovered image.

Noisy Image Max y value = 7 Peak = 1

Poisson Greedy Algorithm Dictionary learned atoms: Method 22.59db Our 20.37db NLSPCA 19.41db BM3Dbin Peak = 1 [Giryes and Elad 2013].

Poisson Greedy Algorithm Method 22.59db Our 20.37db NLSPCA 19.41db BM3Dbin Peak = 1 [Salmon et al. 2013].

Poisson Greedy Algorithm Method 22.59db Our 20.37db NLSPCA 19.41db BM3Dbin Peak = 1 [Makitalo and Foi 2011]

Original Image

Noisy Image Max y value = 3 Peak = 0.2

Poisson Greedy Algorithm Method 24.16db Our 22.98db NLSPCA 23.16db BM3Dbin Peak = 0.2 [Giryes and Elad 2013].

Poisson Greedy Algorithm Method 24.16db Our 22.98db NLSPCA 23.16db BM3Dbin Peak = 0.2 [Salmon et al. 2013].

Poisson Greedy Algorithm Method 24.16db Our 22.98db NLSPCA 23.16db BM3Dbin Peak = 0.2 [Makitalo and Foi 2011]

Original Image

Noisy Image Max y value = 8 Peak = 2

Poisson Greedy Algorithm Method 24.76db Our 23.23db NLSPCA 24.23db BM3Dbin Peak = 2 [Giryes and Elad 2013].

Poisson Greedy Algorithm Method 24.76db Our 23.23db NLSPCA 24.23db BM3Dbin Peak = 2 [Salmon et al. 2013].

Poisson Greedy Algorithm Method 24.76db Our 23.23db NLSPCA 24.23db BM3Dbin Peak = 2 [Makitalo and Foi 2011]

Original Image

Recovery Results 8 Test images. 6 peak levels (0.1,0.2,0.5,1,2,4). Best average recovery error for 5 out of 6 peak values. Second best for peak =1 (difference of 0.02db). 0.288db better on average than second best

Agenda Problem Definition – Poisson Denoising Existing Poisson Denoising Methods Poisson Greedy Algorithm Denoising Results Poisson Inpainting Novel Part

The Poisson Inpainting Problem Some of the pixels in are occluded. The mask defines missing and given pixels in the measured image + = Noisy Image

Poisson Inpainting Objective The Poisson Inpainting minimization problem We approximate this problem using a greedy algorithm as before.

Noise Estimation for Inpainting Having a recovery , we replace each unknown pixel in with a noisy pixel generated from . We get a noisy image for which we can apply the regular dictionary update steps. +

Inpainting Results 23.86dB Peak = 1, 20% Missing Pixels

Inpainting Results 22.83dB Peak = 1, 40% Missing Pixels

Inpainting Results 21.02dB Peak = 1, 60% Missing Pixels

Inpainting Results Average over four different test images

Inpainting Results 24.34dB Peak = 1

Inpainting Results 23.58dB Peak = 2

Inpainting Results 22.72dB Peak = 2

Inpainting Results 19.76dB Peak = 1

Conclusion Poisson based denoising Sparse representation for Poisson noise Greedy Poisson algorithm State-of-the-art denoising results Poisson inpainting algorithm

Questions?