A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.

Slides:



Advertisements
Similar presentations
Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.
Advertisements

MMSE Estimation for Sparse Representation Modeling
Joint work with Irad Yavneh
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
Distilled Sensing: Selective Sampling for Sparse Signal Recovery Jarvis Haupt, Rui Castro, and Robert Nowak (International Conference on Artificial Intelligence.
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
1 Micha Feigin, Danny Feldman, Nir Sochen
Ilias Theodorakopoulos PhD Candidate
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
Dictionary-Learning for the Analysis Sparse Model Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
Exploiting Statistical Dependencies in Sparse Representations Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Sparse and Overcomplete Data Representation
SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.
Image Denoising via Learned Dictionaries and Sparse Representations
Image Denoising and Beyond via Learned Dictionaries and Sparse Representations Michael Elad The Computer Science Department The Technion – Israel Institute.
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
Image Denoising with K-SVD Priyam Chatterjee EE 264 – Image Processing & Reconstruction Instructor : Prof. Peyman Milanfar Spring 2007.
Random Convolution in Compressive Sampling Michael Fleyer.
Introduction to Compressive Sensing
SUSAN: structure-preserving noise reduction EE264: Image Processing Final Presentation by Luke Johnson 6/7/2007.
Independent Component Analysis (ICA) and Factor Analysis (FA)
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
6.829 Computer Networks1 Compressed Sensing for Loss-Tolerant Audio Transport Clay, Elena, Hui.
Image Denoising and Inpainting with Deep Neural Networks Junyuan Xie, Linli Xu, Enhong Chen School of Computer Science and Technology University of Science.
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Alfredo Nava-Tudela John J. Benedetto, advisor
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
The horseshoe estimator for sparse signals CARLOS M. CARVALHO NICHOLAS G. POLSON JAMES G. SCOTT Biometrika (2010) Presented by Eric Wang 10/14/2010.
P. 1/30 Heping Song, Tong Liu, Xiaomu Luo and Guoli Wang Feedback based Sparse Recovery for Motion Tracking in RF Sensor Networks IEEE Inter.
“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Image Restoration using Iterative Wiener Filter --- ECE533 Project Report Jing Liu, Yan Wu.
Eran Treister and Irad Yavneh Computer Science, Technion (with thanks to Michael Elad)
Presented by: Mingyuan Zhou Duke University, ECE June 17, 2011
Max-Margin Classification of Data with Absent Features Presented by Chunping Wang Machine Learning Group, Duke University July 3, 2008 by Chechik, Heitz,
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
Image Denoising Using Wavelets
Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm Ahmed Aziz, Ahmed Salim, Walid Osamy Presenter : 張庭豪 International Journal of Computer.
PARALLEL FREQUENCY RADAR VIA COMPRESSIVE SENSING
Application: Signal Compression Jyun-Ming Chen Spring 2001.
Sparse & Redundant Representation Modeling of Images Problem Solving Session 1: Greedy Pursuit Algorithms By: Matan Protter Sparse & Redundant Representation.
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
Image Priors and the Sparse-Land Model
Stein Unbiased Risk Estimator Michael Elad. The Objective We have a denoising algorithm of some sort, and we want to set its parameters so as to extract.
Li-Wei Kang and Chun-Shien Lu Institute of Information Science, Academia Sinica Taipei, Taiwan, ROC {lwkang, April IEEE.
Introduction to Independent Component Analysis Math 285 project Fall 2015 Jingmei Lu Xixi Lu 12/10/2015.
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Date of download: 7/7/2016 Copyright © 2016 SPIE. All rights reserved. Evaluation of the orthogonal matching pursuit (OMP) cost over the target space in.
Sparsity Based Poisson Denoising and Inpainting
Presenter: Xudong Zhu Authors: Xudong Zhu, etc.
Sparse and Redundant Representations and Their Applications in
Sparse and Redundant Representations and Their Applications in
Wavelet-Based Denoising Using Hidden Markov Models
Wavelet-Based Denoising Using Hidden Markov Models
Sparse and Redundant Representations and Their Applications in
Optimal sparse representations in general overcomplete bases
Sparse and Redundant Representations and Their Applications in
Sparse and Redundant Representations and Their Applications in
Improving K-SVD Denoising by Post-Processing its Method-Noise
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Sparse and Redundant Representations and Their Applications in
CIS 700: “algorithms for Big Data”
Presentation transcript:

A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08 Presented by Dehong Liu ECE, Duke University July 24, 2009

Outline Motivation A mixture of sparse representations Experiments and results Analysis Conclusion

Motivation Noise removal problem y=x+v, in which y is a measurement signal, x is the clean signal, v is assumed to be zero mean iid Gaussian. Sparse representation x=D , in which D  R n  m, n<m,  is a sparse vector. Compressive sensing problem Orthogonal Matching Pursuit (OMP) Sparsest representation Question: “Does this mean that other competitive and slightly inferior sparse representations are meaningless?”

A mixture of sparse representations How to generate a set of sparse representations? –Randomized OMP How to fuse these sparse representations? –A plain averaging

OMP algorithm

Randomized OMP

Experiments and results Model: y=x+v=D  +v D: 100x200 random dictionary with entries drawn from N(0,1), and then with columns normalized;  : a random representations with k=10 non- zeros chosen at random and with values drawn from N(0,1); v: white Gaussian noise with entries drawn from N(0,1); Noise threshold in OMP algorithm T=100(??); Run the OMP once, and the RandOMP 1000 times.

Observations

Sparse vector reconstruction The average representation over 1000 RandOMP representations is not sparse at all.

Denoising factor based on 1000 experiments Denoising factor= Run RandOMP 100 times for each experiment.

Performance with different parameters

Analysis The RandOMP is an approximation of the Minimum-Mean-Squared-Error (MMSE) estimate. “ ”

The above results correspond to a 20x30 dictionary. Parameters: True support=3,  x =1, Averaged over 1000 experiments  Relative Mean-Squared-Error Emp. Oracle 2. Theor. Oracle 3. Emp. MMSE 4. Theor. MMSE 5. Emp. MAP 6. Theor. MAP 7. OMP 8. RandOMP Comparison

Conclusion The paper shows that averaging several sparse representations for a signal lead to better denoising, as it approximates the MMSE estimator.