Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.

Slides:



Advertisements
Similar presentations
1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Advertisements

Ch 7.7: Fundamental Matrices
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.
Sparse Modeling for Finding Representative Objects Ehsan Elhamifar Guillermo Sapiro Ren´e Vidal Johns Hopkins University University of Minnesota Johns.
Structured Sparse Principal Component Analysis Reading Group Presenter: Peng Zhang Cognitive Radio Institute Friday, October 01, 2010 Authors: Rodolphe.
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
Patch-based Image Deconvolution via Joint Modeling of Sparse Priors Chao Jia and Brian L. Evans The University of Texas at Austin 12 Sep
Image Denoising using Locally Learned Dictionaries Priyam Chatterjee Peyman Milanfar Dept. of Electrical Engineering University of California, Santa Cruz.
Learning sparse representations to restore, classify, and sense images and videos Guillermo Sapiro University of Minnesota Supported by NSF, NGA, NIH,
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
A novel supervised feature extraction and classification framework for land cover recognition of the off-land scenario Yan Cui
1 Micha Feigin, Danny Feldman, Nir Sochen
Ilias Theodorakopoulos PhD Candidate
Design of Non-Linear Kernel Dictionaries for Object Recognition
An Introduction to Sparse Coding, Sparse Sensing, and Optimization Speaker: Wei-Lun Chao Date: Nov. 23, 2011 DISP Lab, Graduate Institute of Communication.
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University.
Sparse & Redundant Signal Representation, and its Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Principal Component Analysis
Dictionary-Learning for the Analysis Sparse Model Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
Sparse and Overcomplete Data Representation
Mathematics and Image Analysis, MIA'06
Image Denoising via Learned Dictionaries and Sparse Representations
An Introduction to Sparse Representation and the K-SVD Algorithm
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
Image Denoising with K-SVD Priyam Chatterjee EE 264 – Image Processing & Reconstruction Instructor : Prof. Peyman Milanfar Spring 2007.
Random Convolution in Compressive Sampling Michael Fleyer.
* Joint work with Michal Aharon Guillermo Sapiro
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
SOS Boosting of Image Denoising Algorithms
Image Denoising and Inpainting with Deep Neural Networks Junyuan Xie, Linli Xu, Enhong Chen School of Computer Science and Technology University of Science.
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Sparse and Redundant Representation Modeling for Image Processing Michael Elad The Computer Science Department The Technion – Israel Institute of technology.
Sparsity-Aware Adaptive Algorithms Based on Alternating Optimization and Shrinkage Rodrigo C. de Lamare* + and Raimundo Sampaio-Neto * + Communications.
Graph-based consensus clustering for class discovery from gene expression data Zhiwen Yum, Hau-San Wong and Hongqiang Wang Bioinformatics, 2007.
Online Dictionary Learning for Sparse Coding International Conference on Machine Learning, 2009 Julien Mairal, Francis Bach, Jean Ponce and Guillermo Sapiro.
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
Online Learning for Matrix Factorization and Sparse Coding
Algorithms for a large sparse nonlinear eigenvalue problem Yusaku Yamamoto Dept. of Computational Science & Engineering Nagoya University.
Cs: compressed sensing
“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH.
 Karthik Gurumoorthy  Ajit Rajwade  Arunava Banerjee  Anand Rangarajan Department of CISE University of Florida 1.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Mingyang Zhu, Huaijiang Sun, Zhigang Deng Quaternion Space Sparse Decomposition for Motion Compression and Retrieval SCA 2012.
Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm Ahmed Aziz, Ahmed Salim, Walid Osamy Presenter : 張庭豪 International Journal of Computer.
Image Decomposition, Inpainting, and Impulse Noise Removal by Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
Large-Scale Matrix Factorization with Missing Data under Additional Constraints Kaushik Mitra University of Maryland, College Park, MD Sameer Sheoreyy.
Zhilin Zhang, Bhaskar D. Rao University of California, San Diego March 28,
2D-LDA: A statistical linear discriminant analysis for image matrix
Ultra-high dimensional feature selection Yun Li
Single Image Interpolation via Adaptive Non-Local Sparsity-Based Modeling The research leading to these results has received funding from the European.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Compressive Coded Aperture Video Reconstruction
Highly Undersampled 0-norm Reconstruction
Basic Algorithms Christina Gallner
Unfolding Problem: A Machine Learning Approach
By Viput Subharngkasen
Parallelization of Sparse Coding & Dictionary Learning
Improving K-SVD Denoising by Post-Processing its Method-Noise
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
CIS 700: “algorithms for Big Data”
Unfolding with system identification
Learned Convolutional Sparse Coding
Presentation transcript:

Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro University of Minnesota IEEE Transactions on Image Processing, Vol. 18, No. 7, July 2009 Presented by Haojun Chen

Outline Introduction Sensing Matrix Learning KSVD Algorithm Coupled-KSVD Experiment Results Conclusion

Introduction = Compressive Sensing(CS) Two fundamental principles Sparsity Incoherent Sampling Gramm Matrix: is with all columns normalized Gramm matrix should be as close to the identity as possible = S non-zero m x 1 m x N N x N N x 1 Image source: www.usna.edu/Users/weapsys/avramov/Compressed%20sensing%20tutorial/cs1v4.ppt

Sensing Matrix Learning Assume the dictionary is known, the goal is to find the sensing matrix such that Let be the eigen-decomposition of , then Define Objective is to compute to minimize Let be the eigenvalues of , , , Solution: ,

Sensing Matrix Learning Replacing back in terms of (rows of ) Once we obtain , Algorithm summary

KSVD Algorithm The objective of the KSVD algorithm is to solve, for a given sparsity level S, Two stages in KSVD algorithm Sparse Coding Stage: Using MP or BP Dictionary Update Stage Let and

KSVD Algorithm Define the group of examples that use the atom Let , then Let be the SVD of and define Solution:

KSVD Algorithm KSVD algorithm consists of the following key steps: Initialize Repeat until convergence: Sparse Coding Stage: For fixed, solve using OMP to obtain Dictionary Update Stage: For j=1 to K Define the group of examples that use this atom where P is the number of training square patches and Let where Obtain the largest singular value of and the corresponding singular vectors Update using

Coupled-KSVD To simultaneously training a dictionary and the projection matrix , the following optimization problem is considered Define , then the above equation can be rewritten as Solution obtained from KSVD: where and

Coupled-KSVD Coupled-KSVD algorithm consists of the following key steps: Initialize Repeat until convergence: For fixed, compute using the algorithm in sensing matrix learning For fixed, solve using OMP to obtain For j=1 to K Define the group of examples that use this atom where P is the number of training square patches and Let where Obtain the largest singular value of and the corresponding singular vectors Update using

Experiment Strategies Uncoupled random (UR) Uncoupled learning (UL) Coupled random (CR) Coupled learning (CL)

Experiment Results Training data: Testing data 6600 8 x 8 patches extracted at random from 440 images Testing data 120000 8 x 8 patches from 50 images Comparison of the average MSE of retrieval for the testing patches at different noise level and α K=256 Overcomplete K=64 Complete

Experiment Results Comparison of the retrieval MSE ratio for CL/CR and CL/UL at different noise level and α K=256 Overcomplete K=64 Complete

Experiment Results Best values of that produced the minimum retrieval MSE and at the same time the best CL/CR and CL/UL ratios, for a representative noise level of 5%.

Experiment Results Testing image consisting of non-overlapping 8 × 8 patches reconstructed from their noisy projections (5% level of noise)

Experiment Results Distribution of the off-diagonal elements of the Gramm matrix for each one of four strategies

Conclusions Framework for learning optimal sensing matrix for given sparsifying dictionary was introduced Novel approach for simultaneously learning the sensing matrix and sparsifying dictionary was proposed