Recovery of Clustered Sparse Signals from Compressive Measurements

Slides:



Advertisements
Similar presentations
An Introduction to Compressed Sensing Student : Shenghan TSAI Advisor : Hsuan-Jung Su and Pin-Hsun Lin Date : May 02,
Advertisements

On the Power of Adaptivity in Sparse Recovery Piotr Indyk MIT Joint work with Eric Price and David Woodruff, 2011.
Image acquisition using sparse (pseudo)-random matrices Piotr Indyk MIT.
Compressive Sensing IT530, Lecture Notes.
Sparse Recovery (Using Sparse Matrices)
Pixel Recovery via Minimization in the Wavelet Domain Ivan W. Selesnick, Richard Van Slyke, and Onur G. Guleryuz *: Polytechnic University, Brooklyn, NY.
Short-course Compressive Sensing of Videos Venue CVPR 2012, Providence, RI, USA June 16, 2012 Organizers: Richard G. Baraniuk Mohit Gupta Aswin C. Sankaranarayanan.
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Submodular Dictionary Selection for Sparse Representation Volkan Cevher Laboratory for Information and Inference Systems - LIONS.
Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 2: Compressive Sampling for Analog Time Signals.
Contents 1. Introduction 2. UWB Signal processing 3. Compressed Sensing Theory 3.1 Sparse representation of signals 3.2 AIC (analog to information converter)
compressive nonsensing
Extensions of wavelets
Compressed sensing Carlos Becker, Guillaume Lemaître & Peter Rennert
Richard Baraniuk Rice University Progress in Analog-to- Information Conversion.
Compressive Signal Processing Richard Baraniuk Rice University.
Learning With Dynamic Group Sparsity Junzhou Huang Xiaolei Huang Dimitris Metaxas Rutgers University Lehigh University Rutgers University.
Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator Volkan Cevher Laboratory for Information and Inference Systems – LIONS / EPFL.
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
Sparse Representation and Compressed Sensing: Theory and Algorithms
“Random Projections on Smooth Manifolds” -A short summary
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
Quantization and compressed sensing Dmitri Minkin.
SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.
Richard Baraniuk Rice University dsp.rice.edu/cs Compressive Signal Processing.
Compressive Signal Processing
Random Convolution in Compressive Sampling Michael Fleyer.
Introduction to Compressive Sensing
Rice University dsp.rice.edu/cs Distributed Compressive Sensing A Framework for Integrated Sensing and Processing for Signal Ensembles Marco Duarte Shriram.
6.829 Computer Networks1 Compressed Sensing for Loss-Tolerant Audio Transport Clay, Elena, Hui.
Hybrid Dense/Sparse Matrices in Compressed Sensing Reconstruction
Compressed Sensing Compressive Sampling
Model-based Compressive Sensing
An ALPS’ view of Sparse Recovery Volkan Cevher Laboratory for Information and Inference Systems - LIONS
Compressive Sensing A New Approach to Image Acquisition and Processing
Richard Baraniuk Rice University Model-based Sparsity.
Compressive Sampling: A Brief Overview
Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher
Game Theory Meets Compressed Sensing
Compressive Sensing A New Approach to Signal Acquisition and Processing Richard Baraniuk Rice University.
Richard Baraniuk Chinmay Hegde Marco Duarte Mark Davenport Rice University Michael Wakin University of Michigan Compressive Learning and Inference.
Cs: compressed sensing
Introduction to Compressive Sensing
Recovering low rank and sparse matrices from compressive measurements Aswin C Sankaranarayanan Rice University Richard G. Baraniuk Andrew E. Waters.
Sketching via Hashing: from Heavy Hitters to Compressive Sensing to Sparse Fourier Transform Piotr Indyk MIT.
Introduction to Compressed Sensing and its applications
Learning With Structured Sparsity
Source Localization on a budget Volkan Cevher Rice University Petros RichAnna Martin Lance.
Model-Based Compressive Sensing Presenter: Jason David Bonior ECE / CMR Tennessee Technological University November 5, 2010 Reading Group (Richard G. Baraniuk,
SCALE Speech Communication with Adaptive LEarning Computational Methods for Structured Sparse Component Analysis of Convolutive Speech Mixtures Volkan.
Compressive Sensing for Multimedia Communications in Wireless Sensor Networks By: Wael BarakatRabih Saliba EE381K-14 MDDSP Literary Survey Presentation.
Compressible priors for high-dimensional statistics Volkan Cevher LIONS/Laboratory for Information and Inference Systems
Shriram Sarvotham Dror Baron Richard Baraniuk ECE Department Rice University dsp.rice.edu/cs Sudocodes Fast measurement and reconstruction of sparse signals.
Zhilin Zhang, Bhaskar D. Rao University of California, San Diego March 28,
Li-Wei Kang and Chun-Shien Lu Institute of Information Science, Academia Sinica Taipei, Taiwan, ROC {lwkang, April IEEE.
Compressive Sensing Techniques for Video Acquisition EE5359 Multimedia Processing December 8,2009 Madhu P. Krishnan.
n-pixel Hubble image (cropped)
Lecture 13 Compressive sensing
Lecture 15 Sparse Recovery Using Sparse Matrices
Learning With Dynamic Group Sparsity
Basic Algorithms Christina Gallner
Lecture 4: CountSketch High Frequencies
Sudocodes Fast measurement and reconstruction of sparse signals
Introduction to Compressive Sensing Aswin Sankaranarayanan
Sparse and Redundant Representations and Their Applications in
CIS 700: “algorithms for Big Data”
Sudocodes Fast measurement and reconstruction of sparse signals
Outline Sparse Reconstruction RIP Condition
Presentation transcript:

Recovery of Clustered Sparse Signals from Compressive Measurements Volkan Cevher volkan@rice.edu Piotr Indyk Chinmay Hegde Richard Baraniuk

The Digital Universe Size: ~300 billion gigabytes generated in 2007 digital bits > stars in the universe growing by a factor of 10 every 5 years > Avogadro’s number (6.02x1023) in 15 years Growth fueled by multimedia / multisensor data audio, images, video, surveillance cameras, sensor nets, … In 2007 digital data generated > total storage by 2011, ½ of digital universe will have no home [Source: IDC Whitepaper “The Diverse and Exploding Digital Universe” March 2008]

Challenges Acquisition Compression Processing

Approaches Do nothing / Ignore be content with where we are… generalizes well robust

Approaches Finite Rate of Innovation Sketching / Streaming Compressive Sensing [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]

Approaches PARSITY Finite Rate of Innovation Sketching / Streaming Compressive Sensing PARSITY [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]

Agenda A short review of compressive sensing Beyond sparse models Potential gains via structured sparsity Block-sparse model (K,C)-sparse model Conclusions

Compressive Sensing 101 Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank Solution: Exploit the sparsity/compressibility geometry of acquired signal

Compressive Sensing 101 Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank but satisfies Restricted Isometry Property (RIP) Solution: Exploit the sparsity/compressibility geometry of acquired signal iid Gaussian iid Bernoulli …

Compressive Sensing 101 Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank Solution: Exploit the model geometry of acquired signal

Basic Signal Models Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional subspaces aligned w/ coordinate axes sorted index

Basic Signal Models Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional subspaces Compressible signal: sorted coordinates decay rapidly to zero well-approximated by a K-sparse signal (simply by thresholding) power-law decay sorted index

Recovery Algorithms Goal: given recover and convex optimization formulations basis pursuit, Dantzig selector, Lasso, … Greedy algorithms iterative thresholding (IT), compressive sensing matching pursuit (CoSaMP) subspace pursuit (SP) at their core: iterative sparse approximation

Performance of Recovery Using methods, IT, CoSaMP, SP Sparse signals noise-free measurements: exact recovery noisy measurements: stable recovery Compressible signals recovery as good as K-sparse approximation CS recovery error signal K-term approx error noise

background subtracted images Sparse Signals pixels: background subtracted images wavelets: natural images Gabor atoms: chirps/tones

Sparsity as a Model Sparse/compressible signal model captures simplistic primary structure sparse image

background subtracted images Beyond Sparse Models Sparse/compressible signal model captures simplistic primary structure Modern compression/processing algorithms capture richer secondary coefficient structure pixels: background subtracted images wavelets: natural images Gabor atoms: chirps/tones

Sparse Signals Defn: K-sparse signals comprise a particular set of K-dim canonical subspaces

Model-Sparse Signals Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces

Model-Sparse Signals Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces Structured subspaces <> fewer subspaces <> relaxed RIP <> fewer measurements [Blumensath and Davies]

Model-Sparse Signals Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces Structured subspaces <> increased signal discrimination <> improved recovery perf. <> faster recovery

Block Sparse Signals

Block-Sparsity Motivation Signal model sensor networks intra-sensor: sparsity inter-sensor: common sparse supports union of subspaces when signals are concatenated frequency sensors … *Individual block sizes may vary… [Tropp, Eldar, Mishali; Stojnic, Parvaresh, Hassibi; Baraniuk, VC, Duarte, Hegde]

Block-Sparsity Recovery Sampling bound (B = # of blocks) Problem specific solutions Mixed -norm solutions Greedy solutions: simultaneous orthogonal matching pursuit [Tropp] Model-based recovery framework: -norm [Baraniuk, VC, Duarte, Hegde] within block across blocks [Eldar, Mishali (provable); Stojnic, Parvaresh, Hassibi]

Standard CS Recovery Iterative Thresholding update signal estimate [Nowak, Figueiredo; Kingsbury, Reeves; Daubechies, Defrise, De Mol; Blumensath, Davies; …] update signal estimate prune signal estimate (best K-term approx) update residual

Model-based CS Recovery Iterative Model Thresholding [VC, Duarte, Hegde, Baraniuk; Baraniuk, VC, Duarte, Hegde] update signal estimate prune signal estimate (best K-term model approx) update residual

Model-based CS Recovery Provable guarantees [Baraniuk, VC, Duarte, Hegde] CS recovery error signal K-term model approx error noise update signal estimate prune signal estimate (best K-term model approx) update residual

Block-Sparse CS Recovery Iterative Block Thresholding within block sort + threshold across blocks

block-sparse model recovery (MSE=0.015) Block-Sparse Signal target CoSaMP (MSE = 0.723) block-sparse model recovery (MSE=0.015) Blocks are pre-specified. 29

Clustered Sparse Signals: (K,C)-sparse signal model

Clustered Sparsity (K,C) sparse signals (1-D) Stable recovery K-sparse within at most C clusters Stable recovery Recovery: w/ model-based framework model approximation via dynamic programming (recursive / bottom up)

Example

Simulation via Block-Sparsity Clustered sparse <> approximable by block-sparse *If we are willing to pay 3 × sampling penalty Proof by (adversarial) construction …

Clustered Sparsity in 2D Model clustering of significant pixels in space domain using graphical model (MRF) Ising model approximation via graph cuts [VC, Duarte, Hedge, Baraniuk] target Ising-model recovery CoSaMP recovery LP (FPC) recovery 34

Conclusions Why CS works: stable embedding for signals with special geometric structure Sparse signals >> model-sparse signals Greedy model-based signal recovery algorithms upshot: provably fewer measurements more stable recovery new model: clustered sparsity Compressible signals >> model-compressible signals 35

Volkan Cevher / volkan@rice.edu