Hybrid Dense/Sparse Matrices in Compressed Sensing Reconstruction

Slides:



Advertisements
Similar presentations
Sparse Recovery Using Sparse (Random) Matrices Piotr Indyk MIT Joint work with: Radu Berinde, Anna Gilbert, Howard Karloff, Martin Strauss and Milan Ruzic.
Advertisements

On the Power of Adaptivity in Sparse Recovery Piotr Indyk MIT Joint work with Eric Price and David Woodruff, 2011.
Sparse Recovery Using Sparse (Random) Matrices Piotr Indyk MIT Joint work with: Radu Berinde, Anna Gilbert, Howard Karloff, Martin Strauss and Milan Ruzic.
Bregman Iterative Algorithms for L1 Minimization with
Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
More MR Fingerprinting
Compressed sensing Carlos Becker, Guillaume Lemaître & Peter Rennert
Learning With Dynamic Group Sparsity Junzhou Huang Xiaolei Huang Dimitris Metaxas Rutgers University Lehigh University Rutgers University.
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
1cs542g-term Notes  Assignment 1 is out (due October 5)  Matrix storage: usually column-major.
Compressed Sensing meets Information Theory Dror Baron Duarte Wakin Sarvotham Baraniuk Guo Shamai.
Richard Baraniuk Rice University dsp.rice.edu/cs Compressive Signal Processing.
Compressed Sensing for Networked Information Processing Reza Malek-Madani, 311/ Computational Analysis Don Wagner, 311/ Resource Optimization Tristan Nguyen,
Image Denoising via Learned Dictionaries and Sparse Representations
Matrices. Special Matrices Matrix Addition and Subtraction Example.
A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai.
Compressive Signal Processing
Avoiding Communication in Sparse Iterative Solvers Erin Carson Nick Knight CS294, Fall 2011.
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
Rice University dsp.rice.edu/cs Distributed Compressive Sensing A Framework for Integrated Sensing and Processing for Signal Ensembles Marco Duarte Shriram.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 6. Eigenvalue problems.
Table of Contents Solving Linear Systems - Elementary Row Operations A linear system of equations can be solved in a new way by using an augmented matrix.
Compressed Sensing Compressive Sampling
An ALPS’ view of Sparse Recovery Volkan Cevher Laboratory for Information and Inference Systems - LIONS
Multivariate Linear Systems and Row Operations.
Compressive Sampling: A Brief Overview
Repairable Fountain Codes Megasthenis Asteris, Alexandros G. Dimakis IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 32, NO. 5, MAY /5/221.
Matrix Algebra. Quick Review Quick Review Solutions.
Game Theory Meets Compressed Sensing
Recovery of Clustered Sparse Signals from Compressive Measurements
Mining Discriminative Components With Low-Rank and Sparsity Constraints for Face Recognition Qiang Zhang, Baoxin Li Computer Science and Engineering Arizona.
1 Information Retrieval through Various Approximate Matrix Decompositions Kathryn Linehan Advisor: Dr. Dianne O’Leary.
Recovering low rank and sparse matrices from compressive measurements Aswin C Sankaranarayanan Rice University Richard G. Baraniuk Andrew E. Waters.
Motif finding with Gibbs sampling CS 466 Saurabh Sinha.
The Secrecy of Compressed Sensing Measurements Yaron Rachlin & Dror Baron TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
Andrea Montanari and Ruediger Urbanke TIFR Tuesday, January 6th, 2008 Phase Transitions in Coding, Communications, and Inference.
Shriram Sarvotham Dror Baron Richard Baraniuk ECE Department Rice University dsp.rice.edu/cs Sudocodes Fast measurement and reconstruction of sparse signals.
Direct Robust Matrix Factorization Liang Xiong, Xi Chen, Jeff Schneider Presented by xxx School of Computer Science Carnegie Mellon University.
Matrices and Systems of Linear Equations
10.3 Systems of Linear Equations: Matrices. A matrix is defined as a rectangular array of numbers, Column 1Column 2 Column jColumn n Row 1 Row 2 Row 3.
Matrices and Systems of Equations
Algebra Matrix Operations. Definition Matrix-A rectangular arrangement of numbers in rows and columns Dimensions- number of rows then columns Entries-
Zhilin Zhang, Bhaskar D. Rao University of California, San Diego March 28,
Monte Carlo Linear Algebra Techniques and Their Parallelization Ashok Srinivasan Computer Science Florida State University
2.5 – Determinants and Multiplicative Inverses of Matrices.
2 2.5 © 2016 Pearson Education, Ltd. Matrix Algebra MATRIX FACTORIZATIONS.
13.3 Product of a Scalar and a Matrix.  In matrix algebra, a real number is often called a.  To multiply a matrix by a scalar, you multiply each entry.
A Story of Principal Component Analysis in the Distributed Model David Woodruff IBM Almaden Based on works with Christos Boutsidis, Ken Clarkson, Ravi.
Monte Carlo Linear Algebra Techniques and Their Parallelization Ashok Srinivasan Computer Science Florida State University
12-1 Organizing Data Using Matrices
Highly Undersampled 0-norm Reconstruction
Lecture 15 Sparse Recovery Using Sparse Matrices
Learning With Dynamic Group Sparsity
Background: Lattices and the Learning-with-Errors problem
Lecture 4: CountSketch High Frequencies
CNNs and compressive sensing Theoretical analysis
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
Bounds for Optimal Compressed Sensing Matrices
Sudocodes Fast measurement and reconstruction of sparse signals
Introduction to Compressive Sensing Aswin Sankaranarayanan
Applications of Matrices
Sudocodes Fast measurement and reconstruction of sparse signals
1.8 Matrices.
1.8 Matrices.
Outline Sparse Reconstruction RIP Condition
Subspace Expanders and Low Rank Matrix Recovery
Presentation transcript:

Hybrid Dense/Sparse Matrices in Compressed Sensing Reconstruction Ilya Poltorak Dror Baron Deanna Needell Came out of my personal experience with 301 – fourier analysis and linear systems The work has been supported by the Israel Science Foundation and National Science Foundation.

CS Measurement Replace samples by more general encoder based on a few linear projections (inner products) measurements sparse signal # non-zeros

Caveats Input x strictly sparse w/ real values Noiseless measurements noise can be addressed (later) Assumptions relevant to content distribution (later)

Why is Decoding Expensive? Culprit: dense, unstructured sparse signal measurements nonzero entries

Sparse Measurement Matrices (dense later!) LDPC measurement matrix (sparse) Only {-1,0,+1} in  Each row of  contains L randomly placed nonzeros Fast matrix-vector multiplication fast encoding & decoding sparse signal measurements nonzero entries

Example 1 4 0 1 1 0 0 0 0 0 0 1 1 0 1 1 0 0 1 0 0 0 0 0 1 1 ?

Example What does zero measurement imply? Hint: x strictly sparse 1 4 0 1 1 0 0 0 0 0 0 1 1 0 1 1 0 0 1 0 0 0 0 0 1 1 ?

Example Graph reduction! 0 1 1 0 0 0 ? 1 0 0 0 1 1 0 1 1 0 0 1 0 4 1 4 0 1 1 0 0 0 0 0 0 1 1 0 1 1 0 0 1 0 0 0 0 0 1 1 ?

Example What do matching measurements imply? Hint: non-zeros in x are real numbers 1 4 0 1 1 0 0 0 0 0 0 1 1 0 1 1 0 0 1 0 0 0 0 0 1 1 ?

Example What is the last entry of x? 0 1 1 0 0 0 1 0 0 0 1 1 0 1 4 0 1 1 0 0 0 0 0 0 1 1 0 1 1 0 0 1 0 0 0 0 0 1 1 1 ?

Noiseless Algorithm [Luby & Mitzenmacher 2005] [Sarvotham, Baron, & Baraniuk 2006] [Zhang & Pfister 2008] Phase1: zero measurements Initialize Phase2: matching measurements typically iterate 2-3 times Phase3: singleton measurements Arrange output Done? yes no

Numbers (4 seconds) N=40,000 5% non-zeros M=0.22N L=20 ones per row Iteration, Phase Zeros Non-zeros Total 1,1 30615 1,2 35224 977 36201 1,3 1500 36724 2,1 36800 38300 2,2 37180 1833 39013 2,3 2063 39243 3,1 37268 39331 3,2 37289 2074 39363 3,3 2083 39372 4,1 4,2 37291 2084 39375 4,3 iteration #1 N=40,000 5% non-zeros M=0.22N L=20 ones per row Only 2-3 iterations

Challenge With measurements parts of signal still not reconstructed How do we recover the rest of the signal?

Solution: Hybrid Dense/Sparse Matrix With measurements parts of signal still not reconstructed Add extra dense measurements Residual of signal w/ residual dense columns residual columns

Sudocodes with Two-Part Decoding [Sarvotham, Baron, & Baraniuk 2006] Sudocodes (related to sudoku) Graph reduction solves most of CS problem Residual solved via matrix inversion Residual via matrix inversion sudo decoder residual columns

Contribution 1: Two-Part Reconstruction Many CS algorithms for sparse matrices [Gilbert et al., Berinde & Indyk, Sarvotham et al.] Many CS algorithms for dense matrices [Cormode & Muthukrishnan, Candes et al., Donoho et al., Gilbert et al., Milenkovic et al., Berinde & Indyk, Zhang & Pfister, Hale et al.,…] Solve each part with appropriate algorithm sparse solver residual via dense solver residual columns

Runtimes (K=0.05N, M=0.22N)

Theoretical Results [Sarvotham, Baron, & Baraniuk 2006] Fast encoder and decoder sub-linear decoding complexity caveat: constructing data structure Distributed content distribution sparsified data measurements stored on different servers any M measurements suffice Strictly sparse signals, noiseless measurements

Contribution 2: Noisy Measurements Results can be extended to noisy measurements Part 1 (zero measurements): measurement |ym|< Part 2 (matching): |yi-yj|< Part 3 (singleton): unchanged

Problems with Noisy Measurements Multiple iterations alias noise into next iteration! Use one iteration Requires small threshold  (large SNR) Contribution 3: Provable reconstruction deterministic & random variants

Summary Hybrid Dense/Sparse Matrix Simple (cute?) algorithm Fast Two-part reconstruction Simple (cute?) algorithm Fast Applicable to content distribution Expandable to measurement noise

THE END