Shriram Sarvotham Dror Baron Richard Baraniuk ECE Department Rice University dsp.rice.edu/cs Sudocodes Fast measurement and reconstruction of sparse signals.

Slides:



Advertisements
Similar presentations
An Introduction to Compressed Sensing Student : Shenghan TSAI Advisor : Hsuan-Jung Su and Pin-Hsun Lin Date : May 02,
Advertisements

An Easy-to-Decode Network Coding Scheme for Wireless Broadcasting
On the Power of Adaptivity in Sparse Recovery Piotr Indyk MIT Joint work with Eric Price and David Woodruff, 2011.
Image acquisition using sparse (pseudo)-random matrices Piotr Indyk MIT.
Compressive Sensing IT530, Lecture Notes.
Fast Algorithms For Hierarchical Range Histogram Constructions
Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 2: Compressive Sampling for Analog Time Signals.
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
More MR Fingerprinting
Compressed sensing Carlos Becker, Guillaume Lemaître & Peter Rennert
Learning With Dynamic Group Sparsity Junzhou Huang Xiaolei Huang Dimitris Metaxas Rutgers University Lehigh University Rutgers University.
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
“Random Projections on Smooth Manifolds” -A short summary
Compressive Oversampling for Robust Data Transmission in Sensor Networks Infocom 2010.
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
Sparse and Overcomplete Data Representation
Compressed Sensing meets Information Theory Dror Baron Duarte Wakin Sarvotham Baraniuk Guo Shamai.
Richard Baraniuk Rice University dsp.rice.edu/cs Compressive Signal Processing.
A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai.
Compressive Signal Processing
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
Random Convolution in Compressive Sampling Michael Fleyer.
Richard Fateman CS 282 Lecture 111 Determinants Lecture 11.
Introduction to Compressive Sensing
Rice University dsp.rice.edu/cs Distributed Compressive Sensing A Framework for Integrated Sensing and Processing for Signal Ensembles Marco Duarte Shriram.
Computing Sketches of Matrices Efficiently & (Privacy Preserving) Data Mining Petros Drineas Rensselaer Polytechnic Institute (joint.
6.829 Computer Networks1 Compressed Sensing for Loss-Tolerant Audio Transport Clay, Elena, Hui.
A Sparse Solution of is Necessarily Unique !! Alfred M. Bruckstein, Michael Elad & Michael Zibulevsky The Computer Science Department The Technion – Israel.
Hybrid Dense/Sparse Matrices in Compressed Sensing Reconstruction
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
Random Projections of Signal Manifolds Michael Wakin and Richard Baraniuk Random Projections for Manifold Learning Chinmay Hegde, Michael Wakin and Richard.
Compressed Sensing Compressive Sampling
Feng Lu Chuan Heng Foh, Jianfei Cai and Liang- Tien Chia Information Theory, ISIT IEEE International Symposium on LT Codes Decoding: Design.
Compressive Sampling: A Brief Overview
Richard Baraniuk Chinmay Hegde Marco Duarte Mark Davenport Rice University Michael Wakin University of Michigan Compressive Learning and Inference.
Recovery of Clustered Sparse Signals from Compressive Measurements
Compressive Sensing Based on Local Regional Data in Wireless Sensor Networks Hao Yang, Liusheng Huang, Hongli Xu, Wei Yang 2012 IEEE Wireless Communications.
Cs: compressed sensing
Recovering low rank and sparse matrices from compressive measurements Aswin C Sankaranarayanan Rice University Richard G. Baraniuk Andrew E. Waters.
“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH.
Learning With Structured Sparsity
The Secrecy of Compressed Sensing Measurements Yaron Rachlin & Dror Baron TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
Model-Based Compressive Sensing Presenter: Jason David Bonior ECE / CMR Tennessee Technological University November 5, 2010 Reading Group (Richard G. Baraniuk,
Stochastic Multicast with Network Coding Ajay Gopinathan, Zongpeng Li Department of Computer Science University of Calgary ICDCS 2009, June , Montreal.
Compressive Sensing for Multimedia Communications in Wireless Sensor Networks By: Wael BarakatRabih Saliba EE381K-14 MDDSP Literary Survey Presentation.
Compressible priors for high-dimensional statistics Volkan Cevher LIONS/Laboratory for Information and Inference Systems
Cooperative Recovery of Distributed Storage Systems from Multiple Losses with Network Coding Yuchong Hu, Yinlong Xu, Xiaozhao Wang, Cheng Zhan and Pei.
An Introduction to Compressive Sensing Speaker: Ying-Jou Chen Advisor: Jian-Jiun Ding.
Dr. Sudharman K. Jayaweera and Amila Kariyapperuma ECE Department University of New Mexico Ankur Sharma Department of ECE Indian Institute of Technology,
Zhilin Zhang, Bhaskar D. Rao University of California, San Diego March 28,
An Introduction to Compressive Sensing
Compressive Sensing Techniques for Video Acquisition EE5359 Multimedia Processing December 8,2009 Madhu P. Krishnan.
Highly Undersampled 0-norm Reconstruction
Lecture 22: Linearity Testing Sparse Fourier Transform
Lecture 4: CountSketch High Frequencies
CNNs and compressive sensing Theoretical analysis
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
Linglong Dai, Jintao Wang, Zhaocheng Wang
Bounds for Optimal Compressed Sensing Matrices
Sudocodes Fast measurement and reconstruction of sparse signals
Sparse and Redundant Representations and Their Applications in
CIS 700: “algorithms for Big Data”
Sudocodes Fast measurement and reconstruction of sparse signals
Outline Sparse Reconstruction RIP Condition
Presentation transcript:

Shriram Sarvotham Dror Baron Richard Baraniuk ECE Department Rice University dsp.rice.edu/cs Sudocodes Fast measurement and reconstruction of sparse signals

Motivation: coding of sparse data Distributed delivery of data with sparse representation –Content delivery networks –Peer to peer networks –Distributed file storage systems E.g. thresholded DCT/wavelet coefficients used in JPEG/JPEG2000

Motivation: coding of sparse data Distributed coding of sparse data –Can we exploit sparsity? –Efficient? –Low complexity?

Sparse signal processing Signal has non-zero coefficients Efficient ways to measure and recover ? Traditional DSP approach: –Acquisition: first obtain measurements –Then exploit sparsity is in the processing stage

Sparse signal processing Signal has non-zero coefficients Efficient ways to measure and recover ? Traditional DSP approach: –Acquisition: first obtain measurements –Then exploit sparsity is in the processing stage New compressive sampling (CS) approach: –Acquisition: obtain just measurements –Sparsity is exploited during signal acquisition [Candes et al; Donoho]

Compressive sampling Signal is -sparse Measure signal via few linear projections Enough to encode the signal measurements sparse signal nonzero entries

Compressive sampling Signal is -sparse Measure signal via few linear projections Random Gaussian measurements will work! measurements sparse signal nonzero entries

CS Miracle: L 1 reconstruction measurements sparse signal nonzero entries Find the explanation with smallest L 1 norm [Candes et al; Donoho] If then perfect reconstruction w/ high probability

CS Miracle: L 1 reconstruction measurements sparse signal nonzero entries Performance – Polynomial complexity reconstruction – Efficient encoding

CS Miracle: L 1 reconstruction measurements sparse signal nonzero entries But… is still impractical for many applications Reconstruction times:  N=1,000t=10 seconds  N=10,000t=3 hours  N=100,000t=~months

Why is reconstruction expensive? measurements sparse signal nonzero entries

Why is reconstruction expensive? measurements sparse signal nonzero entries Culprit: dense, unstructured

Fast CS reconstruction measurements sparse signal nonzero entries Sudocode matrix (sparse) Only 0/1 in Each row of contains randomly placed 1’s

Sudocodes measurements sparse signal nonzero entries Sudocode performance –Efficient encoding –Sub-linear complexity reconstruction Encouraging numerical results N=100,000 K=1,000  t=5.47 seconds M=5,132

Sudocode reconstruction measurements sparse signal nonzero entries Process each in succession Each can recover some ‘’s

Case 1: Zero measurement

Resolves all coefficients in the support Can resolve up to coefficients

Case 1: Zero measurement Resolves all coefficients in the support Can resolve up to coefficients Reduces size of problem

Case 2: #(support set)=1

Trivially resolves

Case 2: #(support set)=1 Trivially resolves

Case 3: Matching measurements

Common support Matches originate from same support Disjoint support  coefficients = 0 Common support  contain nonzeros

Case 3: Matching measurements Matches originate from same support Disjoint support  coefficients = 0 Common support  contain nonzeros

Trigger of revelations Recovery of can trigger more revelations

Trigger of revelations Recovery of can trigger more revelations An avalanche of coefficient revelations

Trigger of revelations Recovery of can trigger more revelations An avalanche of coefficient revelations

Sudocode reconstruction measurements sparse signal nonzero entries Like sudoku puzzles

Practical considerations Bottleneck: search for matches –With Binary Search Tree, matches ~ Re-explain measurements: more data structures Search for matches

Design of Sudo measurement matrix Choice of Small : Most measurements reveal Many measurements needed Large : Most measurements uninformative Many measurements needed

Design of Sudo measurement matrix Choice of Small : Most measurements reveal Many measurements needed Large : Most measurements uninformative Many measurements needed

Design of Sudo measurement matrix Choice of Small : Most measurements reveal Many measurements needed Large : Most measurements uninformative Many measurements needed Intuition: so that

Design of Sudo measurement matrix Choice of Small : Most measurements reveal Many measurements needed Large : Most measurements uninformative Many measurements needed Intuition: so that

Related work [Cormode, Muthukrishnan] –CS scheme based on group testing – –Complexity [Gilbert et. al.] Chaining Pursuit –CS scheme based on group testing and iterating – –Complexity –Works best for super-sparse signals

Performance comparison L 1 reconstruction Chaining Pursuit Sudocodes N=10,000 K=10 M=99 T3 hours M=5,915 t=0.16 sec M=461 t=0.14 sec N=10,000 K=100 M=664 T3 hours M=90,013 t=2.43 sec M=803 t=0.37 sec N=100,000 K=10 M=1,329 Tmonths M=17,398 t=1.13 sec M=931 t=1.09 sec N=10,000 K=1000 M=3,321 T3 hours M>10 6 t>30 sec M=5,132 t=5.47 sec measurements sparse signal nonzero entries

Utility in CDNs Measurements come from different sources Needs enough measurements

Ongoing work Statistical dependencies between non-zero coefficients Irregular degree distributions Adaptive linear projections Noisy measurements

Conclusions Sudocodes for CS –highly efficient –low complexity Key idea: use sparse Applications to content distribution

THE END Compressed sensing webpage: dsp.rice.edu/cs

Number of measurements Theorem: With, phase 1 requires to exactly reconstruct coefficients Proof sketch:

Two phase decoding Phase 1: decode coefficients Phase 2: decode remaining coefficients Why? –When most coefficients are decoded, Phase 2 saves a factor of measurements is not measured

Phase 2 measurements and decoding is non-sparse of dimension

Phase 2 measurements and decoding is non-sparse of dimension Resolve remaining coefficients by inverting the sub-matrix of

Phase 2 measurements and decoding is non-sparse of dimension Resolve remaining coefficients by inverting the sub-matrix of Phase 2 complexity = Key: choose Phase 2 complexity is

Compressive Sampling Signal is -sparse in basis/dictionary –WLOG assume sparsity in space domain Measure signal via few linear projections Random sparse measurements will work! measurements sparse signal nonzero entries

Signal model measurements sparse signal nonzero entries Signal is strictly sparse Every nonzero ~ continuous distribution  each nonzero coefficient is unique almost surely