A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai.

Slides:



Advertisements
Similar presentations
Multiuser Detection for CDMA Systems
Advertisements

Bregman Iterative Algorithms for L1 Minimization with
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
1.5 Elementary Matrices and a Method for Finding
Observers and Kalman Filters
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
Compressed Sensing meets Information Theory Dror Baron Duarte Wakin Sarvotham Baraniuk Guo Shamai.
Solving systems using matrices
Optimized Projection Directions for Compressed Sensing Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa.
1 Distortion-Rate for Non-Distributed and Distributed Estimation with WSNs Presenter: Ioannis D. Schizas May 5, 2005 EE8510 Project May 5, 2005 EE8510.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
Rice University dsp.rice.edu/cs Distributed Compressive Sensing A Framework for Integrated Sensing and Processing for Signal Ensembles Marco Duarte Shriram.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
Hybrid Dense/Sparse Matrices in Compressed Sensing Reconstruction
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
Multivariate Linear Systems and Row Operations.
MATRICES AND DETERMINANTS
Elementary Operations of Matrix
Sec 3.1 Introduction to Linear System Sec 3.2 Matrices and Gaussian Elemination The graph is a line in xy-plane The graph is a line in xyz-plane.
Game Theory Meets Compressed Sensing
Sec 3.2 Matrices and Gaussian Elemination Coefficient Matrix 3 x 3 Coefficient Matrix 3 x 3 Augmented Coefficient Matrix 3 x 4 Augmented Coefficient Matrix.
Information and Coding Theory Linear Block Codes. Basic definitions and some examples. Juris Viksna, 2015.
Multiuser Detection (MUD) Combined with array signal processing in current wireless communication environments Wed. 박사 3학기 구 정 회.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
The Secrecy of Compressed Sensing Measurements Yaron Rachlin & Dror Baron TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
Shriram Sarvotham Dror Baron Richard Baraniuk ECE Department Rice University dsp.rice.edu/cs Sudocodes Fast measurement and reconstruction of sparse signals.
Constrained adaptive sensing Mark A. Davenport Georgia Institute of Technology School of Electrical and Computer Engineering TexPoint fonts used in EMF.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
MULTICELL UPLINK SPECTRAL EFFICIENCY OF CODED DS- CDMA WITH RANDOM SIGNATURES By: Benjamin M. Zaidel, Shlomo Shamai, Sergio Verdu Presented By: Ukash Nakarmi.
High-dimensional Error Analysis of Regularized M-Estimators Ehsan AbbasiChristos ThrampoulidisBabak Hassibi Allerton Conference Wednesday September 30,
Matrices and Systems of Equations
Multiplying Matrices Algebra 2—Section 3.6. Recall: Scalar Multiplication - each element in a matrix is multiplied by a constant. Multiplying one matrix.
Spectrum Sensing In Cognitive Radio Networks
Zhilin Zhang, Bhaskar D. Rao University of California, San Diego March 28,
Cameron Rowe.  Introduction  Purpose  Implementation  Simple Example Problem  Extended Kalman Filters  Conclusion  Real World Examples.
A Low-Complexity Universal Architecture for Distributed Rate-Constrained Nonparametric Statistical Learning in Sensor Networks Avon Loy Fernandes, Maxim.
MATRIX A set of numbers arranged in rows and columns enclosed in round or square brackets is called a matrix. The order of a matrix gives the number of.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
2.5 – Determinants and Multiplicative Inverses of Matrices.
Recovering structured signals: Precise performance analysis Christos Thrampoulidis Joint ITA Workshop, La Jolla, CA February 3, 2016.
A rectangular array of numeric or algebraic quantities subject to mathematical operations. The regular formation of elements into columns and rows.
Vectors, Matrices and their Products Hung-yi Lee.
Linear Algebra review (optional)
12. Principles of Parameter Estimation
Unit 1: Matrices Day 1 Aug. 7th, 2012.
Matrix Operations.
Menglong Li Ph.d of Industrial Engineering Dec 1st 2016
Matrix Operations Monday, August 06, 2018.
Matrix Operations.
Multiplying Matrices Algebra 2—Section 3.6.
Synaptic Dynamics: Unsupervised Learning
Multiplying Matrices.
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
Sudocodes Fast measurement and reconstruction of sparse signals
Linear Algebra Lecture 18.
Multiplying Matrices.
Sudocodes Fast measurement and reconstruction of sparse signals
Linear Algebra review (optional)
1.8 Matrices.
12. Principles of Parameter Estimation
1.8 Matrices.
Multiplying Matrices.
Multiplying Matrices.
Multiplying Matrices.
Outline Sparse Reconstruction RIP Condition
Subspace Expanders and Low Rank Matrix Recovery
Presentation transcript:

A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai

Setting Replace samples by more general measurements based on a few linear projections (inner products) measurements sparse signal # non-zeros

Signal Model Signal entry X n = B n U n iid B n » Bernoulli()  sparse iid U n » P U PUPU Bernoulli() Multiplier PXPX

Measurement Noise Measurement process is typically analog Analog systems add noise, non-linearities, etc. Assume Gaussian noise for ease of analysis Can be generalized to non-Gaussian noise

Noiseless measurements denoted y 0 Noise Noisy measurements Unit-norm columns  SNR= Noise Model noiseless SNR

Model process as measurement channel Measurements provide information! channel CS measurement CS decoding source encoder channel encoder channel decoder source decoder Allerton 2006 [Sarvotham, Baron, & Baraniuk]

Theorem: [Sarvotham, Baron, & Baraniuk 2006] For sparse signal with rate-distortion function R(D), lower bound on measurement rate s.t. SNR and distortion D Numerous single-letter bounds –[Aeron, Zhao, & Saligrama] –[Akcakaya and Tarokh] –[Rangan, Fletcher, & Goyal] –[Gastpar & Reeves] –[Wang, Wainwright, & Ramchandran] –[Tune, Bhaskaran, & Hanly] –… Single-Letter Bounds

Goal: Precise Single-letter Characterization of Optimal CS

What Single-letter Characterization? Ultimately what can one say about X n given Y? (sufficient statistic) Very complicated Want a simple characterization of its quality Large-system limit:  channelposterior

Main Result: Single-letter Characterization Result1: Conditioned on X n =x n, the observations (Y,) are statistically equivalent to  easy to compute… Estimation quality from (Y,) just as good as noisier scalar observation degradation  channelposterior

 2 (0,1) is fixed point of Take-home point: degraded scalar channel Non-rigorous owing to replica method w/ symmetry assumption –used in CDMA detection [Tanaka 2002, Guo & Verdu 2005] Related analysis [Rangan, Fletcher, & Goyal 2009] –MMSE estimate (not posterior) using [Guo & Verdu 2005] –extended to several CS algorithms particularly LASSO Details

Decoupling

Result2: Large system limit; any arbitrary (constant) L input elements decouple: Take-home point: “interference” from each individual signal entry vanishes Decoupling Result

Sparse Measurement Matrices

Sparse Measurement Matrices [Baron, Sarvotham, & Baraniuk] LDPC measurement matrix (sparse) Mostly zeros in ; nonzeros » P  Each row contains ¼ Nq randomly placed nonzeros Fast matrix-vector multiplication  fast encoding / decoding sparse matrix

CS Decoding Using BP [Baron, Sarvotham, & Baraniuk] Measurement matrix represented by graph Estimate input iteratively Implemented via nonparametric BP [Bickson,Sommer,…] measurements y signal x

Identical Single-letter Characterization w/BP Result3: Conditioned on X n =x n, the observations (Y,) are statistically equivalent to Sparse matrices just as good BP is asymptotically optimal! identical degradation

Decoupling Between Two Input Entries (N=500, M=250, =0.1, =10) density

CS-BP vs Other CS Methods (N=1000, =0.1, q=0.02) MM MMSE CS-BP

Conclusion Single-letter characterization of CS Decoupling Sparse matrices just as good Asymptotically optimal CS-BP algorithm