Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai.

Similar presentations


Presentation on theme: "A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai."— Presentation transcript:

1 A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai

2 Setting Replace samples by more general measurements based on a few linear projections (inner products) measurements sparse signal # non-zeros

3 Signal Model Signal entry X n = B n U n iid B n » Bernoulli()  sparse iid U n » P U PUPU Bernoulli() Multiplier PXPX

4 Measurement Noise Measurement process is typically analog Analog systems add noise, non-linearities, etc. Assume Gaussian noise for ease of analysis Can be generalized to non-Gaussian noise

5 Noiseless measurements denoted y 0 Noise Noisy measurements Unit-norm columns  SNR= Noise Model noiseless SNR

6 Model process as measurement channel Measurements provide information! channel CS measurement CS decoding source encoder channel encoder channel decoder source decoder Allerton 2006 [Sarvotham, Baron, & Baraniuk]

7 Theorem: [Sarvotham, Baron, & Baraniuk 2006] For sparse signal with rate-distortion function R(D), lower bound on measurement rate s.t. SNR and distortion D Numerous single-letter bounds –[Aeron, Zhao, & Saligrama] –[Akcakaya and Tarokh] –[Rangan, Fletcher, & Goyal] –[Gastpar & Reeves] –[Wang, Wainwright, & Ramchandran] –[Tune, Bhaskaran, & Hanly] –… Single-Letter Bounds

8 Goal: Precise Single-letter Characterization of Optimal CS

9 What Single-letter Characterization? Ultimately what can one say about X n given Y? (sufficient statistic) Very complicated Want a simple characterization of its quality Large-system limit:  channelposterior

10 Main Result: Single-letter Characterization Result1: Conditioned on X n =x n, the observations (Y,) are statistically equivalent to  easy to compute… Estimation quality from (Y,) just as good as noisier scalar observation degradation  channelposterior

11  2 (0,1) is fixed point of Take-home point: degraded scalar channel Non-rigorous owing to replica method w/ symmetry assumption –used in CDMA detection [Tanaka 2002, Guo & Verdu 2005] Related analysis [Rangan, Fletcher, & Goyal 2009] –MMSE estimate (not posterior) using [Guo & Verdu 2005] –extended to several CS algorithms particularly LASSO Details

12 Decoupling

13 Result2: Large system limit; any arbitrary (constant) L input elements decouple: Take-home point: “interference” from each individual signal entry vanishes Decoupling Result

14 Sparse Measurement Matrices

15 Sparse Measurement Matrices [Baron, Sarvotham, & Baraniuk] LDPC measurement matrix (sparse) Mostly zeros in ; nonzeros » P  Each row contains ¼ Nq randomly placed nonzeros Fast matrix-vector multiplication  fast encoding / decoding sparse matrix

16 CS Decoding Using BP [Baron, Sarvotham, & Baraniuk] Measurement matrix represented by graph Estimate input iteratively Implemented via nonparametric BP [Bickson,Sommer,…] measurements y signal x

17 Identical Single-letter Characterization w/BP Result3: Conditioned on X n =x n, the observations (Y,) are statistically equivalent to Sparse matrices just as good BP is asymptotically optimal! identical degradation

18 Decoupling Between Two Input Entries (N=500, M=250, =0.1, =10) density

19 CS-BP vs Other CS Methods (N=1000, =0.1, q=0.02) MM MMSE CS-BP

20 Conclusion Single-letter characterization of CS Decoupling Sparse matrices just as good Asymptotically optimal CS-BP algorithm


Download ppt "A Single-letter Characterization of Optimal Noisy Compressed Sensing Dongning Guo Dror Baron Shlomo Shamai."

Similar presentations


Ads by Google