Download presentation

Presentation is loading. Please wait.

Published byReese Stallman Modified over 2 years ago

1
COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

2
CONTENTS 1.WHAT? Introduction to Compressed Sensing (CS) 2.HOW? Theory behind CS 3.FOR WHAT PURPOSE? CS applications 4.AND THEN? Active research and future lines

3
CONTENTS 1.WHAT? Introduction to Compressed Sensing (CS) 2.HOW? Theory behind CS 3.FOR WHAT PURPOSE? CS applications 4.AND THEN? Active research and future lines

4
Transmission scheme Sample Receive Compress Decompress NK N >> K Transmit KN Why so many samples? Natural signals (sparse/compressible) no significant perceptual loss Brick wall to performance

5
Shannon/Nyquist theorem Shannon/Nyquist theorem tell us to use a sampling rate of 1/(2W) seconds, if W is the highest frequency of the signal This is a worst-case bound for ANY band- limited signal Sparse / compressible signals is a favorable case CS solution: melt sampling and compression

6
Compressed Sensing (CS) Compressed Sensing ReceiveReconstruct M K < M << N Transmit MN Recover sparse signals by directly acquiring compressed data Replace samples by measurements What do we need for CS to success?

7
We now how to Sense Compressively Im glad this battle is over. Finally my military period is over. I will now come back to Motril and get married, and then I will grow up pigs as I have always wanted to do Aye Cool! Do you mean youre glad this battle is over because now youve finished here and you will go back to Motril, get married, and grow up pigs as you always wanted to?

8
What does CS need? Nice sensing dictionary Appropriate sensing A priori knowledge Recovery process Wie lange wird das nehmen? What? Saint Roques dog has no tail I know this guy so much that I know what he means Cool! WordsIdea

9
CS needs: Nice sensing dictionary Appropriate sensing A priori knowledge Recovery process SPARSENESS RANDOMNESS INCOHERENCE OPTIMIZATION

10
Sparseness: less is more A stranger approaching a hut by the only known road: the valley Dictionary: How to express it? Idea: He was advancing by the only road that was ever traveled by the stranger as he approached the Hut; or, he came up the valley Wyandotte Combining elements… J.F. Cooper E.A. Poe Combining elements… Hummm, you could say the same using less words… He was advancing by the valley, the only road traveled by a stranger approaching the Hut Comments to Wyandotte SPARSER

11
Sparseness: less is more Sparseness: Property of being small in numbers or amount, often scattered over a large area [Cambridge Advanced Learners Dictionary] A CERTAIN DISTRIBUTIONA SPARSER DISTRIBUTION

12
Sparseness: less is more Original Einstein Taking 10% pixels 10% Fourier coeffs.10% Wavelet coeffs. Pixels: not sparse A new domain can increase sparseness

13
Sparseness: less is more Dictionary: How to express it? Linear analysis Non-linear analysis X-lets elementary functions (atoms) linear subband non-linear subband SPARSER X-let-based representations are compressible, meaning that most of the energy is concentrated in few coefficients Analysis-sense Sparseness: Response of X-lets filters is sparse [Malllat 89, Olshausen & Field 96] Synthesis-sense Sparseness: We can increase sparseness by non-linear analysis

14
Sparseness: less is more Dictionary: How to express it? Idea: Combining other way… X-lets elementary functions non-linear subband SPARSER Taking around 3.5% of total coeffs… Taking less coefficients we achieve strict sparseness, at the price of just approximating the image PSNR: dB

15
Incoherence Sparse signals in a given dictionary must be dense in another incoherent one Sampling dictionary should be incoherent w.r.t. that where the signal is sparse/compressible A time-sparse signal Its frequency-dense representation

16
Measurement and recovery processes Measurement process: Sparseness + Incoherence Random sampling will do Recovery process: Numerical non-linear optimization is able to exactly recover the signal given the measurements

17
CS relies on: A priori knowledge: Many natural signals are sparse or compressible in a proper basis Nice sensing dictionary: Signals should be dense when using the sampling waveforms Appropriate sensing: Random sampling have demonstrated to work well Recovery process: Bounds for exact recovery depends on the optimization method

18
Summary CS is a simple and efficient signal acquisition protocol which samples at a reduced rate and later use computational power for reconstruction from what appears to be an incomplete set of measurements CS is universal, democratic and asymmetrical

19
CONTENTS 1.WHAT? Introduction to Compressed Sensing (CS) 2.HOW? Theory behind CS 3.FOR WHAT PURPOSE? CS applications 4.AND THEN? Active research and future lines

20
The sensing problem x t : Original discrete signal (vector) : Sampling dictionary (matrix) y k : Sampled signal (vector)

21
The sensing problem Traditional sampling: Original signal Sampled signal Sampling dictionary N x 1 y N x NN x 1 = I x

22
The sensing problem When the signal is sparse/compressible, we can directly acquire a condensed representation with no/little information loss Random projection will work if M = O(K log(N/K)) [Candès et al., Donoho, 2004] M x 1 y M x NN x 1 x K nonzero entries K < M << N

23
Universality Random measurements can be used if signal is sparse/compressible in any basis M x 1 y M x NN x 1 a K nonzero entries K < M << N N x N

24
Good sensing waveforms? and should be incoherent Measure the largest correlation between any two elements: Large correlation low incoherence Examples Spike and Fourier basis (maximal incoherence) Random and any fixed basis

25
Solution: sensing randomly Random measurements ReceiveReconstruct M M = O(K log(N/K)) Transmit MN We have set up the encoder Lets now study the decoder

26
CS recovery Assume a is K-sparse, and y = a We can recover a by solving: This is a NP-hard problem (combinatorial) Use some tractable approximation Count number of active coefficients

27
Robust CS recovery What about a is only compressible and y = a + n), with n and unknown error term? Isometry constant of : The smallest K such that, for all K-sparse vectors x: obeys a Restricted Isometry Property (RIP) if K is not too close to 1 obeys a RIP Any subset of K columns are nearly orthogonal To recover K-sparse signals we need 2K < 1 (unique solution)

28
Recovery techniques Minimization of L1-norm Greedy techniques Iterative thresholding Total-variation minimization …

29
Recovery by minimizing L1-norm Convexity: tractable problem Solvable by Linear or Second-order programming For C > 0, â 1 = â if: Sum of absolute values

30
Recovery by minimizing L1-norm Noisy data: Solve the LASSO problem Convex problem solvable via 2nd order cone programming (SOCP) If 2K < 2 – 1, then:

31
Example of L1 recovery A 120X512 : Random orthonormal matrix Perfect recovery of x by L1-minimization xy = Ax

32
Recovery by Greedy Pursuit Algorithm: New active component: that whose corresponding i is most correlated with y Find best approximation, y, to y using active components Substract y from y to form residual e Make y = e and repeat Very fast for small-scale problems Not as accurate/robust for large signals in the presence of noise

33
Recovery by Iterative Thresholding Algorithm: Iterates between shrinkage/thresholding operation and projection onto perfect reconstruction If soft-thresholding is used, analogous theory to L1-minimization If hard-thresholding is used, the error is within a constant factor of the best attainable estimation error [Blumensath08]

34
Recovery by TV minimization Sparseness: signals have few jumps Convexity: tractable problem Accurate and robust, but can be slow for large-scale problems

35
Example of TV recovery : Fourier transform Perfect recovery of x by TV-minimization xx LS = x

36
Summary Sensing: Use random sampling in dictionaries with low coherence to that where the signal is sparse. Choose M wisely Recovery: A wide range of techniques are available L1-minimization seems to work well, but choose that best fitting your needs

37
CONTENTS 1.WHAT? Introduction to Compressed Sensing (CS) 2.HOW? Theory behind CS 3.FOR WHAT PURPOSE? CS applications 4.AND THEN? Active research and future lines

38
Some CS applications Data compression Compressive imaging Detection, classification, estimation, learning… Medical imaging Analog-to-information conversion Biosensing Geophysical data analysis Hyperspectral imaging Compressive radar Astronomy Comunications Surface metrology Spectrum analysis …

39
Data compression The sparse basis may be unknown or impractical to implement at the encoder A randomly designed can be considered a universal encoding strategy This may be helpful for distributed source coding in multi-signal settings [Baron et al. 05, Haupt and Nowak 06,…]

40
Magnetic resonance imaging

41
Rice Single-Pixel CS Camera

42
Rice Analog-to-Information conversion Analog input signal into discrete digital measurements Extension of A2D converter that samples at signals information rate rather than its Nyquist rate

43
CS in Astronomy [Bobin et al 08] Desperate need for data compression Resolution, Sensitivity and photometry are important Herschel satellite (ESA, 2009): conventional compression cannot be used CS can help with: New compressive sensors A flexible compression/decompression scheme Computational cost ( x): O(t) vs. JPEG 2000s O(t log(t)) Decoupling of compression and decompression CS outperforms conventional compression

44
CONTENTS 1.WHAT? Introduction to Compressed Sensing (CS) 2.HOW? Theory behind CS 3.FOR WHAT PURPOSE? CS applications 4.AND THEN? Active research and future lines

45
CS is a very active area

46
More than seventy 2008 papers in CS repository Most active areas: New applications (de-noising, learning, video, New recovery methods (non-convex, variational, CoSamp,…) ICIP 08: COMPRESSED SENSING FOR MULTI-VIEW TRACKING AND 3-D VOXEL RECONSTRUCTION COMPRESSIVE IMAGE FUSION IMAGE REPRESENTATION BY COMPRESSED SENSING KALMAN FILTERED COMPRESSED SENSING NONCONVEX COMPRESSIVE SENSING AND RECONSTRUCTION OF GRADIENT-SPARSE IMAGES: RANDOM VS. TOMOGRAPHIC FOURIER SAMPLING …

47
Conclusions CS is a new technique for acquiring and compressing images simultaneously Sparseness + Incoherence + random sampling allows perfect reconstruction under some conditions A wide range of applications are possible Big research effort now on recovery techniques

48
Our future lines? Convex CS: TV-regularization Non-convex CS: L0-GM for CS Intermediate norms (0 < p < 1) for CS CS Applications: Super-resolved sampling? Detection, estimation, classification,…

49
Thank you See references and software here:

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google