Presentation is loading. Please wait.

Presentation is loading. Please wait.

Source Localization on a budget Volkan Cevher Rice University Petros RichAnna Martin Lance.

Similar presentations


Presentation on theme: "Source Localization on a budget Volkan Cevher Rice University Petros RichAnna Martin Lance."— Presentation transcript:

1 Source Localization on a budget Volkan Cevher volkan@rice.edu Rice University Petros RichAnna Martin Lance

2 Localization Problem Goal: Localize targets by fusing measurements from a network of sensors [Cevher, Duarte, Baraniuk; EUSIPCO 2007| Model and Zibulevsky; SP 2006| Cevher et al.; ICASSP 2006| Malioutov, Cetin, and Willsky; IEEE TSP, 2005| Chen et al.; Proc. of IEEE 2003]

3 Localization Problem Goal: Localize targets by fusing measurements from a network of sensors –collect time signal data –communicate signals across the network –solve an optimization problem

4 Digital Revolution Goal: Localize targets by fusing measurements from a network of sensors –collect time signal data –communicate signals across the network –solve an optimization problem <>

5 Digital Data Acquisition Foundation: Shannon/Nyquist sampling theorem timespace “if you sample densely enough (at the Nyquist rate), you can perfectly reconstruct the original analog data”

6 Major Trends in Sensing higher resolution / denser sampling large numbers of sensors increasing # of modalities / mobility

7 Goal: Localize targets by fusing measurements from a network of sensors –collect time signal data  requires potentially high-rate (Nyquist) sampling –communicate signals across the network  potentially large communication burden –solve an optimization problem  e.g., MLE Need compression Problems of the Current Paradigm

8 Approaches Do nothing / Ignore be content with the existing approaches –generalizes well –robust

9 Approaches Finite Rate of Innovation Sketching / Streaming Compressive Sensing [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]

10 Approaches Finite Rate of Innovation Sketching / Streaming Compressive Sensing PARSITY

11 Agenda A short review of compressive sensing Localization via dimensionality reduction –Experimental results A broader view of localization Conclusions

12 A Short Review of Compressive Sensing Theory

13 Compressive Sensing 101 Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank Solution: Exploit the sparsity / compressibility geometry of acquired signal

14 Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank but satisfies Restricted Isometry Property (RIP) Solution: Exploit the sparsity / compressibility geometry of acquired signal –iid Gaussian –iid Bernoulli –… Compressive Sensing 101

15 Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank but satisfies Restricted Isometry Property (RIP) Solution: Exploit the sparsity / compressibility geometry of acquired signal via convex optimization or greedy algorithm –iid Gaussian –iid Bernoulli –… Compressive Sensing 101

16 Sparse signal: only K out of N coordinates nonzero –model: union of K-dimensional subspaces aligned w/ coordinate axes Concise Signal Structure sorted index

17 Sparse signal: only K out of N coordinates nonzero –model: union of K -dimensional subspaces Compressible signal: sorted coordinates decay rapidly to zero well-approximated by a K -sparse signal (simply by thresholding) sorted index Concise Signal Structure power-law decay

18 Restricted Isometry Property (RIP) Preserve the structure of sparse/compressible signals RIP of order 2K implies: for all K-sparse x 1 and x 2 K-planes

19 Restricted Isometry Property (RIP) Preserve the structure of sparse/compressible signals Random subGaussian (iid Gaussian, Bernoulli) matrix has the RIP with high probability if K-planes

20 Recovery Algorithms Goal:given recover and convex optimization formulations –basis pursuit, Dantzig selector, Lasso, … Greedy algorithms –orthogonal matching pursuit, iterative thresholding (IT), compressive sensing matching pursuit (CoSaMP) –at their core:iterative sparse approximation

21 Performance of Recovery Using methods, IT, CoSaMP Sparse signals –noise-free measurements: exact recovery –noisy measurements: stable recovery Compressible signals –recovery as good as K-sparse approximation CS recovery error (METRIC) signal K-term approx error noise

22 Universality Random measurements can be used for signals sparse in any basis

23 Universality Random measurements can be used for signals sparse in any basis

24 Universality Random measurements can be used for signals sparse in any basis sparse coefficient vector nonzero entries

25

26 Signal recovery is not always required. ELVIS: Enhanced Localization via Incoherence and Sparsity (Back to Localization)

27 An Important Detail Solve two entangled problems for localization –estimate source locations –estimate source signals

28 Today Instead, solve one localization problem –estimate source locations by exploiting random projections of observed signals –estimate source signals

29 ELVIS Instead, solve one localization problem –estimate source locations by exploiting random projections of observed signals –estimate source signals Bayesian model order selection & MAP estimation in a decentralized sparse approximation framework that leverages –source sparsity –incoherence of sources –spatial sparsity of sources [VC, Boufounos, Baraniuk, Gilbert, Strauss; IPSN’09]

30 Problem Setup Discretize space into a localization grid with N grid points –fixes localization resolution –P sensors do not have to be on grid points

31 Localization as Sparse Approximation localization grid actual sensor measurements true target location local localization dictionary

32 Multiple Targets localization grid actual sensor measurements 2 true target locations

33 Local Dictionaries Sample sparse / compressible signal using CS –Fourier sampling [Gilbert, Strauss] Calculate at sensor i using measurements –for all grid positions n=1,…,N: assume that target is at grid position n –for all sensors j=1,…,P: use Green’s function to estimate signal sensor j would measure if target was at position n

34 Valid Dictionaries S.A. works when columns of are mutual incoherent True when target signal has fast-decaying autocorrelation Extends to multiple targets with small cross-correlation

35 Typical Correlation Functions Toyota Prius Isuzu Rodeo Chevy Camaro ACF: Toyota Prius ACF: Isuzu Rodeo ACF: Chevy Camaro CCF: Rodeo vs. Prius CCF: Rodeo vs. Camaro CCF: Camaro vs. Prius

36 An Important Issue localization grid actual sensor measurements Need to send across the network

37 Enter Compressive Sensing Sparse localization vector <> acquire and transmit compressive measurements of the actual observations without losing information

38 So Far… Use random projections of observed signals two ways: –create local sensor dictionaries that sparsify source locations –create intersensor communication messages (K targets on N-dim grid) populated using recovered signals random iid

39 ELVIS Highlights Use random projections of observed signals two ways: –create local sensor dictionaries that sparsify source locations sample at source sparsity –create intersensor communication messages communicate at spatial sparsity robust to (i) quantization (1-bit quantization–paper) (ii) packet drops No Signal Reconstruction ELVIS Dictionary

40 ELVIS Highlights Use random projections of observed signals two ways: –create local sensor dictionaries that sparsify source locations sample at source sparsity –create intersensor communication messages communicate at spatial sparsity robust to (i) quantization (1-bit quantization–paper) (ii) packet drops Provable greedy estimation for ELVIS dictionaries Bearing pursuit – computationally efficient reconstruction No Signal Reconstruction

41 Experiments

42 Field Data Results 5 vehicle convoy >100 × sub-Nyquist

43

44 Sensing System Problems Common theme so far… sensors > representations > metrics > “do our best” Purpose of deployment –Multi-objective:sensing, lifetime, connectivity, coverage, reliability, etc…

45 Competition among Objectives Common theme so far… sensors > representations > metrics > “do our best” Purpose of deployment –Multi-objective:sensing, lifetime, connectivity, coverage, reliability, etc… Limited resources > conflicts in objectives

46 Diversity of Objectives Multiple objectives –localizationarea –lifetime time –connectivityprobability –coveragearea –reliabilityprobability Unifying framework –utility

47 Optimality Pareto efficiency –Economics / optimization literature Pareto Frontier –a fundamental limit for achievable utilities

48 Pareto Frontiers for Localization Mathematical framework for multi objective design best sensors portfolio Elements of the design –constant budget > optimization polytope –sensor dictionary –random deployment –communications [VC, Kaplan; IPSN’09, TOSN’09]

49 Pareto Frontiers for Localization Mathematical framework for multi objective design Elements of the design –constant budget –sensor dictionary >measurement type (bearing or range) and error, sensor reliability, field-of- view, sensing range, and mobility …… $10$30$200$1M$5M

50 Pareto Frontiers for Localization Mathematical framework for multi objective design Elements of the design –constant budget –sensor dictionary –random deployment>optimize expected / worst case utilities –communications

51 Pareto Frontiers for Localization Mathematical framework for multi objective design Elements of the design –constant budget –sensor dictionary –random deployment –communications>bearing or range

52 Statement of Results – 1 Theory to predict the localization performance with management –signals > performance characterizations –key idea: duality among sensors <> existence of a reference sensing system Provable diminishing returns

53 Optimal heterogeneity –sparse solutions bounded by # of objectives –key theorems: concentration of resources dominating sensor pairs Solution algorithms –integer optimization Statement of Results – 2

54 Conclusions CS –sensing via dimensionality reduction ELVIS –source localization via dimensionality reduction –provable and efficient recovery via bearing pursuit Current work –clock synchronization –sensor position errorsvia linear filtering Pareto Frontiers w/ ELVIS:reactive systems

55 Questions? Volkan Cevher / volkan@rice.edu


Download ppt "Source Localization on a budget Volkan Cevher Rice University Petros RichAnna Martin Lance."

Similar presentations


Ads by Google