Source Localization on a budget Volkan Cevher Rice University Petros RichAnna Martin Lance
Localization Problem Goal: Localize targets by fusing measurements from a network of sensors [Cevher, Duarte, Baraniuk; EUSIPCO 2007| Model and Zibulevsky; SP 2006| Cevher et al.; ICASSP 2006| Malioutov, Cetin, and Willsky; IEEE TSP, 2005| Chen et al.; Proc. of IEEE 2003]
Localization Problem Goal: Localize targets by fusing measurements from a network of sensors –collect time signal data –communicate signals across the network –solve an optimization problem
Digital Revolution Goal: Localize targets by fusing measurements from a network of sensors –collect time signal data –communicate signals across the network –solve an optimization problem <>
Digital Data Acquisition Foundation: Shannon/Nyquist sampling theorem timespace “if you sample densely enough (at the Nyquist rate), you can perfectly reconstruct the original analog data”
Major Trends in Sensing higher resolution / denser sampling large numbers of sensors increasing # of modalities / mobility
Goal: Localize targets by fusing measurements from a network of sensors –collect time signal data requires potentially high-rate (Nyquist) sampling –communicate signals across the network potentially large communication burden –solve an optimization problem e.g., MLE Need compression Problems of the Current Paradigm
Approaches Do nothing / Ignore be content with the existing approaches –generalizes well –robust
Approaches Finite Rate of Innovation Sketching / Streaming Compressive Sensing [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]
Approaches Finite Rate of Innovation Sketching / Streaming Compressive Sensing PARSITY
Agenda A short review of compressive sensing Localization via dimensionality reduction –Experimental results A broader view of localization Conclusions
A Short Review of Compressive Sensing Theory
Compressive Sensing 101 Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank Solution: Exploit the sparsity / compressibility geometry of acquired signal
Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank but satisfies Restricted Isometry Property (RIP) Solution: Exploit the sparsity / compressibility geometry of acquired signal –iid Gaussian –iid Bernoulli –… Compressive Sensing 101
Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank but satisfies Restricted Isometry Property (RIP) Solution: Exploit the sparsity / compressibility geometry of acquired signal via convex optimization or greedy algorithm –iid Gaussian –iid Bernoulli –… Compressive Sensing 101
Sparse signal: only K out of N coordinates nonzero –model: union of K-dimensional subspaces aligned w/ coordinate axes Concise Signal Structure sorted index
Sparse signal: only K out of N coordinates nonzero –model: union of K -dimensional subspaces Compressible signal: sorted coordinates decay rapidly to zero well-approximated by a K -sparse signal (simply by thresholding) sorted index Concise Signal Structure power-law decay
Restricted Isometry Property (RIP) Preserve the structure of sparse/compressible signals RIP of order 2K implies: for all K-sparse x 1 and x 2 K-planes
Restricted Isometry Property (RIP) Preserve the structure of sparse/compressible signals Random subGaussian (iid Gaussian, Bernoulli) matrix has the RIP with high probability if K-planes
Recovery Algorithms Goal:given recover and convex optimization formulations –basis pursuit, Dantzig selector, Lasso, … Greedy algorithms –orthogonal matching pursuit, iterative thresholding (IT), compressive sensing matching pursuit (CoSaMP) –at their core:iterative sparse approximation
Performance of Recovery Using methods, IT, CoSaMP Sparse signals –noise-free measurements: exact recovery –noisy measurements: stable recovery Compressible signals –recovery as good as K-sparse approximation CS recovery error (METRIC) signal K-term approx error noise
Universality Random measurements can be used for signals sparse in any basis
Universality Random measurements can be used for signals sparse in any basis
Universality Random measurements can be used for signals sparse in any basis sparse coefficient vector nonzero entries
Signal recovery is not always required. ELVIS: Enhanced Localization via Incoherence and Sparsity (Back to Localization)
An Important Detail Solve two entangled problems for localization –estimate source locations –estimate source signals
Today Instead, solve one localization problem –estimate source locations by exploiting random projections of observed signals –estimate source signals
ELVIS Instead, solve one localization problem –estimate source locations by exploiting random projections of observed signals –estimate source signals Bayesian model order selection & MAP estimation in a decentralized sparse approximation framework that leverages –source sparsity –incoherence of sources –spatial sparsity of sources [VC, Boufounos, Baraniuk, Gilbert, Strauss; IPSN’09]
Problem Setup Discretize space into a localization grid with N grid points –fixes localization resolution –P sensors do not have to be on grid points
Localization as Sparse Approximation localization grid actual sensor measurements true target location local localization dictionary
Multiple Targets localization grid actual sensor measurements 2 true target locations
Local Dictionaries Sample sparse / compressible signal using CS –Fourier sampling [Gilbert, Strauss] Calculate at sensor i using measurements –for all grid positions n=1,…,N: assume that target is at grid position n –for all sensors j=1,…,P: use Green’s function to estimate signal sensor j would measure if target was at position n
Valid Dictionaries S.A. works when columns of are mutual incoherent True when target signal has fast-decaying autocorrelation Extends to multiple targets with small cross-correlation
Typical Correlation Functions Toyota Prius Isuzu Rodeo Chevy Camaro ACF: Toyota Prius ACF: Isuzu Rodeo ACF: Chevy Camaro CCF: Rodeo vs. Prius CCF: Rodeo vs. Camaro CCF: Camaro vs. Prius
An Important Issue localization grid actual sensor measurements Need to send across the network
Enter Compressive Sensing Sparse localization vector <> acquire and transmit compressive measurements of the actual observations without losing information
So Far… Use random projections of observed signals two ways: –create local sensor dictionaries that sparsify source locations –create intersensor communication messages (K targets on N-dim grid) populated using recovered signals random iid
ELVIS Highlights Use random projections of observed signals two ways: –create local sensor dictionaries that sparsify source locations sample at source sparsity –create intersensor communication messages communicate at spatial sparsity robust to (i) quantization (1-bit quantization–paper) (ii) packet drops No Signal Reconstruction ELVIS Dictionary
ELVIS Highlights Use random projections of observed signals two ways: –create local sensor dictionaries that sparsify source locations sample at source sparsity –create intersensor communication messages communicate at spatial sparsity robust to (i) quantization (1-bit quantization–paper) (ii) packet drops Provable greedy estimation for ELVIS dictionaries Bearing pursuit – computationally efficient reconstruction No Signal Reconstruction
Experiments
Field Data Results 5 vehicle convoy >100 × sub-Nyquist
Sensing System Problems Common theme so far… sensors > representations > metrics > “do our best” Purpose of deployment –Multi-objective:sensing, lifetime, connectivity, coverage, reliability, etc…
Competition among Objectives Common theme so far… sensors > representations > metrics > “do our best” Purpose of deployment –Multi-objective:sensing, lifetime, connectivity, coverage, reliability, etc… Limited resources > conflicts in objectives
Diversity of Objectives Multiple objectives –localizationarea –lifetime time –connectivityprobability –coveragearea –reliabilityprobability Unifying framework –utility
Optimality Pareto efficiency –Economics / optimization literature Pareto Frontier –a fundamental limit for achievable utilities
Pareto Frontiers for Localization Mathematical framework for multi objective design best sensors portfolio Elements of the design –constant budget > optimization polytope –sensor dictionary –random deployment –communications [VC, Kaplan; IPSN’09, TOSN’09]
Pareto Frontiers for Localization Mathematical framework for multi objective design Elements of the design –constant budget –sensor dictionary >measurement type (bearing or range) and error, sensor reliability, field-of- view, sensing range, and mobility …… $10$30$200$1M$5M
Pareto Frontiers for Localization Mathematical framework for multi objective design Elements of the design –constant budget –sensor dictionary –random deployment>optimize expected / worst case utilities –communications
Pareto Frontiers for Localization Mathematical framework for multi objective design Elements of the design –constant budget –sensor dictionary –random deployment –communications>bearing or range
Statement of Results – 1 Theory to predict the localization performance with management –signals > performance characterizations –key idea: duality among sensors <> existence of a reference sensing system Provable diminishing returns
Optimal heterogeneity –sparse solutions bounded by # of objectives –key theorems: concentration of resources dominating sensor pairs Solution algorithms –integer optimization Statement of Results – 2
Conclusions CS –sensing via dimensionality reduction ELVIS –source localization via dimensionality reduction –provable and efficient recovery via bearing pursuit Current work –clock synchronization –sensor position errorsvia linear filtering Pareto Frontiers w/ ELVIS:reactive systems
Questions? Volkan Cevher /