Source Localization on a budget Volkan Cevher Rice University Petros RichAnna Martin Lance.

Slides:



Advertisements
Similar presentations
An Introduction to Compressed Sensing Student : Shenghan TSAI Advisor : Hsuan-Jung Su and Pin-Hsun Lin Date : May 02,
Advertisements

On the Power of Adaptivity in Sparse Recovery Piotr Indyk MIT Joint work with Eric Price and David Woodruff, 2011.
Image acquisition using sparse (pseudo)-random matrices Piotr Indyk MIT.
Compressive Sensing IT530, Lecture Notes.
Sparse Recovery (Using Sparse Matrices)
Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.
Short-course Compressive Sensing of Videos Venue CVPR 2012, Providence, RI, USA June 16, 2012 Organizers: Richard G. Baraniuk Mohit Gupta Aswin C. Sankaranarayanan.
Learning Measurement Matrices for Redundant Dictionaries Richard Baraniuk Rice University Chinmay Hegde MIT Aswin Sankaranarayanan CMU.
Online Performance Guarantees for Sparse Recovery Raja Giryes ICASSP 2011 Volkan Cevher.
Submodular Dictionary Selection for Sparse Representation Volkan Cevher Laboratory for Information and Inference Systems - LIONS.
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 2: Compressive Sampling for Analog Time Signals.
Beyond Nyquist: Compressed Sensing of Analog Signals
Contents 1. Introduction 2. UWB Signal processing 3. Compressed Sensing Theory 3.1 Sparse representation of signals 3.2 AIC (analog to information converter)
More MR Fingerprinting
Compressed sensing Carlos Becker, Guillaume Lemaître & Peter Rennert
Compressive Signal Processing Richard Baraniuk Rice University.
Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator Volkan Cevher Laboratory for Information and Inference Systems – LIONS / EPFL.
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
“Random Projections on Smooth Manifolds” -A short summary
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
Quantization and compressed sensing Dmitri Minkin.
Sparse and Overcomplete Data Representation
Richard Baraniuk Rice University dsp.rice.edu/cs Compressive Signal Processing.
Distributed Regression: an Efficient Framework for Modeling Sensor Network Data Carlos Guestrin Peter Bodik Romain Thibaux Mark Paskin Samuel Madden.
Compressive Signal Processing
Random Convolution in Compressive Sampling Michael Fleyer.
Introduction to Compressive Sensing
Rice University dsp.rice.edu/cs Distributed Compressive Sensing A Framework for Integrated Sensing and Processing for Signal Ensembles Marco Duarte Shriram.
Hybrid Dense/Sparse Matrices in Compressed Sensing Reconstruction
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Compressed Sensing Compressive Sampling
Model-based Compressive Sensing
An ALPS’ view of Sparse Recovery Volkan Cevher Laboratory for Information and Inference Systems - LIONS
Richard Baraniuk Rice University Model-based Sparsity.
Compressive Sampling: A Brief Overview
Sensing via Dimensionality Reduction Structured Sparsity Models Volkan Cevher
IC Research Seminar A Teaser Prof. Dr. Volkan Cevher LIONS/Laboratory for Information and Inference Systems
Game Theory Meets Compressed Sensing
Richard Baraniuk Chinmay Hegde Marco Duarte Mark Davenport Rice University Michael Wakin University of Michigan Compressive Learning and Inference.
Recovery of Clustered Sparse Signals from Compressive Measurements
Compressive Sensing Based on Local Regional Data in Wireless Sensor Networks Hao Yang, Liusheng Huang, Hongli Xu, Wei Yang 2012 IEEE Wireless Communications.
Cs: compressed sensing
Introduction to Compressive Sensing
Recovering low rank and sparse matrices from compressive measurements Aswin C Sankaranarayanan Rice University Richard G. Baraniuk Andrew E. Waters.
Learning With Structured Sparsity
Model-Based Compressive Sensing Presenter: Jason David Bonior ECE / CMR Tennessee Technological University November 5, 2010 Reading Group (Richard G. Baraniuk,
SCALE Speech Communication with Adaptive LEarning Computational Methods for Structured Sparse Component Analysis of Convolutive Speech Mixtures Volkan.
Compressive Sensing for Multimedia Communications in Wireless Sensor Networks By: Wael BarakatRabih Saliba EE381K-14 MDDSP Literary Survey Presentation.
Compressible priors for high-dimensional statistics Volkan Cevher LIONS/Laboratory for Information and Inference Systems
Shriram Sarvotham Dror Baron Richard Baraniuk ECE Department Rice University dsp.rice.edu/cs Sudocodes Fast measurement and reconstruction of sparse signals.
Image Priors and the Sparse-Land Model
Compressive Sensing Techniques for Video Acquisition EE5359 Multimedia Processing December 8,2009 Madhu P. Krishnan.
Super-resolution MRI Using Finite Rate of Innovation Curves Greg Ongie*, Mathews Jacob Computational Biomedical Imaging Group (CBIG) University of Iowa.
Compressive Coded Aperture Video Reconstruction
Modulated Unit Norm Tight Frames for Compressed Sensing
Highly Undersampled 0-norm Reconstruction
Lecture 15 Sparse Recovery Using Sparse Matrices
Basic Algorithms Christina Gallner
CNNs and compressive sensing Theoretical analysis
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
Sudocodes Fast measurement and reconstruction of sparse signals
Introduction to Compressive Sensing Aswin Sankaranarayanan
Optimal sparse representations in general overcomplete bases
INFONET Seminar Application Group
CIS 700: “algorithms for Big Data”
Sudocodes Fast measurement and reconstruction of sparse signals
Outline Sparse Reconstruction RIP Condition
Subspace Expanders and Low Rank Matrix Recovery
Presentation transcript:

Source Localization on a budget Volkan Cevher Rice University Petros RichAnna Martin Lance

Localization Problem Goal: Localize targets by fusing measurements from a network of sensors [Cevher, Duarte, Baraniuk; EUSIPCO 2007| Model and Zibulevsky; SP 2006| Cevher et al.; ICASSP 2006| Malioutov, Cetin, and Willsky; IEEE TSP, 2005| Chen et al.; Proc. of IEEE 2003]

Localization Problem Goal: Localize targets by fusing measurements from a network of sensors –collect time signal data –communicate signals across the network –solve an optimization problem

Digital Revolution Goal: Localize targets by fusing measurements from a network of sensors –collect time signal data –communicate signals across the network –solve an optimization problem <>

Digital Data Acquisition Foundation: Shannon/Nyquist sampling theorem timespace “if you sample densely enough (at the Nyquist rate), you can perfectly reconstruct the original analog data”

Major Trends in Sensing higher resolution / denser sampling large numbers of sensors increasing # of modalities / mobility

Goal: Localize targets by fusing measurements from a network of sensors –collect time signal data  requires potentially high-rate (Nyquist) sampling –communicate signals across the network  potentially large communication burden –solve an optimization problem  e.g., MLE Need compression Problems of the Current Paradigm

Approaches Do nothing / Ignore be content with the existing approaches –generalizes well –robust

Approaches Finite Rate of Innovation Sketching / Streaming Compressive Sensing [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]

Approaches Finite Rate of Innovation Sketching / Streaming Compressive Sensing PARSITY

Agenda A short review of compressive sensing Localization via dimensionality reduction –Experimental results A broader view of localization Conclusions

A Short Review of Compressive Sensing Theory

Compressive Sensing 101 Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank Solution: Exploit the sparsity / compressibility geometry of acquired signal

Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank but satisfies Restricted Isometry Property (RIP) Solution: Exploit the sparsity / compressibility geometry of acquired signal –iid Gaussian –iid Bernoulli –… Compressive Sensing 101

Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank but satisfies Restricted Isometry Property (RIP) Solution: Exploit the sparsity / compressibility geometry of acquired signal via convex optimization or greedy algorithm –iid Gaussian –iid Bernoulli –… Compressive Sensing 101

Sparse signal: only K out of N coordinates nonzero –model: union of K-dimensional subspaces aligned w/ coordinate axes Concise Signal Structure sorted index

Sparse signal: only K out of N coordinates nonzero –model: union of K -dimensional subspaces Compressible signal: sorted coordinates decay rapidly to zero well-approximated by a K -sparse signal (simply by thresholding) sorted index Concise Signal Structure power-law decay

Restricted Isometry Property (RIP) Preserve the structure of sparse/compressible signals RIP of order 2K implies: for all K-sparse x 1 and x 2 K-planes

Restricted Isometry Property (RIP) Preserve the structure of sparse/compressible signals Random subGaussian (iid Gaussian, Bernoulli) matrix has the RIP with high probability if K-planes

Recovery Algorithms Goal:given recover and convex optimization formulations –basis pursuit, Dantzig selector, Lasso, … Greedy algorithms –orthogonal matching pursuit, iterative thresholding (IT), compressive sensing matching pursuit (CoSaMP) –at their core:iterative sparse approximation

Performance of Recovery Using methods, IT, CoSaMP Sparse signals –noise-free measurements: exact recovery –noisy measurements: stable recovery Compressible signals –recovery as good as K-sparse approximation CS recovery error (METRIC) signal K-term approx error noise

Universality Random measurements can be used for signals sparse in any basis

Universality Random measurements can be used for signals sparse in any basis

Universality Random measurements can be used for signals sparse in any basis sparse coefficient vector nonzero entries

Signal recovery is not always required. ELVIS: Enhanced Localization via Incoherence and Sparsity (Back to Localization)

An Important Detail Solve two entangled problems for localization –estimate source locations –estimate source signals

Today Instead, solve one localization problem –estimate source locations by exploiting random projections of observed signals –estimate source signals

ELVIS Instead, solve one localization problem –estimate source locations by exploiting random projections of observed signals –estimate source signals Bayesian model order selection & MAP estimation in a decentralized sparse approximation framework that leverages –source sparsity –incoherence of sources –spatial sparsity of sources [VC, Boufounos, Baraniuk, Gilbert, Strauss; IPSN’09]

Problem Setup Discretize space into a localization grid with N grid points –fixes localization resolution –P sensors do not have to be on grid points

Localization as Sparse Approximation localization grid actual sensor measurements true target location local localization dictionary

Multiple Targets localization grid actual sensor measurements 2 true target locations

Local Dictionaries Sample sparse / compressible signal using CS –Fourier sampling [Gilbert, Strauss] Calculate at sensor i using measurements –for all grid positions n=1,…,N: assume that target is at grid position n –for all sensors j=1,…,P: use Green’s function to estimate signal sensor j would measure if target was at position n

Valid Dictionaries S.A. works when columns of are mutual incoherent True when target signal has fast-decaying autocorrelation Extends to multiple targets with small cross-correlation

Typical Correlation Functions Toyota Prius Isuzu Rodeo Chevy Camaro ACF: Toyota Prius ACF: Isuzu Rodeo ACF: Chevy Camaro CCF: Rodeo vs. Prius CCF: Rodeo vs. Camaro CCF: Camaro vs. Prius

An Important Issue localization grid actual sensor measurements Need to send across the network

Enter Compressive Sensing Sparse localization vector <> acquire and transmit compressive measurements of the actual observations without losing information

So Far… Use random projections of observed signals two ways: –create local sensor dictionaries that sparsify source locations –create intersensor communication messages (K targets on N-dim grid) populated using recovered signals random iid

ELVIS Highlights Use random projections of observed signals two ways: –create local sensor dictionaries that sparsify source locations sample at source sparsity –create intersensor communication messages communicate at spatial sparsity robust to (i) quantization (1-bit quantization–paper) (ii) packet drops No Signal Reconstruction ELVIS Dictionary

ELVIS Highlights Use random projections of observed signals two ways: –create local sensor dictionaries that sparsify source locations sample at source sparsity –create intersensor communication messages communicate at spatial sparsity robust to (i) quantization (1-bit quantization–paper) (ii) packet drops Provable greedy estimation for ELVIS dictionaries Bearing pursuit – computationally efficient reconstruction No Signal Reconstruction

Experiments

Field Data Results 5 vehicle convoy >100 × sub-Nyquist

Sensing System Problems Common theme so far… sensors > representations > metrics > “do our best” Purpose of deployment –Multi-objective:sensing, lifetime, connectivity, coverage, reliability, etc…

Competition among Objectives Common theme so far… sensors > representations > metrics > “do our best” Purpose of deployment –Multi-objective:sensing, lifetime, connectivity, coverage, reliability, etc… Limited resources > conflicts in objectives

Diversity of Objectives Multiple objectives –localizationarea –lifetime time –connectivityprobability –coveragearea –reliabilityprobability Unifying framework –utility

Optimality Pareto efficiency –Economics / optimization literature Pareto Frontier –a fundamental limit for achievable utilities

Pareto Frontiers for Localization Mathematical framework for multi objective design best sensors portfolio Elements of the design –constant budget > optimization polytope –sensor dictionary –random deployment –communications [VC, Kaplan; IPSN’09, TOSN’09]

Pareto Frontiers for Localization Mathematical framework for multi objective design Elements of the design –constant budget –sensor dictionary >measurement type (bearing or range) and error, sensor reliability, field-of- view, sensing range, and mobility …… $10$30$200$1M$5M

Pareto Frontiers for Localization Mathematical framework for multi objective design Elements of the design –constant budget –sensor dictionary –random deployment>optimize expected / worst case utilities –communications

Pareto Frontiers for Localization Mathematical framework for multi objective design Elements of the design –constant budget –sensor dictionary –random deployment –communications>bearing or range

Statement of Results – 1 Theory to predict the localization performance with management –signals > performance characterizations –key idea: duality among sensors <> existence of a reference sensing system Provable diminishing returns

Optimal heterogeneity –sparse solutions bounded by # of objectives –key theorems: concentration of resources dominating sensor pairs Solution algorithms –integer optimization Statement of Results – 2

Conclusions CS –sensing via dimensionality reduction ELVIS –source localization via dimensionality reduction –provable and efficient recovery via bearing pursuit Current work –clock synchronization –sensor position errorsvia linear filtering Pareto Frontiers w/ ELVIS:reactive systems

Questions? Volkan Cevher /