550 likes | 565 Vues
Source Localization on a budget. Volkan Cevher volkan@rice.edu Rice University. Petros. Rich. Anna. Martin. Lance. Localization Problem. Goal: Localize targets by fusing measurements from a network of sensors.
E N D
Source Localization on a budget Volkan Cevher volkan@rice.edu Rice University Petros Rich Anna Martin Lance
Localization Problem • Goal: Localize targetsby fusing measurementsfrom a network of sensors [Cevher, Duarte, Baraniuk; EUSIPCO 2007| Model and Zibulevsky; SP 2006| Cevher et al.; ICASSP 2006| Malioutov, Cetin, and Willsky; IEEE TSP, 2005| Chen et al.; Proc. of IEEE 2003]
Localization Problem • Goal: Localize targetsby fusing measurementsfrom a network of sensors • collect time signal data • communicate signals across the network • solve an optimizationproblem
Digital Revolution • Goal: Localize targetsby fusing measurementsfrom a network of sensors • collect time signal data • communicate signals across the network • solve an optimizationproblem <>
Digital Data Acquisition Foundation: Shannon/Nyquist sampling theorem • “if you sample densely enough (at the Nyquist rate), you can perfectly reconstruct the original analog data” time space
Major Trends in Sensing higher resolution / denser sampling large numbers of sensors increasing # of modalities / mobility
Problems of the Current Paradigm • Goal: Localize targetsby fusing measurementsfrom a network of sensors • collect time signal data • requires potentiallyhigh-rate (Nyquist)sampling • communicate signals across the network • potentially largecommunicationburden • solve an optimizationproblem • e.g., MLE Need compression
Approaches • Do nothing / Ignore be content with the existing approaches • generalizes well • robust
Approaches • Finite Rate of Innovation Sketching / Streaming Compressive Sensing [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]
Approaches • Finite Rate of Innovation Sketching / Streaming Compressive Sensing PARSITY
Agenda • A short review of compressive sensing • Localization via dimensionality reduction • Experimental results • A broader view of localization • Conclusions
Compressive Sensing 101 • Goal: Recover a sparse or compressible signal from measurements • Problem: Randomprojection not full rank • Solution: Exploit the sparsity / compressibilitygeometry of acquired signal
Compressive Sensing 101 • Goal: Recover a sparse or compressible signal from measurements • Problem: Randomprojection not full rankbut satisfies Restricted Isometry Property (RIP) • Solution: Exploit the sparsity / compressibility geometry of acquired signal • iid Gaussian • iid Bernoulli • …
Compressive Sensing 101 • Goal: Recover a sparse or compressible signal from measurements • Problem: Randomprojection not full rankbut satisfies Restricted Isometry Property (RIP) • Solution: Exploit the sparsity / compressibility geometry of acquired signal via convex optimization or greedy algorithm • iid Gaussian • iid Bernoulli • …
Concise Signal Structure • Sparse signal: only K out of N coordinates nonzero • model: union of K-dimensional subspaces aligned w/ coordinate axes sorted index
Concise Signal Structure • Sparse signal: only K out of N coordinates nonzero • model: union of K-dimensional subspaces • Compressible signal: sorted coordinates decay rapidly to zero well-approximated by a K-sparse signal (simply by thresholding) power-lawdecay sorted index
Restricted Isometry Property (RIP) K-planes • Preserve the structure of sparse/compressible signals • RIP of order 2K implies: for all K-sparse x1 and x2
Restricted Isometry Property (RIP) K-planes • Preserve the structure of sparse/compressible signals • Random subGaussian (iid Gaussian, Bernoulli) matrix has the RIP with high probability if
Recovery Algorithms • Goal: given recover • and convex optimization formulations • basis pursuit, Dantzig selector, Lasso, … • Greedy algorithms • orthogonal matching pursuit, iterative thresholding (IT), compressive sensing matching pursuit (CoSaMP) • at their core: iterative sparse approximation
Performance of Recovery • Using methods, IT, CoSaMP • Sparse signals • noise-free measurements: exact recovery • noisy measurements: stable recovery • Compressible signals • recovery as good as K-sparse approximation CS recoveryerror (METRIC) noise signal K-termapprox error
Universality • Random measurements can be used for signals sparse in any basis
Universality • Random measurements can be used for signals sparse in any basis
Universality • Random measurements can be used for signals sparse in any basis sparsecoefficient vector nonzero entries
Signal recovery is not always required.ELVIS:Enhanced Localization via Incoherence and Sparsity (Back to Localization)
An Important Detail • Solve two entangled problems for localization • estimate source locations • estimate source signals
Today • Instead, solve one localization problem • estimate source locations by exploiting random projections of observed signals • estimate source signals
ELVIS • Instead, solve one localization problem • estimate source locations by exploiting random projections of observed signals • estimate source signals • Bayesian model order selection & MAP estimation in a decentralized sparse approximation framework that leverages • source sparsity • incoherence of sources • spatial sparsity of sources [VC, Boufounos, Baraniuk, Gilbert, Strauss; IPSN’09]
Problem Setup • Discretize space into a localization grid withN grid points • fixes localization resolution • P sensors do not have tobe on grid points
Localization as Sparse Approximation localizationgrid local localizationdictionary actual sensormeasurements truetargetlocation
Multiple Targets localizationgrid actual sensormeasurements 2 truetargetlocations
Local Dictionaries • Sample sparse / compressible signal using CS • Fourier sampling [Gilbert, Strauss] • Calculate at sensor i using measurements • for all grid positionsn=1,…,N: assume that target is at grid position n • for all sensors j=1,…,P: use Green’s function to estimate signal sensor j would measure if target was at position n
Valid Dictionaries • S.A. works when columns of are mutual incoherent • True when target signal has fast-decaying autocorrelation • Extends to multiple targets with small cross-correlation
Typical Correlation Functions Toyota Prius ACF: Toyota Prius CCF: Rodeo vs. Prius Isuzu Rodeo ACF: Isuzu Rodeo CCF: Rodeo vs. Camaro Chevy Camaro ACF: Chevy Camaro CCF: Camaro vs. Prius
An Important Issue Need to send across the network localizationgrid actual sensormeasurements
Enter Compressive Sensing • Sparse localization vector <> acquire and transmit compressive measurementsof the actual observations without losing information
So Far… • Use random projections of observed signals two ways: • create local sensor dictionaries that sparsify source locations • create intersensor communication messages (K targets on N-dim grid) populated using recovered signals random iid
ELVIS Highlights • Use random projections of observed signals two ways: • create local sensor dictionaries that sparsify source locations sample at source sparsity • create intersensor communication messages communicate at spatial sparsity robust to (i) quantization (1-bit quantization–paper) (ii) packet drops No Signal Reconstruction ELVIS Dictionary
ELVIS Highlights • Use random projections of observed signals two ways: • create local sensor dictionaries that sparsify source locations sample at source sparsity • create intersensor communication messages communicate at spatial sparsity robust to (i) quantization (1-bit quantization–paper) (ii) packet drops • Provable greedy estimation for ELVIS dictionaries Bearing pursuit – computationally efficient reconstruction No Signal Reconstruction
Field Data Results 5 vehicle convoy >100 × sub-Nyquist
Sensing System Problems • Common theme so far… sensors > representations > metrics > “do our best” • Purpose of deployment • Multi-objective: sensing, lifetime, connectivity, coverage, reliability, etc…
Competition among Objectives • Common theme so far… sensors > representations > metrics > “do our best” • Purpose of deployment • Multi-objective: sensing, lifetime, connectivity, coverage, reliability, etc… • Limited resources > conflicts in objectives
Diversity of Objectives • Multiple objectives • localization area • lifetime time • connectivity probability • coverage area • reliability probability • Unifying framework • utility
Optimality • Pareto efficiency • Economics / optimization literature • Pareto Frontier • a fundamental limit for achievable utilities
Pareto Frontiers for Localization • Mathematical framework for multi objective design best sensors portfolio • Elements of the design • constant budget > optimization polytope • sensor dictionary • random deployment • communications [VC, Kaplan; IPSN’09, TOSN’09]
Pareto Frontiers for Localization • Mathematical framework for multi objective design • Elements of the design • constant budget • sensor dictionary > measurement type (bearing or range) and error, sensor reliability, field-of- view, sensing range, and mobility … … $10 $30 $200 $1M $5M
Pareto Frontiers for Localization • Mathematical framework for multi objective design • Elements of the design • constant budget • sensor dictionary • random deployment > optimize expected / worst case utilities • communications