1 / 36

Recovery of Clustered Sparse Signals from Compressive Measurements

Recovery of Clustered Sparse Signals from Compressive Measurements. Chinmay Hegde. Richard Baraniuk. Volkan Cevher volkan@rice.edu. Piotr Indyk. The Digital Universe. Size: ~300 billion gigabytes generated in 2007 digital bits > stars in the universe

majed
Télécharger la présentation

Recovery of Clustered Sparse Signals from Compressive Measurements

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recovery of Clustered Sparse Signals from Compressive Measurements ChinmayHegde Richard Baraniuk Volkan Cevher volkan@rice.edu PiotrIndyk

  2. The Digital Universe • Size: ~300 billion gigabytes generated in 2007 digital bits > stars in the universe growing by a factor of 10 every 5 years > Avogadro’s number (6.02x1023) in 15 years • Growth fueled by multimedia / multisensor data audio, images, video, surveillance cameras, sensor nets, … • In 2007 digital data generated > total storage by 2011, ½ of digital universe will have no home [Source: IDC Whitepaper “The Diverse and Exploding Digital Universe” March 2008]

  3. Challenges Acquisition Compression Processing

  4. Approaches • Do nothing / Ignore be content with where we are… • generalizes well • robust

  5. Approaches • Finite Rate of Innovation Sketching / Streaming Compressive Sensing [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]

  6. Approaches • Finite Rate of Innovation Sketching / Streaming Compressive Sensing PARSITY [Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]

  7. Agenda • A short review of compressive sensing • Beyond sparse models • Potential gains via structured sparsity • Block-sparse model • (K,C)-sparse model • Conclusions

  8. Compressive Sensing 101 • Goal: Recover a sparse or compressible signal from measurements • Problem: Randomprojection not full rank • Solution: Exploit the sparsity/compressibilitygeometry of acquired signal

  9. Compressive Sensing 101 • Goal: Recover a sparse or compressible signal from measurements • Problem: Randomprojection not full rankbut satisfies Restricted Isometry Property (RIP) • Solution: Exploit the sparsity/compressibility geometry of acquired signal • iid Gaussian • iid Bernoulli • …

  10. Compressive Sensing 101 • Goal: Recover a sparse or compressible signal from measurements • Problem: Randomprojection not full rank • Solution: Exploit the modelgeometry of acquired signal

  11. Basic Signal Models • Sparse signal: only K out of N coordinates nonzero • model: union of K-dimensional subspaces aligned w/ coordinate axes sorted index

  12. Basic Signal Models • Sparse signal: only K out of N coordinates nonzero • model: union of K-dimensional subspaces • Compressible signal: sorted coordinates decay rapidly to zero well-approximated by a K-sparse signal (simply by thresholding) power-lawdecay sorted index

  13. Recovery Algorithms • Goal: given recover • and convex optimization formulations • basis pursuit, Dantzig selector, Lasso, … • Greedy algorithms • iterative thresholding (IT), compressive sensing matching pursuit (CoSaMP) subspace pursuit (SP) • at their core: iterative sparse approximation

  14. Performance of Recovery • Using methods, IT, CoSaMP, SP • Sparse signals • noise-free measurements: exact recovery • noisy measurements: stable recovery • Compressible signals • recovery as good as K-sparse approximation CS recoveryerror signal K-termapprox error noise

  15. Sparse Signals pixels: background subtracted images wavelets: natural images Gabor atoms: chirps/tones

  16. Sparsity as a Model • Sparse/compressible signal model captures simplistic primary structure sparse image

  17. Beyond Sparse Models • Sparse/compressible signal model captures simplistic primary structure • Modern compression/processing algorithms capture richer secondary coefficient structure pixels: background subtracted images wavelets: natural images Gabor atoms: chirps/tones

  18. Sparse Signals • Defn: K-sparse signals comprise a particular set of K-dim canonical subspaces

  19. Model-Sparse Signals • Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces

  20. Model-Sparse Signals • Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces • Structured subspaces • <> fewer subspaces • <> relaxed RIP • <> fewer measurements [Blumensath and Davies]

  21. Model-Sparse Signals • Defn: A K-sparse signal model comprises a particular (reduced) set of K-dim canonical subspaces • Structured subspaces • <> increased signal discrimination • <> improved recovery perf. • <> faster recovery

  22. Block Sparse Signals

  23. Block-Sparsity • Motivation • sensor networks • Signal model intra-sensor: sparsityinter-sensor: common sparse supports • union of subspaces when signals are concatenated sensors frequency … *Individual block sizes may vary… [Tropp, Eldar, Mishali; Stojnic, Parvaresh, Hassibi; Baraniuk, VC, Duarte, Hegde]

  24. Block-Sparsity Recovery • Sampling bound (B = # of blocks) • Problem specific solutions • Mixed -norm solutions • Greedy solutions: simultaneous orthogonal matching pursuit [Tropp] • Model-based recovery framework: -norm [Baraniuk, VC, Duarte, Hegde] within block across blocks [Eldar, Mishali (provable); Stojnic, Parvaresh, Hassibi]

  25. Standard CS Recovery • Iterative Thresholding • [Nowak, Figueiredo; Kingsbury, Reeves; Daubechies, Defrise, De Mol; Blumensath, Davies; …] update signal estimate prune signal estimate(best K-term approx) update residual

  26. Model-based CS Recovery • Iterative ModelThresholding • [VC, Duarte, Hegde, Baraniuk; Baraniuk, VC, Duarte, Hegde] update signal estimate prune signal estimate(best K-term model approx) update residual

  27. Model-based CS Recovery • Provable guarantees • [Baraniuk, VC, Duarte, Hegde] CS recoveryerror signal K-termmodel approx error noise update signal estimate prune signal estimate(best K-term model approx) update residual

  28. Block-Sparse CS Recovery • Iterative Block Thresholding within block • sort • + • threshold across blocks

  29. Block-Sparse Signal target CoSaMP (MSE = 0.723) block-sparse model recovery (MSE=0.015) Blocks are pre-specified.

  30. Clustered Sparse Signals:(K,C)-sparsesignal model

  31. Clustered Sparsity • (K,C) sparse signals (1-D) • K-sparse within at most C clusters • Stable recovery • Recovery: • w/ model-based framework • model approximation via dynamic programming (recursive / bottom up)

  32. Example

  33. Simulation via Block-Sparsity • Clustered sparse <> approximable by block-sparse *If we are willing to pay 3 × sampling penalty • Proof by (adversarial) construction …

  34. Clustered Sparsity in 2D • Model clustering of significant pixels in space domain using graphical model (MRF) • Ising model approximation via graph cuts [VC, Duarte, Hedge, Baraniuk] target Ising-modelrecovery CoSaMPrecovery LP (FPC)recovery

  35. Conclusions Why CS works: stable embedding for signals with special geometric structure Sparse signals >> model-sparse signals Greedy model-based signalrecovery algorithms upshot: provably fewer measurements more stable recovery new model: clustered sparsity Compressible signals >> model-compressible signals

  36. Volkan Cevher / volkan@rice.edu

More Related