1 / 29

Cs: compressed sensing

Cs: compressed sensing. Jialin peng. Outline. Introduction Exact/Stable Recovery Conditions -norm based recovery OMP based recovery Some related recovery algorithms Sparse Representation Applications. Introduction. Data Compression. Data Storage. decompress.

lalasa
Télécharger la présentation

Cs: compressed sensing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cs: compressed sensing Jialin peng

  2. Outline • Introduction • Exact/Stable Recovery Conditions • -norm based recovery • OMP based recovery • Some related recovery algorithms • Sparse Representation • Applications

  3. Introduction Data Compression Data Storage decompress Receiving & Storage • high-density sensor • high speed sampling • …… • A large amount of sampled data will be discarded • A certain minimum number of samples is required in order to perfectly capture an arbitrary bandlimited signal

  4. Sparse Property • Important classes of signals have naturally sparse representations with respect to fixed bases (i.e., Fourier, Wavelet), or concatenations of such bases. • Audio, images … • Although the images (or their features) are naturally very high dimensional, in many applications images belonging to the same class exhibit degenerate structure. • Low dimensional subspaces, submanifolds • representative samples—sparse representation

  5. Transform coding: JPEG, JPEG2000, MPEG, and MP3

  6. The Goal Develop anend-to-end system • Sampling • processing • reconstruction • All operations are performed at a low rate: below the Nyquist-rate of the input (too costly, or even physically impossible) • Relying on structure in the input

  7. Sparse: the simplest choice is the best one • Signals can often be wellapproximated as a linear combination of just a fewelements from a known basis or dictionary. • When this representation is exact ,we say that the signal is sparse. Remark: In many cases these high-dimensional signals contain relatively little information compared to their ambient dimension.

  8. Introduction Data Compression Data Storage decompress Receiving & Storage • high-density sensor • high speed sampling • …… • A large amount of sampled data will be discarded • A certain minimum number of samples is required in order to perfectly capture an arbitrary bandlimited signal

  9. Introduction Data Storage modified sensor Receiving & Storage optimization • Sparse priors of signal • Nonuniform sampling • Imaging algorithm: optimization • Alleviated sensor • Reduced data • ……

  10. Introduction = Sensing Matrix

  11. compression Find the most concise representation: Compressed sensing: sparse or compressible representation • A finite-dimensional signal having a sparse or compressible representation can be recovered from a small set of linear, nonadaptive measurements • how should we design the sensing matrix A to ensure that it preserves the information in the signal x?. • how can we recover the original signal x from measurements y? • Nonlinear: • Unknown nonzero locations results in a nonlinear model: the choice of which dictionary elements are used can change from signal to signal . 2. Nonlinear recovering algorithms the signal is well-approximated by a signal with only k nonzerocoefficients

  12. Introduction How can we recovery the unknown signal: Exact/Stable Recovery Condition • Let be a matrix of size with . • For a –sparse signal , let be the measurement vector. • Our goal is to exact/stable recovery the unknown signal from measurement. • The problem is under-determined. • Thanks for the sparsity, we can reconstruct the signal via .

  13. Exact/stable recovery conditions • The spark of a given matrix A • Null space property (NSP) of order k • The restricted isometry property Remark: verifying that a general matrix A satisfies any of these properties has a combinatorial computational complexity

  14. Exact/stable recovery conditions Restricted Isometry Property • The restricted isometry constant (RIC) is defined as the smallest constant which satisfy: • The restricted orthogonality condition (ROC)is the smallest number such that:

  15. Exact/stable recovery conditions • Solving minimization is NP-hard, we usually relax it to the or minimization.

  16. Exact/stable recovery conditions • For the inaccurate measurement , the stable reconstruction model is

  17. Exact/stable recovery conditions • Some other Exact/Stable Recovery Conditions:

  18. Exact/stable recovery conditions • Braniuk et al. have proved that for some random matrices, such as • Gaussian, • Bernoulli, • …… we can exactly/stably reconstruct unknown signal with overwhelming high probability.

  19. Exact/stable recovery conditions cf: minimization

  20. Exact/stable recovery conditions • Some evidences have indicated that with , can exactly/stably recovery signal with fewer measurements.

  21. Quicklook Interpretation • Dimensionality-reducing projection. • Approximately isometric embeddings, i.e., pairwise Euclidean distances are nearly preserved in the reduced space RIP

  22. Quicklook Interpretation

  23. Quicklook Interpretation • the ℓ2 norm penalizes large coefficients heavily, therefore solutions tend to have many smaller coefficients. • In the ℓ1 norm, many small coefficients tend to carry a • larger penalty than a few large coefficients.

  24. Algorithms • L1 minimization algorithms iterative soft thresholding iteratively reweighted least squares … • Greedy algorithms Orthogonal Matching Pursuit iterative thresholding • Combinatorial algorithms

  25. CS builds upon the fundamental fact that • we can represent many signals using only a few non-zero coefficients in a suitable basis or dictionary. • Nonlinear optimization can then enable recovery of such signals from very few measurements.

  26. Sparse property • The basis for representing the data • incoherent->task-specific (often overcomplete) dictionary or redundant one

  27. MRI Reconstruction MR images are usually sparse in certain transform domains, such as finite difference and wavelet.

  28. Sparse Representation • Consider a family of images, representing natural and typical image content: • Such images are very diverse vectors in • They occupy the entire space? • Spatially smooth images occur much more often than highly non-smooth and disorganized images • L1-norm measure leads to an enforcement of sparsity of the signal/image derivatives. • Sparse representation

  29. Matrix completion algorithms NP-hard Convex relaxation Unconstraint • Recovering a unknown (approximate) low-rank matrix from a sampling set of its entries.

More Related