1 / 22

Efficient Statistical Pruning for Maximum Likelihood Decoding

Efficient Statistical Pruning for Maximum Likelihood Decoding. Radhika Gowaikar Babak Hassibi California Institute of Technology July 3, 2003. Outline . Integer Least Squares Problem Probabilistic Setup, Complexity as Random Variable Sphere Decoder Modified Algorithm

kerry
Télécharger la présentation

Efficient Statistical Pruning for Maximum Likelihood Decoding

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Efficient Statistical Pruning for Maximum Likelihood Decoding Radhika Gowaikar Babak Hassibi California Institute of Technology July 3, 2003

  2. Outline • Integer Least Squares Problem • Probabilistic Setup, Complexity as Random Variable • Sphere Decoder • Modified Algorithm • Statistical Pruning, Expected Complexity • Results • Analysis • Conclusions and Future Work

  3. Integer-Least Squares Problems • Search space is discrete, perhaps infinite • Given a “skewed” lattice • Given a vector • Find “closest” lattice point Knownto be NP-hard

  4. Applications in ML Decoding • ML detection leads to integer least-squares problems • Signal constellation is a subset of a lattice (PAM, QAM) • Noise is AWG Eg. Multi-antenna systems

  5. computation Approximate Solutions • Zero forcing cancellation • Nulling and canceling • Nulling and canceling with optimal ordering But Bit Error Rate suffers BER comparison – ML vs. Approximate

  6. Exact Methods • Sphere Decoding : search in a hypersphere centered at (Fincke-Pohst ; Viterbo, Boutros; Vikalo, Hassibi) How do we find the points that are in the hypersphere?

  7. Sphere Decoder To find points without exhaustive search • When , this is an interval • Use this to go from a -dimensional point to a (k+1) – dimensional point. Search over spheres of radius r and dimensions 1,2,…, N. Use to facilitate this

  8. Sphere Decoder – How it Works Call

  9. How it Works contd. depends only on

  10. Search Space and Tree Solve these successively --- get a tree Complexity depends on the size of the tree

  11. Reducing Complexity Not ML decoding any more

  12. Results Complexity exponent and BER for N=20 with QPSK

  13. Need to keep small Probability of Error Let e be the probability that the transmitted point s is not in the search space Can be shown that

  14. Choose s to make as small as desired Finding epsilon can be determined exactly in terms of s Theorem:

  15. Computational Complexity is the search region at dimension is the constellation Need to find

  16. Finding s are independent. Hence Also, can be determined exactly Yet have to employ approximations…

  17. Easy to compute Upper Bound For , it needs to satisfy conditions. For upper bound, just the -th condition. is the incomplete gamma function.

  18. Approximations Can be shown that where and are functions of The complexity can now be determined by Monte Carlo simulations

  19. Simulation Results Complexity exponent and BER for N=20 with QAM

  20. Simulation Results Complexity Exponent and BER for N=50 with QAM

  21. Conclusions and Future Work • Significant reduction in Complexity • BER can be made close to optimal • Quantify trade-off between BER and Complexity • Compare with other decoding algorithms • Analyze for signaling schemes with coding • Other applications for these techniques…?

More Related