1 / 18

Wideband Communications

Wideband Communications . Lecture 18-19: Multi-user detection Aliazam Abbasfar. Outline. Multi-user detection (MUD) Optimum detection De-correlator MMSE detector Nonlinear detector. Multi-user detection. Single user detection Require single signature waveform + timing

aiko-tran
Télécharger la présentation

Wideband Communications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Wideband Communications Lecture 18-19: Multi-user detection Aliazam Abbasfar

  2. Outline • Multi-user detection (MUD) • Optimum detection • De-correlator • MMSE detector • Nonlinear detector

  3. Multi-user detection • Single user detection • Require single signature waveform + timing • Single user matched filter • Optimum receiver when one correlation is available • Was believed to be close to optimum • The interference can be approx. as Gaussian RV • Multi-user detection • Multi-user matched filters • Improves the estimation when other correlations are available • y1 is not the sufficient statistics for b1 • y2, y3, …, yKalso have information about b1

  4. Two-user optimum MUD • y(t) = A1b1 s1(t) + A2b2 s2(t) + n(t) • y1 = A1b1 + A2b2r21+ n1 • y2 = A1b1r12+ A2b2 + n2 • y = R A b + n • Noises are correlated • E[n nT] = s2 R • MAP estimation : • Jointly optimum : max P{ (b1, b2)| y1, y2} • Individually optimum : max P{ b1 | y1, y2} • USUALLY gives the same answer • Not necessarily (counter example : P++=0.26, P+-=0.27, P-+=0.26, P--=0.21) • Usually there is only one pair with high posteriori probability • Which one to compute • Individually optimum : best BER for one user • Jointly optimum : Low complexity • ML estimation : • Jointly optimum : : max P{y1, y2|(b1, b2)} • Individually optimum : : max P{y1, y2| b1} • ML = MAP estimation when bits are equiprobable

  5. Two-user ML estimation • Cross-correlation matrix decomposition • Cholesky decomposition : R = LLH • y’ = L-1y = LH A b + L-1n = LH A b + n’ • y'1 = A1b1 + r A2b2+ n’1 • y'2 = lA2b2 + n’2 • New noises are uncorrelated • E[n’ n’T] = s2 I • Jointly optimum ML estimation : • The nearest point to one of 4 hypotheses • min [(y'1 – (A1b1 + A2b2r))2+(y'2 - lA2b2)2)] • min [ |L-1y - LH A b|2] • No near-far effect

  6. Signal space analysis • y = X1 s1 + X2 s2 + n • Orthonormal bases that span the signal • p1 = s1 • p2 = (s2 – s2Hs1 s1)/|s2 – s2Hs1 s1| • p2 = (s2 – r s1)/ l • projections on orthogonal bases that span the signal space • y'1 = p1H y = y1 • y'2 = p2H y = (y2 - r y1)/ l • Noises are uncorrelated • ML solution : nearest point to hypotheses • Any orthonormal set works

  7. Two-user ML estimation (2) • y’ = LH A b + n’ • y'1 = A1b1 + r A2b2+ n’1 • y'2 = lA2b2 + n’2 • E[n’ n’T] = s2 I • Individually optimum ML estimation : • Min[ P(y’|b1,+)+P(y’|b1,-) ] • min { exp[((y'1 – (A1b1 + A2r))2+(y'2 - lA2)2))/s2] + exp[((y'1 – (A1b1 - A2r))2+(y'2 + lA2)2)) /s2] } • More complex than jointly estimation • No near-far effect

  8. K-user • y’ = LH A b + n’ • E[n’ n’T] = s2 I • Jointly optimum ML estimation : • The nearest point to one of hypotheses • min { |L-1y - LH A b|2 } • max { 2bTAy - bTARA b } • Combinatorial optimization • Complexity : O(2K) • Really complex for long codes • Complexity is more for asynchronous CDMA

  9. Power trade-off regions • Optimum MUD Single user • BER = 3x10-5 • SNR = 12 dB • No penalty if r < 0.5

  10. De-correlator • y = R A b + n • y’ = R-1 y = Ab + R-1 n • Data bits are de-correlated • bk = sgn( y’k) • New noises are correlated • E[n’ n’T] = s2 R-1 • Advantages: • Ak ’s not needed • Can be de-centralized • y’k = pkH y • The best estimate when Ak’s not known • BER = Q( Ak/(sRkk) ) • No near-far problem • Error free with no noise

  11. Signal space analysis • y = X1 s1 + X2 s2 + … + XKsK + n • For each user find the a basis vector orthogonal to other users vector • Two user case : • p1 = s2 • p2 = s1 • Decision regions • Independent of A • Passes through origin • Noises are correlated • Noise enhancement

  12. Power trade-off regions • vs. Single user Optimum MUD • BER = 3x10-5 • SNR = 12 dB • Minimum total power when users have the same power

  13. MMSE detector • y = R A b + n • y’ = G y = G( RAb + n) • bk = sgn( y’k) • Minimize MSE : E[|y’-b|2] • G = (R- s2 A-2)-1 • Notes: • Ak ’s are needed • Can be de-centralized • y’k = pkH y • Performance • De-correlator in high SNR • Optimum in low SNR • No near-far problem • Error free with no noise

  14. Successive cancellation • y = R A b + n • y’ = G y = G( RAb + n) • bk = sgn( y’k) • Minimize MSE : E[|y’-b|2] • G = (R- s2 A-2)-1 • Notes: • Ak ’s are needed • Can be de-centralized • y’k = pkH y • Performance • De-correlator in high SNR • Optimum in low SNR • No near-far problem • Error free with no noise

  15. Successive cancellation • Detect one user at a time and cancel its interference • Popular order : decreasing received power • Best order : decreasing correlator output power • For two-user case • we detect user 2 first • Subtract from Rx signal • Near-far effect can happen

  16. Power trade-off regions • vs. Optimum MUD • User 2 is detected first • BER1,2 = 3x10-5 • r= 0.5  SNR1 = 12 dB, SNR2 = 15 dB • Same power : SNR1 = 18 dB, SNR2 = 18 dB

  17. Decision feedback detector • y’ = L-1y = LH A b + L-1n = LH A b + n’ • y'1 = A1b1 + r A2b2+ n’1 • y'2 = lA2b2 + n’2 • Detect b2 first, • Cancel b2 in y’1 , then detect b1 • BER  Q( AkLkk/s) • Lower signal amplitude • No near-far effect

  18. Reading • Verdu 4.1, 5.1, 5.5, 6.1, 6.2, 6.3, 7.1,7.5

More Related