1 / 47

TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A A A A A

Workshop on Random Matrix Theory and Wireless Communications. Bridging the Gaps: Free Probability and Channel Capacity. Antonia Tulino Università degli Studi di Napoli Chautauqua Park, Boulder, Colorado, July 17, 2008. TexPoint fonts used in EMF.

lemuel
Télécharger la présentation

TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A A A A A

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Workshop on Random Matrix Theory and Wireless Communications Bridging the Gaps:Free Probability and Channel Capacity • Antonia Tulino • Università degli Studi di Napoli • Chautauqua Park, Boulder, Colorado,July 17, 2008 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAAA

  2. Linear Vector Channel noise=AWGN+interference N-dimensional output K-dimensional input (NK)channel matrix • Variety of communication problems by simply reinterpreting K, N, and H • Fading • Wideband • Multiuser • Multiantenna

  3. Role of the Singular Values Mutual Information: Ergodic case: Non-Ergodic case:

  4. Role of the Singular Values Minumum Mean-Square Error (MMSE) :

  5. Independent and Identically distributed entries • Separable Correlation Model • UIU-Model with independent arbitrary distrbuted entries • with which is uniformly distributed over the manifold of complex matrices such that H-Model

  6. Gaussian Erasure Channels Flat Fading & Deterministic ISI: Random erasure mechanisms: • link congestion/failure (networks) • cellular system with unreliable wired infrastructure • impulse noise (DSL) • faulty transducers (sensor networks)

  7. i.i.d. with uniform distribution in [0, 1]d, d-Fold Vandermonde Matrix • Sensor networks • Multiantenna multiuser communications • Detection of distributed Targets

  8. an i.i.d sequence Flat Fading & Deterministic ISI:

  9. Formulation = asymptotically circulant matrix (stationary input (with PSD ) = asymptotically circulant matrix Grenander-Szego theorem Eigenvalues ofS

  10. the water levelis chosen so that: where is the waterfilling input power spectral density given by: Deterministic ISI & |Ai|=1 Key Tool: Grenander-Szego theorem on the distribution of the eigenvalues of large Toeplitz matrices

  11. Deterministic ISI & Flat Fading Key Question: The distribution of the eigenvalues of a large-dimensional random matrix: • S = asymptotically circulant matrix • A = random diagonal fading matrix

  12. Ai =ei ={0,1} Key Question: The distribution of the eigenvalues of a large-dimensional random matrix: • S = asymptotically circulant matrix • E = random 0-1 diagonal matrix

  13. with X a nonnegative random variable whose distribution is while g is a nonnegative real number. RANDOM MATRIX THEORY:- & Shannon-Transform The - and Shannon-transform of an nonnegative definite random matrix , with asymptotic ESD A. M. Tulino and S. Verdú“Random Matrices and Wireless Communications,”Foundations and Trends in Communications and Information Theory, vol. 1, no. 1, June 2004.

  14. Theorem: The Shannon transform and -transforms are related through: where is defined by the fixed-point equation RANDOM MATRIX THEORY:Shannon-Transform • Abe a nonnegative definite random matrix.

  15. is monotonically increasing with g • which is the solution to the equation • is monotonically decreasing with y Property of

  16. Theorem: -Transform Theorem: The -transform of is where is the solution to:

  17. Theorem: Shannon-Transform Theorem: The Shannon-transform of is where a and n are the solutions to:

  18. Stationary Gaussian inputs with power spectral Flat Fading & Deterministic ISI: Theorem: The mutual information is: with

  19. Stationary Gaussian inputs with power spectral Flat Fading & Deterministic ISI: Theorem: The mutual information is: with

  20. Stationary Gaussian inputs with power spectral Flat Fading & Deterministic ISI: Theorem: The mutual information is: with

  21. Special Case: No Fading  

  22. Special Case: Memoryless Channels  

  23. Stationary Gaussian inputs with power spectral Special case: Gaussian Erasure Channels Theorem: The mutual information is: with

  24. Stationary Gaussian inputs with power spectral with : Flat Fading & Deterministic ISI: Let Theorem: The mutual information is:

  25. ─ n = 200 Example n=200

  26. n = 1000 ─ Example n=1000

  27. with so that: Input Optimization Theorem: • be an random matrix • such that i-th column of A. M. Tulino, A. Lozano and S. Verdú“Capacity-Achieving Input Covariance for Single-user Multi-Antenna Channels”,IEEE Trans. on Wireless Communications 2006

  28. Input Optimization Theorem: The capacity-achieving input power spectral density is: where and is chosen so that

  29. the waterfilling solutionfor g the fading-free water level for g Input Optimization Corollary: Effect of fading on the capacity-achieving input power spectral density = SNR penalty with k< 1regulates amount of water admitted on each frequency tailoring the waterfilling for no-fading to fading channels.

  30. Theorem: -Transform Theorem: The -transform of is where is the solution to:

  31. Proof: Key Ingredient • We can replace S by it circulant asymptotic equivalent counterpart, =FLF† • Let Q = EF, denote by qi the ith column of Q, and let

  32. Proof: Matrix inversion lemma:

  33. Proof:

  34. Proof: Lemma:

  35. Asymptotics • Low-power ( ) • High-power ( )

  36. Asymptotics: High-SNR At large SNR we can closely approximate it linearly need and S0 where High-SNR dB offset High-SNR slope

  37. Asymptotics: High-SNR Theorem: Let , and the generalized bandwidth,

  38. Asymptotics • Sporadic Erasure (e!0) • Sporadic Non-Erasure (e!1)

  39. Memoryless noisy erasure channel High SNR where is the water level of the PSD that achieves Low SNR Asymptotics: Sporadic Erasures (e0) Theorem: For any output power spectral densityand Theorem: For sporadic erasures:

  40. Theorem: Optimizingover with with the maximum channel gain Asymptotics: Sporadic Non-Erasures (e1) Theorem:

  41. S(f) =1 Bounds: Theorem: The mutual information rate is lower bounded by: Equality

  42. Bounds: Theorem: The mutual information rate is upper bounded by:

  43. d-Fold Vandermonde Matrix Diagonal matrix (either random or deterministic) with supported compact measure Diagonal matrix (either random or deterministic) with supported compact measure

  44. d-Fold Vandermonde Matrix Theorem: The -transform of is The Shannon-Transformr is

  45. d-Fold Vandermonde Matrix Theorem: The p-moment of is:

  46. Summary • Asymptotic distribution of A S A ---new result at the intersection of the asymptotic eigenvalue distribution of Toeplitz matrices and of random matrices --- • The mutual information of a Channel with ISI and Fading. • Optimality of waterfilling in the  presence of fading known at the receiver. • Easily computable asymptotic expressions in various regimes (low and high SNR) • New result for d-fold Vandermond matrices and on their product with diagonal matrices

More Related