1 / 42

Backing off from infinity :

Backing off from infinity : . fundamental communication limits in non-asymptotic regimes. Andrea Goldsmith. Thanks to collaborators Chen, Eldar , Grover, Mirghaderi , Weissman. Information Theory and Asymptopia.

shawna
Télécharger la présentation

Backing off from infinity :

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Backing off from infinity: fundamental communication limits in non-asymptotic regimes Andrea Goldsmith Thanks to collaborators Chen, Eldar, Grover, Mirghaderi, Weissman

  2. Information Theory and Asymptopia • Capacity with asymptotically small error achieved by asymptotically long codes. • Defining capacity in terms of asymptotically small error and infinite delay is brilliant! • Has also been limiting • Cause of unconsummated union between networks and information theory • Optimal compression based on properties of asymptotically long sequences • Leads to optimality of separation • Other forms of asymptopia • Infinite SNR, energy, sampling, precision, feedback, …

  3. Why back off? Theory not informing practice

  4. Theory vs. practice What else lives in asymptopia?

  5. Backing off from: infinite blocklength • Recent developments on finite blocklength • Channel codes (Capacity C for n) • Source codes (entropy H or rate distortion R(D)) [Ingber, Kochman’11; Kostina, Verdu’11] [Wang et. Al’11; Kostina, Verdu’12] Separation not Optimal Separation not Optimal

  6. Grand Challenges Workshop: CTW Maui • From the perspective of the cellular industry, the Shannon bounds evaluated by Slepian are within .5 dB for a packet size of 30 bits or more for the real AWGN channel at 0.5 bits/sym, for BLER = 1e-4. In this perhaps narrow context there is not much uncertainty for performance evaluations. • For cellular and general wireless channels, finite blocklength bounds for practical fading models are needed and there is very little work along those lines. • Even for the AWGN channel the computational effort of evaluating the Shannon bounds is formidable. • This indicates a need for accurate approximations, such as those recently developed based on the idea of channel dispersion.

  7. Error Prone Low Pe Diversity vs. Multiplexing Tradeoff • Use antennas for multiplexing or diversity • Diversity/Multiplexing tradeoffs (Zheng/Tse) What is Infinite?

  8. Backing off from: infinite SNR • High SNR Myth: Use some spatial dimensions for multiplexing and others for diversity • Reality: Use all spatial dimensions for one or the other* • Diversity is wasteful of spatial dimensions with HARQ • Adapt modulation/coding to channel SNR *Transmit Diversity vs. Spatial Multiplexing inModern MIMO Systems”,Lozano/Jindal

  9. Diversity-Multiplexing-ARQ Tradeoff • Suppose we allow ARQ with incremental redundancy • ARQ is a form of diversity [Caire/El Gamal 2005] L=4 ARQ Window Size L=1 L=2 L=3

  10. ST Code High Rate High-Rate Quantizer Decoder Error Prone ST Code High Diversity Low-Rate Quantizer Decoder Low Pe Joint Source/Channel Coding • Use antennas for multiplexing: • Use antennas for diversity How should antennas be used: Depends on end-to-end metric

  11. Increased rate here decreases source distortion But permits less diversity here And maybe higher total distortion Resulting in more errors Joint Source-Channel coding w/MIMO s bits s bits Index Assignment Channel Encoder Source Encoder i p(i) MIMO Channel A joint design is needed s bits s bits Channel Decoder Inverse Index Assignment vj Source Decoder j p(j)

  12. Antenna Assignment vs. SNR

  13. Relaying in wireless networks • Intermediate nodes (relays) in a route help to forward the packet to its final destination. • Decode-and-forward (store-and-forward) most common: • Packet decoded, then re-encoded for transmission • Removes noise at the expense of complexity • Amplify-and-forward: relay just amplifies received packet • Also amplifies noise: works poorly for long routes; low SNR. • Compress-and-forward: relay compresses received packet • Used when Source-relay link good, relay-destination link weak Source Relay Destination Capacity of the relay channel unknown: only have bounds

  14. Cooperation in Wireless Networks • Relaying is a simple form of cooperation • Many more complex ways to cooperate: • Virtual MIMO , generalized relaying, interference forwarding, and one-shot/iterative conferencing • Many theoretical and practice issues: • Overhead, forming groups, dynamics, full-duplex, synch, …

  15. RX1 TX1 X1 Y4=X1+X2+X3+Z4 relay Y3=X1+X2+Z3 X3= f(Y3) Y5=X1+X2+X3+Z5 X2 TX2 RX2 Generalized Relaying and Interference Forwarding Analog network coding • Can forward message and/or interference • Relay can forward all or part of the messages • Much room for innovation • Relay can forward interference • To help subtract it out

  16. Beneficial to forward bothinterference and message

  17. In fact, it can achieve capacity P3 P1 Ps D S P2 P4 Maric/Goldsmith’12 • For large powers Ps, P1, P2, …, analog network coding (AF) approaches capacity : Asymptopia?

  18. Interference Alignment • Addresses the number of interference-free signaling dimensions in an interference channel • Based on our orthogonal analysis earlier, it would appear that resources need to be divided evenly, so only 2BT/N dimensions available • Jafar and Cadambe showed that by aligning interference, 2BT/2 dimensions are available • Everyone gets half the cake! Except at finite SNRs 

  19. Backing off from: infinite SNR • High SNR Myth: Decode-and-forward equivalent to amplify-forward, which is optimal at high SNR* • Noise amplification drawback of AF diminishes at high SNR • Amplify-forward achieves full degrees of freedom in MIMO systems (Borade/Zheng/Gallager’07) • At high-SNR, Amplify-forward is within a constant gap from the capacity upper bound as the received powers increase (Maric/Goldsmith’07) • Reality: optimal relaying unknown at most SNRs: • Amplify-forward highly suboptimal outside high SNR per-node regime, which is not always the high power or high channel gain regime • Amplify-forward has unbounded gap from capacity in the high channel gain regime (Avestimehr/Diggavi/Tse’11) Decode-forward used in practice • Relay strategy should depend on the worst link

  20. Capacity and Feedback • Capacity under feedback largely unknown • Channels with memory • Finite rate and/or noisy feedback • Multiuser channels • Multihop networks • ARQ is ubiquitious in practice • Works well on finite-rate noisy feedback channels • Reduces end-to-end delay • Why hasn’t theory met practice when it comes to feedback?

  21. PtPMemoryless Channels: Perfect Feedback Shannon Feedback does not increase capacity of DMCs Schalkwijk-Kailath Scheme for AWGN channels Low-complexity linear recursive scheme Achieves capacity Double exponential decay in error probability Encoder Decoder +

  22. Backing off from: Perfect Feedback + Channel Encoder Decoder • [Shannon 59]: No Feedback • [Pinsker, Gallager et al.]: Perfect feedback • Infinite rate/no noise • [Kim et. al. 07/10]: Feedback with AWGN • [Polyaskiy et. al. 10]: Noiseless feedback reduces • the minimum energy per bit when nRis fixed and n    Feedback Module

  23. Gaussian Channel with Rate-Limited Feedback Channel Encoder + Decoder Feedback is rate- limited ; no noise • Constraints • Objective: • Choose and • to maximize the decay rate of • error probability Feedback Module

  24. A super-exponential error probability is achievable if and only if • : The error exponent is finite but higher than no-feedback error exponent • : Double exponential error probability • : L-fold exponential error probability

  25. Feedback under Energy/Delay Constraint If , send Termination Alarm Otherwise, resend with energy Send back with energy If Termination Alarm is received, report as the decoded message Forward Channel m-bit Encoder m-bit Decoder Feedback Channel m-bit Decoder m-bit Encoder Objective: Choose to minimize the overall probability of error • Constraints

  26. Feedback Gain under Energy/Delay Constraint Depends on the error probability model ε() • Exponential Error Model: ε(x)=βe-αx • Applicable when Tx energy dominates • Feedback gain is high if total energy is large enough • No feedback gain for energy budgets below a threshold • Super-Exponential Error Model: ε(x)=βe-αx2 • Applicable when Tx and coding energy are comparable • No feedback gain for energy budgets above a threshold

  27. Backing off from: perfect feedback • Memoryless point-to-point channels: • Capacity unchanged with perfect feedback • Simple linear scheme reduces error exponent (Schalkwijk-Kailath: double exponential) • Feedback reduces energy consumption • Capacity of feedback channels largely unknown • Unknown for general channels with memory and perfect feedback • Unknown under finite rate and/or noisy feedback • Unknown in general for multiuser channels • Unknown in general for multihopnetworks • ARQ is ubiquitious in practice • Assumes channel errors • Works well on finite-rate noisy feedback channels • Reduces end-to-end delay No feedback Feedback

  28. How to use feedback in wireless networks? Noisy/Compressed • Output feedback • Channel information (CSI) • Acknowledgements • Something else? Interesting applications to neuroscience

  29. Backing off from: infinite sampling New Channel Sampling Mechanism (rate fs) • For a given sampling mechanism (i.e. a “new” channel) • What is the optimal input signal? • What is the tradeoff between capacity and sampling rate? • What known sampling methods lead to highest capacity? • What is the optimal sampling mechanism? • Among all possible (known and unknown) sampling schemes

  30. Capacity under Sampling w/Prefilter • Theorem: Channel capacity Determined by waterfilling: suppresses aliasing “Folded” SNR filtered by S(f)

  31. Capacity not monotonic in fs • Consider a “sparse” channel • Capacity not monotonic in fs! Single-branch sampling fails to exploit channel structure

  32. Filter Bank Sampling • Theorem: Capacity of the sampled channel using a bank of m filters with aggregate rate fs Similar to MIMO; no combining!

  33. Equivalent MIMO Channel Model For each f Water-filling over singular values MIMO – Decoupling • Theorem 3: The channel capacity of the sampled channel using a bank of m filters with aggregate rate is Pre-whitening

  34. Joint Optimization of Input and Filter Bank • Selects the m branches with m highest SNR • Example (Bank of 2 branches) low SNR Capacity monotonic in fs highest SNR 2nd highest SNR low SNR Can we do better?

  35. Sampling with Modulator+Filter (1 or more) • Theorem: • Bank of Modulator+FilterSingle Branch  Filter Bank • Theorem • Optimal among all time-preservingnonuniform sampling techniques of rate fs equals zzzzzzzzzz zzzzzzzzzz

  36. Backing off from: Infinite processing power Is Shannon-capacity still a good metric for system design?

  37. Our approach

  38. Power consumption via a network graphpower consumed in nodes and wires Extends early work of El Gamal et. al.’84 and Thompson’80

  39. Fundamental area-time-performance tradeoffs Area occupied by wires Encoding/decoding clock cycles • For encoding/decoding “good” codes, • Stay away from capacity! • Close to capacity we have • Large chip-area • More time • More power

  40. Total power diverges to infinity! Regular LDPCs closer to bound than capacity-approaching LDPCs! Need novel code designs with short wires, good performance

  41. Conclusions • Information theory asympotia has provided much insight and decades of sublime delight to researchers • Backing off from infinity required for some problems to gain insight and fundamental bounds • New mathematical tools and new ways of applying conventional tools needed for these problems • Many interesting applications in finance, biology, neuroscience, …

More Related