1 / 20

Cooperative Communications in Networks: Random coding for wireless multicast

Cooperative Communications in Networks: Random coding for wireless multicast . Performance for fading channels and inter-session coding. Brooke Shrader and Anthony Ephremides University of Maryland October, 2008. Joint work with Randy Cogill, University of Virginia.

sigourney
Télécharger la présentation

Cooperative Communications in Networks: Random coding for wireless multicast

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cooperative Communications in Networks:Random coding for wireless multicast Performance for fading channels and inter-session coding Brooke Shrader and Anthony Ephremides University of Maryland October, 2008 Joint work with Randy Cogill, University of Virginia

  2. Alternatives to overcome channel MAC-layer errors in multicast transmission: repeatedly send packets (ARQ) network coding Introduction and Motivation • Previous work: • network coding outperforms ARQ for time-invariant channels • coding used within (but not between) multicast sessions

  3. For packets form where ai are generated randomly and uniformly from u-ary alphabet and Σ is sum in finite field. If ai = 0 for all i, generate new coefficients. Random linear coding for multicast Form random linear combinations of K packets. Transmit coefficients ai in packet header Decode: solve a system of linear equations in si

  4. Previous workStable throughput in random access • Setting: 2 sources, 2 destinations • If user n has a packet to send, it transmits in a slot with probability pn Random linear coding Retransmit packets • Stable throughput: region of (λ1, λ2) for which there exists (p1, p2) such that both queues remain finite

  5. Previous workStable throughput in random access Increasing K Random linear coding approaches capacity as K→∞.

  6. Previous workQueueing delay in random linear coding Policy: fix K, the number of packets in coding Time spent waiting to form a group of K Multicast: 3 destinations, q=0.9

  7. Previous workQueueing delay in random linear coding Policy: let k, number of packets in coding, vary according to traffic load 1 ≤ k ≤ K Unicast: 1 destination, q=0.5 As K →∞, delay performance approaches retransmission delay

  8. Rateless coding naturally adapts coding rate to variations in the channel. Coding across sessions means that receivers decode additional packets that aren’t intended for them. Recent/Ongoing work • We analyze the multicast throughput for random linear coding • over a fading wireless channel where reception probability depends on packet length, overhead, SNR • across multiple multicast sessions

  9. Multicast throughput Let Tm denote the number of slots needed for destination m to collect K linearly independent random linear combinations. The multicast throughput is: Difficulty: Tm are correlated due to correlation in the random linear combinations sent to different destination nodes. This is true even if the channels to the destination nodes are independent.

  10. Lower bound on multicast throughput Assume: the channels to the M destinations are identically distributed (but not independent). For random variables X1,X2,…,XM identically distributed and correlated and for any t > 0, Then the multicast throughput is lower bounded, for any t > 0, as

  11. Network coding: can approach min-cut capacity in the limit as alphabet size approaches infinity. Random network coding:overhead needed to transmit coefficients of random code. Packet length (symbols per packet & alphabet size) must be sufficiently large in order to: approach min-cut capacity ensure small (fractional) overhead I: Packet length and overhead Our approach: model the packet erasure probability as a function of packet length (symbols per packet and alphabet size).

  12. I: Packet erasure probability Assume that there is no channel coding within packets. For packet to be received, every symbol must be received. q: Probability that a transmitted packet is successfully received at a destination node. Pu: u-ary symbol error probability for modulation scheme, depends on SNR, channel model (e.g., AWGN)

  13. I: Accounting for overhead Each packet consists of nu-ary symbols. Coding is performed on groups of K packets. The multicast throughput is lower bounded, for any t > 0, as ratio of information to information+overhead

  14. I: Fading channel model The channel to each destination node evolves as a Markov chain with “Good” and “Bad” states. qG: Probability that a transmitted packet is received in “Good” state qB: Probability that a transmitted packet is received in “Bad” state A packet-erasure version of the Gilbert channel model.

  15. I: Augmented Markov chain for reception at each destination State (S,j) where S is “Good” or “Bad” state and j=0,1,…K is the number of linearly independent random linear combinations that have been received. P: transition probability matrix Assume qB=0, so initial state is always S=G. Transmission time T1: time to reach state (S,K) from (G,0).

  16. I: Multicast throughput versus K Compare to time-invariant channel with probability of reception M=10, n=250, u=8 QAM modulation over AWGN channel with SNR/bit 3.5 dB in “Good” state and -∞ dB in “Bad” state.

  17. II: Coding across multicast sessions • One source node • K multicast sessions, each with an independent arrival process of equal rate • Each session serves M destination nodes • Channels to all MK destination nodes are identically distributed with reception probability q Random linear coding: create random linear combinations from the K head-of-line packets, one from each session.

  18. II: Coding across multicast sessions For successful decoding, each destination must decode the packets from all K multicast sessions. Using bound on E[max(X1,X2,…XMK)], we bound the throughput as

  19. Coding across sessions (lower bound) --- Retransmissions (upper bound) II: Multicast throughput for coding across sessions For large number of sessions and receivers per session, coding outperforms retransmissions K=50, q=0.8

  20. Future work: Cooperative communications in networks • Combine network coding with back-pressure algorithm for congestion control • Packet erasure models for networks • erasure probability → 1 as packet “length” grows • Cooperative techniques in which relay nodes utilize idle slots

More Related