Network Coding – AAU Summer School Advanced Topics I

# Network Coding – AAU Summer School Advanced Topics I

Télécharger la présentation

## Network Coding – AAU Summer School Advanced Topics I

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
##### Presentation Transcript

1. Network Coding – AAU Summer SchoolAdvanced Topics I Prof. Daniel Lucani, Ph.D.

2. Recap • XOR bit by bit • How do youmultiplyin GF(2n)? • How do you add in GF(2n)? • Polynomialarithmetic (modulo irreduciblepolynomial) • Whatisthefieldsizein GF(2n)? • Simply, 2n... meaningyouhaveelements 0, 1, ..., 2n-1 • Whatis a generation? • How do youcreate a codedpacket? • Whatmakesnetworkcodingunique?

3. A 1 T A D x x x x x C1 C1 C1 C1 C1 g bits n bits + + + + + A 2 T A D x x x x x C2 C2 C2 C2 C2 Coded Data C2 Header C1 h bits g bits n bits Generating a Coded Packet • Generating a linear network coded packet (CP) • Operations over finite field of size. e.g. g = 8 bits, q = 256 Coded Data

4. NC at the Receiver Gaussian elimination n x n matrix requires An3 + Bn2 + Cn operations CP1 P1’ a11 a12 a13 a14 a15 a16 CP2 P2’ a21 a22 a23 a24 a25 a26 CP3 P3’ a31 a32 a33 a34 a35 a36 = CP4 P4’ a42 a43 a44 a45 a46 a41 CP5 P5’ a51 a52 a53 a54 a55 a56 CP6 P6’ a61 a62 a63 a64 a65 a66 Coded packets Original packets Coefficients M x M

5. Field Size Analysis Tx Rx 1 n ind. linear combinations needed Channel / Network (Packet Losses) 1 coded packet arrives Inputs random linear network coded packets (M original packets) • Modeled as a Markov chain

6. Field Size Analysis If M is large: • Little overhead • Small performance degradation Connection to Kodo: • Task 7: single link, • GF(2), generation size 8

7. Field Size Analysis: Distribution

8. Why Generations? Gaussian elimination n x n matrix requires An3 + Bn2 + Cn operations Case 1: O(M3) Case of “k” generations of size “p”, with p = M/k: O(k p3) Overhead: recall that each packet in a generation has an associated coefficient that needs to be sent Case 1: For GF(2g), overhead per coded packet is: (g M) bits Case “k” generations of size “p”: (g p) bits CP1 P1’ a11 a11 a11 a12 a12 a12 a13 a13 0 0 a14 0 0 0 a15 0 a16 0 CP2 P2’ a21 a21 a21 a22 a22 a22 a23 0 a23 0 0 a24 a25 0 0 a26 0 0 CP3 P3’ 0 a31 a31 a32 0 a32 a33 a33 a33 a34 a34 0 0 a35 0 0 a36 0 = CP4 P4’ 0 0 a42 0 a43 a43 a44 a44 a44 0 a45 a45 a46 a46 0 a41 0 0 CP5 P5’ 0 a51 0 a52 0 0 0 0 a53 a54 0 a54 a55 a55 a55 a56 a56 a56 CP6 P6’ a61 0 0 0 a62 0 0 a63 0 a64 a64 0 a65 a65 a65 a66 a66 a66 M x M

9. Systematic Coding: Complexity Gaussian elimination n x n matrix,n = M - D requires An3 + Bn2 + Cn operations Distribution of D determines average # of operations Linked to channel model Erasures IID Be(Pe): CP1 P1’ 1 1 0 0 0 0 0 0 0 0 0 0 CP2 P2’ 0 0 1 1 0 0 0 0 0 0 0 0 CP3 P3’ 0 0 0 0 1 1 0 0 0 0 0 0 = CP4 P4’ a42 0 a43 0 a44 a44 a45 a45 a46 a46 a41 0 CP5 P5’ 0 a51 a52 0 a53 0 a54 a54 a55 a55 a56 a56 CP6 P6’ 0 a61 a62 0 a63 0 a64 a64 a65 a65 a66 a66 D Uncoded packets M x M • Operations first elimination (Product): D 2 (M-D) O(M3Pe3) A(MPe)3 + B’(MPe)2 + C’(MPe)

10. Sparse Code Structures • What does sparsity mean? • Large fraction of the coefficients are zero • Why use sparse structures? • Efficient decoders (less complexity) • What are we giving up? • Performance: need to transmit more coded packets • What are the challenges? • Decoders • Complexity – performance trade-off • Re-coding may destroy sparse structure

11. Sparse Code Structures • Decoders: Forward pass of Gaussian elimination • Dominant effect towards complexity • It can introduce spurious coefficients in unprocessed coded packets: sparse structure is lost • Re-coding: • If not careful, • can increase • density Start: 14 non-zero coeff. 1 1 0 0 2 2 1 1 0 0 0 0 P1 P1 After some steps: 16 non-zero coeff. 0 0 1 1 0 0 0 0 1 1 0 0 P2 P2 0 1 0 0 0 2 0 1 0 0 0 0 P3 P3 0 0 0 0 1 1 1 1 1 1 0 0 P4 P4 0 0 3 0 4 4 0 0 0 3 0 0 P5 P5 0 2 0 1 4 0 0 2 0 1 2 2 P6 P6 P1 + 2P3 P1 + 2P3 + 3P5 + 5P6 3P5 + 5P6

12. What about random arrivals? Mmax Mmin 1 λ pkt/s . . . • So far: • Focus on a single batch/generation • Packets in the generation already available to source • How do we manage a stream/flow of data? • E.g., video streaming • One approach: create generations with packets available Load: ρ = λ / μ μ pkt/s Server

13. Online transmission: no generations Number of packets coded could be very large (random) Requires feedback for queue management What is the right strategy? Drop packets when decoded? Question 1: how much time do we wait until decoding? Question 2: how do we manage queues? Question 3: how do we provide feedback? Online Network Coding

14. Drop when decoded works Problem: queue at the sender grows until a decoding event Problem: proven that queue at sender grows as Ω(1/(1-ρ)2) Feedback: ACK # of independent linear combinations Advantage: feedback is not on specific packets Advantage 2: If ACKs can be lost, a single feedback packet may ACK several packets Online NC: Drop when decoded a12 a42 a13 a43 0 a44 0 a45 0 a46 a41 a11 a21 a51 a22 a52 a53 a23 a54 0 0 a55 0 a56 a61 a31 a32 a62 a33 a63 a34 a64 a65 a35 0 a66 Decoding event P1 P1 P2 P2 P3 P3 P4 P4 P5 P5 P6 P6 ACK: 6 3 2

15. Can drop packet from queue if it has been ‘seen’ Not necessarily decoded Proven that queue at sender grows as O(1/(1-ρ)) Advantages for ACK losses Single feedback packet may ACK several packets Online NC: Drop when “seen” a12 0 a13 0 0 a44 0 a45 0 a46 0 a11 a21 0 a22 0 0 a23 a54 0 0 a55 0 a56 0 0 0 0 a33 0 a34 a64 a65 a35 0 a66 Decoding event P1 P1 P2 P2 P3 P3 P4 P4 P5 P5 P6 P6 ACK: 6 3 2

16. Take Away Points • Breaking large files into generations • Motivation: complexity and overhead • Systematic NC and other sparse structures • Reduce complexity by exploiting structure • In multi-hop networks: difficult to preserve structure • Random arrivals: can use various strategies • Could still have generations • Could use an online coding approach • Feedback becomes important