1 / 20

Zero-error source-channel coding with source side information at the decoder

J. Nayak, E. Tuncel and K. Rose University of California, Santa Barbara. Zero-error source-channel coding with source side information at the decoder. Outline. The problem Asymptotically vanishing error case Zero error Unrestricted Input Restricted Input How large are the gains?

brent
Télécharger la présentation

Zero-error source-channel coding with source side information at the decoder

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. J. Nayak, E. Tuncel and K. Rose University of California, Santa Barbara Zero-error source-channel coding with source side information at the decoder

  2. Outline • The problem • Asymptotically vanishing error case • Zero error • Unrestricted Input • Restricted Input • How large are the gains? • Conclusions

  3.      sc sc s c c s S-C Encoder S-C Decoder n n n n U Channel Encoder Channel Decoder Source Encoder î Source Decoder X Û Y n i n n n X Y U Û Channel p(y|x) Channel p(y|x) n V n V The Problem • Is separate source and channel coding optimal? Does an encoder-decoder pair exist?

  4. Asymptotically Vanishing Probability of Error • Source coding: R>H(U|V) • Slepian-Wolf code • Channel coding: R<C • Source-channel code (Shamai et. al.) • Communication not possible if H(U|V)>C • Separate source and channel coding asymptotically optimal

  5. Channel • Channel transition probability p(y|x), yY, x X Characteristic graph of the channel • Examples • Noiseless channel : Edge free graph • Conventional channel : Complete graph

  6. Channel Code • Code = symbols from an independent set • 1-use capacity = log2(Gx) • n uses of the channel • Graph = GXn, n-fold AND product of GX • Zero error capacity of a graph Depends only on characteristic graphGX

  7. Source With Side Information • (U,V)UxV ~ p(u,v) • Support set SUV = {(u,v)  UxV: p(u,v)>0} • Confusability graph on U: GU=(U,EU) • Examples • U=V : Edge free graph • U,V independent : Complete graph

  8. Source Code • Rate depends only on GU • Connected nodes cannot receive same codewordEncoding=Coloring GU • Rate = log2(GU) • Two cases • Unrestricted inputs • Restricted inputs

  9. Unrestricted Input • (u,v) not necessarily in SUV • Decode correctly if (u,v)  SUV • 1-instance rate: log2(GU) • n-instance graph • Graph = Gu(n), n-fold OR product of Gu • Asymptotic rate for UI code

  10. Restricted Input • (u,v) in SUV • 1-instance rate: log2[(GU)] • n-instance graph • Graph = Gun, n-fold AND product of Gu • Asymptotic rate = Witsenhausen rate of source graph

  11. Source-Channel Coding • 1 source instance -1 channel use code • Encoder • Decoder • u1 and u2 are not distinguishable given side information sc1(u1) and sc1(u2) should not result in same output y • u1 and u2 connected in GUsc1(u1) and sc1(u2) are not connected in GXandsc1(u2)  sc1(u1)

  12. Source-Channel Coding • If n-n UI (RI) code exists for some n, (GU,GX) is a UI (RI) compatible pair • (G, G) is always a UI and RI compatible pair

  13. A A Channel C D E B C D E B Unrestricted Input a Source • C(GX5) = log2[5] • RUI(GU5) = log2[5/2]> C(GX5) • Source = Complement of channel a b c d e A D B E C e b = d c

  14. Restricted Input • Previous example not useful • RW(GU5) = log2[5] = C(GX5) • Source graph GU = complement of channel graph GX • Approach: Find f(G) such that • C(G) f(G)RW(G) • If either inequality strict, done!

  15. Lovász theta function: • Lovász: • Key result:

  16. Restricted Input • GU = Schläfli graph ( 27 vertex graph) = GX • Haemers • Code exists since GU = GX

  17. How large are the gains? • Channel uses per source symbol • Alon ‘90: There exist graphs such that C(G) < log k and • Given l, there exist G such that

  18. Conclusions • Under a zero error constraint separate source and channel coding is asymptotically sub-optimal. • Not so for the asymptotically vanishing error case. • In the zero-error case, the gains by joint coding can be arbitrarily large.

  19. Scalar Code Design Complexity • Instance: Source graph G • Question: Does a scalar source-channel code exist from G to channel H? • Equivalent to graph homomorphism problem from G into H • NP-complete for all H (Hell & Nesetril ’90)

  20. Future Work • Do UI compatible pairs (GU,GX) exist with RW(GU)<C(GX) <RUI(GU)? • For what classes of graphs is separate coding optimal?

More Related