1 / 31

Introduction to Quantum Shannon Theory

Introduction to Quantum Shannon Theory. Patrick Hayden (McGill University). | . 12 February 2007, BIRS Quantum Structures Workshop. Overview. What is Shannon theory? Why quantum Shannon theory? Highlights: The brilliant trivialities Basic capacity theorems The grand unified theory.

rivasj
Télécharger la présentation

Introduction to Quantum Shannon Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Quantum Shannon Theory Patrick Hayden (McGill University) | 12 February 2007, BIRS Quantum Structures Workshop

  2. Overview • What is Shannon theory? • Why quantum Shannon theory? • Highlights: • The brilliant trivialities • Basic capacity theorems • The grand unified theory

  3. Information theory • A practical question: • How to best make use of a given communications resource? • A mathematico-epistemological question: • How to quantify uncertainty and information? • Shannon: • Solved the first by considering the second. • A mathematical theory of communication [1948] The

  4. Quantifying uncertainty • Shannon entropy: H(X) = - xp(x) log2p(x) • Term suggested by von Neumann (more on him later) • Can arrive at definition axiomatically: • H(X,Y) = H(X) + H(Y) for independent X, Y, etc. • Operational point of view…

  5. {0,1}n: 2n possible strings ~2nH(X)typical strings Compression Source of independent copies of X If X is binary: 0000100111010100010101100101 About nP(X=0) 0’s and nP(X=1) 1’s X2 … X1 Xn Can compress n copies of X to a binary string of length ~nH(X)

  6. H(Y) Uncertainty in X when value of Y is known H(X|Y) I(X;Y) Information is that which reduces uncertainty Quantifying information H(X) H(X,Y) H(Y|X) H(X|Y) = EY H(X|Y=y) = H(X,Y)-H(Y) I(X;Y) = H(X) – H(X|Y) = H(X)+H(Y)-H(X,Y)

  7. ´ m’ m Decoding Encoding Shannon’s noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send messages reliably to Bob through  is given by the formula Sending information through noisy channels Statistical model of a noisy channel:

  8. Shannon theory provides • Practically speaking: • A holy grail for error-correcting codes • Conceptually speaking: • A operationally-motivated way of thinking about correlations • What’s missing (for a quantum mechanic)? • Features from linear structure:Entanglement and non-orthogonality

  9. Quantum Shannon Theory provides • General theory of interconvertibility between different types of communications resources: qubits, cbits, ebits, cobits, sbits… • Relies on a • Major simplifying assumption: Computation is free • Minor simplifying assumption: Noise and data have regular structure

  10. Basic resources | span{ |0, |1} 1 qubit |+AB=|0iA|0iB+|1iA|1iB 1 ebit

  11. Brilliant Triviality # 1: Superdense coding j 2 {0,1,2,3} Time j 1 qubit 1 ebit |+ j: 2 bits Entanglement allows one qubit to carry two bits of classical data BW92

  12. Brilliant Triviality # 2: Teleportation Two classical bits and one ebit can be used send one qubit Time 1 qubit | 2 bits (j) 1 ebit |+ | j Reality: Fiction: BBCJPW93

  13. Quantifying uncertainty • Let  = x p(x) |xihx| be a density operator • von Neumann entropy: H() = - tr [ log ] • Equal to Shannon entropy of  eigenvalues • Analog of a joint random variable: • AB describes a composite system A ­ B • H(A) = H(A) = H( trBAB)

  14. No statistical assumptions: Just quantum mechanics! B­ n dim(Support of B­ n ) ~ 2nH(B) Compression Source of independent copies of AB: ­ ­  ­… A A A B B B Can compress n copies of B to a system of ~nH(B) qubits while preserving correlations with A

  15. H(B) Uncertainty in A when value of B is known? H(A|B) |iAB=|0iA|0iB+|1iA|1iB Quantifying information H(A) H(AB) H(B|A) H(A|B) = H(AB)-H(B) H(A|B) = 0 – 1 = -1 Conditional entropy can be negative! B = I/2

  16. H(B) Uncertainty in A when value of B is known? H(A|B) I(A;B) Information is that which reduces uncertainty Quantifying information H(A) H(AB) H(B|A) H(A|B) = H(AB)-H(B) I(A;B) = H(A) – H(A|B) = H(A)+H(B)-H(AB) ¸ 0

  17. Encoding ( state) Decoding (measurement) m’ m HSW noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send messages reliably to Bob through  is given by the (regularization of the) formula where Sending classical information through noisy channels Physical model of a noisy channel: (Trace-preserving, completely positive map)

  18. Encoding (TPCP map) Decoding (TPCP map) ‘ |i2 Cd LSD noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can reliably send qubits to Bob (1/n log d) through  is given by the (regularization of the) formula Conditional entropy! where Sending quantum information through noisy channels Physical model of a noisy channel: (Trace-preserving, completely positive map)

  19. The family paradigm Many problems in quantum Shannon theory are all versions of the same problem: protocols transform into each other Father Mother TP Stupid TP Entanglement distillation Quantum capacity SD SD Teleporting over noisy states Entanglement-assisted classical capacity Superdense coding with noisy states Devetak, Harrow, Winter [2003]

  20. Further unification Fully quantum Slepian-Wolf Special case Schmidt symmetry Time-reversal Channel simulation Father Mother TP Stupid TP Entanglement distillation Quantum capacity SD SD Teleporting over noisy states Entanglement-assisted classical capacity Superdense coding with noisy states Quantum multiple access capacities Distributed compression Abeyesinghe, Devetak, Hayden, Winter [2006]

  21. The art of forgetting

  22. The art of forgetting TRASH AB2B3 AB1B2B3 AB2 = A­B2 How can Bob unilaterally destroy his correlation with Alice? What is the minimal number of particles he must discard before the remaining state is uncorrelated? In this case, by discarding 2 particles, Bob succeeded in eliminating all correlations with Alice’s particle

  23. Purification and correlation B D Purification: When faced with a mixed state, we can always imagine that the state describes part of a larger system on which the state is pure. |ABi|CDi = (idAC­UBD-1)|ABCDi |ABCDi =(idAC­ UBD)|ABi|CDi Purifications are essentially unique. (Up to local transformations of the purifying space.) TrBDABCD = A­C A­σC

  24. The benefits of forgetting:Applied theology Watch again: AB2 = A­B2 AB2B3 AB1B2B3 |AB1B2B3Ci Charlie’s Magical Bucket O’ Particles Purification TRASH All purifications equivalent up to a local transformation in Charlie’s lab. Charlie holds uncorrelated purifications of both Alice’s particle and Bob’s remaining particles.

  25. The benefits of forgetting:Applied theology Before After TRASH TRASH |AC1i|B2C2C3i |AB1B2B3Ci Alice never did anything ) Her marginal state A = A is unchanged Originally, her purification is held by both Bob and Charlie. Afterwards, entirely by Charlie. Bob transferred his Alice entanglement to Charlie and distilled entanglement with Charlie, just by discarding particles!

  26. Fully quantum Slepian-Wolf:How much does Bob need to send? Uncertainty: von Neumann entropy Before H(A) = H(A) = - tr[ A log A ] Correlation: mutual information I(A;B) = H(A) + H(B) – H(AB) TRASH 0 if and only if AB = A­B I(A;B)= m for m pairs of correlated bits 2m for m ebits (maximal) |ABCi­ n Initial mutual information: n I(A;B) Final mutual information:  Each qubit Bob discards has the potential to eliminate at most 2 bits of correlation Bob should (ideally) send around nI(A;B)/2 qubits to Charlie.

  27. At random! How does Bob choose which qubits? ? Before TRASH |ABCi­ n (According to the unitarily invariant measure on the typical subspace of B­n.) Bob can ignore the correlation structure of his state!

  28. Final accounting After Investment: Bob sends Charlie ~n[I(A;B)]/2 qubits Rewards: 1) Charlie holds Alice’s purification 2) B and C establish ~n[I(B;C)]/2 ebits TRASH |AC1i|B2C2C3i OK – but what good is it?

  29. Entanglement distillation (BC)­ n Bob and Charlie share many copies of a noisy entangled state and would like to convert them to ebits. Only local operations and classical communication are allowed. Forgetting protocol good but uses quantum communication Implement quantum communication using teleportation: Transmit 1 qubit using 2 cbits and 1 ebit. Optimal Net rate of ebit production: I(B;C)/2 – I(A;B)/2 = H(C)-H(BC) [Devetak/Winter 03]

  30. Conclusions • Information theory can be generalized to analyze quantum information processing • Yields a rich theory, surprising conceptual simplicity • Compression, data transmission, superdense coding, teleportation, subspace transmission • Capacity zoo, using noisy entanglement, channel simulation: all are closely related • Operational approach to thinking about quantum mechanics

More Related