1 / 15

TexPoint fonts used in EMF.

Sampling of min-entropy relative to quantum knowledge Robert König in collaboration with Renato Renner. TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A A A A. Ambainis, Nayak, Ta-Shma, Vazirani 99/Nayak 99. random access codes.

hortensia
Télécharger la présentation

TexPoint fonts used in EMF.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sampling of min-entropy relative to quantum knowledgeRobert Königin collaboration withRenato Renner TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAA

  2. Ambainis, Nayak, Ta-Shma, Vazirani 99/Nayak 99 random access codes Ben-Aroya, Regev, de Wolf 07: decoding probability adaptive for measurement (random) subset time m-qubit state storage ? n coin tosses ?

  3. vs random access codes Sampling min-entropy vs decoding probability Claim: preservation of entropy-rate given : bound on entropy (pseudo) random subset time arbitrary quantum state vs bounded number of qubits correlation random variables, large alphabet vs coin flips

  4. Min-entropy and secret keys for classical-quantum states equal to extractable key length (also equal to guessing entropy: [K,Schaffner,Renner08]) Privacy amplification generates approximately [BBR88,BBCM95,Renner05] bits of secure key from partially secret raw key X, against adversary holding Q (optimal)

  5. Key expansion in the bounded storage model Sample-then-extract [Maurer92,….,Vadhan03] previously only analysed for classical adversary Source of randomness temporarily available resources insecure channel bits of key bits of key Privacy amplification Privacy amplification known to work against quantum adversary Quantum storage substring (sampled with S) substring (sampled with S) qubits

  6. Implication for the bounded storage model Sample-then-extract-approach for building locally computable extractors (Vadhan03) works against quantum adversaries! Validity against quantum adversaries cannot be established using classical extractor properties only. [Gavinsky, Kempe, Kerenidis, Raz & de Wolf’06] • Ingredients: • large source of randomness ( bits) • short initial shared key ( bits) • aim: generate bits of secure key key Extract: ``standard’’ hashing Sample: choose random subset Claim: seed for “sampler” seed for extractor

  7. Main result: Sampling of min-entropy Sample: choose random subset (classical) e.g., for (randomly chosen) subset of rephrased: if then large alphabet size c needed! “blockwise sampling” for any Main result: for any state where

  8. Why sampling (Shannon) entropy works • There is a simple proof for sampling of Shannon entropy. • Only uses • Subadditivity • Chain-rule • repeated application of chain-rule splits joint • entropy into sum of contributions large • random subset hits “good” parts with • high probability small

  9. Why sampling (Shannon) entropy works • There is a simple proof for sampling of Shannon entropy. • Only uses • Subadditivity • Chain-rule • repeated application of chain-rule splits joint • entropy into sum of contributions large • random subset hits “good” parts with • high probability small • subadditivity helps to remove dependence • on variables not in subset • chain-rule shows that is large

  10. (Min)-entropy(-)rules • subadditivity: • chain-rule (recombination): • chain-rule (splitting): Not true in general! Renato’s talk: Need three rules for entropy-sampling argument to work. Two of these hold trivially. The third rule has to be replaced for min-entropy. recursive application of this rule impossible

  11. Entropy-splitting and recombining recombining splitting large entropy: distance to original state: Is a probability distribution Additional properties if split states constructed using eigendecomposition of conditional operator General strategy for showing lower bound on smooth min-entropy 1. construct orthogonal decomposition 2. choose high-entropy subset 3. show that is large small large

  12. (Min)-entropy(-)rules • subadditivity: • chain-rule (recombination): • chain-rule (splitting): Not true in general! original state split states (Approximate) chain-rule for appropriately chosen (discrete) splitting!

  13. Recursive splitting and recombining for a given subset choose high-entropy components small large

  14. Conclusions/Application to BSM sampling preserves smooth min-entropy rate - application • to the BSM: sample-then-hash approach achieves significant key expansion • to general key extraction/qkd schemes: building block (aka “condenser”) for constructing randomness-efficient quantum extractors memory bits of shared key bits of shared key against an adversary with qubits

  15. THE END Thank you for your attention!

More Related