1 / 11

Online Coded Caching

Online Coded Caching. Mohammad Ali Maddah-Ali Bell Labs, Alcatel-Lucent, USA joint work with Ramtin Pedarsani Urs Niesen UC-Berkeley Bell Labs. Video on Demand. Video on Demand Netflix Amazon Hulu Verizon/Comcast ….

truman
Télécharger la présentation

Online Coded Caching

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Online Coded Caching Mohammad Ali Maddah-Ali Bell Labs, Alcatel-Lucent, USA joint work with RamtinPedarsaniUrsNiesen UC-Berkeley Bell Labs

  2. Video on Demand • Video on Demand • Netflix • Amazon • Hulu • Verizon/Comcast • … • Places significant stress on service providers’ networks. • Caching can be used to mitigate this stress.

  3. Least Recently Used (LRU) • LRU caching gain: deliver content locally (local gain) • LRU is approximately optimum for single cache [Sleator, Tarjan, 85] • LRU is widely used in industry Server LRU File Cache User • Cache every uncached requested file • Cache is full: Evict the Least Recently Used (LRU).

  4. Beyond Local Gain Maddah-Ali, Nisien, Fundamental Limits of Caching. 2012 • Local Gain = 0.5 • Global (coding) Gain = 0.5 • As number of cashes increases • Local gain stays constant! • Global gain scales Linearly A1 A2 B2 B1 A2 A2⊕B1 B1 A1 A2 B1 B2 Efficient online caching must capture the GLOBAL GAIN.

  5. In this talk: Coded Least Recently Sent • We propose coded LRS to exploit global gain • Cache any uncashed requested file • Partially! • Randomly • Uniformly • No matter who requested! • Cache is full: • Evict least recently sent Coded Coded

  6. Optimality of Coded LRS Set of Equi-PopularN Files Theorem (Coded LRS) With prob. p Local Gain Size M Global (coding) Gain: Scales with K KUser

  7. Sketch of Proof Partially Cached files popular files Users Demands No Caching Gain Enjoys code cashing gain • Challenges: • Load for uncached demands are bounded by a constant number. • Number of uncached demands is governed by a complicated Markov chain. • Big gain for the partially cached demands.

  8. Performance Evaluation • Real-Life demand time series extracted From Netflix Prize Data (10 millions demands, over one year period) • Dynamic Variation of the Users’ Demand

  9. Performance Evaluation Average Rate Size of Each Isolated Cache • Significant gain due to coded global gain

  10. Conclusion • For cache networks,LRUisNOT optimal. • Introduced online coded caching (Coded LRS). • Significant gain over LRU • Proved coded LRS is approximately optimal under some conditions • Validated the result for real-life time series of requests extracted from Netflix.

  11. Further Reading • Maddah-Ali and Niesen, “Fundamental Limits of Caching”, Sept 2012 (IEEE Trans. On Information theory, March 2014). • Maddah-Ali and Niesen, “Distributed Caching Attains Order-Optimal Memory-Rate Trade-offs”, Jan. 2013 ( to Appear in ACM/IEEE Trans. On Networking, 2014). • Niesen and Maddah-Ali“Coded Caching with Non-Uniform Demands”, Jun. 2013. (Submitted to IEEE Trans. On Information Theory). • Pedarsani, Maddah-Ali, and Niesen, “Online Coded Caching”, Nov. 2013 (Submitted to ACM/IEEE Trans. on Networking). • Karamchandani, Niesen, Maddah-Ali, and Diggavi“Hierarchical Coded Caching”, Jan. 2014 (Submitted to IEEE Trans. On Information Theory).

More Related