Online Coded Caching Mohammad Ali Maddah-Ali Bell Labs, Alcatel-Lucent, USA joint work with RamtinPedarsaniUrsNiesen UC-Berkeley Bell Labs
Video on Demand • Video on Demand • Netflix • Amazon • Hulu • Verizon/Comcast • … • Places significant stress on service providers’ networks. • Caching can be used to mitigate this stress.
Least Recently Used (LRU) • LRU caching gain: deliver content locally (local gain) • LRU is approximately optimum for single cache [Sleator, Tarjan, 85] • LRU is widely used in industry Server LRU File Cache User • Cache every uncached requested file • Cache is full: Evict the Least Recently Used (LRU).
Beyond Local Gain Maddah-Ali, Nisien, Fundamental Limits of Caching. 2012 • Local Gain = 0.5 • Global (coding) Gain = 0.5 • As number of cashes increases • Local gain stays constant! • Global gain scales Linearly A1 A2 B2 B1 A2 A2⊕B1 B1 A1 A2 B1 B2 Efficient online caching must capture the GLOBAL GAIN.
In this talk: Coded Least Recently Sent • We propose coded LRS to exploit global gain • Cache any uncashed requested file • Partially! • Randomly • Uniformly • No matter who requested! • Cache is full: • Evict least recently sent Coded Coded
Optimality of Coded LRS Set of Equi-PopularN Files Theorem (Coded LRS) With prob. p Local Gain Size M Global (coding) Gain: Scales with K KUser
Sketch of Proof Partially Cached files popular files Users Demands No Caching Gain Enjoys code cashing gain • Challenges: • Load for uncached demands are bounded by a constant number. • Number of uncached demands is governed by a complicated Markov chain. • Big gain for the partially cached demands.
Performance Evaluation • Real-Life demand time series extracted From Netflix Prize Data (10 millions demands, over one year period) • Dynamic Variation of the Users’ Demand
Performance Evaluation Average Rate Size of Each Isolated Cache • Significant gain due to coded global gain
Conclusion • For cache networks,LRUisNOT optimal. • Introduced online coded caching (Coded LRS). • Significant gain over LRU • Proved coded LRS is approximately optimal under some conditions • Validated the result for real-life time series of requests extracted from Netflix.
Further Reading • Maddah-Ali and Niesen, “Fundamental Limits of Caching”, Sept 2012 (IEEE Trans. On Information theory, March 2014). • Maddah-Ali and Niesen, “Distributed Caching Attains Order-Optimal Memory-Rate Trade-offs”, Jan. 2013 ( to Appear in ACM/IEEE Trans. On Networking, 2014). • Niesen and Maddah-Ali“Coded Caching with Non-Uniform Demands”, Jun. 2013. (Submitted to IEEE Trans. On Information Theory). • Pedarsani, Maddah-Ali, and Niesen, “Online Coded Caching”, Nov. 2013 (Submitted to ACM/IEEE Trans. on Networking). • Karamchandani, Niesen, Maddah-Ali, and Diggavi“Hierarchical Coded Caching”, Jan. 2014 (Submitted to IEEE Trans. On Information Theory).