290 likes | 404 Vues
This paper explores Proxy-Assisted Techniques for Delivering Continuous Multimedia Streams, focusing on reducing service latency and resource requirements by using proxy servers. The study includes Proxy-Assisted Video Delivery Architecture, Catching, Selective Catching, and simulation results. It presents an efficient way to utilize server and network resources for video streaming over the internet effectively. The proposed architectures reduce the resource demands on central servers and improve service latency for clients.
E N D
Proxy-Assisted Techniques for Delivering Continuous Multimedia Streams Lixin Gao, Zhi-Li Zhang, and Don Towsley
Agenda • Related work • Proxy-Assisted Video Delivery Architecture • Proxy-Assisted Catching • Proxy-Assisted Selective Catching • Simulation results • Conclusion
Related Work • Server-push -> Typically designed for “hot” (frequently requested) objects -> Fixed number of multicast channels
Limitations of current technology • Server and network resources (Server I/O bandwidth and network bandwidth) are major limiting factors in widespread usage of video streaming over the internet • Need techniques to efficiently utilize server and network resources • Service latency and popularity of video object should be considered
Advantages of proxy-assisted video delivery • Latency reduction without increasing demand on backbone network resources • Need to store only the initial frames hence feasible with large data volume • I/O bandwidth requirement on proxy server is insignificant, since responsible for limited number of clients
Classification • Proxy-assisted catching : Suited for “hot” video objects • Proxy-assisted selective catching : Even suited for “cold” (less frequently requested) video objects
Advantages of proposed architectures • Reduce the resources requirements at central server • Reduce service latency experienced by clients Assumptions • Client can receive data from 2 channels simultaneously
Proxy-Assisted Catching • Reduces service latency by allowing clients to join an ongoing broadcast • Clients catch-up by retrieving initial frames using unicast channel from proxy
Proxy-Assisted Catching Partition function used
Optimizing • Server and network bandwidth are major bottleneck. Hence reducing total number of channels required • Trade-off between -> Number of dedicated channels by server -> Storage space required by proxy
Terms involved • N : No. of video objects on central server • L : Length of video • λ : Request rate (Poisson distribution) • K : Server channels to broadcast video • K* : Optimal number of server channels • i : Video object no. • j : Broadcasting frame
Calculation • No. of proxy channels required : • Total no. of channels required : • Tradeoff between number of server channels and expected number of proxy channels required for catch-up
Calculation contd.. • Optimization problem : • Expected number of channels : Optimal no. of server channels Optimal no. of proxy channels
Controlled Multicast • Client pull technique • Allows client to join the ongoing multicast if it requests with a certain threshold time Ti • Else a new multicast channel is allocated Proxy-assisted Controlled Multicast • Proxy pre-store the initial Ti frames of video • Missing portion of video is send separately through a unicast channel • Good technique for “cold” video objects
Comparison with Proxy-Assisted Controlled Multicast • Total no. of channels required for controlled multicast is : • For large value of λ no. of channels required by proxy-assisted catching is less • Verified using following setup : L : 90 min. video object
Observation 0.4
Proxy-Assisted Selective Catching • Combines Proxy-Assisted Catching and Controlled Multicast • Broadcast most frequent videos using Proxy-Assisted Catching and less frequent videos using Controlled Multicast
Classifying “Hot” and “Cold” videos • Hot video if Total no. of channels required using catching Total no. of channels required using controlled multicast
Simulation results • Simulation settings N : No. of video objects on central server λ : Request rate (Poisson's distribution) • Simulates 150 hours of client requests • Ki* : Broadcasting channels for “hot” video objects • Remaining channels for controlled multicast • First-come-first-serve basis
Assumptions • Sufficient proxy resources to store prefixes for all videos • Proxy server has 40GB of storage space and I/O bandwidth of 88 Mb/s
Waiting time vs. total number of channels λ = 50 710 900
Waiting time vs. Arrival rate • λ varies from 40 to 80 • Total no. of channels = 700
Total no. of channels vs. arrival rate 100 150 Performance of selective catching and catching same
Waiting time vs. Server channels 700 460 • 36% saving in number of channels required at central server
Number of channels vs. Arrival rate • Significant reduction in central server channel requirement
Waiting time vs. Server channels • Advantage of proxy-assisted selective catching does not critically depend on availability of proxy storage space
Conclusion • Approach is proved using quite realistic simulations without any major assumptions • If the arrival rate exceeds beyond certain assumptions then the service latency will increase