Download presentation
Presentation is loading. Please wait.
Published byMarian Hodge Modified over 9 years ago
1
Design and Implementation of a Caching System for Streaming Media over the Internet Ethendranath Bommaiah, Katherine Guo, Markus Hofmann,and Sanjoy Paul IEEE Real-Time Technology and Applications Symposium, 2000
2
Outline Focus on the design and implementation issues Protocols RTSP as control protocol RTP as data protocol Performance Network load Server load Client start-up latency
3
Application layer aware helper in the network
4
The streaming cache design Helper Caching and data forwarding proxy Each client is associated with on helper. Client requests are redirected to the client’s helper. The helper serves the request itself if possible, otherwise it forwards the request to the most appropriate helper or the server.
5
The streaming cache design (Cont’d) Segmentation of streaming objects Client request aggregation Temporal distance Ring buffer Data transfer rate control Reduce startup latency
6
Startup latency Without helper L 0 = 2(d 1 + d 2 ) + K With helper L 1 = d 2 + max(K 1 r/b, 2d) + d 2 + (K – K 1 )r/min(a, b) The client does not start playing until its playout buffer is filled.
7
Startup latency (Cont’d) The client does not start playing until its playout buffer is filled. With helper: Download K 1 seconds of data to the client Request K – K 1 seconds of data from either its local disk, or another helper, or the server.
8
Start-up latency when getting data from different sources d = 0, b > a, L 1 = d 2 + K 1 r/b + d 2 + (K – K 1 )r/b
9
Main module of a helper
10
Implementation RTSP/RTP client and server Buffer management Attach a new request to an existing buffer Allocate a new buffer Cache management Maps URLs to local filenames Manage the disk space allocated for caching Scheduler Manage the global queue of events Producer and consumer events
11
Buffer organization
12
Buffer management Modeled by producer and consumer events Garbage collection Buffer temporal distance is statically chosen, but the number of packets within the ring might vary. Solution Associate a reference count with each RTP packet. Use a garbage collection event to free packets after they have been forwarded by the last consumer. Outgoing stream composition RTP SSRC (synchronizing source identifier) Timestamp
13
Timestamp translation
14
Experimental results Server Read server on a Sun Ultra-4 workstation with 4 processors, 1GB main memory Helper 400MHz Pentium II with 250MB main memory Client 300MHz Pentium Pro with 250MB main memory Network 10Mbps Ethernet
15
Traffic reduction ratio R = (D out – D in ) / D out D out : data transferred from the helper to the client D in : data transferred from the server to the helper A larger value of R indicates larger server load and network.
16
Prefix caching benefits No cache replacement
17
Buffer request aggregation benefits
18
Improvement on startup latency K = 5 sec
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.