Presentation is loading. Please wait.

Presentation is loading. Please wait.

Author: Francis Chang, Wu-chang Feng, Kang Li Publisher: INFOCOM 2004 Presenter: Yun-Yan Chang Date: 2010/12/01 1.

Similar presentations


Presentation on theme: "Author: Francis Chang, Wu-chang Feng, Kang Li Publisher: INFOCOM 2004 Presenter: Yun-Yan Chang Date: 2010/12/01 1."— Presentation transcript:

1 Author: Francis Chang, Wu-chang Feng, Kang Li Publisher: INFOCOM 2004 Presenter: Yun-Yan Chang Date: 2010/12/01 1

2  Introduction  The Bloom filter  Extension of Bloom filter  Bloom filter aging 2

3  Provides a modified Bloom filter allowing a small amount of misclassification can decrease the size of packet classification cache without reducing hit rates. 3

4  A space-efficient data structure to store and query set-membership information.  The data structure consists of M = N × L bins, bins are organized into L levels with N bins in each level, to create N L virtual bins.  Each L functions can address all M bit buckets. 4

5 5 Figure 1: An example: A Bloom filter with N = 5 bins and L = 3 hash levels. Suppose we wish to insert an element, e.

6  Multiple Predicates ◦ Goal  To extend the storage capability. ◦ Consider a router with I interfaces. The cache requires to store a routing interface number. ◦ Construct a cache composed of I Bloom filters to record I binary predicates. ◦ Query all I Bloom filters when query the cache for forwarding interface number of flow e. 6

7 ◦ If e is a member of the ith Bloom filter, this implies flow e should be through the ith interface. ◦ If e is not a member of any Bloom filter, e has not been cached. ◦ In the unlikely event that if more than one Bloom filter claims e as a member.  One solution to this problem is to treat the cache lookup as a miss by reclassifying e. 7

8 ◦ If e is a member of the ith Bloom filter, this implies flow e should be through the ith interface. ◦ If e is not a member of any Bloom filter, e has not been cached. ◦ In the unlikely event that if more than one Bloom filter claims e as a member.  One solution to this problem is to treat the cache lookup as a miss by reclassifying e. 8

9 9 Figure 5: An example: A modified Bloom filter with 5 buckets and 2 hash levels, supporting a router with 8 interfaces. Suppose we wish to cache a flow e that gets routed to interface number 2.

10  Multi-Predicate Comparison 10 Figure 7: Effect of storing routing information on effective cache size, p = 1e − 9, using optimal Bloom filter dimensions.

11  Cold Cache ◦ Empty the cache whenever the Bloom filter becomes full. ◦ Advantage  Makes full use of all of the memory devoted to the cache. ◦ Disadvantage  While the cache is being emptied, it cannot be used.  All cached flows must be re-classified after empty the cache will cause a load spike in the classification engine.  Zeroing out the cache may cause a high amount of memory access. 11

12  Double-Buffering ◦ Partition the cache into two Bloom filters, active cache and warm-up cache. ◦ Goal  To avoid the high number of cache misses immediately following cache cleaning. ◦ Disadvantage  Double the memory requirement to store the same number of concurrent flows.  Zeroing out the expired cache still causes a load spike in the use of the memory bus.  Potentially double the number of memory accesses required to store a new flow. 12

13 13 Double-Buffering Algorithm when a new packet arrives if the flow id is in the active cache if the active cache is more than ½ full insert the flow id into the warm-up cache allow packet to proceed otherwise perform a full classification if the classifier allows the packet insert the flow id into the active cache if the active cache is more than ½ full insert the flow id into the warm-up cache allow packet to proceed if the active cache is full switch the active cache and warm-up cache zero out the old active cache

14  Double-Buffering ◦ Disadvantage  Double the memory requirement to store the same number of concurrent flows.  Zeroing out the expired cache still causes a load spike in the use of the memory bus.  Potentially double the number of memory accesses required to store a new flow. 14

15 15 Figure 9: Cache hit rates as a function of memory, M Figure 11: Average cache misses as a function of memory, M For a memory-starved system, the cold-cache approach is more effective with respect to cache hit-rates.

16 16 Figure 12: Variance of cache misses as a function of memory, M (aggregate over 100ms timescales) The variance in miss rates decreases much faster in the double-buffered case than in the cold-cache approach.


Download ppt "Author: Francis Chang, Wu-chang Feng, Kang Li Publisher: INFOCOM 2004 Presenter: Yun-Yan Chang Date: 2010/12/01 1."

Similar presentations


Ads by Google