1 Minimize average access time Items have weights: Item i has weight w i Let W =  w i be the total weight of the items Want the search to heavy items.

Slides:



Advertisements
Similar presentations
AVL Trees COL 106 Amit Kumar Shweta Agrawal Slide Courtesy : Douglas Wilhelm Harder, MMath, UWaterloo
Advertisements

Advanced Data structure
1 Finger search trees. 2 Goal Keep sorted lists subject to the following operations: find(x,L) insert(x,L) delete(x,L) catenate(L1,L2) : Assumes that.
B+-Trees (PART 1) What is a B+ tree? Why B+ trees? Searching a B+ tree
Binary Trees, Binary Search Trees CMPS 2133 Spring 2008.
© The McGraw-Hill Companies, Inc., Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
Data Structures and Algorithms1 B-Trees with Minimum=1 2-3 Trees.
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
6/14/2015 6:48 AM(2,4) Trees /14/2015 6:48 AM(2,4) Trees2 Outline and Reading Multi-way search tree (§3.3.1) Definition Search (2,4)
B + -Trees (Part 1). Motivation AVL tree with N nodes is an excellent data structure for searching, indexing, etc. –The Big-Oh analysis shows most operations.
B + -Trees (Part 2) Lecture 21 COMP171 Fall 2006.
1 Splay trees (Sleator, Tarjan 1983). 2 Motivation Assume you know the frequencies p 1, p 2, …. What is the best static tree ? You can find it in O(nlog(n))
1 Red Black Trees (Guibas Sedgewick 78). 2 Goal Keep sorted lists subject to the following operations: find(x,L) insert(x,L) delete(x,L) catenate(L1,L2)
Chapter 10 Search Structures Instructors: C. Y. Tang and J. S. Roger Jang All the material are integrated from the textbook "Fundamentals of Data Structures.
The Complexity of Algorithms and the Lower Bounds of Problems
B + -Trees COMP171 Fall AVL Trees / Slide 2 Dictionary for Secondary storage * The AVL tree is an excellent dictionary structure when the entire.
Tirgul 6 B-Trees – Another kind of balanced trees.
Princeton University COS 423 Theory of Algorithms Spring 2002 Kevin Wayne Fibonacci Heaps These lecture slides are adapted from CLRS, Chapter 20.
B + -Trees (Part 2) COMP171. Slide 2 Review: B+ Tree of order M and of leaf size L n The root is either a leaf or 2 to M children n Each (internal) node.
Fundamental Structures of Computer Science March 02, 2006 Ananda Guna Binomial Heaps.
1 Binomial heaps, Fibonacci heaps, and applications.
Balanced Trees Ellen Walker CPSC 201 Data Structures Hiram College.
1 B-Trees & (a,b)-Trees CS 6310: Advanced Data Structures Western Michigan University Presented by: Lawrence Kalisz.
Weight balance trees (Nievergelt & Reingold 73)
Binary Trees, Binary Search Trees RIZWAN REHMAN CENTRE FOR COMPUTER STUDIES DIBRUGARH UNIVERSITY.
CSIT 402 Data Structures II
2IL50 Data Structures Fall 2015 Lecture 7: Binary Search Trees.
B + -Trees. Motivation An AVL tree with N nodes is an excellent data structure for searching, indexing, etc. The Big-Oh analysis shows that most operations.
Starting at Binary Trees
1 Splay trees (Sleator, Tarjan 1983). 2 Goal Support the same operations as previous search trees.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 9.
Lecture 11COMPSCI.220.FS.T Balancing an AVLTree Two mirror-symmetric pairs of cases to rebalance the tree if after the insertion of a new key to.
1 Biased 2-b trees (Bent, Sleator, Tarjan 1980). 2 Goal Keep sorted lists subject to the following operations: find(x,L) insert(x,L) delete(x,L) catenate(L1,L2)
D. ChristozovCOS 221 Intro to CS II AVL Trees 1 AVL Trees: Balanced BST Binary Search Trees Performance Height Balanced Trees Rotation AVL: insert, delete.
B-TREE. Motivation for B-Trees So far we have assumed that we can store an entire data structure in main memory What if we have so much data that it won’t.
Balanced Binary Search Trees
Lecture 10COMPSCI.220.FS.T Binary Search Tree BST converts a static binary search into a dynamic binary search allowing to efficiently insert and.
1 Binary Search Trees  Average case and worst case Big O for –insertion –deletion –access  Balance is important. Unbalanced trees give worse than log.
B+-Tree Deletion Underflow conditions B+ tree Deletion Algorithm
Self-Adjusting Data Structures
Multiway Search Trees Data may not fit into main memory
Binomial heaps, Fibonacci heaps, and applications
Finger search trees.
B+-Trees.
B+-Trees.
B+ Tree.
Binary Trees, Binary Search Trees
Chapter 22 : Binary Trees, AVL Trees, and Priority Queues
Data Structures and Algorithms
(2,4) Trees (2,4) Trees 1 (2,4) Trees (2,4) Trees
Height Balanced Trees 2-3 Trees.
(2,4) Trees /26/2018 3:48 PM (2,4) Trees (2,4) Trees
Bottom-Up Splay Trees–Analysis
The algorithm of Garsia and Wachs
(2,4) Trees (2,4) Trees (2,4) Trees.
Splay trees (Sleator, Tarjan 1983)
(2,4) Trees 2/15/2019 (2,4) Trees (2,4) Trees.
COMP171 B+-Trees (Part 2).
(2,4) Trees /24/2019 7:30 PM (2,4) Trees (2,4) Trees
Binary Trees, Binary Search Trees
(2,4) Trees (2,4) Trees (2,4) Trees.
Binomial heaps, Fibonacci heaps, and applications
Dynamic trees (Steator and Tarjan 83)
Red Black Trees (Guibas Sedgewick 78)
2-3 Trees Extended tree. Tree in which all empty subtrees are replaced by new nodes that are called external nodes. Original nodes are called internal.
Fibonacci Heaps.
COMP171 B+-Trees (Part 2).
Binomial heaps, Fibonacci heaps, and applications
Binary Trees, Binary Search Trees
Presentation transcript:

1 Minimize average access time Items have weights: Item i has weight w i Let W =  w i be the total weight of the items Want the search to heavy items to be faster If p i = w i /W represents the access frequency to item i then the average access time is pipi  didi where d i is the depth of item i

2 There is a lower bound So we will be looking for trees for which d i = O(log (W/w i )) p i d i   p i log b (1/ p i )  for every tree with maximum degree b In particular if all weights are equal the regular search trees which we have studied, will do the job.

3 Static setup: we know the access freq. You can find the best tree in O(nlog(n)) time (homework)

4 Approximation (Mehlhorn)

5 Approximation (Mehlhorn)

19 Analysis The sum of the weights of the pieces that correspond to an internal node is no larger than the length of the corresponding interval An internal node at level i corresponds to an interval of length 1/2 i

20 Analysis

21 Biased 2-b trees (Bent, Sleator, Tarjan 1980)

22 Biased 2-b trees definition Internal nodes have degree between 2 and b. We also need an additional property: Define the rank of a node x in a 2-b tree recursively as follows. If x is a leaf containing item i then r(x) =  log 2 w i  If x is an internal node r(x) = 1 + max {r(y) | y is a child of x }

23 Biased 2-3 tree (example)

24 Biased 2-b trees definition (cont) Call x major if r(x) = r(p(x)) - 1 Otherwise x is minor Local bias: Any neighboring sibling of a minor node is a major leaf. In case all weights are the same this implies that all leaves should be at the same level and we get regular 2-b trees. Here is the additional property:

25 Biased 2-3 trees example revisited

26 Are the access times ok ? Define the size of a node x in a 2-b tree recursively as follows. If x is a leaf containing item i s(x) = w i If x is an internal node s(x) =  y is a child of x s(y) Lemma: For any node x, 2 r(x)-1  s(x), For a leaf x, 2 r(x)  s(x) < 2 r(x) +1 ==> if x is a leaf of depth d then d < log(W/ w i ) + 2 proof. D  r(root) - r(x) < log (s(r)) (log(s(x)) - 1)

27 Are the access times ok ? (cont.) Lemma: For any node x, 2 r(x)-1  s(x), For a leaf x, 2 r(x)  s(x) < 2 r(x) +1 proof. By induction on r(x). If x is a leaf the definition r(x) =  log 2 s(x)  implies that 2 r(x)  s(x) < 2 r(x) +1 If x is an internal node with a minor child then x has a major child which is a leaf, say y. So 2 r(x)-1 = 2 r(y)  s(y) < s(x) If x is an internal node with no minor child then it has at least two major children y and z: 2 r(x)-1 = 2 r(y) s(z)-1  s(y) + s(z)  s(x)

28 Concatenation (example) =

29 Catenation (definition) Traverse the right path of the tree rooted at r and the left path of the tree rooted at r’ concurrently. Go down one step from the node of higher rank. Stop either when they are both equal or the node of higher rank is a leaf. y x w.l.o.g. let rank(x) ≥ rank(y). If rank(x) > rank(y) then x is a leaf p(x) p(y) Note that rank(p(y)) ≥ rank(x) (otherwise we should not have traversed y, but continue from x or stop) r r’

30 Catenation (definition) Assume v=p(x), the other case is symmetric y x p(x) p(y) Let v be the node among p(x) and p(y) of minimum rank

31 Catenation (definition) Case 1: If the rank of v is larger by at least 2 than the rank of x stick x and y as children of a new node g. Stick g underneath v Merge the paths by rank. y x v=p(x) p(y) yx v=p(x) p(y) g

32 Catenation (definition) Case 2: If the rank of v is larger by 1 than the rank of x Add y as a child of v Merge the paths by rank. y x v=p(x) p(y) y x v=p(x) p(y)

33 Concatenation (example) =

34 Catenation (definition) Note that in both cases local biased is preserved ! y x v=p(x) p(y) y x v=p(x) p(y)

35 Catenation (the symmetric case) If v=p(y) then y x p(x) p(y) Let v be the node among p(x) and p(y) of minimum rank y x p(x) p(y) Note that if y is minor then x is a major leaf

36 Catenation (definition) Traverse the right path of the tree rooted at x and the left path of the tree rooted at y concurrently. Go down one step from the node of higher rank. Stop either when they are both equal or the node of higher rank is a leaf. Merge the traversed paths ordering nodes by rank: Case 1: If the rank of the rank-largest node of the last two nodes is one smaller than the rank of the smallest-rank node w above this pair then stick the last two nodes as children of w. Merge the paths by rank. Split w if necessary and continue splitting as long as a major node splits (the nodes resulting from the split have the same rank). When a minor node splits add a new node which is a parent of the two node resulting from the split and stop. Otherwise, you stop when the root splits

37 Catenation (definition) Case 2: If the rank of the rank-largest node of the last two nodes is smaller by at least 2 than the rank of the smallest-rank node w above this pair then stick the last two nodes as children of a new node g. Stick g underneath the smallest parent of the last two node. Merge the paths by rank.

38 Catenation (splitting the high degree node) Why does a node split into two nodes of the same rank ? 1 Can’t have two minor consecutive siblings It could be that we have to split a high degree node. We split as long as we have a high degree node, when a minor node splits we add a new parent to the two pieces and stop.

39 Catenation (proof of correctness) Obs1: Before splitting every minor node stands where a minor node used to stand before in one of the trees. Obs2: Splitting preserves local bias. Follows from the following observations:

40 Catenation (worst case analysis) Worst case bound: O(max{r(x),r(y)} - max{r(u),r(v)}) = O(log(W/(w - + w+)) x and y are the two roots u is the rightmost leaf descendant of x and v is the leftmost leaf descendant of y w - = s(u), w + = s(v), W is the total weight of both trees. In particular if y is leaf and x is the root of a big tree of weight W then this bound is O(W/s(y))

41 Catenation (amortized analysis) amortized bound: O(|r(x) - r(y) |) Potential (def): every (minor) node x has r(p(x)) - r(x) - 1 credits.  = total number of credits. We want the potential to decrease by one for every node of rank smaller than r(y) that we traverse. Proof:

42 Catenation (amortized analysis) a a b c d e b c d e f + a a b c d e b c d e f =

43 Catenation (amortized analysis) a a b c d e b c d e f g d e f g f had r(e) - r(f) - 1 credits. g needs r(d) - r(g) - 1 which is smaller by at least 2, in general it would be smaller by at least 1 + the number of blue guys c d c d e d had r(c) - r(d) - 1 d needs r(e) - r(d) - 1 # of released credits is at least the number of pink guys

44 3-way concatenation (example) =

45 3-way concatenation Do two succesive 2-way catenations. Analysis: Amortized: O(max{r(x), r(y), r(z)} - min{r(x), r(y), r(z)}) worst-case: O(max{r(x), r(y), r(z)} - r(y))

46 2-way split Similar to what we did for regular search trees. Suppose we split at a leaf y which is in the tree. We go up from y towards the root x and accumulate a left tree and a right tree by succesive 2-way catenations Analysis: To split a tree with root x at a leaf y. amortized: O(r(x) - r(y)) = O(log(W/s(y))

47 3-way split Splitting at an item i which is not in the tree. Let i- be the largest item in the tree which is smaller than i Let i+ be the smallest item in the tree which is bigger than i Let y be the lowest common ancestor of i- and i+ The initial left tree is formed from the children of y containing item less than i. The initial right tree is formed from the children of y containing items bigger than i. Analysis: To split a tree with root x at an item i not in the tree amortized: O(r(x) - r(y)) = O(log(W/(s(i-) + s(i+)))

48 Other operations Define delete, insert, and weight change in a straightforward way in terms of catenate and split.

49 Extensions There are many variants. Binary variants. Variants that has good bounds for all operations on the worst case