Presentation is loading. Please wait.

Presentation is loading. Please wait.

Binary Search Trees < > = Dictionaries

Similar presentations


Presentation on theme: "Binary Search Trees < > = Dictionaries"— Presentation transcript:

1 Binary Search Trees < > = 6 2 9 1 4 8 Dictionaries
6/29/2019 1:16 PM Binary Search Trees < 6 2 > 9 1 4 = 8

2 Ordered Dictionaries Keys are assumed to come from a total order.
New operations: closestKeyBefore(k) closestElemBefore(k) closestKeyAfter(k) closestElemAfter(k)

3 Binary Search (§3.1.1) Binary search performs operation findElement(k) on a dictionary implemented by means of an array-based sequence, sorted by key similar to the high-low game at each step, the number of candidate items is halved terminates after O(log n) steps Example: findElement(7) 1 3 4 5 7 8 9 11 14 16 18 19 l m h 1 3 4 5 7 8 9 11 14 16 18 19 l m h 1 3 4 5 7 8 9 11 14 16 18 19 l m h 1 3 4 5 7 8 9 11 14 16 18 19 l=m =h

4 Lookup Table (§3.1.1) A lookup table is a dictionary implemented by means of a sorted sequence We store the items of the dictionary in an array-based sequence, sorted by key We use an external comparator for the keys Performance: findElement takes O(log n) time, using binary search insertItem takes O(n) time since in the worst case we have to shift n/2 items to make room for the new item removeElement take O(n) time since in the worst case we have to shift n/2 items to compact the items after the removal The lookup table is effective only for dictionaries of small size or for dictionaries on which searches are the most common operations, while insertions and removals are rarely performed (e.g., credit card authorizations)

5 Binary Search Tree (§3.1.2) A binary search tree is a binary tree storing keys (or key-element pairs) at its internal nodes and satisfying the following property: Let u, v, and w be three nodes such that u is in the left subtree of v and w is in the right subtree of v. We have key(u)  key(v)  key(w) External nodes do not store items An inorder traversal of a binary search trees visits the keys in increasing order 6 9 2 4 1 8

6 Dictionaries 6/29/2019 1:16 PM Search (§3.1.3) To search for a key k, we trace a downward path starting at the root The next node visited depends on the outcome of the comparison of k with the key of the current node If we reach a leaf, the key is not found and we return NO_SUCH_KEY Example: findElement(4) Algorithm findElement(k, v) if T.isExternal (v) return NO_SUCH_KEY if k < key(v) return findElement(k, T.leftChild(v)) else if k = key(v) return element(v) else { k > key(v) } return findElement(k, T.rightChild(v)) < 6 2 > 9 1 4 = 8

7 Insertion (§3.1.4) < 6 To perform operation insertItem(k, o), we search for key k Assume k is not already in the tree, and let w be the leaf reached by the search We insert k at node w and expand w into an internal node Example: insert 5 2 9 > 1 4 8 > w 6 2 9 1 4 8 w 5

8 Deletion (§3.1.5) 6 To perform operation removeElement(k), we search for key k Assume key k is in the tree, and let let v be the node storing k If node v has a leaf child w, we remove v and w from the tree with operation removeAboveExternal(w) Example: remove 4 < 2 9 > v 1 4 8 w 5 6 2 9 1 5 8

9 Deletion (cont.) 1 v We consider the case where the key k to be removed is stored at a node v whose children are both internal we find the internal node w that follows v in an inorder traversal we copy key(w) into node v we remove node w and its left child z (which must be a leaf) by means of operation removeAboveExternal(z) Example: remove 3 3 2 8 6 9 w 5 z 1 v 5 2 8 6 9

10 Performance (§3.1.6) Consider a dictionary with n items implemented by means of a binary search tree of height h the space used is O(n) methods findElement , insertItem and removeElement take O(h) time The height h is O(n) in the worst case and O(log n) in the best case

11 Dictionaries 6/29/2019 1:16 PM (2,4) Trees 9 10 14

12 Multi-Way Search Tree A multi-way search tree is an ordered tree such that Each internal node has at least two children and stores d -1 key-element items (ki, oi), where d is the number of children For a node with children v1 v2 … vd storing keys k1 k2 … kd-1 keys in the subtree of v1 are less than k1 keys in the subtree of vi are between ki-1 and ki (i = 2, …, d - 1) keys in the subtree of vd are greater than kd-1 The leaves store no items and serve as placeholders 15 30

13 Multi-Way Inorder Traversal
We can extend the notion of inorder traversal from binary trees to multi-way search trees Namely, we visit item (ki, oi) of node v between the recursive traversals of the subtrees of v rooted at children vi and vi + 1 An inorder traversal of a multi-way search tree visits the keys in increasing order 8 12 15 2 4 6 10 14 18 30 1 3 5 7 9 11 13 16 19 15 17

14 Multi-Way Searching Similar to search in a binary search tree
A each internal node with children v1 v2 … vd and keys k1 k2 … kd-1 k = ki (i = 1, …, d - 1): the search terminates successfully k < k1: we continue the search in child v1 ki-1 < k < ki (i = 2, …, d - 1): we continue the search in child vi k > kd-1: we continue the search in child vd Reaching an external node terminates the search unsuccessfully Example: search for 30 15 30

15 (2,4) Tree A (2,4) tree (also called 2-4 tree or tree) is a multi-way search with the following properties Node-Size Property: every internal node has at most four children Depth Property: all the external nodes have the same depth Depending on the number of children, an internal node of a (2,4) tree is called a 2-node, 3-node or 4-node 2 8 12 18

16 Height of a (2,4) Tree Theorem: A (2,4) tree storing n items has height O(log n) Proof: Let h be the height of a (2,4) tree with n items Since there are at least 2i items at depth i = 0, … , h - 1 and no items at depth h, we have n  … + 2h-1 = 2h - 1 Thus, h  log (n + 1) Searching in a (2,4) tree with n items takes O(log n) time depth items 1 1 2 h-1 2h-1 h

17 Insertion We insert a new item (k, o) at the parent v of the leaf reached by searching for k We preserve the depth property but We may cause an overflow (i.e., node v may become a 5-node) Example: inserting key 30 causes an overflow v 2 8 12 18 v 2 8 12 18

18 Overflow and Split We handle an overflow at a 5-node v with a split operation: let v1 … v5 be the children of v and k1 … k4 be the keys of v node v is replaced nodes v' and v" v' is a 3-node with keys k1 k2 and children v1 v2 v3 v" is a 2-node with key k4 and children v4 v5 key k3 is inserted into the parent u of v (a new root may be created) The overflow may propagate to the parent node u u u v v' v" 12 18 12 18 27 30 35 v1 v2 v3 v4 v5 v1 v2 v3 v4 v5

19 Analysis of Insertion Algorithm insertItem(k, o)
1. We search for key k to locate the insertion node v 2. We add the new item (k, o) at node v 3. while overflow(v) if isRoot(v) create a new empty root above v v  split(v) Let T be a (2,4) tree with n items Tree T has O(log n) height Step 1 takes O(log n) time because we visit O(log n) nodes Step 2 takes O(1) time Step 3 takes O(log n) time because each split takes O(1) time and we perform O(log n) splits Thus, an insertion in a (2,4) tree takes O(log n) time

20 Deletion We reduce deletion of an item to the case where the item is at the node with leaf children Otherwise, we replace the item with its inorder successor (or, equivalently, with its inorder predecessor) and delete the latter item Example: to delete key 24, we replace it with 27 (inorder successor) 2 8 12 18 2 8 12 18

21 Underflow and Fusion u u w v v'
Deleting an item from a node v may cause an underflow, where node v becomes a 1-node with one child and no keys To handle an underflow at node v with parent u, we consider two cases Case 1: the adjacent siblings of v are 2-nodes Fusion operation: we merge v with an adjacent sibling w and move an item from u to the merged node v' After a fusion, the underflow may propagate to the parent u u u 9 14 9 w v v' 10 10 14

22 Underflow and Transfer
To handle an underflow at node v with parent u, we consider two cases Case 2: an adjacent sibling w of v is a 3-node or a 4-node Transfer operation: 1. we move a child of w to v 2. we move an item from u to v 3. we move an item from w to u After a transfer, no underflow occurs u u 4 9 4 8 w v w v 2 6 8 2 6 9

23 Analysis of Deletion Let T be a (2,4) tree with n items
Tree T has O(log n) height In a deletion operation We visit O(log n) nodes to locate the node from which to delete the item We handle an underflow with a series of O(log n) fusions, followed by at most one transfer Each fusion and transfer takes O(1) time Thus, deleting an item from a (2,4) tree takes O(log n) time

24 Implementing a Dictionary
Comparison of efficient dictionary implementations Search Insert Delete Notes Hash Table 1 expected no ordered dictionary methods simple to implement Skip List log n high prob. randomized insertion (2,4) Tree log n worst-case complex to implement

25 Dictionaries 6/29/2019 1:16 PM Red-Black Trees 6 v 3 8 z 4

26 Outline and Reading From (2,4) trees to red-black trees (§3.3.3)
Definition Height Insertion restructuring recoloring Deletion adjustment

27 From (2,4) to Red-Black Trees
A red-black tree is a representation of a (2,4) tree by means of a binary tree whose nodes are colored red or black In comparison with its associated (2,4) tree, a red-black tree has same logarithmic time performance simpler implementation with a single node type 4 3 5 4 5 3 6 OR 3 5 2 7

28 Red-Black Tree A red-black tree can also be defined as a binary search tree that satisfies the following properties: Root Property: the root is black External Property: every leaf is black Internal Property: the children of a red node are black Depth Property: all the leaves have the same black depth 9 4 15 2 6 12 21 7

29 Height of a Red-Black Tree
Theorem: A red-black tree storing n items has height O(log n) Proof: The height of a red-black tree is at most twice the height of its associated (2,4) tree, which is O(log n) The search algorithm for a binary search tree is the same as that for a binary search tree By the above theorem, searching in a red-black tree takes O(log n) time

30 Insertion To perform operation insertItem(k, o), we execute the insertion algorithm for binary search trees and color red the newly inserted node z unless it is the root We preserve the root, external, and depth properties If the parent v of z is black, we also preserve the internal property and we are done Else (v is red ) we have a double red (i.e., a violation of the internal property), which requires a reorganization of the tree Example where the insertion of 4 causes a double red: 6 6 v v 3 8 3 8 z z 4

31 Remedying a Double Red Consider a double red with child z and parent v, and let w be the sibling of v Case 1: w is black The double red is an incorrect replacement of a 4-node Restructuring: we change the 4-node replacement Case 2: w is red The double red corresponds to an overflow Recoloring: we perform the equivalent of a split 4 4 w v w v 2 7 2 7 z z 6 6

32 Restructuring A restructuring remedies a child-parent double red when the parent red node has a black sibling It is equivalent to restoring the correct replacement of a 4-node The internal property is restored and the other properties are preserved z 4 6 w v v 2 7 4 7 z w 6 2

33 Restructuring (cont.) There are four restructuring configurations depending on whether the double red nodes are left or right children 6 4 2 6 2 4 2 6 4 2 6 4 4 2 6

34 Recoloring A recoloring remedies a child-parent double red when the parent red node has a red sibling The parent v and its sibling w become black and the grandparent u becomes red, unless it is the root It is equivalent to performing a split on a 5-node The double red violation may propagate to the grandparent u 4 4 w v w v 2 7 2 7 z z 6 6 … 4 … 2 6 7

35 Analysis of Insertion Algorithm insertItem(k, o)
1. We search for key k to locate the insertion node z 2. We add the new item (k, o) at node z and color z red 3. while doubleRed(z) if isBlack(sibling(parent(z))) z  restructure(z) return else { sibling(parent(z) is red } z  recolor(z) Recall that a red-black tree has O(log n) height Step 1 takes O(log n) time because we visit O(log n) nodes Step 2 takes O(1) time Step 3 takes O(log n) time because we perform O(log n) recolorings, each taking O(1) time, and at most one restructuring taking O(1) time Thus, an insertion in a red-black tree takes O(log n) time

36 Deletion To perform operation remove(k), we first execute the deletion algorithm for binary search trees Let v be the internal node removed, w the external node removed, and r the sibling of w If either v or r was red, we color r black and we are done Else (v and r were both black) we color r double black, which is a violation of the internal property requiring a reorganization of the tree Example where the deletion of 8 causes a double black: 6 6 v r 3 8 3 r w 4 4

37 Remedying a Double Black
The algorithm for remedying a double black node w with sibling y considers three cases Case 1: y is black and has a red child We perform a restructuring, equivalent to a transfer , and we are done Case 2: y is black and its children are both black We perform a recoloring, equivalent to a fusion, which may propagate up the double black violation Case 3: y is red We perform an adjustment, equivalent to choosing a different representation of a 3-node, after which either Case 1 or Case 2 applies Deletion in a red-black tree takes O(log n) time

38 Red-Black Tree Reorganization
Insertion remedy double red Red-black tree action (2,4) tree action result restructuring change of 4-node representation double red removed recoloring split double red removed or propagated up Deletion remedy double black Red-black tree action (2,4) tree action result restructuring transfer double black removed recoloring fusion double black removed or propagated up adjustment change of 3-node representation restructuring or recoloring follows


Download ppt "Binary Search Trees < > = Dictionaries"

Similar presentations


Ads by Google