Presentation is loading. Please wait.

Presentation is loading. Please wait.

AVL Trees1 Part-F2 AVL Trees 6 3 8 4 v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that.

Similar presentations


Presentation on theme: "AVL Trees1 Part-F2 AVL Trees 6 3 8 4 v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that."— Presentation transcript:

1 AVL Trees1 Part-F2 AVL Trees v z

2 AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that for every internal node v of T, the heights of the children of v can differ by at most 1. An example of an AVL tree where the heights are shown next to the nodes:

3 AVL Trees3 Balanced nodes A internal node is balanced if the heights of its two children differ by at most 1. Otherwise, such an internal node is unbalanced.

4 AVL Trees4 Height of an AVL Tree Fact: The height of an AVL tree storing n keys is O(log n). Proof: Let us bound n(h): the minimum number of internal nodes of an AVL tree of height h. We easily see that n(1) = 1 and n(2) = 2 For n > 2, an AVL tree of height h contains the root node, one AVL subtree of height n-1 and another of height n-2. That is, n(h) = 1 + n(h-1) + n(h-2) Knowing n(h-1) > n(h-2), we get n(h) > 2n(h-2). So n(h) > 2n(h-2), n(h) > 4n(h-4), n(h) > 8n(n-6), … (by induction), n(h) > 2 i n(h-2i)>2 {  h/2  -1} (1) = 2 {  h/2  -1} Solving the base case we get: n(h) > 2 h/2-1 Taking logarithms: h < 2log n(h) +2 Thus the height of an AVL tree is O(log n) 3 4 n(1) n(2) h-1 h-2

5 AVL Trees5 Insertion in an AVL Tree Insertion is as in a binary search tree Always done by expanding an external node. Example: w b=x a=y c=z before insertionafter insertion It is no longer balanced

6 AVL Trees6 Names of important nodes w: the newly inserted node. (insertion process follow the binary search tree method) The heights of some nodes in T might be increased after inserting a node. Those nodes must be on the path from w to the root. Other nodes are not effected. z: the first node we encounter in going up from w toward the root such that z is unbalanced. y: the child of z with higher height. y must be an ancestor of w. (why? Because z in unbalanced after inserting w) x: the child of y with higher height. The height of the sibling of x is smaller than that of x. (Otherwise, the height of y cannot be increased.) x must be an ancestor of w. See the figure in the last slide.

7 AVL Trees7 Algorithm restructure(x): Input: A node x of a binary search tree T that has both parent y and grand-parent z. Output: Tree T after a trinode restructuring. 1. Let (a, b, c) be the list (increasing order) of nodes x, y, and z. Let T0, T1, T2 T3 be a left-to-right (inorder) listing of the four subtrees of x, y, and z not rooted at x, y, or z. 2. Replace the subtree rooted at z with a new subtree rooted at b.. 3. Let a be the left child of b and let T0 and T1 be the left and right subtrees of a, respectively. 4. Let c be the right child of b and let T2 and T3 be the left and right subtrees of c, respectively.

8 AVL Trees8 Restructuring (as Single Rotations) Single Rotations:

9 AVL Trees9 Restructuring (as Double Rotations) double rotations:

10 AVL Trees10 Insertion Example, continued T 0 T 1 T 2 T 3 x y z unbalanced......balanced T 1

11 AVL Trees11 Theorem: One restructure operation is enough to ensure that the whole tree is balanced. Proof: Left to the readers.

12 AVL Trees12 Removal in an AVL Tree Removal begins as in a binary search tree, which means the node removed will become an empty external node. Its parent, w, may cause an imbalance. Example: before deletion of 32after deletion

13 AVL Trees13 Rebalancing after a Removal Let z be the first unbalanced node encountered while travelling up the tree from w. Also, let y be the child of z with the larger height, let x be the child of y defined as follows; If one of the children of y is taller than the other, choose x as the taller child of y. If both children of y have the same height, select x be the child of y on the same side as y (i.e., if y is the left child of z, then x is the left child of y; and if y is the right child of z then x is the right child of y.)

14 AVL Trees14 Rebalancing after a Removal We perform restructure(x) to restore balance at z. As this restructuring may upset the balance of another node higher in the tree, we must continue checking for balance until the root of T is reached w c=x b=y a=z

15 AVL Trees15 Unbalanced after restructuring w c=x b=y a=z h=5 h=3 h=4 Unbalanced balanced

16 AVL Trees16 Rebalancing after a Removal We perform restructure(x) to restore balance at z. As this restructuring may upset the balance of another node higher in the tree, we must continue checking for balance until the root of T is reached w c=x b=y a=z

17 AVL Trees17 Running Times for AVL Trees a single restructure is O(1) using a linked-structure binary tree find is O(log n) height of tree is O(log n), no restructures needed insert is O(log n) initial find is O(log n) Restructuring up the tree, maintaining heights is O(log n) remove is O(log n) initial find is O(log n) Restructuring up the tree, maintaining heights is O(log n)

18 AVL Trees18 Part-G1 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4

19 AVL Trees19 Divide-and-Conquer (§ ) Divide-and conquer is a general algorithm design paradigm: Divide: divide the input data S in two disjoint subsets S 1 and S 2 Recur: solve the subproblems associated with S 1 and S 2 Conquer: combine the solutions for S 1 and S 2 into a solution for S The base case for the recursion are subproblems of size 0 or 1 Merge-sort is a sorting algorithm based on the divide-and-conquer paradigm Like heap-sort It uses a comparator It has O(n log n) running time Unlike heap-sort It does not use an auxiliary priority queue It accesses data in a sequential manner (suitable to sort data on a disk)

20 AVL Trees20 Merge-Sort (§ 10.1) Merge-sort on an input sequence S with n elements consists of three steps: Divide: partition S into two sequences S 1 and S 2 of about n  2 elements each Recur: recursively sort S 1 and S 2 Conquer: merge S 1 and S 2 into a unique sorted sequence Algorithm mergeSort(S, C) Input sequence S with n elements, comparator C Output sequence S sorted according to C if S.size() > 1 (S 1, S 2 )  partition(S, n/2) mergeSort(S 1, C) mergeSort(S 2, C) S  merge(S 1, S 2 )

21 AVL Trees21 Merging Two Sorted Sequences The conquer step of merge-sort consists of merging two sorted sequences A and B into a sorted sequence S containing the union of the elements of A and B Merging two sorted sequences, each with n  2 elements and implemented by means of a doubly linked list, takes O(n) time Algorithm merge(A, B) Input sequences A and B with n  2 elements each Output sorted sequence of A  B S  empty sequence while  A.isEmpty()   B.isEmpty() if A.first().element() < B.first().element() S.insertLast(A.remove(A.first())) else S.insertLast(B.remove(B.first())) while  A.isEmpty() S.insertLast(A.remove(A.first())) while  B.isEmpty() S.insertLast(B.remove(B.first())) return S

22 AVL Trees22 Merge-Sort Tree An execution of merge-sort is depicted by a binary tree each node represents a recursive call of merge-sort and stores  unsorted sequence before the execution and its partition  sorted sequence at the end of the execution the root is the initial call the leaves are calls on subsequences of size 0 or  9 4   2  2 79  4   72  29  94  4

23 AVL Trees23 Execution Example Partition       1 67  72  29  94  43  38  86  61   

24 AVL Trees24 Execution Example (cont.) Recursive call, partition 7 2  9 4       1 67  72  29  94  43  38  86  61   

25 AVL Trees25 Execution Example (cont.) Recursive call, partition 7 2  9 4    2      72  29  94  43  38  86  61   

26 AVL Trees26 Execution Example (cont.) Recursive call, base case 7 2  9 4    2      77  7 2  29  94  43  38  86  61   

27 AVL Trees27 Execution Example (cont.) Recursive call, base case 7 2  9 4    2      77  72  22  29  94  43  38  86  61   

28 AVL Trees28 Execution Example (cont.) Merge 7 2  9 4    2      77  72  22  29  94  43  38  86  61   

29 AVL Trees29 Execution Example (cont.) Recursive call, …, base case, merge 7 2  9 4    2      77  72  22  23  38  86  61     94  4

30 AVL Trees30 Execution Example (cont.) Merge 7 2  9 4    2      77  72  22  29  94  43  38  86  61   

31 AVL Trees31 Execution Example (cont.) Recursive call, …, merge, merge 7 2  9 4    2      77  72  22  29  94  43  33  38  88  86  66  61  11   

32 AVL Trees32 Execution Example (cont.) Merge 7 2  9 4    2      77  72  22  29  94  43  33  38  88  86  66  61  11   

33 AVL Trees33 Analysis of Merge-Sort The height h of the merge-sort tree is O(log n) at each recursive call we divide in half the sequence, The overall amount or work done at the nodes of depth i is O(n) we partition and merge 2 i sequences of size n  2 i we make 2 i  1 recursive calls Thus, the total running time of merge-sort is O(n log n) depth#seqssize 01n 12 n2n2 i2i2i n2in2i ………

34 AVL Trees34 Summary of Sorting Algorithms AlgorithmTimeNotes selection-sort O(n2)O(n2) slow in-place for small data sets (< 1K) insertion-sort O(n2)O(n2) slow in-place for small data sets (< 1K) heap-sort O(n log n) fast in-place for large data sets (1K — 1M) merge-sort O(n log n) fast sequential data access for huge data sets (> 1M)


Download ppt "AVL Trees1 Part-F2 AVL Trees 6 3 8 4 v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that."

Similar presentations


Ads by Google