Towers of Hanoi Move n (4) disks from pole A to pole B such that a larger disk is never put on a smaller disk A BC ABC.

Slides:



Advertisements
Similar presentations
AVL Trees1 Part-F2 AVL Trees v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that.
Advertisements

© 2004 Goodrich, Tamassia Merge Sort1 7 2  9 4   2  2 79  4   72  29  94  4.
Goodrich, Tamassia Sorting, Sets and Selection1 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
© 2004 Goodrich, Tamassia Merge Sort1 7 2  9 4   2  2 79  4   72  29  94  4.
Data Structures Lecture 9 Fang Yu Department of Management Information Systems National Chengchi University Fall 2010.
Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
Merge Sort 4/15/2017 4:30 PM Merge Sort 7 2   7  2  2 7
Quicksort Quicksort     29  9.
Merge Sort1 Part-G1 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
Merge Sort1 7 2  9 4   2  2 79  4   72  29  94  4.
Insertion Sort. Selection Sort. Bubble Sort. Heap Sort. Merge-sort. Quick-sort. 2 CPSC 3200 University of Tennessee at Chattanooga – Summer 2013 © 2010.
CSC2100B Quick Sort and Merge Sort Xin 1. Quick Sort Efficient sorting algorithm Example of Divide and Conquer algorithm Two phases ◦ Partition phase.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
© 2004 Goodrich, Tamassia QuickSort1 Quick-Sort     29  9.
Quick-Sort     29  9.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
Comparison Sorts Chapter 9.4, 12.1, Sorting  We have seen the advantage of sorted data representations for a number of applications  Sparse vectors.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
© 2004 Goodrich, Tamassia Selection1. © 2004 Goodrich, Tamassia Selection2 The Selection Problem Given an integer k and n elements x 1, x 2, …, x n, taken.
© 2004 Goodrich, Tamassia Merge Sort1 Quick-Sort     29  9.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4 Modified by: Daniel Gomez-Prado, University of Massachusetts Amherst.
Selection1. 2 The Selection Problem Given an integer k and n elements x 1, x 2, …, x n, taken from a total order, find the k-th smallest element in this.
© 2004 Goodrich, Tamassia Merge Sort1 Chapter 8 Sorting Topics Basics Merge sort ( 合併排序 ) Quick sort Bucket sort ( 分籃排序 ) Radix sort ( 基數排序 ) Summary.
MergeSort Source: Gibbs & Tamassia. 2 MergeSort MergeSort is a divide and conquer method of sorting.
Sorting.
CSC 213 Lecture 12: Quick Sort & Radix/Bucket Sort.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Sorting Fun1 Chapter 4: Sorting     29  9.
Lecture10: Sorting II Bohyung Han CSE, POSTECH CSED233: Data Structures (2014F)
Randomized Algorithms CSc 4520/6520 Design & Analysis of Algorithms Fall 2013 Slides adopted from Dmitri Kaznachey, George Mason University and Maciej.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
© 2004 Goodrich, Tamassia Merge Sort1 7 2  9 4   2  2 79  4   72  29  94  4.
1 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
QuickSort by Dr. Bun Yue Professor of Computer Science CSCI 3333 Data.
Merge Sort 1/12/2018 5:48 AM Merge Sort 7 2   7  2  2 7
Advanced Sorting 7 2  9 4   2   4   7
Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount.
Merge Sort 1/12/2018 9:39 AM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Merge Sort 1/12/2018 9:44 AM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Quick-Sort 2/18/2018 3:56 AM Selection Selection.
Towers of Hanoi Move n (4) disks from pole A to pole C
Divide-and-Conquer 6/30/2018 9:16 AM
Quick-Sort 9/12/2018 3:26 PM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Quick-Sort 9/13/2018 1:15 AM Quick-Sort     2
MergeSort Source: Gibbs & Tamassia.
More on Merge Sort CS 244 This presentation is not given in class
Radish-Sort 11/11/ :01 AM Quick-Sort     2 9  9
Objectives Introduce different known sorting algorithms
Quick-Sort 11/14/2018 2:17 PM Chapter 4: Sorting    7 9
Quick-Sort 11/19/ :46 AM Chapter 4: Sorting    7 9
Merge Sort 12/4/ :32 AM Merge Sort 7 2   7  2   4  4 9
1 Lecture 13 CS2013.
Sorting Recap & More about Sorting
Quick-Sort 2/23/2019 1:48 AM Chapter 4: Sorting    7 9
Merge Sort 2/23/ :15 PM Merge Sort 7 2   7  2   4  4 9
Merge Sort 2/24/2019 6:07 PM Merge Sort 7 2   7  2  2 7
Quick-Sort 2/25/2019 2:22 AM Quick-Sort     2
Copyright © Aiman Hanna All rights reserved
Copyright © Aiman Hanna All rights reserved
Quick-Sort 4/8/ :20 AM Quick-Sort     2 9  9
Merge Sort 4/10/ :25 AM Merge Sort 7 2   7  2   4  4 9
Quick-Sort 4/25/2019 8:10 AM Quick-Sort     2
Divide & Conquer Sorting
Quick-Sort 5/7/2019 6:43 PM Selection Selection.
Quick-Sort 5/25/2019 6:16 PM Selection Selection.
Merge Sort 5/30/2019 7:52 AM Merge Sort 7 2   7  2  2 7
CS203 Lecture 15.
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

Towers of Hanoi Move n (4) disks from pole A to pole B such that a larger disk is never put on a smaller disk A BC ABC

ABC Move n (4) disks from A to B Move n-1 (3) disks from A to C Move 1 disk from A to B Move n-1 (3) disks from C to B

Figure 2.19a and b a) The initial state; b) move n - 1 disks from A to C

Figure 2.19c and d c) move one disk from A to B; d) move n - 1 disks from C to B

Hanoi towers public static void solveTowers(int count, char source, char destination, char spare) { if (count == 1) { System.out.println("Move top disk from pole " + source + " to pole " + destination); } else { solveTowers(count-1, source, spare, destination); // X solveTowers(1, source, destination, spare); // Y solveTowers(count-1, spare, destination, source); // Z } // end if } // end solveTowers

Recursion tree: The order of recursive calls that results from solveTowers(3,A,B,C)

ABC ABC ABC Figure 2.21a Box trace of solveTowers(3, ‘ A ’, ‘ B ’, ‘ C ’ ) ABC

Figure 2.21b Box trace of solveTowers(3, ‘ A ’, ‘ B ’, ‘ C ’ ) ABC ABC ABC ABC

Figure 2.21c Box trace of solveTowers(3, ‘ A ’, ‘ B ’, ‘ C ’ ) ABC ABC ABC ABC

Figure 2.21d Box trace of solveTowers(3, ‘ A ’, ‘ B ’, ‘ C ’ ) ABC

Figure 2.21e Box trace of solveTowers(3, ‘ A ’, ‘ B ’, ‘ C ’ )

Cost of Hanoi Towers How many moves is necessary to solve Hanoi Towers problem for N disks? moves(1) = 1 moves(N) = moves(N-1) + moves(1) + moves(N-1) i.e. moves(N) = 2*moves(N-1) + 1 Guess solution and show it’s correct with Mathematical Induction!

Merge Sort13 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4 Presentation for use with the textbook Data Structures and Algorithms in Java, 6 th edition, by M. T. Goodrich, R. Tamassia, and M. H. Goldwasser, Wiley, 2014

Merge Sort14 Divide-and-Conquer Divide-and conquer is a general algorithm design paradigm: Divide: divide the input data S in two disjoint subsets S 1 and S 2 Recur: solve the subproblems associated with S 1 and S 2 Conquer: combine the solutions for S 1 and S 2 into a solution for S The base case for the recursion are subproblems of size 0 or 1

Merge Sort15 Merge-Sort Merge-sort on an input sequence S with n elements consists of three steps: Divide: partition S into two sequences S 1 and S 2 of about n  2 elements each Recur: recursively sort S 1 and S 2 Conquer: merge S 1 and S 2 into a unique sorted sequence Algorithm mergeSort(S) Input sequence S with n elements Output sequence S sorted according to C if S.size() > 1 (S 1, S 2 )  partition(S, n/2) mergeSort(S 1 ) mergeSort(S 2 ) S  merge(S 1, S 2 )

Merge Sort16 Merging Two Sorted Sequences The conquer step of merge-sort consists of merging two sorted sequences A and B into a sorted sequence S containing the union of the elements of A and B Merging two sorted sequences, each with n  2 elements and implemented by means of a doubly linked list, takes O(n) time Algorithm merge(A, B) Input sequences A and B with n  2 elements each Output sorted sequence of A  B S  empty sequence while  A.isEmpty()   B.isEmpty() if A.first().element() < B.first().element() S.addLast(A.remove(A.first())) else S.addLast(B.remove(B.first())) while  A.isEmpty() S.addLast(A.remove(A.first())) while  B.isEmpty() S.addLast(B.remove(B.first())) return S

Merge Sort17 Merge-Sort Tree An execution of merge-sort is depicted by a binary tree each node represents a recursive call of merge-sort and stores unsorted sequence before the execution and its partition sorted sequence at the end of the execution the root is the initial call the leaves are calls on subsequences of size 0 or  9 4   2  2 79  4   72  29  94  4

Merge Sort18 Execution Example Partition       1 67  72  29  94  43  38  86  61   

Merge Sort19 Execution Example (cont.) Recursive call, partition 7 2  9 4       1 67  72  29  94  43  38  86  61   

Merge Sort20 Execution Example (cont.) Recursive call, partition 7 2  9 4    2      72  29  94  43  38  86  61   

Merge Sort21 Execution Example (cont.) Recursive call, base case 7 2  9 4    2      77  7 2  29  94  43  38  86  61   

Merge Sort22 Execution Example (cont.) Recursive call, base case 7 2  9 4    2      77  72  22  29  94  43  38  86  61   

Merge Sort23 Execution Example (cont.) Merge 7 2  9 4    2      77  72  22  29  94  43  38  86  61   

Merge Sort24 Execution Example (cont.) Recursive call, …, base case, merge 7 2  9 4    2      77  72  22  23  38  86  61     94  4

Merge Sort25 Execution Example (cont.) Merge 7 2  9 4    2      77  72  22  29  94  43  38  86  61   

Merge Sort26 Execution Example (cont.) Recursive call, …, merge, merge 7 2  9 4    2      77  72  22  29  94  43  33  38  88  86  66  61  11   

Merge Sort27 Execution Example (cont.) Merge 7 2  9 4    2      77  72  22  29  94  43  33  38  88  86  66  61  11   

Merge Sort28 Analysis of Merge-Sort The height h of the merge-sort tree is O(log n) at each recursive call we divide in half the sequence, The overall amount or work done at the nodes of depth i is O(n) we partition and merge 2 i sequences of size n  2 i we make 2 i  1 recursive calls Thus, the total running time of merge-sort is O(n log n) depth#seqssize 01n 12 n2n2 i2i2i n2in2i ………

Quick-Sort29 Quick-Sort     29  9 Presentation for use with the textbook Data Structures and Algorithms in Java, 6 th edition, by M. T. Goodrich, R. Tamassia, and M. H. Goldwasser, Wiley, 2014

Quick-Sort30 Quick-Sort Quick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm: Divide: pick a random element x (called pivot) and partition S into L elements less than x E elements equal x G elements greater than x Recur: sort L and G Conquer: join L, E and G x x LG E x

Quick-Sort31 Partition We partition an input sequence as follows: We remove, in turn, each element y from S and We insert y into L, E or G, depending on the result of the comparison with the pivot x Each insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) time Thus, the partition step of quick-sort takes O(n) time Algorithm partition(S, p) Input sequence S, position p of pivot Output subsequences L, E, G of the elements of S less than, equal to, or greater than the pivot, resp. L, E, G  empty sequences x  S.remove(p) while  S.isEmpty() y  S.remove(S.first()) if y < x L.addLast(y) else if y = x E.addLast(y) else { y > x } G.addLast(y) return L, E, G

Quick-Sort32 Quick-Sort Tree An execution of quick-sort is depicted by a binary tree Each node represents a recursive call of quick-sort and stores Unsorted sequence before the execution and its pivot Sorted sequence at the end of the execution The root is the initial call The leaves are calls on subsequences of size 0 or     29  9

Quick-Sort33 Execution Example Pivot selection      38    94  4

Quick-Sort34 Execution Example (cont.) Partition, recursive call, pivot selection    94     38  8 2  2

Quick-Sort35 Execution Example (cont.) Partition, recursive call, base case   11    94      38  8

Quick-Sort36 Execution Example (cont.) Recursive call, …, base case, join   38     11  14 3   94  44  4

Quick-Sort37 Execution Example (cont.) Recursive call, pivot selection      11  14 3   94  44  4

Quick-Sort38 Execution Example (cont.) Partition, …, recursive call, base case      11  14 3   94  44  4 9  99  9

Quick-Sort39 Execution Example (cont.) Join, join      11  14 3   94  44  4 9  99  9

Quick-Sort40 Worst-case Running Time The worst case for quick-sort occurs when the pivot is the unique minimum or maximum element One of L and G has size n  1 and the other has size 0 The running time is proportional to the sum n  (n  1)  …  2  Thus, the worst-case running time of quick-sort is O(n 2 ) depthtime 0n 1 n  1 …… 1 …

Quick-Sort41 Expected Running Time Consider a recursive call of quick-sort on a sequence of size s Good call: the sizes of L and G are each less than 3s  4 Bad call: one of L and G has size greater than 3s  4 A call is good with probability 1  2 1/2 of the possible pivots cause good calls:  Good callBad call Good pivotsBad pivots

Quick-Sort42 Expected Running Time, Part 2 Probabilistic Fact: The expected number of coin tosses required in order to get k heads is 2k For a node of depth i, we expect i  2 ancestors are good calls The size of the input sequence for the current call is at most ( 3  4 ) i  2 n Therefore, we have For a node of depth 2log 4  3 n, the expected input size is one The expected height of the quick-sort tree is O(log n) The amount or work done at the nodes of the same depth is O(n) Thus, the expected running time of quick-sort is O(n log n)

Quick-Sort43 In-Place Quick-Sort Quick-sort can be implemented to run in-place In the partition step, we use replace operations to rearrange the elements of the input sequence such that the elements less than the pivot have rank less than h the elements equal to the pivot have rank between h and k the elements greater than the pivot have rank greater than k The recursive calls consider elements with rank less than h elements with rank greater than k Algorithm inPlaceQuickSort(S, l, r) Input sequence S, ranks l and r Output sequence S with the elements of rank between l and r rearranged in increasing order if l  r return i  a random integer between l and r x  S.elemAtRank(i) (h, k)  inPlacePartition(x) inPlaceQuickSort(S, l, h  1) inPlaceQuickSort(S, k  1, r)

Quick-Sort44 In-Place Partitioning Perform the partition using two indices to split S into L and E U G (a similar method can split E U G into E and G). Repeat until j and k cross: Scan j to the right until finding an element > x. Scan k to the left until finding an element < x. Swap elements at indices j and k jk (pivot = 6) jk