Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS203 Lecture 15.

Similar presentations


Presentation on theme: "CS203 Lecture 15."— Presentation transcript:

1 CS203 Lecture 15

2 Quick-Sort Quick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm: Divide: pick a random element x (called pivot) and partition S into L elements less than x E elements equal x G elements greater than x Recur: sort L and G Conquer: join L, E and G x x L G E x Quick-Sort

3 Partition We partition an input sequence as follows:
We remove, in turn, each element y from S and We insert y into L, E or G, depending on the result of the comparison with the pivot x Each insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) time Thus, the partition step of quick-sort takes O(n) time Algorithm partition(S, p) Input sequence S, position p of pivot Output subsequences L, E, G of the elements of S less than, equal to, or greater than the pivot, resp. L, E, G  empty sequences x  S.remove(p) while S.isEmpty() y  S.remove(S.first()) if y < x L.addLast(y) else if y = x E.addLast(y) else { y > x } G.addLast(y) return L, E, G Quick-Sort

4 Java Implementation Quick-Sort

5 Quick-Sort Tree An execution of quick-sort is depicted by a binary tree Each node represents a recursive call of quick-sort and stores Unsorted sequence before the execution and its pivot Sorted sequence at the end of the execution The root is the initial call The leaves are calls on subsequences of size 0 or 1 4 2  2 4 7 9  7 9 2  2 9  9 Quick-Sort

6 Worst-case Running Time
The worst case for quick-sort occurs when the pivot is the unique minimum or maximum element One of L and G has size n - 1 and the other has size 0 The running time is proportional to the sum n + (n - 1) + … Thus, the worst-case running time of quick-sort is O(n2) dept h time n 1 n - 1 Quick-Sort

7 Expected Running Time Consider a recursive call of quick-sort on a sequence of size s Good call: the sizes of L and G are close to equal Bad call: one of L and G is much larger than the other  1 1 Good call Bad call Bad pivots Good pivots Bad pivots Quick-Sort

8 Expected Running Time, Part 2
Best case: the pivots are all perfect. Height of tree is ceiling of O(log n) Typical case: some of the pivots are good, some are bad. Height still O(log n), with a different base Worst case: all the pivots are bad: height of tree is O(n) The amount or work done at the nodes of the same depth is O(n), because all the nodes appear at every depth Thus, the best case and expected running time of quick- sort is O(n log n); worst case is O(n2) Quick-Sort

9 In-Place Quick-Sort Quick-sort can be implemented to run in-place
In the partition step, we use replace operations to rearrange the elements of the input sequence such that the elements less than the pivot have rank less than h the elements equal to the pivot have rank between h and k the elements greater than the pivot have rank greater than k The recursive calls consider elements with rank less than h elements with rank greater than k Algorithm inPlaceQuickSort(S, l, r) Input sequence S, ranks l and r Output sequence S with the elements of rank between l and r rearranged in increasing order if l  r return i  a random integer between l and r x  S.elemAtRank(i) (h, k)  inPlacePartition(x) inPlaceQuickSort(S, l, h - 1) inPlaceQuickSort(S, k + 1, r) Quick-Sort

10 In-Place Partitioning
Perform the partition using two indices to split S into L and E U G (a similar method can split E U G into E and G). Repeat until j and k cross: Scan j to the right until finding an element > x. Scan k to the left until finding an element < x. Swap elements at indices j and k j k (pivot = 6) j k Quick-Sort

11 Java Implementation Quick-Sort

12 Merge Sort Merge Sort is a recursive divide and conquer algorithm
12 Merge Sort Merge Sort is a recursive divide and conquer algorithm Recursive method splits the list repeatedly until it is made up of single-element sublists, then merges them as the recursion unwinds: mergeSort(list): firstHalf = mergeSort(firstHalf); secondHalf = mergeSort(secondHalf); list = merge(firstHalf, secondHalf); merge: add the lesser of firstHalf [0] and secondHalf [0] to the new, larger list repeat until one of the (already sorted) sublists is exhausted add the rest of the remaining sublist to the larger list.

13 13 Merge Sort

14 Merge Sort Time 14 Thus, Merge Sort is O(n log n)
Assume n is a power of 2. This assumption makes the math simpler. If n is not a power of 2, the difference is irrelevant to the order of complexity. Merge sort splits the list into two sublists, sorts the sublists using the same algorithm recursively, and then merges the sublists. Each recursive call merge sorts half the list, so the depth of the recursion is the number of times you need to split n to get lists of size 1, ie log n. The single-item lists are, obviously, sorted. Merge Sort reassembles the list in log n steps, just as it broke the list down. The total size of all the sublists is n, the original size of the unsorted list. To merge the sublists, across all the sublists at one level of recursion, takes at most n-1 comparisons and n copies of one element from a sublist to the merged list. The total merge time is 2n-1, which is O(n). The O(n) merging happens log n times as the full sorted array is built. Thus, Merge Sort is O(n log n)

15 15 Bucket Sort All sort algorithms discussed so far are general sorting algorithms that work for any types of keys (e.g., integers, strings, and any comparable objects). These algorithms sort the elements by comparing their keys. The lower bound for general sorting algorithms is O(n logn). So, no sorting algorithms based on comparisons can perform better than O(n log n). However, if the keys are small integers, you can use bucket sort without having to compare the keys.

16 16 Bucket Sort The bucket sort algorithm works as follows. Assume the keys are in the range from 0 to N-1. We need N buckets labeled 0, 1, ..., and N-1. If an element’s key is i, the element is put into the bucket i. Each bucket holds the elements with the same key value. You can use an ArrayList to implement a bucket. Bucket Sort is O(n). Searching: finding the bucket is O(1). Searching the list in one bucket is O(1) in the best case and O(n) in the worst case. Note the similarity to a hash map with a very simple hash function.

17 Strategy Pattern The Strategy pattern provides interchangeable algorithms for solving some problem by encapsulating them in objects. In Java, Strategy is implemented using a Java interface with multiple implementations. Each implementing class can use a different algorithm. A Strategy pattern can be set up so that the particular algorithm can be chosen at runtime based on characteristics of the actual data or other considerations. Arrays.sort() implements the Strategy pattern; it chooses the particular type of sort to use, and this is transparent to client code. An application that can save files in different file formats might use Strategy, deciding at runtime which implementation to use Several of my labs have been designed to give you a sense of the type of situation in which you would use Strategy 17

18 Strategy Pattern 18

19 Strategy Pattern Source: 19

20 Strategy Pattern When a pattern has conventional terms for the classes and interfaces, use them in your code. Other developers will understand much of how your code works as soon as they see that you are using the pattern. For example, in Strategy, use the word Strategy in the name of the interface, eg PaymentStrategy, and in the names of the implementations, eg CreditCardStrategy. 20


Download ppt "CS203 Lecture 15."

Similar presentations


Ads by Google