Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sorting Recap & More about Sorting

Similar presentations


Presentation on theme: "Sorting Recap & More about Sorting"— Presentation transcript:

1 Sorting Recap & More about Sorting
[G], [C], and [L] 

2 Bubble sort Details in [L] page 100

3 Bubble sort Details in [L] page 101

4 Selection sort Details in [L] page 99

5 Selection sort Details in [L] page 99

6 Insertion sort Details in [L] page 134

7 Insertion sort Details in [L] page 135

8 New: Shell’s sort Divide the array into smaller subarrays that are “gap” apart Sort subarrays Decrease “gap” Sort again [[ “easier” sort ]]

9 Shell Sort: A Better Insertion Sort
A Shell sort is a type of insertion sort, but with O(n3/2) or better performance than the O(n2) sorts It is named after its discoverer, Donald Shell Shell’s sort can be thought of as a divide-and- conquer approach to insertion sort Instead of sorting the entire array, Shell sort sorts many smaller subarrays using insertion sort before sorting the entire array

10 Subarrays gap = 5

11 Subarrays gap = 2

12 Trace of Shell Sort gap value 7 40 35 80 75 60 90 70 55 85 34 45 57 65
[0] [1] [2] [3] [4] 90 70 55 [5] [6] [7] [8] [9] 85 34 45 62 57 [10] [11] [12] [13] [14] 65 [15]

13 Trace of Shell Sort (cont.)
gap value 7 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 55 [5] [6] [7] [8] [9] 85 34 45 62 57 [10] [11] [12] [13] [14] 65 [15] subarray 1

14 Trace of Shell Sort (cont.)
gap value 7 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 55 [5] [6] [7] [8] [9] 85 34 45 62 57 [10] [11] [12] [13] [14] 65 [15] subarray 2

15 Trace of Shell Sort (cont.)
gap value 7 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 55 [5] [6] [7] [8] [9] 85 34 45 62 57 [10] [11] [12] [13] [14] 65 [15] subarray 3

16 Trace of Shell Sort (cont.)
gap value 7 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 55 [5] [6] [7] [8] [9] 85 34 45 62 57 [10] [11] [12] [13] [14] 65 [15] subarray 4

17 Trace of Shell Sort (cont.)
gap value 7 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 55 [5] [6] [7] [8] [9] 85 34 45 62 57 [10] [11] [12] [13] [14] 65 [15] subarray 5

18 Trace of Shell Sort (cont.)
gap value 7 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 55 [5] [6] [7] [8] [9] 85 34 45 62 57 [10] [11] [12] [13] [14] 65 [15] subarray 6

19 Trace of Shell Sort (cont.)
gap value 7 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 55 [5] [6] [7] [8] [9] 85 34 45 62 57 [10] [11] [12] [13] [14] 65 [15] subarray 7

20 Trace of Shell Sort (cont.)
gap value 7 Sort subarray 1 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 55 [5] [6] [7] [8] [9] 85 34 45 62 57 [10] [11] [12] [13] [14] 65 [15] subarray 1

21 Trace of Shell Sort (cont.)
gap value 7 Sort subarray 1 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 57 55 [5] [6] [7] [8] [9] 85 34 45 62 [10] [11] [12] [13] [14] 65 [15] subarray 1

22 Trace of Shell Sort (cont.)
gap value 7 Sort subarray 2 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 57 55 [5] [6] [7] [8] [9] 85 34 45 62 [10] [11] [12] [13] [14] 65 [15] subarray 2

23 Trace of Shell Sort (cont.)
gap value 7 Sort subarray 3 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 57 55 [5] [6] [7] [8] [9] 85 34 45 62 [10] [11] [12] [13] [14] 65 [15] subarray 3

24 Trace of Shell Sort (cont.)
gap value 7 Sort subarray 4 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 57 55 [5] [6] [7] [8] [9] 85 34 45 62 [10] [11] [12] [13] [14] 65 [15] subarray 4

25 Trace of Shell Sort (cont.)
gap value 7 Sort subarray 5 40 35 80 75 60 [0] [1] [2] [3] [4] 90 70 57 55 [5] [6] [7] [8] [9] 85 34 45 62 [10] [11] [12] [13] [14] 65 [15] subarray 5

26 Trace of Shell Sort (cont.)
gap value 7 Sort subarray 5 40 35 80 75 34 [0] [1] [2] [3] [4] 90 70 57 55 [5] [6] [7] [8] [9] 85 60 45 62 [10] [11] [12] [13] [14] 65 [15] subarray 5

27 Trace of Shell Sort (cont.)
gap value 7 Sort subarray 6 40 35 80 75 34 [0] [1] [2] [3] [4] 90 70 57 55 [5] [6] [7] [8] [9] 85 60 45 62 [10] [11] [12] [13] [14] 65 [15] subarray 6

28 Trace of Shell Sort (cont.)
gap value 7 Sort subarray 6 40 35 80 75 34 [0] [1] [2] [3] [4] 45 70 57 55 90 [5] [6] [7] [8] [9] 85 60 62 [10] [11] [12] [13] [14] 65 [15] subarray 6

29 Trace of Shell Sort (cont.)
gap value 7 Sort subarray 7 40 35 80 75 34 [0] [1] [2] [3] [4] 45 70 57 55 90 [5] [6] [7] [8] [9] 85 60 62 [10] [11] [12] [13] [14] 65 [15] subarray 7

30 Trace of Shell Sort (cont.)
gap value 7 Sort subarray 7 40 35 80 75 34 [0] [1] [2] [3] [4] 45 62 57 55 90 [5] [6] [7] [8] [9] 85 60 70 [10] [11] [12] [13] [14] 65 [15] subarray 7

31 Trace of Shell Sort (cont.)
gap value 7 Sort on smaller gap value next 40 35 80 75 34 [0] [1] [2] [3] [4] 45 62 57 55 90 [5] [6] [7] [8] [9] 85 60 70 [10] [11] [12] [13] [14] 65 [15]

32 Trace of Shell Sort (cont.)
gap value 3 Sort on smaller gap value 40 35 80 75 34 [0] [1] [2] [3] [4] 45 62 57 55 90 [5] [6] [7] [8] [9] 85 60 70 [10] [11] [12] [13] [14] 65 [15]

33 Trace of Shell Sort (cont.)
gap value 3 Sort subarray 1 40 35 80 75 34 [0] [1] [2] [3] [4] 45 62 57 55 90 [5] [6] [7] [8] [9] 85 60 70 [10] [11] [12] [13] [14] 65 [15] subarray 1

34 Trace of Shell Sort (cont.)
gap value 3 Sort subarray 1 40 35 80 62 34 [0] [1] [2] [3] [4] 45 75 57 55 90 [5] [6] [7] [8] [9] 85 60 70 [10] [11] [12] [13] [14] 65 [15] subarray 1

35 Trace of Shell Sort (cont.)
gap value 3 Sort subarray 1 40 35 80 62 34 [0] [1] [2] [3] [4] 45 65 57 55 75 [5] [6] [7] [8] [9] 85 60 90 70 [10] [11] [12] [13] [14] [15] subarray 1

36 Trace of Shell Sort (cont.)
gap value 3 Sort subarray 2 40 35 80 62 34 [0] [1] [2] [3] [4] 45 65 57 55 75 [5] [6] [7] [8] [9] 85 60 90 70 [10] [11] [12] [13] [14] [15] subarray 2

37 Trace of Shell Sort (cont.)
gap value 3 Sort subarray 2 40 34 80 62 35 [0] [1] [2] [3] [4] 45 65 57 55 75 [5] [6] [7] [8] [9] 85 60 90 70 [10] [11] [12] [13] [14] [15] subarray 2

38 Trace of Shell Sort (cont.)
gap value 3 Sort subarray 2 40 34 80 62 35 [0] [1] [2] [3] [4] 45 65 57 55 75 [5] [6] [7] [8] [9] 70 60 90 85 [10] [11] [12] [13] [14] [15] subarray 2

39 Trace of Shell Sort (cont.)
gap value 3 Sort subarray 3 40 34 80 62 35 [0] [1] [2] [3] [4] 45 65 57 55 75 [5] [6] [7] [8] [9] 70 60 90 85 [10] [11] [12] [13] [14] [15] subarray 3

40 Trace of Shell Sort (cont.)
gap value 3 Sort subarray 3 40 34 45 62 35 [0] [1] [2] [3] [4] 80 65 57 55 75 [5] [6] [7] [8] [9] 70 60 90 85 [10] [11] [12] [13] [14] [15] subarray 3

41 Trace of Shell Sort (cont.)
gap value 3 Sort subarray 3 40 34 45 62 35 [0] [1] [2] [3] [4] 55 65 57 80 75 [5] [6] [7] [8] [9] 70 60 90 85 [10] [11] [12] [13] [14] [15] subarray 3

42 Trace of Shell Sort (cont.)
gap value 3 Sort subarray 3 40 34 45 62 35 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 80 90 85 [10] [11] [12] [13] [14] [15] subarray 3

43 Trace of Shell Sort (cont.)
gap value 3 Sort subarray 3 40 34 45 62 35 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15] subarray 3

44 Trace of Shell Sort (cont.)
gap value 3 Sort on gap value of 1 (a regular insertion sort) 40 34 45 62 35 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

45 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 40 34 45 62 35 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

46 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 40 34 45 62 35 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

47 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 40 45 62 35 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

48 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 40 45 62 35 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

49 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 40 45 62 35 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

50 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 40 45 62 35 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

51 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 40 45 62 35 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

52 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 40 45 62 35 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

53 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 62 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

54 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 62 [0] [1] [2] [3] [4] 55 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

55 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 62 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

56 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 62 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

57 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 62 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

58 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 62 65 57 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

59 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 62 65 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

60 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 62 65 60 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

61 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

62 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

63 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

64 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 75 [5] [6] [7] [8] [9] 70 90 85 80 [10] [11] [12] [13] [14] [15]

65 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 70 [5] [6] [7] [8] [9] 75 90 85 80 [10] [11] [12] [13] [14] [15]

66 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 70 [5] [6] [7] [8] [9] 75 90 85 80 [10] [11] [12] [13] [14] [15]

67 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 70 [5] [6] [7] [8] [9] 75 90 85 80 [10] [11] [12] [13] [14] [15]

68 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 70 [5] [6] [7] [8] [9] 75 90 85 80 [10] [11] [12] [13] [14] [15]

69 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 70 [5] [6] [7] [8] [9] 75 90 85 80 [10] [11] [12] [13] [14] [15]

70 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 70 [5] [6] [7] [8] [9] 75 90 85 80 [10] [11] [12] [13] [14] [15]

71 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 70 [5] [6] [7] [8] [9] 75 85 90 80 [10] [11] [12] [13] [14] [15]

72 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 70 [5] [6] [7] [8] [9] 75 85 90 80 [10] [11] [12] [13] [14] [15]

73 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 70 [5] [6] [7] [8] [9] 75 85 80 90 [10] [11] [12] [13] [14] [15]

74 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 70 [5] [6] [7] [8] [9] 75 85 80 90 [10] [11] [12] [13] [14] [15]

75 Trace of Shell Sort (cont.)
gap value 1 Sort on gap value of 1 (a regular insertion sort) 34 35 40 45 55 [0] [1] [2] [3] [4] 57 60 62 65 70 [5] [6] [7] [8] [9] 75 85 80 90 [10] [11] [12] [13] [14] [15]

76 Shell Sort Algorithm Shell Sort Algorithm
Set the initial value of gap to n / 2 while gap > 0 3. for each array element from position gap to the last element Insert this element where it belongs in its subarray. 5. if gap is 2, set it to 1 6. else gap = gap / 2.2. // chosen by experimentation

77 Analysis of Shell Sort Because the behavior of insertion sort is closer to O(n) than O(n2) when an array is nearly sorted, presorting speeds up later sorting This is critical when sorting large arrays where the O(n2) performance becomes significant

78 Analysis of Shell Sort (cont.)
A general analysis of Shell sort is an open research problem in computer science Performance depends on how the decreasing sequence of values for gap is chosen If successive powers of 2 are used for gap, performance is O(n2) If successive values for gap are based on Hibbard's sequence, 2k – 1 (i.e. 31, 15, 7, 3, 1) it can be proven that the performance is O(n3/2) Other sequences give similar or better performance

79 Analysis of Shell Sort (cont.)

80 Example Shell Sort void shell_sort(int A[], int size) {
int i, j, incrmnt, temp; incrmnt = size / 2; while (incrmnt > 0) { for (i=incrmnt; i < size; i++) { j = i; temp = A[i]; while ((j >= incrmnt) && (A[j-incrmnt] > temp)) { A[j] = A[j - incrmnt]; j = j - incrmnt; } A[j] = temp; } incrmnt /= 2;

81 Merge Sort 1/1/ :28 AM Merge Sort 7 2   7  2  2 7 9  4  4 9 7  7 2  2 9  9 4  4

82 Merge Sort 1/1/ :28 AM Divide-and-Conquer Divide-and conquer is a general algorithm design paradigm: Divide: divide the input data S in two disjoint subsets S1 and S2 Recur: solve the subproblems associated with S1 and S2 Conquer: combine the solutions for S1 and S2 into a solution for S The base case for the recursion are subproblems of size 0 or 1 Merge-sort is a sorting algorithm based on the divide-and-conquer paradigm Like heap-sort It uses a comparator It has O(n log n) running time Unlike heap-sort It does not use an auxiliary priority queue It accesses data in a sequential manner (suitable to sort data on a disk)

83 Merge Sort 1/1/ :28 AM Merge-Sort Merge-sort on an input sequence S with n elements consists of three steps: Divide: partition S into two sequences S1 and S2 of about n/2 elements each Recur: recursively sort S1 and S2 Conquer: merge S1 and S2 into a unique sorted sequence Algorithm mergeSort(S, C) Input sequence S with n elements, comparator C Output sequence S sorted according to C if S.size() > 1 (S1, S2)  partition(S, n/2) mergeSort(S1, C) mergeSort(S2, C) S  merge(S1, S2)

84 Merging Two Sorted Sequences
Merge Sort 1/1/ :28 AM Merging Two Sorted Sequences The conquer step of merge-sort consists of merging two sorted sequences A and B into a sorted sequence S containing the union of the elements of A and B Merging two sorted sequences, each with n/2 elements and implemented by means of a doubly linked list, takes O(n) time Algorithm merge(A, B) Input sequences A and B with n/2 elements each Output sorted sequence of A  B S  empty sequence while A.isEmpty()  B.isEmpty() if A.first().element() < B.first().element() S.insertLast(A.remove(A.first())) else S.insertLast(B.remove(B.first())) while A.isEmpty() S.insertLast(A.remove(A.first())) while B.isEmpty() S.insertLast(B.remove(B.first())) return S

85 Merge Sort 1/1/ :28 AM Merge-Sort Tree An execution of merge-sort is depicted by a binary tree each node represents a recursive call of merge-sort and stores unsorted sequence before the execution and its partition sorted sequence at the end of the execution the root is the initial call the leaves are calls on subsequences of size 0 or 1 7 2   7  2  2 7 9  4  4 9 7  7 2  2 9  9 4  4

86 Execution Example Partition 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9
Merge Sort 1/1/ :28 AM Execution Example Partition   7 2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

87 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Recursive call, partition   7 2  9 4  7 2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

88 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Recursive call, partition   7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

89 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Recursive call, base case   7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

90 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Recursive call, base case   7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

91 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Merge   7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

92 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Recursive call, …, base case, merge   7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

93 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Merge   7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

94 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Recursive call, …, merge, merge   7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

95 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Merge   7 2  9 4  7  2  2 7 9 4  4 9 3 8  3 8 6 1  1 6 7  7 2  2 9  9 4  4 3  3 8  8 6  6 1  1

96 Analysis of Merge-Sort
1/1/ :28 AM Analysis of Merge-Sort The height h of the merge-sort tree is O(log n) at each recursive call we divide in half the sequence, The overall amount or work done at the nodes of depth i is O(n) we partition and merge 2i sequences of size n/2i we make 2i+1 recursive calls Thus, the total running time of merge-sort is O(n log n) depth #seqs size 1 n 2 n/2 i 2i n/2i

97 Nonrecursive Merge-Sort
1/1/ :28 AM Nonrecursive Merge-Sort public static void mergeSort(Object[] orig, Comparator c) { // nonrecursive Object[] in = new Object[orig.length]; // make a new temporary array System.arraycopy(orig,0,in,0,in.length); // copy the input Object[] out = new Object[in.length]; // output array Object[] temp; // temp array reference used for swapping int n = in.length; for (int i=1; i < n; i*=2) { // each iteration sorts all length-2*i runs for (int j=0; j < n; j+=2*i) // each iteration merges two length-i pairs merge(in,out,c,j,i); // merge from in to out two length-i runs at j temp = in; in = out; out = temp; // swap arrays for next iteration } // the "in" array contains the sorted array, so re-copy it System.arraycopy(in,0,orig,0,in.length); protected static void merge(Object[] in, Object[] out, Comparator c, int start, int inc) { // merge in[start..start+inc-1] and in[start+inc..start+2*inc-1] int x = start; // index into run #1 int end1 = Math.min(start+inc, in.length); // boundary for run #1 int end2 = Math.min(start+2*inc, in.length); // boundary for run #2 int y = start+inc; // index into run #2 (could be beyond array boundary) int z = start; // index into the out array while ((x < end1) && (y < end2)) if (c.compare(in[x],in[y]) <= 0) out[z++] = in[x++]; else out[z++] = in[y++]; if (x < end1) // first run didn't finish System.arraycopy(in, x, out, z, end1 - x); else if (y < end2) // second run didn't finish System.arraycopy(in, y, out, z, end2 - y); merge runs of length 2, then 4, then 8, and so on merge two runs in the in array to the out array

98 Merge Sort 1/1/ :28 AM Quick-Sort 4 2  2 4 7 9  7 9 2  2 9  9

99 Merge Sort 1/1/ :28 AM Quick-Sort Quick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm: Divide: pick a random element x (called pivot) and partition S into L elements less than x E elements equal x G elements greater than x Recur: sort L and G Conquer: join L, E and G x x L G E x

100 Partition We partition an input sequence as follows:
Merge Sort 1/1/ :28 AM Partition We partition an input sequence as follows: We remove, in turn, each element y from S and We insert y into L, E or G, depending on the result of the comparison with the pivot x Each insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) time Thus, the partition step of quick-sort takes O(n) time Algorithm partition(S, p) Input sequence S, position p of pivot Output subsequences L, E, G of the elements of S less than, equal to, or greater than the pivot, resp. L, E, G  empty sequences x  S.remove(p) while S.isEmpty() y  S.remove(S.first()) if y < x L.insertLast(y) else if y = x E.insertLast(y) else { y > x } G.insertLast(y) return L, E, G

101 Merge Sort 1/1/ :28 AM Quick-Sort Tree An execution of quick-sort is depicted by a binary tree Each node represents a recursive call of quick-sort and stores Unsorted sequence before the execution and its pivot Sorted sequence at the end of the execution The root is the initial call The leaves are calls on subsequences of size 0 or 1 4 2  2 4 7 9  7 9 2  2 9  9

102 Execution Example Pivot selection 7 2 9 4 3 7 6 1  1 2 3 4 6 7 8 9
Merge Sort 1/1/ :28 AM Execution Example Pivot selection 2  2 9 4  4 9 3  3 8  8 9  9 4  4

103 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Partition, recursive call, pivot selection 2  2 9 4  4 9 3  3 8  8 9  9 4  4

104 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Partition, recursive call, base case    1  1 9 4  4 9 3  3 8  8 9  9 4  4

105 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Recursive call, …, base case, join 1  1 4 3  3 4 3  3 8  8 9  9 4  4

106 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Recursive call, pivot selection 1  1 4 3  3 4 8  8 9  9 9  9 4  4

107 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Partition, …, recursive call, base case 1  1 4 3  3 4 8  8 9  9 9  9 4  4

108 Execution Example (cont.)
Merge Sort 1/1/ :28 AM Execution Example (cont.) Join, join 1  1 4 3  3 4 8  8 9  9 9  9 4  4

109 Worst-case Running Time
Merge Sort 1/1/ :28 AM Worst-case Running Time The worst case for quick-sort occurs when the pivot is the unique minimum or maximum element One of L and G has size n - 1 and the other has size 0 The running time is proportional to the sum n + (n - 1) + … Thus, the worst-case running time of quick-sort is O(n2) depth time n 1 n - 1

110 Merge Sort 1/1/ :28 AM Expected Running Time Consider a recursive call of quick-sort on a sequence of size s Good call: the sizes of L and G are each less than 3s/4 Bad call: one of L and G has size greater than 3s/4 A call is good with probability 1/2 1/2 of the possible pivots cause good calls:  1 1 Good call Bad call Bad pivots Good pivots Bad pivots

111 Expected Running Time, Part 2
Merge Sort 1/1/ :28 AM Expected Running Time, Part 2 Probabilistic Fact: The expected number of coin tosses required in order to get k heads is 2k For a node of depth i, we expect i/2 ancestors are good calls The size of the input sequence for the current call is at most (3/4)i/2n Therefore, we have For a node of depth 2log4/3n, the expected input size is one The expected height of the quick-sort tree is O(log n) The amount or work done at the nodes of the same depth is O(n) Thus, the expected running time of quick-sort is O(n log n)

112 In-Place Quick-Sort Quick-sort can be implemented to run in-place
Merge Sort 1/1/ :28 AM In-Place Quick-Sort Quick-sort can be implemented to run in-place In the partition step, we use replace operations to rearrange the elements of the input sequence such that the elements less than the pivot have rank less than h the elements equal to the pivot have rank between h and k the elements greater than the pivot have rank greater than k The recursive calls consider elements with rank less than h elements with rank greater than k Algorithm inPlaceQuickSort(S, l, r) Input sequence S, ranks l and r Output sequence S with the elements of rank between l and r rearranged in increasing order if l  r return i  a random integer between l and r x  S.elemAtRank(i) (h, k)  inPlacePartition(x) inPlaceQuickSort(S, l, h - 1) inPlaceQuickSort(S, k + 1, r)

113 In-Place Partitioning
Merge Sort 1/1/ :28 AM In-Place Partitioning Perform the partition using two indices to split S into L and E U G (a similar method can split E U G into E and G). Repeat until j and k cross: Scan j to the right until finding an element > x. Scan k to the left until finding an element < x. Swap elements at indices j and k j k (pivot = 6) j k

114 only works for distinct elements
Merge Sort 1/1/ :28 AM Java Implementation public static void quickSort (Object[] S, Comparator c) { if (S.length < 2) return; // the array is already sorted in this case quickSortStep(S, c, 0, S.length-1); // recursive sort method } private static void quickSortStep (Object[] S, Comparator c, int leftBound, int rightBound ) { if (leftBound >= rightBound) return; // the indices have crossed Object temp; // temp object used for swapping Object pivot = S[rightBound]; int leftIndex = leftBound; // will scan rightward int rightIndex = rightBound-1; // will scan leftward while (leftIndex <= rightIndex) { // scan right until larger than the pivot while ( (leftIndex <= rightIndex) && (c.compare(S[leftIndex], pivot)<=0) ) leftIndex++; // scan leftward to find an element smaller than the pivot while ( (rightIndex >= leftIndex) && (c.compare(S[rightIndex], pivot)>=0)) rightIndex--; if (leftIndex < rightIndex) { // both elements were found temp = S[rightIndex]; S[rightIndex] = S[leftIndex]; // swap these elements S[leftIndex] = temp; } // the loop continues until the indices cross temp = S[rightBound]; // swap pivot with the element at leftIndex S[rightBound] = S[leftIndex]; S[leftIndex] = temp; // the pivot is now at leftIndex, so recurse quickSortStep(S, c, leftBound, leftIndex-1); quickSortStep(S, c, leftIndex+1, rightBound); only works for distinct elements

115 Java JDK sorts Java uses both mergesort and quicksort.
Can sort array of type Comparable or any primitive type. Uses a version of quicksort for primitive types. Uses a version of mergesort for objects. Why?

116 Java JDK sorts Explore Arrays.sort N-way! Dual pivot!
Yes; we didn’t cover everything  See you in Advanced Algorithms (May be)

117 Sorting Lower Bound 1/1/ :28 AM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia, and M. H. Goldwasser, Wiley, 2014 Sorting Lower Bound

118 Comparison-Based Sorting
Many sorting algorithms are comparison based. They sort by making comparisons between pairs of objects Examples: bubble-sort, selection-sort, insertion-sort, heap-sort, merge-sort, quick-sort, ... Let us therefore derive a lower bound on the running time of any algorithm that uses comparisons to sort n elements, x1, x2, …, xn. Is xi < xj? no yes

119 Counting Comparisons Let us just count comparisons then.
Each possible run of the algorithm corresponds to a root-to-leaf path in a decision tree

120 Decision Tree Height The height of the decision tree is a lower bound on the running time Every input permutation must lead to a separate leaf output If not, some input …4…5… would have same output ordering as …5…4…, which would be wrong Since there are n!=12  … n leaves, the height is at least log (n!)

121 The Lower Bound Any comparison-based sorting algorithms takes at least log (n!) time Therefore, any such algorithm takes time at least That is, any comparison-based sorting algorithm must run in W(n log n) time.

122 Sorting in time less than O(n logn)
How!!

123 Out of comparison world
We do not compare We assume we deal with integers And we assume a limited range Character strings are fine too Let us see

124 Merge Sort 1/1/ :28 AM Counting sort A: 4 1 3 B: 2 5 C: C':

125 Counting sort Knowledge: the numbers fall in a small range
Example 1: sort the final exam score of a large class 1000 students Maximum score: 100 Minimum score: 0 Scores are integers Example 2: sort students according to the first letter of their last name Number of students: many Number of letters: 26

126 Counting sort Input: A[1 . . n], where A[ j]Î{1, 2, …, k} .
Output: B[1 . . n], sorted. Auxiliary storage: C[1 . . k] . Not an in-place sorting algorithm Requires  (n+k) additional storage besides the original array

127 Intuition S1: 100 S2: 90 S3: 85 S4: 100 S5: 90 … 85 90 100 S3 S2 S1 S5
85 90 100 S3 S2 S1 S5 S4 … S3 … S2, S5, …, S1, S4

128 Intuition 75 85 90 100 1 1 2 2 50 students with score ≤ 75
What is the rank (lowest to highest) for a student with score = 75? 50 200 students with score ≤ 90 What is the rank for a student with score = 90? 200

129 Counting sort for i  1 to k do C[i]  0 for j  1 to n
1. for i  1 to k do C[i]  0 for j  1 to n do C[A[ j]]  C[A[ j]] + 1 ⊳ C[i] = |{key = i}| for i  2 to k do C[i]  C[i] + C[i–1] ⊳ C[i] = |{key £ i}| for j  n downto 1 do B[C[A[ j]]]  A[ j] C[A[ j]]  C[A[ j]] – 1 Initialize 2. Count 3. Compute running sum 4. Re-arrange

130 Counting-sort example
1 2 3 4 5 1 2 3 4 A: 4 1 3 4 3 C: B:

131 Loop 1: initialization A: 4 1 3 4 3 C: B: for i  1 to k do C[i]  0
2 3 4 5 1 2 3 4 A: 4 1 3 4 3 C: B: 1. for i  1 to k do C[i]  0

132 Loop 2: count A: 4 1 3 4 3 C: 1 B: for j  1 to n
5 1 2 3 4 A: 4 1 3 4 3 C: 1 B: 2. for j  1 to n do C[A[ j]]  C[A[ j]] + 1 ⊳ C[i] = |{key = i}|

133 Loop 2: count A: 4 1 3 4 3 C: 1 1 B: for j  1 to n
5 1 2 3 4 A: 4 1 3 4 3 C: 1 1 B: 2. for j  1 to n do C[A[ j]]  C[A[ j]] + 1 ⊳ C[i] = |{key = i}|

134 Loop 2: count A: 4 1 3 4 3 C: 1 1 1 B: for j  1 to n
5 1 2 3 4 A: 4 1 3 4 3 C: 1 1 1 B: 2. for j  1 to n do C[A[ j]]  C[A[ j]] + 1 ⊳ C[i] = |{key = i}|

135 Loop 2: count A: 4 1 3 4 3 C: 1 1 2 B: for j  1 to n
5 1 2 3 4 A: 4 1 3 4 3 C: 1 1 2 B: 2. for j  1 to n do C[A[ j]]  C[A[ j]] + 1 ⊳ C[i] = |{key = i}|

136 Loop 2: count A: 4 1 3 4 3 C: 1 2 2 B: for j  1 to n
5 1 2 3 4 A: 4 1 3 4 3 C: 1 2 2 B: 2. for j  1 to n do C[A[ j]]  C[A[ j]] + 1 ⊳ C[i] = |{key = i}|

137 Loop 3: compute running sum
1 2 3 4 5 1 2 3 4 A: 4 1 3 4 3 C: 1 2 2 B: C': 1 1 2 2 3. for i  2 to k do C[i]  C[i] + C[i–1] ⊳ C[i] = |{key £ i}|

138 Loop 3: compute running sum
1 2 3 4 5 1 2 3 4 A: 4 1 3 4 3 C: 1 2 2 B: C': 1 1 3 2 3. for i  2 to k do C[i]  C[i] + C[i–1] ⊳ C[i] = |{key £ i}|

139 Loop 3: compute running sum
1 2 3 4 5 1 2 3 4 A: 4 1 3 4 3 C: 1 2 2 B: C': 1 1 3 5 3. for i  2 to k do C[i]  C[i] + C[i–1] ⊳ C[i] = |{key £ i}|

140 Loop 4: re-arrange A: 4 1 3 4 3 C: 1 1 3 5 B: 3 C': 1 1 3 5
2 3 4 5 1 2 3 4 A: 4 1 3 4 3 C: 1 1 3 5 B: 3 C': 1 1 3 5 4. for j  n downto 1 do B[C[A[ j]]]  A[ j] C[A[ j]]  C[A[ j]] – 1

141 Loop 4: re-arrange A: 4 1 3 4 3 C: 1 1 3 5 B: 3 C': 1 1 2 5
4. for j  n downto 1 do B[C[A[ j]]]  A[ j] C[A[ j]]  C[A[ j]] – 1

142 Loop 4: re-arrange A: 4 1 3 4 3 C: 1 1 2 5 B: 3 4 C': 1 1 2 5
4. for j  n downto 1 do B[C[A[ j]]]  A[ j] C[A[ j]]  C[A[ j]] – 1

143 Loop 4: re-arrange A: 4 1 3 4 3 C: 1 1 2 5 B: 3 4 C': 1 1 2 4
4. for j  n downto 1 do B[C[A[ j]]]  A[ j] C[A[ j]]  C[A[ j]] – 1

144 Loop 4: re-arrange A: 4 1 3 4 3 C: 1 1 2 4 B: 3 3 4 C': 1 1 2 4
5 1 2 3 4 A: 4 1 3 4 3 C: 1 1 2 4 B: 3 3 4 C': 1 1 2 4 4. for j  n downto 1 do B[C[A[ j]]]  A[ j] C[A[ j]]  C[A[ j]] – 1

145 Loop 4: re-arrange A: 4 1 3 4 3 C: 1 1 2 4 B: 3 3 4 C': 1 1 1 4
5 1 2 3 4 A: 4 1 3 4 3 C: 1 1 2 4 B: 3 3 4 C': 1 1 1 4 4. for j  n downto 1 do B[C[A[ j]]]  A[ j] C[A[ j]]  C[A[ j]] – 1

146 Loop 4: re-arrange A: 4 1 3 4 3 C: 1 1 1 4 B: 1 3 3 4 C': 1 1 1 4
2 3 4 5 1 2 3 4 A: 4 1 3 4 3 C: 1 1 1 4 B: 1 3 3 4 C': 1 1 1 4 4. for j  n downto 1 do B[C[A[ j]]]  A[ j] C[A[ j]]  C[A[ j]] – 1

147 Loop 4: re-arrange A: 4 1 3 4 3 C: 1 1 1 4 B: 1 3 3 4 C': 1 1 4
2 3 4 5 1 2 3 4 A: 4 1 3 4 3 C: 1 1 1 4 B: 1 3 3 4 C': 1 1 4 4. for j  n downto 1 do B[C[A[ j]]]  A[ j] C[A[ j]]  C[A[ j]] – 1

148 Loop 4: re-arrange A: 4 1 3 4 3 C: 1 1 4 B: 1 3 3 4 4 C': 1 1 4
2 3 4 5 1 2 3 4 A: 4 1 3 4 3 C: 1 1 4 B: 1 3 3 4 4 C': 1 1 4 4. for j  n downto 1 do B[C[A[ j]]]  A[ j] C[A[ j]]  C[A[ j]] – 1

149 Loop 4: re-arrange A: 4 1 3 4 3 C: 1 1 4 B: 1 3 3 4 4 C': 1 1 3
2 3 4 5 1 2 3 4 A: 4 1 3 4 3 C: 1 1 4 B: 1 3 3 4 4 C': 1 1 3 4. for j  n downto 1 do B[C[A[ j]]]  A[ j] C[A[ j]]  C[A[ j]] – 1

150 Analysis Q(k) Q(n) Q(k) Q(n) Q(n + k) 1. for i  1 to k do C[i]  0 2.
for j  1 to n do C[A[ j]]  C[A[ j]] + 1 Q(n) 3. for i  2 to k do C[i]  C[i] + C[i–1] Q(k) 4. for j  n downto 1 do B[C[A[ j]]]  A[ j] C[A[ j]]  C[A[ j]] – 1 Q(n) Q(n + k)

151 Running time If k = O(n), then counting sort takes Q(n) time.
But, theoretical lower-bound sorting takes W(n log n) time! Why does counting sort takes less? Answer: Comparison sorting takes W(n log n) time. Counting sort is not a comparison sort. In fact, not a single comparison between elements occurs!

152 Counting Sort Cool! Why don’t we always use counting sort?
Because it depends on range k of elements Could we use counting sort to sort 32 bit integers? Why or why not? Answer: no, k too large (232 = 4,294,967,296)

153 Stable sorting Counting sort is a stable sort: it preserves the input order among equal elements. A: 4 1 3 B: Why this is important? We will see now

154 Importance of stability
Let us browse thru this “address sorting example”

155 Task: sort students by alphabetical order of their addresses (state, city, street).

156 Task: sort students by alphabetical order of their addresses (state, city, street).

157 Task: sort students by alphabetical order of their addresses (state, city, street).

158 Is this useful ?!

159 Original set: Task: sort students by alphabetical order of their addresses (state, city, street).

160 Task: sort students by alphabetical order of their addresses (state, city, street).

161 Task: sort students by alphabetical order of their addresses (state, city, street).

162 Task: sort students by alphabetical order of their addresses (state, city, street).

163 Task: sort students by alphabetical order of their addresses (state, city, street).

164 Original set: Task: sort students by alphabetical order of their addresses (state, city, street).

165 Task: sort students by alphabetical order of their addresses (state, city, street).

166 Task: sort students by alphabetical order of their addresses (state, city, street).

167 Task: sort students by alphabetical order of their addresses (state, city, street).

168 Stabling a sort Most Θ(n^2) sorting algorithms are stable
Standard selection sort is not, but can be made so Most Θ(n log n) sorting algorithms are not stable Except merge sort Generic way to make any sorting algorithm stable Use two keys, the second key is the original index of the element When two elements are equal, compare their second keys 5, 6, 5, 1, 2, 3, 2, 6 (5, 1), (6, 2), (5, 3), (1, 4), (2, 5), (3, 6), (2, 7), (6, 8) (2, 5) < (2, 7) (5, 1) < (5, 3)

169 How to sort very large numbers?
Those numbers are too large for the int type. They are represented as strings. One method: Use comparison-based sorting, but compare strings character by character. Change if (A[i] < A[j]) to if (compare(A[i], A[j]) < 0) Compare(s, t) for i = 1 to length(s) if (s[i] < t[i]) return -1; else if (s[i] > t[i]) return 1; return 0; What’s the cost to compare two strings, each with d characters? Total cost: Θ(d n log n) Θ(d)

170 Radix sort Similar to sorting address books Treat each digit as a key
Start from the least significant bit Most significant Least significant 198099 340199 384700 382408 614386

171 Radix sort illustration
simpler examples:

172 Radix sort illustration
Sort the last digit:

173 Radix sort illustration
Sort the second digit:

174 Radix sort illustration
Sort the first digit: Sorted

175 Time complexity Sort each of the d digits by counting sort
Total cost: d (n + k) k = 10 Total cost: Θ(dn) Partition the d digits into groups of 3 Total cost: (n+103)d/3 We work with binaries rather than decimals Partition d bits into groups of r bits Total cost: (n+2r)d/r Choose r = log n Total cost: dn / log n Compare with dn log n Catch: radix sort has a larger hidden constant factor

176 Space complexity Calls counting sort
Therefore additional storage is needed (n)

177 Radix Sort In general, radix sort based on counting sort is
Fast Asymptotically fast (i.e., O(n)) Simple to code A good choice To think about: Can we radix sort be used on floating-point numbers? mm….

178 Bucket-Sort 1, c 3, a 3, b 7, d 7, g 7, e  1  2 3  4  5  6 7  8
Merge Sort 1/1/ :28 AM Bucket-Sort 1, c 3, a 3, b 7, d 7, g 7, e 1 2 3 4 5 6 7 8 9 B

179 Merge Sort 1/1/ :28 AM Bucket-Sort Let be S be a sequence of n (key, element) entries with keys in the range [0, N - 1] Bucket-sort uses the keys as indices into an auxiliary array B of sequences (buckets) Phase 1: Empty sequence S by moving each entry (k, o) into its bucket B[k] Phase 2: For i = 0, …, N - 1, move the entries of bucket B[i] to the end of sequence S Analysis: Phase 1 takes O(n) time Phase 2 takes O(n + N) time Bucket-sort takes O(n + N) time Algorithm bucketSort(S, N) Input sequence S of (key, element) items with keys in the range [0, N - 1] Output sequence S sorted by increasing keys B  array of N empty sequences while S.isEmpty() f  S.first() (k, o)  S.remove(f) B[k].insertLast((k, o)) for i  0 to N - 1 while B[i].isEmpty() f  B[i].first() (k, o)  B[i].remove(f) S.insertLast((k, o))

180 Example Key range [0, 9] 7, d 1, c 3, a 7, g 3, b 7, e Phase 1 B 1, c
Merge Sort 1/1/ :28 AM Example Key range [0, 9] 7, d 1, c 3, a 7, g 3, b 7, e Phase 1 1 2 3 4 5 6 7 8 9 B 1, c 7, d 7, g 3, b 3, a 7, e Phase 2 1, c 3, a 3, b 7, d 7, g 7, e

181

182 Properties and Extensions
Merge Sort 1/1/ :28 AM Properties and Extensions Key-type Property The keys are used as indices into an array and cannot be arbitrary objects No external comparator Stable Sort Property The relative order of any two items with the same key is preserved after the execution of the algorithm Extensions Integer keys in the range [a, b] Put entry (k, o) into bucket B[k - a] String keys from a set D of possible strings, where D has constant size (e.g., names of the 50 U.S. states) Sort D and compute the rank r(k) of each string k of D in the sorted sequence Put entry (k, o) into bucket B[r(k)]

183 Merge Sort 1/1/ :28 AM Radix-Sort in [G]

184 Merge Sort 1/1/ :28 AM Lexicographic Order A d-tuple is a sequence of d keys (k1, k2, …, kd), where key ki is said to be the i-th dimension of the tuple Example: The Cartesian coordinates of a point in space are a 3-tuple The lexicographic order of two d-tuples is recursively defined as follows (x1, x2, …, xd) < (y1, y2, …, yd)  x1 < y1  x1 = y1  (x2, …, xd) < (y2, …, yd) I.e., the tuples are compared by the first dimension, then by the second dimension, etc.

185 Lexicographic-Sort Example: Algorithm lexicographicSort(S)
Merge Sort 1/1/ :28 AM Lexicographic-Sort Algorithm lexicographicSort(S) Input sequence S of d-tuples Output sequence S sorted in lexicographic order for i  d downto 1 stableSort(S, Ci) Let Ci be the comparator that compares two tuples by their i-th dimension Let stableSort(S, C) be a stable sorting algorithm that uses comparator C Lexicographic-sort sorts a sequence of d-tuples in lexicographic order by executing d times algorithm stableSort, one per dimension Lexicographic-sort runs in O(dT(n)) time, where T(n) is the running time of stableSort Example: (7,4,6) (5,1,5) (2,4,6) (2, 1, 4) (3, 2, 4) (2, 1, 4) (3, 2, 4) (5,1,5) (7,4,6) (2,4,6) (2, 1, 4) (5,1,5) (3, 2, 4) (7,4,6) (2,4,6) (2, 1, 4) (2,4,6) (3, 2, 4) (5,1,5) (7,4,6)

186 Merge Sort 1/1/ :28 AM Radix-Sort Radix-sort is a specialization of lexicographic-sort that uses bucket-sort as the stable sorting algorithm in each dimension Radix-sort is applicable to tuples where the keys in each dimension i are integers in the range [0, N - 1] Radix-sort runs in time O(d( n + N)) Algorithm radixSort(S, N) Input sequence S of d-tuples such that (0, …, 0)  (x1, …, xd) and (x1, …, xd)  (N - 1, …, N - 1) for each tuple (x1, …, xd) in S Output sequence S sorted in lexicographic order for i  d downto 1 bucketSort(S, N)

187 Radix-Sort for Binary Numbers
Merge Sort 1/1/ :28 AM Radix-Sort for Binary Numbers Consider a sequence of n b-bit integers x = xb - 1 … x1x0 We represent each element as a b-tuple of integers in the range [0, 1] and apply radix-sort with N = 2 This application of the radix-sort algorithm runs in O(bn) time For example, we can sort a sequence of 32-bit integers in linear time Algorithm binaryRadixSort(S) Input sequence S of b-bit integers Output sequence S sorted replace each element x of S with the item (0, x) for i  0 to b - 1 replace the key k of each item (k, x) of S with bit xi of x bucketSort(S, 2)

188 Example Sorting a sequence of 4-bit integers 1001 0010 1101 0001 1110
Merge Sort 1/1/ :28 AM Example Sorted Sorting a sequence of 4-bit integers 1001 0010 1101 0001 1110 0010 1110 1001 1101 0001 1001 1101 0001 0010 1110 1001 0001 0010 1101 1110 0001 0010 1001 1101 1110

189 Merge Sort 1/1/ :28 AM Selection in [G]

190 Merge Sort 1/1/ :28 AM The Selection Problem Given an integer k and n elements x1, x2, …, xn, taken from a total order, find the k-th smallest element in this set. Of course, we can sort the set in O(n log n) time and then index the k-th element. Can we solve the selection problem faster? k=3

191 Merge Sort 1/1/ :28 AM Quick-Select Quick-select is a randomized selection algorithm based on the prune-and-search paradigm: Prune: pick a random element x (called pivot) and partition S into L elements less than x E elements equal x G elements greater than x Search: depending on k, either answer is in E, or we need to recurse in either L or G x x L G E k > |L|+|E| k’ = k - |L| - |E| k < |L| |L| < k < |L|+|E| (done)

192 Merge Sort 1/1/ :28 AM Partition We partition an input sequence as in the quick-sort algorithm: We remove, in turn, each element y from S and We insert y into L, E or G, depending on the result of the comparison with the pivot x Each insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) time Thus, the partition step of quick-select takes O(n) time Algorithm partition(S, p) Input sequence S, position p of pivot Output subsequences L, E, G of the elements of S less than, equal to, or greater than the pivot, resp. L, E, G  empty sequences x  S.remove(p) while S.isEmpty() y  S.remove(S.first()) if y < x L.insertLast(y) else if y = x E.insertLast(y) else { y > x } G.insertLast(y) return L, E, G

193 Quick-Select Visualization
Merge Sort 1/1/ :28 AM Quick-Select Visualization An execution of quick-select can be visualized by a recursion path Each node represents a recursive call of quick-select, and stores k and the remaining sequence k=5, S=( ) k=2, S=( ) k=2, S=( ) k=1, S=( ) 5

194 Merge Sort 1/1/ :28 AM Expected Running Time Consider a recursive call of quick-select on a sequence of size s Good call: the sizes of L and G are each less than 3s/4 Bad call: one of L and G has size greater than 3s/4 A call is good with probability 1/2 1/2 of the possible pivots cause good calls:  1 1 Good call Bad call Bad pivots Good pivots Bad pivots

195 Deterministic Selection
Merge Sort 1/1/ :28 AM Deterministic Selection We can do selection in O(n) worst-case time. Main idea: recursively use the selection algorithm itself to find a good pivot for quick-select: Divide S into n/5 sets of 5 each Find a median in each set Recursively find the median of the “baby” medians. Min size for L 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 Min size for G

196 Reading [G] Chapter 12: Sorting and Selection
[C] Chapter 8: Sorting in Linear Time Repeated material [C] Chapter 9: Medians and Order Statistics [L] Section 7.1: Sorting by Counting All about Trees is not required <yet> All about Graphs is not required <yet>

197 For Murtaja x ← x + y // x holds x + y, y holds y
y ← x − y // x holds x + y, y holds x x ← x − y // x holds y, and y holds x The same trick is applicable to any binary data by (XOR) operation: Start: x=0100 y=1010 x ← x XOR y x=1110 y=1010 y ← x XOR y x=1110 y=0100 x ← x XOR y x=1010 y=0100


Download ppt "Sorting Recap & More about Sorting"

Similar presentations


Ads by Google