Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sorting Algorithms.

Similar presentations


Presentation on theme: "Sorting Algorithms."— Presentation transcript:

1 Sorting Algorithms

2 What is the Sorting Algorithm?
Definition Given a set of records, the output is to the non-decreasing order (numerical order or lexicographical order) of records. The output is a permutation of the input.

3 Why is Sorting Important?
Sorting has been commonly used as the pre-processed method for searching and matching. Sorting is also used as the basic solution for many other complex problem. In most organization, more than 25% of computing time is spent on sorting. No best algorithm for any situation: initial ordering and size of list. We need to know several techniques. The analysis of lower bound is good for understanding basic skill for algorithm analysis.

4 Categories of Sorting Algorithms
Comparison sort vs. non-comparison sort Comparative sorting algorithm determines the order of two element through a comparison operator. Comparison sorting algorithms: Selection sort, Bubble sort, Insertion sort, Quick sort, … Non-comparison sorting algorithms: Radix sort, Bucket sort, Counting sort Note: non-comparison sort is also called a linear sorting method.

5 Categories of Sorting Algorithms
Internal sort vs. external sort Internal sorting technique is for the case where the list is small enough to sort entirely in main memory Minimizing the number of comparisons External sorting technique is used when the list is too big to fit into main memory, (e.g., disk and SSD). Minimizing the number of I/O accesses

6 Stability of Sorting Algorithms
Definition Stable sorting algorithms maintain the relative order of records with equal keys. The initial order of records with equal keys does not changed. Stable algorithm unstable algorithm

7 What is Selection Sort? Description: At the i-th iteration (𝟎≤𝒊≤𝒏−𝟏)
Given a list 𝐿, there are two parts: 𝑳 𝟎, 𝒊−𝟏 𝑎𝑛𝑑 𝑳[𝒊, 𝒏−𝟏]. 𝑳 𝟎,𝒊−𝟏 : a sublist of items to be already sorted (gray) 𝑳 𝒊,𝒏−𝟏 : a sublist of items remaining to be sorted (green) Select the minimum (red) from the unsorted part. Exchange the minimum with the 𝑖-th element in the list. 0th 1st 2nd 3rd 4th 10 21 5 8 1 12 10 21 5 8 1 12 1 21 5 8 10 12 1 21 5 8 10 12 1 5 21 8 10 12 1 5 21 8 10 12 1 5 8 21 10 12 1 5 8 21 10 12 1 5 8 10 21 12 1 5 8 10 21 12

8 Implementation of Selection Sort
void SelectionSort(Data* list, int n) { int min, temp; for (int i = 0; i < n - 1; i++) min = i; for (int j = i + 1; j < n; j++) // Find an index with the minimum element. if (list[j] < list[min]) min = j; } // Exchange the minimum element and the i-th element. SWAP(list[i], list[min], temp);

9 Exercise: Selection Sort
Animation: sorting 3, 4, 5, 4, 9, 1, 2, 6, 7 Draw the step-by-step procedure of selection sort. Is it stable? Initial list After sorting the first element

10 Analysis of Selection Sort
Time complexity Best case: 𝑂 𝑛 2 The number of comparisons: 𝑛−1 + 𝑛−2 + … +2+1 Worst case: 𝑂( 𝑛 2 ) Q) Is it stable? A) No, the movement of elements are not adjacent. The selection sort is unstable. E.g., 4(1) 2 4(2) 1 5

11 The 0-th iteration is ended.
What is Bubble Sort? Description: At the 𝒊 th iteration (𝟎≤𝒊≤𝒏−𝟏) There are two parts: 𝑳 𝟎, 𝒏−𝒊−𝟏 𝑎𝑛𝑑 𝑳[𝒏−𝒊−𝟏, 𝒏−𝟏]. 𝑳 𝟎,𝒏−𝒊−𝟏 : a sublist of items to be already sorted (gray) 𝑳 𝒏−𝒊−𝟏,𝒏−𝟏 : a sublist of items to be sorted (green) Compare each pair of adjacent items (red) and swap them if they are in the wrong order from the unsorted part. 10 21 5 8 1 12 10 21 5 8 1 12 Check if swap 10 and 21. Check if swap 21 and 5. 10 5 21 8 1 12 10 5 8 21 1 12 Check if swap 21 and 8. Check if swap 21 and 1. 10 5 8 1 21 12 10 5 8 1 12 21 Check if swap 21 and 12. The 0-th iteration is ended.

12 Implementation of Bubble Sort
void BubbleSort(Data* list, int n) { int temp; for (int i = n - 1; i > 0; i--) for (int j = 0; j < i; j++) // Compare each pair of adjacent items. if (list[j] > list[j + 1]) // Swap if they are in the wrong order. SWAP(list[j], list[j + 1], temp); }

13 Exercise: Bubble Sort Animation: sorting 3, 4, 5, 4, 9, 1, 2, 6, 7
Draw the step-by-step procedure of bubble sort. Is it stable? Initial list After sorting the last element

14 Analysis of Bubble Sort
Time complexity Best case: 𝑂 𝑛 2 Worst case: 𝑂( 𝑛 2 ) Q) Is it stable? A) Yes, it is based on the exchanges between two adjacent items. Consider 4(1) 4(2)

15 What is Insertion Sort? Description: At the 𝒊 th iteration (𝟎≤𝒊≤𝒏−𝟏)
Given a list 𝐿, there are two parts: 𝑳 𝟎, 𝒊−𝟏 𝑎𝑛𝑑 𝑳[𝒊, 𝒏−𝟏]. 𝑳 𝟎,𝒊−𝟏 : a sublist of items that is partially sorted 𝑳 𝒊,𝒏−𝟏 : a sublist of items to be sorted Insert 𝐿[𝑖] to the correct position in 𝐿[0,𝑖] to be sorted. 1th 2st 3nd 4rd 5th 10 21 5 8 1 12 10 21 5 8 1 12 10 21 5 8 1 12 10 5 21 8 1 12 5 10 21 8 1 12 5 8 10 21 1 12 5 8 10 21 1 12 1 5 8 10 21 12 1 5 8 10 21 12 1 5 8 10 12 21

16 Implementation of Insertion Sort
void InsertionSort(Data* list, int n) { int j, key; for (int i = 1; i < n; i++) key = list[i];// Choose the i-th element. for (j = i - 1; j >= 0; j--) { // If the j-th element is greater than key, // move to the next position. if (key < list[j]) list[j + 1] = list[j]; else break; } // Otherwise, move the key to the (j+1)-th element. list[j + 1] = key;

17 Exercise: Insertion Sort
Animation: sorting 3, 4, 5, 4, 9, 1, 2, 6, 7 Draw the step-by-step procedure of insertion sort. Is it stable? Initial list After sorting four elements

18 Analysis of Insertion Sort
Time complexity Best case: 𝑂 𝑛 Worst case: 𝑂( 𝑛 2 ) Q) Is it stable? A) Yes, the exchange of elements is adjacent. The exchanges of elements are similar to bubble sort.

19 What is Shell Sort? Description Invented by Donald Shell in 1959
Start by sorting pairs of elements far apart from each other, then reduce the gap between elements to be compared. Generalization of insertion sort Sort every k-th elements by considering k interleaved lists. Reduce the size of k by 1. 9 7 5 20 4 2 22 1 3 15 18 After sorting every 5 element 2 7 1 3 4 9 22 5 20 15 18 After sorting every 3 element 2 4 1 3 5 9 15 7 20 22 18 After sorting every element 1 2 3 4 5 7 9 15 18 20 22

20 Example of Shell Sort When k = 5, sorting every k-th element 9 7 5 20
4 2 22 1 3 15 18 sorting 2 9 18 7 22 1 5 3 20 4 15 2 7 1 3 4 9 22 5 20 15 18

21 Example of Shell Sort When k = 5 and k = 1, sorting every k-th element
2 7 1 3 4 9 22 5 20 15 18 sorting 2 3 15 22 4 5 7 18 1 9 20 sorting 2 4 1 3 5 9 15 7 20 22 18 1 2 3 4 5 7 9 15 18 20 22

22 Implementation of Shell Sort
This is similar to insertion sort, but considers the gap between elements. When the gap is one, it is same as insertion sort. void IncInsertionSort(int* list, int first, int last, int k) { int i, j, key; // Sort every k-th element for (i = first + k; i < last; i = i + k) key = list[i]; // Choose the i-th element. // If the j-th element is greater than key, // move to the next position with k interval. for (j = i - k; j >= first && key < list[j]; j = j - k) list[j + k] = list[j]; list[j + k] = key; }

23 Implementation of Shell Sort
It is non-trivial to answer which gap sequence is better. void ShellSort(Data* list, int n) { // Sort every k-th element by reducing from n/2 to 1. for (int k = n / 2; k > 0; k /= 2) // Choose the k-th element as the odd. if ((k % 2) == 0) k++; for (int i = 0; i < k; i++) IncInsertionSort(list, i, n, k); }

24 After sorting every 5-th elements
Exercise: Shell Sort Animation: sorting 3, 4, 5, 4, 9, 1, 2, 6, 7 Draw the step-by-step procedure of shell sort. Every k = 5, 3, and 1 Is it stable? Initial list After sorting every 5-th elements

25 Analysis of Shell Sort Time complexity Q) Is it stable?
Average case: 𝑂(𝑛 log 𝑛) ~ 𝑂(𝑛 2) Worst case: 𝑂(𝑛 log 𝑛) ~ 𝑂(𝑛 2) Q) Is it stable? A) No, when the gap is more than one, elements can move without stability.

26 Divide & Conquer (D&C) Paradigm
Definition Breaking down a problem into two or more subproblems of the same or related type. The solutions to the subproblems are combined to be a solution to the original problem. Problem 7 12 8 3 1 6 5 split split split split subproblem subproblem 7 12 8 3 1 6 5 4 solve solve solve solve subsolution subsolution 3 7 8 12 1 4 5 6 merge merge merge merge Solution 1 3 4 5 6 7 8 12

27 What is Quick Sort? Description Overall procedure
Invented by Tony Hoare in 1959 Based on the divide and conquer (D&C) paradigm. Overall procedure Pivot selection: Pick an element, called a pivot, from the list. Partitioning: reorder the list with the pivot. The elements less than the pivot come before the pivot. The element greater than the pivot come after the pivot. Recursively apply the above steps to the sublists. 7 4 8 1 12 11 3 9 Pivot Partitioning 1 4 3 7 12 11 8 9

28 Example of Quick Sort Overall procedure
For each list, select a pivot as the left-most element. Partition the list into two sublists. 7 4 8 1 12 11 3 9 1 4 3 7 12 11 8 9 1 4 3 7 9 11 8 12 1 3 4 7 8 9 11 12 1 3 4 7 8 9 11 12

29 Quick Sort: Partitioning
Select a pivot from the list. In general, the left-most element is chosen as the pivot. Use two variables: low and high low: if L[low] is less than a pivot, move to the right element. high: if L[high] is greater than a pivot, move to the left element. Swap two elements L[low] and L[high]. If low and high are crossed, stop partitioning. Swap two elements L[left] and L[high].

30 Quick Sort: Partitioning
Assume that the left-most element is the pivot. left: an starting index for a sublist that is less than a pivot right: an ending index for a sublist that is greater than a pivot Select a pivot 7 4 8 1 12 11 3 9 left low right, high Move low until pivot < L[low]. 7 4 8 1 12 11 3 9 left low right, high Move high until pivot >= L[high]. 7 4 8 1 12 11 3 9 left low high right Swap L[low] and L[high]. 7 4 3 1 12 11 8 9 left low high right

31 Quick Sort: Partitioning
Assume that the left-most element is the pivot. left: an starting index for a sublist that is less than a pivot right: an ending index for a sublist that is greater than a pivot Move low until pivot < L[low]. 7 4 3 1 12 11 8 9 left low high right Move high until pivot >= L[high]. 7 4 3 1 12 11 8 9 left high low right Swap L[left] and L[high]. 1 4 3 7 12 11 8 9 left high low right All elements in the left sublist are less than the pivot. All elements in the right sublist are greater than the pivot.

32 Quick Sort: Partitioning
Partitioning one list into two sublists All elements in the left sublist are less than the pivot. All elements in the right sublist are greater than the pivot. int Partition(Data* list, int left, int right) { int pivot = list[left], temp; int low = left + 1, high = right; while (1) { while (low < right && list[low] < pivot) low++; // Move low until pivot < L[low] while (high > left && list[high] >= pivot) high--; // Move high until pivot >= L[low] if (low < high) // Swap list[low] and list[high]. SWAP(list[low], list[high], temp); else break; } SWAP(list[left], list[high], temp); return high;// return the pivot position.

33 Implementation of Quick Sort
Overall procedure Pivot selection: Pick an element, called a pivot, from the list. Partitioning: reorder the list with the pivot. Recursively apply the above steps to the sublists. void QuickSort(Data* list, int left, int right) { if (left < right) { // The mid refers to the pivot position. int mid = Partition(list, left, right); // All elements are less than the pivot. QuickSort(list, left, mid - 1); // All elements are greater than the pivot. QuickSort(list, mid + 1, right); }

34 After the first partitioning
Exercise: Quick Sort Animation: sorting 3, 4, 5, 4, 9, 1, 2, 6, 7 Draw the step-by-step procedure of insertion sort. Q: Is it stable? No… Initial list After the first partitioning

35 Analysis of Quick Sort We expect that the list will be split into two halves in an average case 𝑻 𝒏 =𝟐𝑻 𝒏 𝟐 +𝒄𝒏, where splitting time is 𝑐𝑛. By the master theorem, it is 𝑂( 𝑛log 𝑛 ). The time complexity of quick sort is 𝑶 𝒏𝒍𝒐𝒈 𝒏 . However, the worse case is that the list will be split into 𝟏 and 𝒏−𝟏. 𝑻 𝒏 =𝑻 𝒏−𝟏 +𝒄𝒏=𝑇 𝑛−2 +2𝑐𝑛=… =𝑇 1 +𝑐𝑛 𝑛−1 =𝑂( 𝑛 2 ) The time complexity of quick sort is 𝑶( 𝒏 𝟐 ).

36 Analysis of Quick Sort The worse case occurs if the pivot is selected as an extremely skewed case. The time complexity of quick sort mainly depends on pivot selection. How to choose a good pivot in quick sort?

37 What is Merge Sort? Why does quick sort have 𝑶( 𝒏 𝟐 ) in the worse case? When the sizes of partitioned sublists are extremely skewed. Let us split the list into exactly half and half. Description: Use the D&C paradigm. Divide: split the list into two halves. Conquer: Sort two sublists. Combine: Merge two sorted sublits into one list. Recursively apply the above steps to the sublists.

38 What is Merge Sort? Partitioning and merging in a recursive manner 7
12 8 3 1 6 5 7 12 8 3 1 6 5 4 Partitioning 7 12 8 3 1 6 5 4 7 12 8 3 1 6 5 4 7 12 8 3 1 6 5 4 7 12 3 8 1 6 4 5 Merging 3 7 8 12 1 4 5 6 1 3 4 5 6 7 8 12

39 Merge Sort: Merging How to merge two sublists into one list?
Compare two elements in L1 and L2 in sequence. If the element in L1 is less than or equal to that in L2, move to the next position in L1. If the element in L1 is greater than that in L2, move to the next position in L2. First sublist 3 7 8 12 Because 3 > 1, move the position of the second sublist. Second sublist 1 4 5 6 Copy 1 in L2 to the sorted list. Sorted list 1

40 Merge Sort: Merging How to merge two sublists into one list?
Compare two elements in L1 and L2 in sequence. If the element in L1 is less than or equal to that in L2, move to the next position in L1. If the element in L1 is greater than that in L2, move to the next position in L2. First sublist 3 7 8 12 Because 3 < 4, move the position of the first sublist. Second sublist 1 4 5 6 Copy 3 in L1 to the sorted list. Sorted list 1 3

41 Merge Sort: Merging How to merge two sublists into one list?
Compare two elements in L1 and L2 in sequence. If the element in L1 is less than or equal to that in L2, move to the next position in L1. If the element in L1 is greater than that in L2, move to the next position in L2. First sublist 3 7 8 12 Because 7 > 4, move the position of the second sublist. Second sublist 1 4 5 6 Copy 4 in L2 to the sorted list. Sorted list 1 3 4

42 Merge Sort: Merging How to merge two sublists into one list?
Compare two elements in L1 and L2 in sequence. If the element in L1 is less than or equal to that in L2, move to the next position in L1. If the element in L1 is greater than that in L2, move to the next position in L2. First sublist 3 7 8 12 Because 7 > 5 and 7 > 6, move the position of the second sublist. Second sublist 1 4 5 6 Copy 5 and 6 in L2 to the sorted list. Sorted list 1 3 4 5 6

43 Merge Sort: Merging How to merge two sublists into one list?
Compare two elements in L1 and L2 in sequence. If the element in L1 is less than or equal to that in L2, move to the next position in L1. If the element in L1 is greater than that in L2, move to the next position in L2. First sublist 3 7 8 12 Second sublist 1 4 5 6 Copy remaining items in L1 to the sorted list. Sorted list 1 3 4 5 6 7 8 12

44 Implementation of Merging
void Merge(Data* list, int left, int mid, int right) { int sorted[MAX_SIZE]; int first = left, second = mid + 1, i = left; // Merge two lists by comparing elements in sequence. while (first <= mid && second <= right) { if (list[first] <= list[second]) sorted[i++] = list[first++]; else sorted[i++] = list[second++]; } // For remaining items, add them in sequence. if (first > mid) for (int j = second; j <= right; j++) sorted[i++] = list[j]; for (int j = first; j <= mid; j++) // Copy the sorted list to the list. for (int j = left; j <= right; j++) list[j] = sorted[j];

45 Implementation of Merge Sort
Overall procedure Partitioning: split the list into two halves. Merge: Merge two sorted sublits into one list. Recursively apply the above steps to the sublists. void MergeSort(Data* list, int left, int right) { if (left < right) int mid = (left + right) / 2; // Equal partitioning MergeSort(list, left, mid); // Sorting sublists MergeSort(list, mid + 1, right); // Sorting sublists Merge(list, left, mid, right); // Merging two sublists }

46 After sorting two sublists
Exercise: Merge Sort Animation: sorting 3, 4, 5, 4, 9, 1, 2, 6, 7 Draw the step-by-step procedure of insertion sort. Q: Is it stable? Initial list After sorting two sublists

47 Iterative Merge Sort Iterative merge sort algorithm
Increase the length of merging two lists in sequence. 7 12 8 3 1 6 5 Length 1 7 12 8 3 1 6 5 4 Length 2 7 12 3 8 1 6 4 5 Length 4 3 7 8 12 1 4 5 6 Length 8 1 3 4 5 6 7 8 12

48 Iterative Merge Sort void IterMergeSort(Data* list, int n) {
// Merge subarrays in bottom up manner. First merge subarrays of // size 1 to create sorted subarrays of size 2, then merge subarrays // of size 2 to create sorted subarrays of size 4, and so on. for (int size = 1; size <= n - 1; size = 2 * size) // Pick starting point of different subarrays of current size for (int left_start = 0; left_start < n - 1; left_start += 2 * size) // Find ending point of left subarray. // mid+1 is starting point of right int mid = left_start + size - 1; int right_end = MIN(left_start + 2 * size - 1, n - 1); // Merge Subarrays arr[left_start...mid] & arr[mid+1...right_end] Merge(list, left_start, mid, right_end); }

49 Analysis of Merge Sort Time complexity Is it stable?
Split a list to into two sublists: 𝑂(1) Sort two sublists: 2𝑇( 𝑛 2 ) Merge two sublists: 𝑐𝑛 So, the recurrence relation is 𝑇 𝑛 =2 𝑇 2 +𝑐𝑛. The time complexity of merge sort is 𝑶 𝒏𝒍𝒐𝒈 𝒏 . Average case and worst case are equal. Is it stable? Yes, the merging procedure can maintain stability.

50 Comparison of Sorting Algorithms
Best Average Worst Selection sort 𝑂( 𝑛 2 ) Bubble sort Insertion sort 𝑂(𝑛) Shell sort 𝑂(?) Quick sort 𝑂(𝑛 log 𝑛) Merge sort 𝑂( 𝑛 log 𝑛) Heap sort Radix sort 𝑂(𝑑𝑛)

51 Comparison of Sorting Algorithms
Comparing the running time (N = 100K) Algorithm Running time (sec) Selection sort 10.842 Bubble sort 22.894 Insertion sort 7.438 Shell sort 0.056 Quick sort 0.014 Merge sort 0.026 Heap sort 0.034

52 Summary of Sorting Algorithms
Pros & cons Insertion sort Best for almost sorted: 𝑶(𝒏) Best for small # of elements Quick sort Best in average case Worse case: 𝑶( 𝒏 𝟐 ) Merge sort Best in the worst case: 𝑶(𝒏 𝐥𝐨𝐠 𝒏) Combination of sorting algorithms Insertion sorting is the fastest when 𝑛<20. Quick sorting is the fastest when 20<𝑛<45. Merge sorting is the fastest when 𝑛 is large.

53 Summary of Sorting Algorithms
Combining sorting algorithms If # of of elements is small, insertion sort is the fastest. void QuickSort(Data* list, int left, int right) { if (left < right) { // The mid refers to the pivot position. int mid = Partition(list, left, right); // All elements are less than the pivot. if (mid - left < 20) InsertionSort(list, left, mid - 1); else QuickSort(list, left, mid - 1); // All elements are greater than the pivot. if (right - mid < 20) InsertionSort(list, mid+1, right); QuickSort(list, mid+1, right); }

54 Optimal Sorting Time What is the optimal time for sorting by comparison? Baseline sorting algorithm We have a list with three elements: X, Y, and Z. Generate all possible candidate lists: 𝟑! X Y Z X Z Y Y X Z Y Z X Z X Y Z Y X Check all lists if it is sorted one or not.

55 Optimal Sorting Time We now compare two elements Y and Z.
If Y < Z, then how may candidates will be removed? X Y Z X Z Y Y X Z Y Z X Z X Y Z Y X OK, if you compare once more ? Ideally, if you compare once, a half of candidate list will be removed. How many comparisons do we need?

56 Optimal Sorting Time We have a list with three elements X, Y, and Z.
𝑋 ≤ 𝑌 Yes No [X, Y, Z] [X, Z, Y] [Y, X, Z] [Y, Z, X] [Z, X, Y] [Z, Y, X] 𝑌 ≤ 𝑍 𝑋 ≤ 𝑍 [X, Y, Z] [X, Z, Y] [Z, X, Y] [Y, X, Z] [Y, Z, X] [Z, X, Y] Yes No Yes No Stop 𝑋 ≤ 𝑍 Stop 𝑌 ≤ 𝑍 [X, Y, Z] [X, Z, Y] [Z, X, Y] [Y, X, Z] [Y, Z, X] [Z, X, Y] Yes No Yes No Stop Stop Stop Stop [X, Z, Y] [Z, X, Y] [Y, Z, X] [Z, X, Y]

57 Optimal Sorting Time We have a list with n elements.
How many candidate lists? 𝒏! How many comparisons are needed to get one list? 2 𝑘 =𝑛!→𝒌= 𝐥𝐨𝐠 𝒏! 𝑶( 𝐥𝐨𝐠 𝒏!)=𝑶(𝒏 𝐥𝐨𝐠 𝒏 )

58 Sorting Algorithms Non-comparison sorting algorithms

59 What is Counting Sort? Description Non-comparison sorting algorithm
Count the number of elements with distinct key values. Output each element from the input sequence followed by decreasing its count by one. index count 1 2 3 4 5 6 3 1 input output 1 6 2 1 2 6

60 Example of Counting Sort
Description Count the number of elements with distinct key values. Determine the positions of key values in the output. Output each element from the input sequence followed by decreasing its count by one. index count index count 1 2 3 4 5 6 3 1 1 2 3 4 5 6 3 4 5 input 1 6 2

61 Example of Counting Sort
Description Count the number of elements with distinct key values. Determine the positions of key values in the output. Output each element from the input sequence followed by decreasing its count by one. index count 1 2 3 4 5 6 3 4 5 input output 1 6 2 1 Because the index of 1 is 3, store 1 to output[3].

62 Example of Counting Sort
Description Count the number of elements with distinct key values. Determine the positions of key values in the output. Output each element from the input sequence followed by decreasing its count by one. 1 2 3 4 5 6 2 4 5 input output 1 6 2 1 6 Because the index of 6 is 5, store 5 to output[5].

63 Example of Counting Sort
Description Count the number of elements with distinct key values. Determine the positions of key values in the output. Output each element from the input sequence followed by decreasing its count by one. 1 2 3 4 5 6 2 4 input output 1 6 2 1 6 Because the index of 1 is 2, store 1 to output[2].

64 Example of Counting Sort
Description Count the number of elements with distinct key values. Determine the positions of key values in the output. Output each element from the input sequence followed by decreasing its count by one. 1 2 3 4 5 6 1 4 input output 1 6 2 1 2 6 Because the index of 2 is 4, store 2 to output[4].

65 Example of Counting Sort
Description Count the number of elements with distinct key values. Determine the positions of key values in the output. Output each element from the input sequence followed by decreasing its count by one. 1 2 3 4 5 6 1 3 4 input output 1 6 2 1 2 6 Because the index of 1 is 1, store 1 to output[1].

66 Implementation of Counting Sort
void CountingSort(Data* list, int n) { Data count[MAX_SIZE] = { 0 }; Data output[MAX_SIZE]; // Counting the redundant elemnts for (int i = 0; i < n; i++) count[list[i]]++; // Cumulate the number of elements. for (int i = 1; i < MAX_SIZE; i++) count[i] += count[i - 1]; // Read the elements in the list and copy them to the output list. for (int i = 0; i < n; i++) { output[count[list[i]] - 1] = list[i]; count[list[i]]--; } // Copy the output list to the original list. list[i] = output[i];

67 Exercise: Counting Sort
Animation: sorting 3, 4, 5, 4, 9, 1, 2, 6, 7 Draw the step-by-step procedure of counting sort. Q: Is it stable? Initial list

68 Analysis of Counting Sort
Characteristics The time complexity is linear in the number of items and the difference between the maximum and minimum key values. It is only suitable for the case where the variation in keys is not significantly greater than the number of items. It is often used as a subroutine in another sorting algorithm, radix sort, that can handle larger keys more efficiently.

69 What is Radix Sort? Description Non-comparison sorting algorithm
Grouping keys by the individual digits which share the same significant position and value. index count 1 2 3 4 5 6 2 1 input output 12 16 22 34 46 12 22 34 16 46 X2 X6 X4

70 What is Radix Sort? Description Non-comparison sorting algorithm
Grouping keys by the individual digits which share the same significant position and value. index count 1 2 3 4 5 6 2 1 input output 12 22 34 16 46 12 22 34 16 46 1X 2X 3X 4X

71 Implementation of Radix Sort
void Counting(int list[], int n, int exp) { int count[10] = { 0 }; int output[MAX_SIZE]; // Store count of occurrences in count list. for (int i = 0; i < n; i++) count[(list[i] / exp) % 10]++; // Change count[i] so that count[i] contains actual position of this digit in output list. for (int i = 1; i < 10; i++) count[i] += count[i - 1]; // Build the output list. for (int i = n - 1; i >= 0; i--) { output[count[(list[i] / exp) % 10] - 1] = list[i]; count[(list[i] / exp) % 10]--; } // Copy the output list to list[], so that list[] now // contains sorted numbers according to current digit list[i] = output[i];

72 Implementation of Radix Sort
void RadixSort(Data* list, int n) { // Find the maximum number to know the number of digits. int max = list[0]; for (int i = 1; i < n; i++) { if (list[i] > max) max = list[i]; } // Do counting sort for every digit. Note that instead // of passing digit number, exp is passed. exp is 10^i // where i is current digit number for (int exp = 1; max / exp > 0; exp *= 10) Counting(list, n, exp);

73 Exercise: Radix Sort Animation: sorting 3, 4, 5, 4, 9, 1, 2, 6, 7
Draw the step-by-step procedure of radix sort. Q: Is it stable? Initial list

74 Analysis of Radix Sort Time complexity
For each digit, perform counting sort. The time complexity of radix sort is 𝑶 𝒅𝒏 . 𝑑 is the maximum number of digits. 𝑛 is the number of elements.


Download ppt "Sorting Algorithms."

Similar presentations


Ads by Google