Presentation is loading. Please wait.

Presentation is loading. Please wait.

QuickSort Previous slides based on ones by Ethan Apter & Marty Stepp

Similar presentations


Presentation on theme: "QuickSort Previous slides based on ones by Ethan Apter & Marty Stepp"— Presentation transcript:

1 QuickSort Previous slides based on ones by Ethan Apter & Marty Stepp
CSE 143 Lecture 23 QuickSort Previous slides based on ones by Ethan Apter & Marty Stepp

2 Merge Sort Merge sort: divide a list into two halves sort the halves recombine the sorted halves into a sorted whole Merge sort is an example of a “divide and conquer” algorithm divide and conquer algorithm: an algorithm that repeatedly divides the given problem into smaller pieces that can be solved more easily it’s easier to sort the two small lists than the one big list

3 Merge Sort Picture 1 2 3 4 5 6 7 22 18 12 -4 58 31 42 split 22 18 12
1 2 3 4 5 6 7 22 18 12 -4 58 31 42 split 22 18 12 -4 58 7 31 42 22 18 12 -4 58 7 31 42 22 18 12 -4 58 7 31 42 merge 18 22 -4 12 7 58 31 42 -4 12 18 22 7 31 42 58 1 2 3 4 5 6 7 -4 12 18 22 31 42 58

4 sort Final Code Final version of sort:
public static void sort(int[] list) { if (list.length > 1) { int[] list1 = new int[list.length / 2]; int[] list2 = new int[list.length - list.length / 2]; for (int i = 0; i < list1.length; i++) list1[i] = list[i]; for (int i = 0; i < list2.length; i++) list2[i] = list[i + list1.length]; sort(list1); sort(list2); mergeInto(list, list1, list2); }

5 mergeInto Final Code private static void mergeInto(int[] result, int[] list1, int[] list2) { int i1 = 0; int i2 = 0; for (int i = 0; i < result.length; i++) { if (i2 >= list2.length || (i1 < list1.length && list1[i1] <= list2[i2])) { result[i] = list1[i1]; i1++; } else { result[i] = list2[i2]; i2++; }

6 Analyzing Our Performance
Sun’s sort is faster than ours but only by a factor of about 2.5 That’s really good! It didn’t take us long to write this, and Sun’s Arrays.sort is a professional sorting method Sun has had years to fine-tune the performance of Arrays.sort, and yet we wrote a reasonably competitive merge sort in less than an hour! So what’s the complexity of merge sort?

7 Complexity of Merge Sort
To determine the time complexity, let’s break our merge sort into pieces and analyze the pieces Remember, merge sort consists of: divide a list into two halves sort the halves recombine the sorted halves into a sorted whole Dividing the list in half and recombining the lists are pretty easy to analyze: both have O(n) time complexity But what about sorting the halves?

8 Complexity of Merge Sort
We can think of merge sort as occurring in levels at the first level, we want to sort the whole list at the second level, we want to sort the two half lists at the third level, we want to sort the four quarter lists ... We know there’s O(n) work at each level from dividing/recombining the lists But how many levels are there? if we can figure this out, our time complexity is just O(n * num_levels)

9 Complexity of Merge Sort
Because we divide the array in half each time, there are log(n) levels So merge sort is an O(n log(n)) algorithm this is a big improvement over the O(n2) sorting algorithms log(n) levels O(n) work at each level

10 Quick Sort Pick a “pivot” Divide into less-than & greater-than pivot
Sort each side recursively

11 The steps of QuickSort S S1 S2 S1 S2 S select pivot value partition S
81 31 57 43 13 75 92 65 26 S1 S2 partition S 31 75 43 13 65 81 92 26 57 QuickSort(S1) and QuickSort(S2) S1 S2 13 26 31 43 57 65 75 81 92 S 13 26 31 43 57 65 75 81 92 Presto! S is sorted [Weiss]

12 Quick sort vs. Merge sort
pick a pivot value from the array partition the list around the pivot value sort the left half sort the right half Merge sort: divide a list into two identically sized halves recombine the sorted halves into a sorted whole

13 QuickSort Example Move i to the right to be larger than pivot.
j 5 1 3 9 7 4 2 6 8 j i 5 1 3 9 7 4 2 6 8 i j 5 1 3 9 7 4 2 6 8 i j 5 1 3 2 7 4 9 6 8 Move i to the right to be larger than pivot. Move j to the left to be smaller than pivot. Swap

14 QuickSort Example pivot S1 < pivot S2 > pivot 5 1 3 2 7 4 9 6 8
j 5 1 3 2 7 4 9 6 8 i j 5 1 3 2 7 4 9 6 8 i j 5 1 3 2 4 7 9 6 8 i j 5 1 3 2 4 7 9 6 8 j i 5 1 3 2 4 7 9 6 8 j i 1 4 2 4 5 6 9 7 8 pivot S1 < pivot S2 > pivot

15 Partition private static int partition(Object[] list, int low, int high) { swap(list, low, (low + high) / 2); // swap middle value into first pos Object pivot = list[low]; // remember pivot int index1 = low + 1; // index of first unknown value int index2 = high; // index of last unknown value while (index1 <= index2) { // while some values still unknown if (list[index1] <= pivot) index1++; else if (list[index2] > pivot) index2--; else { swap(list, index1, index2); } swap(list, low, index2); // put the pivot value between the two // sublists and return its index return index2;

16 Quicksort public static void sort(Object[] list, int low, int high){
if (low < high) { int pivotIndex = partition(list, low, high); sort(list, low, pivotIndex - 1); sort(list, pivotIndex + 1, high); }

17 Optimized Quicksort Don’t use quicksort for small arrays.
Quicksort(A[]: integer array, left, right : integer): { pivotindex : integer; if left + CUTOFF  right then pivot := median3(A,left,right); pivotindex := Partition(A,left,right-1,pivot); Quicksort(A, left, pivotindex – 1); Quicksort(A, pivotindex + 1, right); else Insertionsort(A,left,right); } Don’t use quicksort for small arrays. CUTOFF = 10 is reasonable.

18 Complexity of Quicksort?

19 Complexity Classes Complexity Class Name Example O(1) constant time
popping a value off a stack O(log n) logarithmic time binary search on an array O(n) linear time scanning all elements of an array O(n log n) log-linear time binary search on a linked list and good sorting algorithms O(n2) quadratic time poor sorting algorithms (like inserting n items into SortedIntList) O(n3) cubic time (example in lecture 11) O(2n) exponential time Really hard problems. These grow so fast that they’re impractical

20 Examples of Each Complexity Class’s Growth Rate
From Reges/Stepp, page 708: assume that all complexity classes can process an input of size 100 in 100ms Input Size (n) O(1) O(log n) O(n) O(n log n) O(n2) O(n3) O(2n) 100 100ms 200 115ms 200ms 240ms 400ms 800ms 32.7 sec 400 130ms 550ms 1.6 sec 6.4 sec 12.4 days 800 145ms 1.2 sec 51.2 sec 36.5 million years 1600 160ms 2.7 sec 25.6 sec 6 min 49.6 sec 4.21 * 1024 years 3200 175ms 3.2 sec 6 sec 1 min 42.4 sec 54 min 36 sec 5.6 * 1061 years


Download ppt "QuickSort Previous slides based on ones by Ethan Apter & Marty Stepp"

Similar presentations


Ads by Google