Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sorting Chapter 6 Chapter 6 –Insertion Sort 6.1 –Quicksort 6.2 Chapter 5 Chapter 5 –Mergesort 5.2 –Stable Sorts 223-224 Divide & Conquer.

Similar presentations


Presentation on theme: "Sorting Chapter 6 Chapter 6 –Insertion Sort 6.1 –Quicksort 6.2 Chapter 5 Chapter 5 –Mergesort 5.2 –Stable Sorts 223-224 Divide & Conquer."— Presentation transcript:

1 Sorting Chapter 6 Chapter 6 –Insertion Sort 6.1 –Quicksort 6.2 Chapter 5 Chapter 5 –Mergesort 5.2 –Stable Sorts 223-224 Divide & Conquer

2 Three Algorithms to know Insertion Sort (not divide-and-conquer) Insertion Sort (not divide-and-conquer) –O(n 2 ) worst-case, best-case, average-case –very efficient on partially sorted lists. Mergesort (divide-and-conquer) Mergesort (divide-and-conquer) –O(n log n) worst-case, best-case, average-case –stable but not the most efficient Quicksort (divide-and-conquer) Quicksort (divide-and-conquer) –O(n 2 ) worst-case but O(n log n) average case. –Improvements make it the fastest practical sorting algorithm

3 1. Insertion Sort is(a[], n) { // Assume first item a[o] is sorted for i = 1 to n-1 { // Sort a[1] to a[n-1] v = a[i];// v is item we want to insert j = i –1;// j will iterate from i down to 0 // Keep looping until you insert v while (j >= 0 && v = a[j] a[j+1] = a[j]; // Shift to make room j--;} a[j+1] = v;// Insert v into correct position }}

4 1. Insertion Sort 10 items Worst Case 10*9/2 = 45 comparisons n*(n-1)/2 = O(n 2 )

5 1. Insertion Sort 10 items Best Case 9 comparisons O(n)

6 1. Insertion Sort 10 items Average Case

7 2. Mergesort mergsort(a[], i, j) { if (i == j) return;// If sub-list is size 1 m = (i+j)/2;// Compute mid-point mergesort(a, i, m);// Mergesort first half mergesort(a, m+1, j); // Mergesort second half merge(a, i, m, j); // Merge the two halves // Merging is O(n), where n = a + b, where a and b are the size of the two sublists. }

8 2. Mergesort mergsort(a[], i, j) { if (i == j) return;// Base case m = (i+j)/2;// O(1) mergesort(a, i, m);// Recursive call mergesort(a, m+1, j); // Recursive call merge(a, i, m, j); // O(n) }

9 Recall the Big Hammer: a = # recursive calls b = 2 (if you cut the input size in half) n k  is the running time of the actual function (irrespective of recursion) 1. a < b k T(n) = Θ(n k ) 2. a = b k T(n) = Θ(n k log n ) 3. a > b k T(n) = Θ( n log b a )

10 2. The Merge Function 142036841012304548 ?????????

11 1420368412304548 ??????10??

12 14203684304548 ??????1012?

13 203684304548 14?????1012?

14 3684304548 1420????1012?

15 36844548 142030???1012?

16 844548 14203036??1012?

17 8448 1420303645?1012?

18 84 1420303645481012?

19 2. Merge Function While the merge function does “minimal” comparisons, it must do a lot of “moves” i.e., swaps, copies, etc. While the merge function does “minimal” comparisons, it must do a lot of “moves” i.e., swaps, copies, etc. In fact, it is nearly impossible to do the merge efficiently without using extra memory. In fact, it is nearly impossible to do the merge efficiently without using extra memory. For each merge, all the items must be moved at least twice. For each merge, all the items must be moved at least twice. Moved once in the temp array and then back again. Moved once in the temp array and then back again.

20 2. Merge Function Notice the left and right sub-lists are sorted. Notice the left and right sub-lists are sorted. They must be sorted otherwise the merge function won’t work! period They must be sorted otherwise the merge function won’t work! period 404810153681130

21 2. Merge Function The first comparison does NOT require any swaps The first comparison does NOT require any swaps 404810153681130 404810153681130

22 2. Merge Function The second comparison does The second comparison does 10 is less than 30, so we should swap 10 is less than 30, so we should swap 404810153681130 404810153681130 404830153681110

23 2. Merge Function But, how do we continue the merge? But, how do we continue the merge? Which pointer should be moved? Which pointer should be moved? 404810153681130 404810153681130 404830153681110

24 2. Merge Function The values 1 and 10 or sorted, so moving the blue pointer makes sense, right? But where should we move it? The values 1 and 10 or sorted, so moving the blue pointer makes sense, right? But where should we move it? 404810153681130 404810153681130 404830153681110 ??

25 2. Merge Function We can’t move it to the 40 because then we’ll compare 40 with 30 next, and we’ll miss 15, which is smaller. We can’t move it to the 40 because then we’ll compare 40 with 30 next, and we’ll miss 15, which is smaller. 404810153681130 404810153681130 404830153681110 ??

26 2. Merge Function Now, we are in big trouble: Now, we are in big trouble: 1. How exactly will the 15 get into the correct position? 2. What happened to our 2 sorted sub- lists? 404810153681130 404810153681130 404830153681110 ?

27 2. Merge Function Catch-22 The only way to efficiently merge sorted sub-list is to use an extra temp array to do the swapping. The only way to efficiently merge sorted sub-list is to use an extra temp array to do the swapping. However, if you use linked list, then you can efficiently merge without extra memory However, if you use linked list, then you can efficiently merge without extra memory Double However, with link lists you can’ t easily jump to the mid-point of each sub- list, so extra iteration is required. Double However, with link lists you can’ t easily jump to the mid-point of each sub- list, so extra iteration is required.

28 2. Mergesort Conclusion Lesson: Minimizing comparisons is good Minimizing comparisons is good But using extra memory and But using extra memory and –having to move so much data around… …makes Mergesort one of the most inefficient O(n log n) sorting algorithms, no matter how you implement it. …makes Mergesort one of the most inefficient O(n log n) sorting algorithms, no matter how you implement it.

29 Quicksort http://pages.stern.nyu.edu/~panos/java/Quicksort/index.html http://pages.stern.nyu.edu/~panos/java/Quicksort/index.html quicksort (a[], i, j) { if (i < j) { p = partition(a, i, j); quicksort(a, i, p-1); quicksort(a, p+1, j); }} partition(a[],i,j) { v = a[i]; h = i; for k = i+1 to j if (a[k] < v) { h++;swap(a[h],a[k]);} swap (a[i],a[h]); return h; }

30 Quicksort 31 items Best Case O(n log n) log n levels O(n) work for each level

31 Quicksort 31 items Worst Case O(n 2 ) n levels O(n) work for each level...

32 Quicksort 31 items Average Case O(n log n) log n levels O(n) work for each level Proof on page 250-252

33 Quicksort vs. Mergesort Mergesort -close to minimum number of comparisons Mergesort -close to minimum number of comparisons –But every comparison requires moving two values a[i]  c[?]  a[?]. –Mergesort always moves 2n log n values. Quicksort - does more comparisons, but less moves or swaps Quicksort - does more comparisons, but less moves or swaps –Also, once the pivot is placed, it is never moved, i.e., its in the correct position.

34 Important Concept General Sorting General Sorting –The only thing you are allowed to do is compare two items at a time. Special Sorting Special Sorting –It is possible to compare one item with every other item in one operation. –How so?

35 Decision Tree for Sorting a<=b abc b <= c a <= c b <= c abc bac acbcabbcacba

36 Decision Tree for Sorting a<=b abc b <= c a <= c d 12 permutations 6 permutations 3 permutations 2 permutations?

37 Lower-bound on General Sorting h >= log(n!) h >= log(n!) h is the minimum number of comparisons h is the minimum number of comparisons log(n!) =  (n log n) (well known 2.3.8) log(n!) =  (n log n) (well known 2.3.8) Thus, log(n!) =  (n log n) and Thus, log(n!) =  (n log n) and h =  (n log n) h =  (n log n)


Download ppt "Sorting Chapter 6 Chapter 6 –Insertion Sort 6.1 –Quicksort 6.2 Chapter 5 Chapter 5 –Mergesort 5.2 –Stable Sorts 223-224 Divide & Conquer."

Similar presentations


Ads by Google