Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.

Slides:



Advertisements
Similar presentations
Chapter 9 continued: Quicksort
Advertisements

Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
Divide And Conquer Distinguish between small and large instances. Small instances solved differently from large ones.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
QuickSort 4 February QuickSort(S) Fast divide and conquer algorithm first discovered by C. A. R. Hoare in If the number of elements in.
Efficient Sorts. Divide and Conquer Divide and Conquer : chop a problem into smaller problems, solve those – Ex: binary search.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
CSC2100B Quick Sort and Merge Sort Xin 1. Quick Sort Efficient sorting algorithm Example of Divide and Conquer algorithm Two phases ◦ Partition phase.
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Quicksort COMP171 Fall Sorting II/ Slide 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N.
Chapter 7: Sorting Algorithms
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
1 Today’s Material Divide & Conquer (Recursive) Sorting Algorithms –QuickSort External Sorting.
CS 201 Data Structures and Algorithms Text: Read Weiss, § 7.7
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
CSC 2300 Data Structures & Algorithms March 23, 2007 Chapter 7. Sorting.
Data Structures and Algorithms PLSD210 Sorting. Card players all know how to sort … First card is already sorted With all the rest, ¶Scan back from the.
CS203 Programming with Data Structures Sorting California State University, Los Angeles.
Quicksort Divide-and-Conquer. Quicksort Algorithm Given an array S of n elements (e.g., integers): If array only contains one element, return it. Else.
Quick Sort. Quicksort Quicksort is a well-known sorting algorithm developed by C. A. R. Hoare. The quick sort is an in-place, divide- and-conquer, massively.
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
CS 171: Introduction to Computer Science II Quicksort.
Fundamentals of Algorithms MCS - 2 Lecture # 16. Quick Sort.
CS 206 Introduction to Computer Science II 12 / 09 / 2009 Instructor: Michael Eckmann.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
Quicksort. 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom.
Quicksort.
Divide and Conquer Sorting
CSC 2300 Data Structures & Algorithms March 20, 2007 Chapter 7. Sorting.
1 7.5 Heapsort Average number of comparison used to heapsort a random permutation of N items is 2N logN - O (N log log N).
Sorting Rearrange n elements into ascending order. 7, 3, 6, 2, 1  1, 2, 3, 6, 7.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
CS 206 Introduction to Computer Science II 12 / 08 / 2008 Instructor: Michael Eckmann.
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
CS 202, Spring 2003 Fundamental Structures of Computer Science II Bilkent University1 Sorting - 3 CS 202 – Fundamental Structures of Computer Science II.
CSE 373 Data Structures Lecture 19
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
Divide-And-Conquer Sorting Small instance.  n
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
Sorting CSIT 402 Data Structures II. 2 Sorting (Ascending Order) Input ›an array A of data records ›a key value in each data record ›a comparison function.
1 CSE 373 Sorting 3: Merge Sort, Quick Sort reading: Weiss Ch. 7 slides created by Marty Stepp
Divide And Conquer A large instance is solved as follows:  Divide the large instance into smaller instances.  Solve the smaller instances somehow. 
1 Heapsort, Mergesort, and Quicksort Sections 7.5 to 7.7.
Sorting Fundamental Data Structures and Algorithms Aleks Nanevski February 17, 2004.
Sorting Algorithms Merge Sort Quick Sort Hairong Zhao New Jersey Institute of Technology.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
SORTING AND ASYMPTOTIC COMPLEXITY Lecture 13 CS2110 – Fall 2009.
Lecture 6 Sorting II Divide-and-Conquer Algorithms.
Sorting – Lecture 3 More about Merge Sort, Quick Sort.
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
Divide and Conquer Sorting
Quick-Sort 9/12/2018 3:26 PM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Divide-And-Conquer-And-Combine
Quick-Sort 9/13/2018 1:15 AM Quick-Sort     2
Chapter 7 Sorting Spring 14
Chapter 4: Divide and Conquer
Quick Sort (11.2) CSE 2011 Winter November 2018.
Divide-And-Conquer-And-Combine
Chapter 4.
CSE 326: Data Structures Sorting
EE 312 Software Design and Implementation I
CSE 373 Data Structures and Algorithms
Quick-Sort 4/25/2019 8:10 AM Quick-Sort     2
Design and Analysis of Algorithms
Divide and Conquer Merge sort and quick sort Binary search
Presentation transcript:

Quicksort, Mergesort, and Heapsort

Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average case complexity  O(N log N ) Worst-case complexity  O(N 2 )  Rarely happens, if coded correctly

Quicksort Outline Divide and conquer approach Given array S to be sorted If size of S < 1 then done; Pick any element v in S as the pivot Partition S-{v} (remaining elements in S ) into two groups S1 = {all elements in S-{v} that are smaller than v } S2 = {all elements in S-{v} that are larger than v } Return { quicksort(S1) followed by v followed by quicksort(S2) } Trick lies in handling the partitioning (step 3).  Picking a good pivot  Efficiently partitioning in-place

Quicksort example Select pivot partition Recursive call Merge

Picking the Pivot How would you pick one? Strategy 1: Pick the first element in S  Works only if input is random  What if input S is sorted, or even mostly sorted? All the remaining elements would go into either S1 or S2 ! Terrible performance!  Why worry about sorted input? Remember  Quicksort is recursive, so sub-problems could be sorted Plus mostly sorted input is quite frequent

Picking the Pivot (contd.) Strategy 2: Pick the pivot randomly  Would usually work well, even for mostly sorted input  Unless the random number generator is not quite random!  Plus random number generation is an expensive operation

Picking the Pivot (contd.) Strategy 3: Median-of-three Partitioning  Ideally, the pivot should be the median of input array S Median = element in the middle of the sorted sequence  Would divide the input into two almost equal partitions  Unfortunately, its hard to calculate median quickly, without sorting first!  So find the approximate median Pivot = median of the left-most, right-most and center element of the array S Solves the problem of sorted input

Picking the Pivot (contd.) Example: Median-of-three Partitioning  Let input S = {6, 1, 4, 9, 0, 3, 5, 2, 7, 8}  left=0 and S[left] = 6  right=9 and S[right] = 8  center = (left+right)/2 = 4 and S[center] = 0  Pivot = Median of S[left], S[right], and S[center] = median of 6, 8, and 0 = S[left] = 6

Partitioning Algorithm Original input : S = {6, 1, 4, 9, 0, 3, 5, 2, 7, 8} Get the pivot out of the way by swapping it with the last element Have two ‘iterators’ – i and j  i starts at first element and moves forward  j starts at last element and moves backwards pivot ijpivot

Partitioning Algorithm (contd.)  While (i < j)  Move i to the right till we find a number greater than pivot  Move j to the left till we find a number smaller than pivot  If (i < j) swap(S[i], S[j])  (The effect is to push larger elements to the right and smaller elements to the left) 4.Swap the pivot with S[i]

Partitioning Algorithm Illustrated ijpivot ij pivot ijpivot Move swap ijpivot move ijpivot swap ijpivot move ij pivot Swap S[i] with pivot i and j have crossed

Dealing with small arrays For small arrays (N ≤ 20),  Insertion sort is faster than quicksort Quicksort is recursive  So it can spend a lot of time sorting small arrays Hybrid algorithm:  Switch to using insertion sort when problem size is small (say for N < 20 )

Quicksort Driver Routine

Quicksort Pivot Selection Routine Swap a[left], a[center] and a[right] in-place Pivot is in a[center] now Swap the pivot a[center] with a[right-1]

Quicksort routine Has a side effect move swap

Exercise: Runtime analysis Worst-case Average case Best case

Heapsort Build a binary heap of N elements  O(N) time Then perform N deleteMax operations  log(N) time per deleteMax Total complexity O(N log N)

Example After BuildHeap After first deleteMax

Heapsort Implementation

Mergesort Divide the N values to be sorted into two halves Recursively sort each half using Mergesort  Base case N=1  no sorting required Merge the two halves  O(N) operation Complexity??  We’ll see

Mergesort Implementation

Complexity analysis T(N) = 2T(N/2) + N  Recurrence relation T(N) = 4T(N/4) + 2N T(N) = 8T(N/8) + 3N … T(N) = 2 k T(N/2 k ) + k*N For k = log N  T(N) = N T(1) + N log N