Chapter 9 continued: Quicksort

Slides:



Advertisements
Similar presentations
©2001 by Charles E. Leiserson Introduction to AlgorithmsDay 9 L6.1 Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 6 Prof. Erik Demaine.
Advertisements

Comp 122, Spring 2004 Order Statistics. order - 2 Lin / Devi Comp 122 Order Statistic i th order statistic: i th smallest element of a set of n elements.
Topic 14 Searching and Simple Sorts "There's nothing in your head the sorting hat can't see. So try me on and I will tell you where you ought to be." -The.
ITEC200 Week10 Sorting. pdp 2 Learning Objectives – Week10 Sorting (Chapter10) By working through this chapter, students should: Learn.
QuickSort Example 13, 21, 15, 3, 12, 9, 14, 7, 6 3, 3, 9, 3, 9, 7, 3, 9, 7, 6, First we use the number in the centre of the list as a ‘pivot’. We then.
Topic 16 Sorting Using ADTs to Implement Sorting Algorithms.
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
CS4413 Divide-and-Conquer
Stephen P. Carl - CS 2421 Recursive Sorting Algorithms Reading: Chapter 5.
QuickSort 4 February QuickSort(S) Fast divide and conquer algorithm first discovered by C. A. R. Hoare in If the number of elements in.
Quicksort CSE 331 Section 2 James Daly. Review: Merge Sort Basic idea: split the list into two parts, sort both parts, then merge the two lists
ISOM MIS 215 Module 7 – Sorting. ISOM Where are we? 2 Intro to Java, Course Java lang. basics Arrays Introduction NewbieProgrammersDevelopersProfessionalsDesigners.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Quicksort COMP171 Fall Sorting II/ Slide 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N.
1 Today’s Material Divide & Conquer (Recursive) Sorting Algorithms –QuickSort External Sorting.
CS 201 Data Structures and Algorithms Text: Read Weiss, § 7.7
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
Sorting Algorithms and Average Case Time Complexity
Quicksort Divide-and-Conquer. Quicksort Algorithm Given an array S of n elements (e.g., integers): If array only contains one element, return it. Else.
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
Fundamentals of Algorithms MCS - 2 Lecture # 16. Quick Sort.
CS 206 Introduction to Computer Science II 04 / 27 / 2009 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 12 / 09 / 2009 Instructor: Michael Eckmann.
Sorting21 Recursive sorting algorithms Oh no, not again!
Recitation 9 Programming for Engineers in Python.
CS 206 Introduction to Computer Science II 12 / 05 / 2008 Instructor: Michael Eckmann.
Quicksort. 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom.
Sorting Chapter 10.
Quicksort.
1 7.5 Heapsort Average number of comparison used to heapsort a random permutation of N items is 2N logN - O (N log log N).
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Sorting Chapter 10. Chapter 10: Sorting2 Chapter Objectives To learn how to use the standard sorting methods in the Java API To learn how to implement.
Design and Analysis of Algorithms – Chapter 51 Divide and Conquer (I) Dr. Ying Lu RAIK 283: Data Structures & Algorithms.
CS 206 Introduction to Computer Science II 12 / 08 / 2008 Instructor: Michael Eckmann.
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
CS 202, Spring 2003 Fundamental Structures of Computer Science II Bilkent University1 Sorting - 3 CS 202 – Fundamental Structures of Computer Science II.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
C Programming Week 10 Sorting Algorithms 1. Sorting In many queries we handle a list of values and want a specific value. We can go through all the list.
Chapter 8 Sorting and Searching Goals: 1.Java implementation of sorting algorithms 2.Selection and Insertion Sorts 3.Recursive Sorts: Mergesort and Quicksort.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
UNIT 5.  The related activities of sorting, searching and merging are central to many computer applications.  Sorting and merging provide us with a.
1 Heapsort, Mergesort, and Quicksort Sections 7.5 to 7.7.
Quicksort Data Structures and Algorithms CS 244 Brent M. Dingle, Ph.D. Game Design and Development Program Department of Mathematics, Statistics, and Computer.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Nirmalya Roy School of Electrical Engineering and Computer Science Washington State University Cpt S 122 – Data Structures Sorting.
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
Intro. to Data Structures Chapter 7 Sorting Veera Muangsin, Dept. of Computer Engineering, Chulalongkorn University 1 Chapter 7 Sorting Sort is.
Sorting – Lecture 3 More about Merge Sort, Quick Sort.
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Chapter 7 Sorting Spring 14
CSE 143 Lecture 23: quick sort.
Quick Sort (11.2) CSE 2011 Winter November 2018.
Unit-2 Divide and Conquer
slides adapted from Marty Stepp
Chapter 4.
EE 312 Software Design and Implementation I
CSE 373 Data Structures and Algorithms
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
Design and Analysis of Algorithms
Divide and Conquer Merge sort and quick sort Binary search
Presentation transcript:

Chapter 9 continued: Quicksort Lecture 19 Chapter 9 continued: Quicksort Similar to mergesort: divide and conquer, recursive algorithm Average running time is O(nlogn) O(n2) worst case performance, (can be made exponentially unlikely) Simple to understand and prove correct But for many years hard to implement In practice, faster than mergesort ADS2 Lecture 19

Basic algorithm to sort array S: If number of elements in S is 0 or 1, then return. Pick any element v in S. This is called the pivot Partition S-{v} (the remaining elements in S) into sets S1 = set of elements in S-{v}  v and S2 = set of elements in S-{v}  v Return{quicksort (S1) followed by v followed by quicksort(S2)} Note that: Choice of pivot is crucial When we reassemble, no need to merge ADS2 Lecture 19

Example: illustration of steps 13 81 92 43 31 65 57 26 75 0 select pivot 13 43 31 57 26 0 65 81 92 75 partition quicksort small quicksort large 65 75 81 92 0 13 26 31 43 57 0 13 26 31 43 57 65 75 81 92 ADS2 Lecture 19

Example for you Use quicksort to sort: 23 3 -7 6 -18 5 1 23 3 -7 6 -18 5 1 Include all stages How did you choose your pivot in each case? describe the strategy used (e.g. “let first element be the pivot”) ADS2 Lecture 19

Picking the pivot – some inadvisable options Use first element Acceptable if input is random Very poor if input is presorted or in reverse order Virtually all the elements go into S1 or into S2 This happens consistently throughout recursive calls If input already presorted, will take quadratic time to do nothing at all! Choose larger of first two elements Same problems as above Pick pivot randomly Theoretically a good idea, unlikely that random number would consistently provide a bad partition (unless random number generator flawed – possible!) Random number generation is expensive ADS2 Lecture 19

Median of three partitioning Median of n numbers is the ([n/2] +1)th element (index [n/2]) when placed in order E.g. median of -1,0,1,4,5,11 is the 4th , i.e. 4 median of -2, 0 , 8, 33, 34, 35, 57 is the 4th , i.e. 33 Median would be the ideal pivot Would be balanced (approx half elements would go into S1 and half into S2) But hard to calculate and slow (need to sort first!) Compromise : median of left, centre and right element (without ordering whole array) Centre position is (left+right)/2 Take the middle of the three. For -2, 0, 8, 33, 24, 35, -7 take median of -2, 33 and -7 (i.e. -2) For -2 ,0, 8,33,24,35,26,-7 take median of -2,33 and -7 (i.e. -2 again) ADS2 Lecture 19

Partitioning strategy One of several. Known to give good results First step, move pivot to far right by swapping with right element Second step, set i to left element, and j to right-1 8 1 4 9 6 3 5 2 7 0 identify pivot swap pivot and last element 8 1 4 9 0 3 5 2 7 6 j i We will assume all elements are distinct And consider the case when they are not, later ADS2 Lecture 19

Partitioning strategy contd. Move all small elements to left part of array, and all large elements to right While i is to the left of j, move i right, skipping over elements smaller than pivot Move j left, skipping over elements larger than pivot When i and j have stopped, i is pointing to large element, and j to small If i is to left of j, swap those elements Finally, swap element pointed to by i with the pivot. ADS2 Lecture 19

Partitioning example before first swap i stays the same as 8 is already> pivot j skips along to element 2 Swap those elements and repeat the process 8 1 4 9 0 3 5 2 7 6 j i after first swap: 2 1 4 9 0 3 5 8 7 6 i j before second swap: 2 1 4 9 0 3 5 8 7 6 j i 2 1 4 5 0 3 9 8 7 6 i j after second swap: before third swap: i 2 1 4 5 0 3 9 8 7 6 j STOP: i and j have crossed After swap with pivot: 2 1 4 5 0 3 6 8 7 9 ADS2 Lecture 19 i

Partitioning example continued When finished, evey element in position p<i is small, and every element in position p>i is large Would then apply quicksort to the lists (arrays) 2,1,4,5,0 3 and 8,7,9 Example for you: complete the next stage of the process, i.e. partition the list 2,1,4,5,0,3 ADS2 Lecture 19

When some elements are identical  When some elements are identical Most importantly, when an element that is not pivot has same value as pivot should we stop or skip ? Should do the same for i and j so that partitioning is not biased - In each case, suppose that all elements are identical. Stopping: many swaps between identical elements but i and j cross in middle, so when pivot replaced 2 nearly equal partitions are created (so, like mergesort, O(nlogn)) So we choose to stop Skipping: no swaps between identical elements but i and j do not cross in middle, so pivot will be (re)placed at last, or second last position. Unequal parts. If all vals identical, O(n2) see board ADS2 Lecture 19

Small arrays : using a cutoff For small arrays, quicksort not as fast as insertion sort As quicksort is recursive, these cases will occur frequently Solution: use quicksort until arrays are small, then use insertion sort on whole array Works very well as insertion sort v efficient for nearly sorted arrays Can save about 15% in running time. Good cutoff range is n=10, but any cutoff between 5 and 20 likely to produce similar results Also avoids problems like finding median of three when only two elements left ADS2 Lecture 19

Quicksort in practice MedianOfThree algorithm see board for discussion Algorithm MedianOfThree(A,x,y): Input: An array A and integers x,y0, such that A has indices x .. y Output: value of pivot Note : A will have pivot at position y c  (x+y)/2 if A[x]>A[c] then swap(A,x,c) if A[x]>A[y] then swap(A,x,y) if A[c]>A[y] then swap(A,c,y) pivot  c return pivot see board for discussion ADS2 Lecture 19

Quicksort in practice contd. Algorithm QuickSort(A,x,y,cutoff): Input: An array A and integers x,y0, such that A has indices x .. y, and integer cutoff Output : A (almost) sorted if (y-x)>cutoff then pivot=MedianOfThree(A,x,y) i  x j  y-1 while i<=j do while A[i]<pivot do i  i+1 while (A[j]>pivot) and (j>=i) do j  j-1 if (i<j) then swap(A,i,j) swap(A,i,y) QuickSort(A,x,i-1,cutoff) QuickSort(A,i+1,y,cutoff) return A ADS2 Lecture 19

QuickSort – Putting it all together This is an exercise for you (part of Lab 5) Need to include following methods: swap medianOfThree insertionSort qSort You will be sorting an array of country names. quickSort will need to be generic Will also need to convert mergeSort to be generic, and compare timings for quickSort, mergeSort, insertionSort, and bubbleSort. ADS2 Lecture 19

Analysis of Quicksort As for mergesort, T(0)=T(1)=1 Running time for array of size n is equal to running time of two recursive calls plus the linear time spent in the partition (pivot selection takes only constant time) Basic quicksort relation is T(n) = T(i) +T(n-i-1) +cn where i=|S1| = number of elements in S1 Worst case: O(n2) Best-Case: O(n log n) Average-case: O(n log n) will prove the first two of these on the board ADS2 Lecture 19

Finally: Comparison of quicksort and mergesort Both routines recursively solve two subproblems and require linear additional work, but Unlike mergesort, in quicksort subproblems not guaranteed to be of equal size However, quicksort is faster because partitioning step can be performed “in place”, very efficiently. More than makes up for (2). ADS2 Lecture 19

Some examples for you (See extra quicksort examples contained in Lectures folder) Sort 3,1,4,1,5,9,2,6,5,3,5 using quicksort with median-of-three partitioning and a cutoff of 3 Perform the first partitioning stage on 11,10,9,8,7,6,5,4,3,2,1 What do you notice? Construct a permutation of 20 elements that is as bad as possible for quicksort, using median-of-three partitioning and a cutoff of 3 ADS2 Lecture 19