Quick Sort (11.2) CSE 2011 Winter 2011 19 November 2018.

Slides:



Advertisements
Similar presentations
Chapter 9 continued: Quicksort
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Quicksort COMP171 Fall Sorting II/ Slide 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N.
Chapter 7: Sorting Algorithms
1 Today’s Material Divide & Conquer (Recursive) Sorting Algorithms –QuickSort External Sorting.
CS 201 Data Structures and Algorithms Text: Read Weiss, § 7.7
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
Insertion Sort Merge Sort QuickSort. Sorting Problem Definition an algorithm that puts elements of a list in a certain order Numerical order and Lexicographical.
Quicksort Divide-and-Conquer. Quicksort Algorithm Given an array S of n elements (e.g., integers): If array only contains one element, return it. Else.
Fundamentals of Algorithms MCS - 2 Lecture # 16. Quick Sort.
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
Quicksort. 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom.
Quicksort.
Quicksort.
Quicksort
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
CSE 373 Data Structures Lecture 19
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
CSC – 332 Data Structures Sorting
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
Sorting CSIT 402 Data Structures II. 2 Sorting (Ascending Order) Input ›an array A of data records ›a key value in each data record ›a comparison function.
1 Heapsort, Mergesort, and Quicksort Sections 7.5 to 7.7.
Divide and Conquer Sorting Algorithms COMP s1 Sedgewick Chapters 7 and 8.
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
Intro. to Data Structures Chapter 7 Sorting Veera Muangsin, Dept. of Computer Engineering, Chulalongkorn University 1 Chapter 7 Sorting Sort is.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Advanced Sorting.
Divide and Conquer Sorting
Fundamental Data Structures and Algorithms
Quick Sort Divide: Partition the array into two sub-arrays
Quicksort
Divide-And-Conquer-And-Combine
Quick-Sort 9/13/2018 1:15 AM Quick-Sort     2
Chapter 7 Sorting Spring 14
Quicksort "There's nothing in your head the sorting hat can't see. So try me on and I will tell you where you ought to be." -The Sorting Hat, Harry Potter.
Data Structures and Algorithms
CSE 143 Lecture 23: quick sort.
CSC 413/513: Intro to Algorithms
Quicksort 1.
Chapter 4: Divide and Conquer
CO 303 Algorithm Analysis And Design Quicksort
Quicksort analysis Bubble sort
Ch 7: Quicksort Ming-Te Chi
8/04/2009 Many thanks to David Sun for some of the included slides!
Topic: Divide and Conquer
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
Divide-And-Conquer-And-Combine
Sub-Quadratic Sorting Algorithms
slides adapted from Marty Stepp
EE 312 Software Design and Implementation I
Quicksort.
CS 1114: Sorting and selection (part two)
CSE 373 Data Structures and Algorithms
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
Topic: Divide and Conquer
Data Structures & Algorithms
Quicksort.
CSE 326: Data Structures: Sorting
The Selection Problem.
Design and Analysis of Algorithms
CSE 332: Sorting II Spring 2016.
Quicksort.
Presentation transcript:

Quick Sort (11.2) CSE 2011 Winter 2011 19 November 2018

Quick Sort Fastest known sorting algorithm in practice Average case: O(N log N) Worst case: O(N2) But the worst case can be made exponentially unlikely. Another divide-and-conquer recursive algorithm, like merge sort

Quick Sort: Main Idea If the number of elements in S is 0 or 1, then return (base case). Pick any element v in S (called the pivot). Partition the elements in S except v into two disjoint groups: S1 = {x S – {v} | x  v} S2 = {x S – {v} | x  v} Return {QuickSort(S1) + v + QuickSort(S2)}

Quick Sort: Example

Example: Quicksort...

Issues To Consider How to pick the pivot? How to partition? Several methods exist. The one we consider is known to give good results and to be easy and efficient.

Partitioning Strategy Want to partition an array A[left .. right] First, get the pivot element out of the way by swapping it with the last element (swap pivot and A[right]) Let i start at the first element and j start at the next-to-last element (i = left, j = right – 1) swap 5 7 4 6 3 12 19 5 7 4 19 3 12 6 i j pivot

Partitioning Strategy Want to have A[k]  pivot, for k < i A[k]  pivot, for k > j When i < j Move i right, skipping over elements smaller than the pivot Move j left, skipping over elements greater than the pivot When both i and j have stopped A[i]  pivot A[j]  pivot  A[i] and A[j] should now be swapped  pivot  pivot i j i j 5 7 4 6 3 12 19

Partitioning Strategy (2) When i and j have stopped and i is to the left of j (thus legal) Swap A[i] and A[j] The large element is pushed to the right and the small element is pushed to the left After swapping A[i]  pivot A[j]  pivot Repeat the process until i and j cross swap 5 7 4 19 3 12 6 5 3 4 19 7 12 6 i j i j

Partitioning Strategy (3) 5 3 4 19 7 12 6 When i and j have crossed swap A[i] and pivot Result: A[k]  pivot, for k < i A[k]  pivot, for k > i i j 5 3 4 19 7 12 6 i j Swap! 5 3 4 6 7 12 19 j i Break!

Picking the Pivot Objective: Choose a pivot so that we will get 2 partitions of (almost) equal size.

Picking the Pivot (2) Use the first element as pivot if the input is random, ok. if the input is presorted (or in reverse order) all the elements go into S2 (or S1). this happens consistently throughout the recursive calls. results in O(N2) behavior (we analyze this case later). Choose the pivot randomly generally safe random number generation can be expensive and does not reduce the running time of the algorithm.

Picking the Pivot (3) Use the median of the array The N/2 th largest element Partitioning always cuts the array into roughly half An optimal quick sort (O(N log N)) However, hard to find the exact median Median-of-three partitioning eliminates the bad case for sorted input. reduces the number of comparisons by 14%.

Median of Three Method Compare just three elements: the leftmost, rightmost and center Swap these elements if necessary so that A[left] = Smallest A[right] = Largest A[center] = Median of three Pick A[center] as the pivot. Swap A[center] and A[right – 1] so that the pivot is at the second last position (why?)

Median of Three: Example A[left] = 2, A[center] = 13, A[right] = 6 2 5 6 4 13 3 12 19 6 2 5 6 4 6 3 12 19 13 Swap A[center] and A[right] 2 5 6 4 6 3 12 19 13 Choose A[center] as pivot pivot 2 5 6 4 19 3 12 6 13 Swap pivot and A[right – 1] pivot We only need to partition A[ left + 1, …, right – 2 ]. Why?

Small Arrays For very small arrays, quick sort does not perform as well as insertion sort Do not use quick sort recursively for small arrays Use a sorting algorithm that is efficient for small arrays, such as insertion sort. When using quick sort recursively, switch to insertion sort when the sub-arrays have between 5 to 20 elements (10 is usually good). saves about 15% in the running time. avoids taking the median of three when the sub-array has only 1 or 2 elements.

Quick Sort: Pseudo-code Choose pivot Partitioning Recursion For small arrays

Partitioning Part Works only if pivot is picked as median-of-three. A[left]  pivot and A[right]  pivot Need to partition only A[left + 1, …, right – 2] j will not run past the beginning because A[left]  pivot i will not run past the end because A[right-1] = pivot

Quick Sort Faster Than Merge Sort Both quick sort and merge sort take O(N log N) in the average case. Why is quicksort faster than merge sort? The inner loop consists of an increment/decrement (by 1, which is fast), a test and a jump. There is no extra juggling as in merge sort. inner loop

Analysis Assumptions: A random pivot (no median-of-three partitioning) No cutoff for small arrays Running time pivot selection: constant time, i.e. O(1) partitioning: linear time, i.e. O(N) running time of the two recursive calls T(N)= T(i) + T(N – i – 1) + cN i: number of elements in S1 c is a constant

Worst-Case Scenario What will be the worst case? The pivot is the smallest element, all the time Partition is always unbalanced

Best-Case Scenario What will be the best case? Partition is perfectly balanced. Pivot is always in the middle (median of the array).

Average-Case Analysis Assume that each of the sizes for S1 is equally likely  has probability 1/N. This assumption is valid for the pivoting and partitioning strategy just discussed (but may not be for some others), On average, the running time is O(N log N). Proof: pp 272–273, Data Structures and Algorithm Analysis by M. A. Weiss, 2nd edition

Next time … Arrays (review) Linked Lists (3.2, 3.3) Stacks, queues (Chapter 5)