QuickSort by Dr. Bun Yue Professor of Computer Science 2013 CSCI 3333 Data.

Slides:



Advertisements
Similar presentations
Lab class 10: 1. Analyze and implement the following merge-sorting program. //import java.lang.*; public class MergeSorter { /** * Sort the elements of.
Advertisements

AVL Trees1 Part-F2 AVL Trees v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that.
Goodrich, Tamassia Sorting, Sets and Selection1 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
© 2004 Goodrich, Tamassia Merge Sort1 7 2  9 4   2  2 79  4   72  29  94  4.
Data Structures Lecture 9 Fang Yu Department of Management Information Systems National Chengchi University Fall 2010.
Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
Merge Sort 4/15/2017 4:30 PM Merge Sort 7 2   7  2  2 7
Quicksort Quicksort     29  9.
Merge Sort1 Part-G1 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
Merge Sort1 7 2  9 4   2  2 79  4   72  29  94  4.
Insertion Sort. Selection Sort. Bubble Sort. Heap Sort. Merge-sort. Quick-sort. 2 CPSC 3200 University of Tennessee at Chattanooga – Summer 2013 © 2010.
Quick Sorting -Ed. 2. and 3.: Chapter 10 -Ed. 4.: Chapter 11.
CSC2100B Quick Sort and Merge Sort Xin 1. Quick Sort Efficient sorting algorithm Example of Divide and Conquer algorithm Two phases ◦ Partition phase.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
© 2004 Goodrich, Tamassia QuickSort1 Quick-Sort     29  9.
Quick-Sort     29  9.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
© 2004 Goodrich, Tamassia Selection1. © 2004 Goodrich, Tamassia Selection2 The Selection Problem Given an integer k and n elements x 1, x 2, …, x n, taken.
TTIT33 Algorithms and Optimization – Dalg Lecture 2 HT TTIT33 Algorithms and optimization Lecture 2 Algorithms Sorting [GT] 3.1.2, 11 [LD] ,
© 2004 Goodrich, Tamassia Merge Sort1 Quick-Sort     29  9.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4 Modified by: Daniel Gomez-Prado, University of Massachusetts Amherst.
1 CSC401 – Analysis of Algorithms Lecture Notes 9 Radix Sort and Selection Objectives  Introduce no-comparison-based sorting algorithms: Bucket-sort and.
Selection1. 2 The Selection Problem Given an integer k and n elements x 1, x 2, …, x n, taken from a total order, find the k-th smallest element in this.
© 2004 Goodrich, Tamassia Merge Sort1 Chapter 8 Sorting Topics Basics Merge sort ( 合併排序 ) Quick sort Bucket sort ( 分籃排序 ) Radix sort ( 基數排序 ) Summary.
MergeSort Source: Gibbs & Tamassia. 2 MergeSort MergeSort is a divide and conquer method of sorting.
Sorting.
CSC 213 Lecture 12: Quick Sort & Radix/Bucket Sort.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
Sorting Fun1 Chapter 4: Sorting     29  9.
Lecture10: Sorting II Bohyung Han CSE, POSTECH CSED233: Data Structures (2014F)
Randomized Algorithms CSc 4520/6520 Design & Analysis of Algorithms Fall 2013 Slides adopted from Dmitri Kaznachey, George Mason University and Maciej.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
1 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
Towers of Hanoi Move n (4) disks from pole A to pole B such that a larger disk is never put on a smaller disk A BC ABC.
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
1 COMP9024: Data Structures and Algorithms Week Eight: Sortings and Sets Hui Wu Session 1, 2016
Merge Sort 1/12/2018 5:48 AM Merge Sort 7 2   7  2  2 7
Advanced Sorting 7 2  9 4   2   4   7
Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount.
Merge Sort 7 2  9 4   2   4   7 2  2 9  9 4  4.
Merge Sort 1/12/2018 9:39 AM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Merge Sort 1/12/2018 9:44 AM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Quick-Sort 2/18/2018 3:56 AM Selection Selection.
COMP9024: Data Structures and Algorithms
Quick-Sort 9/12/2018 3:26 PM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Quick-Sort 9/13/2018 1:15 AM Quick-Sort     2
Quick-Sort To understand quick-sort, let’s look at a high-level description of the algorithm 1) Divide : If the sequence S has 2 or more elements, select.
Selection Selection 1 Quick-Sort Quick-Sort 10/30/16 13:52
Radish-Sort 11/11/ :01 AM Quick-Sort     2 9  9
Objectives Introduce different known sorting algorithms
Quick-Sort 11/14/2018 2:17 PM Chapter 4: Sorting    7 9
Quick-Sort 11/19/ :46 AM Chapter 4: Sorting    7 9
Merge Sort 12/4/ :32 AM Merge Sort 7 2   7  2   4  4 9
1 Lecture 13 CS2013.
Sorting Recap & More about Sorting
Sorting Recap & More about Sorting
Quick-Sort 2/23/2019 1:48 AM Chapter 4: Sorting    7 9
Merge Sort 2/23/ :15 PM Merge Sort 7 2   7  2   4  4 9
Quick-Sort 2/25/2019 2:22 AM Quick-Sort     2
Copyright © Aiman Hanna All rights reserved
Copyright © Aiman Hanna All rights reserved
Quick-Sort 4/8/ :20 AM Quick-Sort     2 9  9
Merge Sort 4/10/ :25 AM Merge Sort 7 2   7  2   4  4 9
Quick-Sort 4/25/2019 8:10 AM Quick-Sort     2
Divide & Conquer Sorting
Quick-Sort 5/7/2019 6:43 PM Selection Selection.
Quick-Sort 5/25/2019 6:16 PM Selection Selection.
Merge Sort 5/30/2019 7:52 AM Merge Sort 7 2   7  2  2 7
CS203 Lecture 15.
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

QuickSort by Dr. Bun Yue Professor of Computer Science CSCI 3333 Data Structures

Acknowledgement  Mr. Charles Moen  Dr. Wei Ding  Ms. Krishani Abeysekera  Dr. Michael Goodrich

Quick-Sort  Quick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm: Divide: pick a random element x (called pivot) and partition S into  L elements less than x  E elements equal x  G elements greater than x Recur: sort L and G Conquer: join L, E and G x x L G E x

Partition  We partition an input sequence as follows: We remove, in turn, each element y from S and We insert y into L, E or G, depending on the result of the comparison with the pivot x  Each insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) time  Thus, the partition step of quick-sort takes O(n) time Algorithm partition(S, p) Input sequence S, position p of pivot Output subsequences L, E, G of the elements of S less than, equal to, or greater than the pivot, resp. L, E, G  empty sequences x  S.remove(p) while  S.isEmpty() y  S.remove(S.first()) if y < x L.insertLast(y) else if y = x E.insertLast(y) else { y > x } G.insertLast(y) return L, E, G

Quick-Sort Tree  An execution of quick-sort is depicted by a binary tree Each node represents a recursive call of quick-sort and stores  Unsorted sequence before the execution and its pivot  Sorted sequence at the end of the execution The root is the initial call The leaves are calls on subsequences of size 0 or     29  9

Execution Example  Pivot selection      38    94  4

Execution Example (cont.)  Partition, recursive call, pivot selection    94     38  8 2  2

Execution Example (cont.)  Partition, recursive call, base case   11    94      38  8

Execution Example (cont.)  Recursive call, …, base case, join   38     11  14 3   94  44  4

Execution Example (cont.)  Recursive call, pivot selection      11  14 3   94  44  4

Execution Example (cont.)  Partition, …, recursive call, base case      11  14 3   94  44  4 9  99  9

Execution Example (cont.)  Join, join      11  14 3   94  44  4 9  99  9

Worst-case Running Time  The worst case for quick-sort occurs when the pivot is the unique minimum or maximum element  One of L and G has size n  1 and the other has size 0  The running time is proportional to the sum n  (n  1)  …  2   Thus, the worst-case running time of quick-sort is O(n 2 ) depthtime 0n 1 n  1 …… 1 …

Expected Running Time  Consider a recursive call of quick-sort on a sequence of size s Good call: the sizes of L and G are each less than 3s  4 Bad call: one of L and G has size greater than 3s  4  A call is good with probability 1  2 1/2 of the possible pivots cause good calls:  Good callBad call Good pivotsBad pivots

Expected Running Time, Part 2  Probabilistic Fact: The expected number of coin tosses required in order to get k heads is 2k  For a node of depth i, we expect i  2 ancestors are good calls The size of the input sequence for the current call is at most ( 3  4 ) i  2 n  Therefore, we have For a node of depth 2log 4  3 n, the expected input size is one The expected height of the quick-sort tree is O(log n)  The amount or work done at the nodes of the same depth is O(n)  Thus, the expected running time of quick-sort is O(n log n)

In-Place Quick-Sort  Quick-sort can be implemented to run in-place  In the partition step, we use replace operations to rearrange the elements of the input sequence such that the elements less than the pivot have rank less than h the elements equal to the pivot have rank between h and k the elements greater than the pivot have rank greater than k  The recursive calls consider elements with rank less than h elements with rank greater than k Algorithm inPlaceQuickSort(S, l, r) Input sequence S, ranks l and r Output sequence S with the elements of rank between l and r rearranged in increasing order if l  r return i  a random integer between l and r x  S.elemAtRank(i) (h, k)  inPlacePartition(x) inPlaceQuickSort(S, l, h  1) inPlaceQuickSort(S, k  1, r)

In-Place Partitioning  Perform the partition using two indices to split S into L and E U G (a similar method can split E U G into E and G).  Repeat until j and k cross: Scan j to the right until finding an element > x. Scan k to the left until finding an element < x. Swap elements at indices j and k jk (pivot = 6) jk

Summary of Sorting Algorithms AlgorithmAvg. TimeNotes selection-sort O(n2)O(n2)  in-place  slow (good for small inputs) insertion-sort O(n2)O(n2)  in-place  slow (good for small inputs) quick-sort O(n log n) expected  in-place, randomized  fastest (good for large inputs) heap-sort O(n log n)  in-place  fast (good for large inputs) merge-sort O(n log n)  sequential data access  fast (good for huge inputs)

Java Implementation public static void quickSort (Object[] S, Comparator c) { if (S.length < 2) return; // the array is already sorted in this case quickSortStep(S, c, 0, S.length-1); // recursive sort method } private static void quickSortStep (Object[] S, Comparator c, int leftBound, int rightBound ) { if (leftBound >= rightBound) return; // the indices have crossed Object temp; // temp object used for swapping Object pivot = S[rightBound]; int leftIndex = leftBound; // will scan rightward int rightIndex = rightBound-1; // will scan leftward while (leftIndex <= rightIndex) { // scan right until larger than the pivot while ( (leftIndex <= rightIndex) && (c.compare(S[leftIndex], pivot)<=0) ) leftIndex++; // scan leftward to find an element smaller than the pivot while ( (rightIndex >= leftIndex) && (c.compare(S[rightIndex], pivot)>=0)) rightIndex--; if (leftIndex < rightIndex) { // both elements were found temp = S[rightIndex]; S[rightIndex] = S[leftIndex]; // swap these elements S[leftIndex] = temp; } } // the loop continues until the indices cross temp = S[rightBound]; // swap pivot with the element at leftIndex S[rightBound] = S[leftIndex]; S[leftIndex] = temp; // the pivot is now at leftIndex, so recurse quickSortStep(S, c, leftBound, leftIndex-1); quickSortStep(S, c, leftIndex+1, rightBound); } only works for distinct elements

C++ Implementation

Questions and Comments?