Quicksort.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
21/3/00SEM107- Kamin & ReddyClass 15 - Recursive Sorting - 1 Class 15 - Recursive sorting methods r Processing arrays by recursion r Divide-and-conquer.
QuickSort 4 February QuickSort(S) Fast divide and conquer algorithm first discovered by C. A. R. Hoare in If the number of elements in.
Efficient Sorts. Divide and Conquer Divide and Conquer : chop a problem into smaller problems, solve those – Ex: binary search.
Sorting. “Sorting” When we just say “sorting,” we mean in ascending order (smallest to largest) The algorithms are trivial to modify if we want to sort.
6-1 CM0551 Week 6 The Array Data Structure Properties of arrays and subarrays – recap. Sorting: insertion, quick-sort and their time and space complexity.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
The Substitution method T(n) = 2T(n/2) + cn Guess:T(n) = O(n log n) Proof by Mathematical Induction: Prove that T(n)  d n log n for d>0 T(n)  2(d  n/2.
Analysis of Quicksort. Quicksort Algorithm Given an array of n elements (e.g., integers): If array only contains one element, return Else –pick one element.
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Quicksort COMP171 Fall Sorting II/ Slide 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N.
Copyright (C) Gal Kaminka Data Structures and Algorithms Sorting II: Divide and Conquer Sorting Gal A. Kaminka Computer Science Department.
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
QuickSort The content for these slides was originally created by Gerard Harrison. Ported to C# by Mike Panitz.
Sorting Algorithms and Average Case Time Complexity
Quicksort Divide-and-Conquer. Quicksort Algorithm Given an array S of n elements (e.g., integers): If array only contains one element, return it. Else.
Fundamentals of Algorithms MCS - 2 Lecture # 16. Quick Sort.
CS 206 Introduction to Computer Science II 04 / 27 / 2009 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 12 / 09 / 2009 Instructor: Michael Eckmann.
Data Structures Advanced Sorts Part 2: Quicksort Phil Tayco Slide version 1.0 Mar. 22, 2015.
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
Copyright © 2006 Pearson Addison-Wesley. All rights reserved. Sorting III 1 An Introduction to Sorting.
Quicksort. 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom.
CS 206 Introduction to Computer Science II 12 / 03 / 2008 Instructor: Michael Eckmann.
Quicksort.
Unit 061 Quick Sort csc326 Information Structures Spring 2009.
Quicksort
Data Structures Review Session 1
Cmpt-225 Sorting – Part two. Idea of Quick Sort 1) Select: pick an element 2) Divide: partition elements so that x goes to its final position E 3) Conquer:
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
Computer Algorithms Lecture 10 Quicksort Ch. 7 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
CS 206 Introduction to Computer Science II 04 / 22 / 2009 Instructor: Michael Eckmann.
Sorting. Objectives Become familiar with the following sorting methods: Insertion Sort Shell Sort Selection Sort Bubble Sort Quick Sort Heap Sort Merge.
Quicksort CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
Computer Science 101 Fast Algorithms. What Is Really Fast? n O(log 2 n) O(n) O(n 2 )O(2 n )
1 Sorting. 2 Chapter Outline How to use standard sorting methods in the Java API How to implement these sorting algorithms: Selection sort Bubble sort.
Chapter 9 Sorting 1. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
M180: Data Structures & Algorithms in Java Sorting Algorithms Arab Open University 1.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Sorting divide and conquer. Divide-and-conquer  a recursive design technique  solve small problem directly  divide large problem into two subproblems,
Review 1 Insertion Sort Insertion Sort Algorithm Time Complexity Best case Average case Worst case Examples.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Nothing is particularly hard if you divide it into small jobs. Henry Ford Nothing is particularly hard if you divide it into small jobs. Henry Ford.
Quicksort Quicksort is a well-known sorting algorithm that, in the worst case, it makes Θ(n 2 ) comparisons. Typically, quicksort is significantly faster.
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
Sorting – Lecture 3 More about Merge Sort, Quick Sort.
329 3/30/98 CSE 143 Searching and Sorting [Sections 12.4, ]
Quicksort
Chapter 7 Sorting Spring 14
Quicksort "There's nothing in your head the sorting hat can't see. So try me on and I will tell you where you ought to be." -The Sorting Hat, Harry Potter.
Quicksort 1.
Quick Sort (11.2) CSE 2011 Winter November 2018.
Data Structures Review Session
Sub-Quadratic Sorting Algorithms
slides adapted from Marty Stepp
EE 312 Software Design and Implementation I
Quicksort.
CSE 373 Data Structures and Algorithms
Quicksort.
Quicksort.
Presentation transcript:

Quicksort

A short list of categories Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide and conquer algorithms Dynamic programming algorithms Greedy algorithms Branch and bound algorithms Brute force algorithms Randomized algorithms

Quicksort I To sort a[left...right]: 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and all a[p+1...right] are >= a[p] 1.2. Quicksort a[left...p-1] 1.3. Quicksort a[p+1...right] 2. Terminate

Partitioning (Quicksort II) A key step in the Quicksort algorithm is partitioning the array We choose some (any) number p in the array to use as a pivot We partition the array into three parts: p numbers less than p p numbers greater than or equal to p

Partitioning II Choose an array value (say, the first) to use as the pivot Starting from the left end, find the first element that is greater than or equal to the pivot Searching backward from the right end, find the first element that is less than the pivot Interchange (swap) these two elements Repeat, searching from where we left off, until done

Partitioning To partition a[left...right]: 1. Set pivot = a[left], l = left + 1, r = right; 2. while l < r, do 2.1. while l < right & a[l] < pivot , set l = l + 1 2.2. while r > left & a[r] >= pivot , set r = r - 1 2.3. if l < r, swap a[l] and a[r] 3. Set a[left] = a[r], a[r] = pivot 4. Terminate

Example of partitioning choose pivot: 4 3 6 9 2 4 3 1 2 1 8 9 3 5 6 search: 4 3 6 9 2 4 3 1 2 1 8 9 3 5 6 swap: 4 3 3 9 2 4 3 1 2 1 8 9 6 5 6 search: 4 3 3 9 2 4 3 1 2 1 8 9 6 5 6 swap: 4 3 3 1 2 4 3 1 2 9 8 9 6 5 6 search: 4 3 3 1 2 4 3 1 2 9 8 9 6 5 6 swap: 4 3 3 1 2 2 3 1 4 9 8 9 6 5 6 search: 4 3 3 1 2 2 3 1 4 9 8 9 6 5 6 (left > right) swap with pivot: 1 3 3 1 2 2 3 4 4 9 8 9 6 5 6

The partition method (Java) static int partition(int[] a, int left, int right) { int p = a[left], l = left + 1, r = right; while (l < r) { while (l < right && a[l] < p) l++; while (r > left && a[r] >= p) r--; if (l < r) { int temp = a[l]; a[l] = a[r]; a[r] = temp; } a[left] = a[r]; a[r] = p; return r;

The quicksort method (in Java) static void quicksort(int[] array, int left, int right) { if (left < right) { int p = partition(array, left, right); quicksort(array, left, p - 1); quicksort(array, p + 1, right); } }

Analysis of quicksort—best case Suppose each partition operation divides the array almost exactly in half Then the depth of the recursion in log2n Because that’s how many times we can halve n However, there are many recursions! How can we figure this out? We note that Each partition is linear over its subarray All the partitions at one level cover the array

Partitioning at various levels

Best case II We cut the array size in half each time So the depth of the recursion in log2n At each level of the recursion, all the partitions at that level do work that is linear in n O(log2n) * O(n) = O(n log2n) Hence in the average case, quicksort has time complexity O(n log2n) What about the worst case?

Worst case In the worst case, partitioning always divides the size n array into these three parts: A length one part, containing the pivot itself A length zero part, and A length n-1 part, containing everything else We don’t recur on the zero-length part Recurring on the length n-1 part requires (in the worst case) recurring to depth n-1

Worst case partitioning

Worst case for quicksort In the worst case, recursion may be n levels deep (for an array of size n) But the partitioning work done at each level is still n O(n) * O(n) = O(n2) So worst case for Quicksort is O(n2) When does this happen? There are many arrangements that could make this happen Here are two common cases: When the array is already sorted When the array is inversely sorted (sorted in the opposite order)

Typical case for quicksort If the array is sorted to begin with, Quicksort is terrible: O(n2) It is possible to construct other bad cases However, Quicksort is usually O(n log2n) The constants are so good that Quicksort is generally the fastest algorithm known Most real-world sorting is done by Quicksort

Tweaking Quicksort Almost anything you can try to “improve” Quicksort will actually slow it down One good tweak is to switch to a different sorting method when the subarrays get small (say, 10 or 12) Quicksort has too much overhead for small array sizes For large arrays, it might be a good idea to check beforehand if the array is already sorted But there is a better tweak than this

Picking a better pivot Before, we picked the first element of the subarray to use as a pivot If the array is already sorted, this results in O(n2) behavior It’s no better if we pick the last element We could do an optimal quicksort (guaranteed O(n log n)) if we always picked a pivot value that exactly cuts the array in half Such a value is called a median: half of the values in the array are larger, half are smaller The easiest way to find the median is to sort the array and pick the value in the middle (!)

Median of three Obviously, it doesn’t make sense to sort the array in order to find the median to use as a pivot Instead, compare just three elements of our (sub)array—the first, the last, and the middle Take the median (middle value) of these three as pivot It’s possible (but not easy) to construct cases which will make this technique O(n2) Suppose we rearrange (sort) these three numbers so that the smallest is in the first position, the largest in the last position, and the other in the middle This lets us simplify and speed up the partition loop

Final comments Quicksort is the fastest known sorting algorithm For optimum efficiency, the pivot must be chosen carefully “Median of three” is a good technique for choosing the pivot However, no matter what you do, there will be some cases where Quicksort runs in O(n2) time

The End