Quick Sort Cmput 115 - Lecture 13 Department of Computing Science University of Alberta ©Duane Szafron 2000 Some code in this lecture is based on code.

Slides:



Advertisements
Similar presentations
Chapter 14 Recursion Lecture Slides to Accompany An Introduction to Computer Science Using Java (2nd Edition) by S.N. Kamin, D. Mickunas,, E. Reingold.
Advertisements

Garfield AP Computer Science
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Stephen P. Carl - CS 2421 Recursive Sorting Algorithms Reading: Chapter 5.
Using Divide and Conquer for Sorting
Theory of Algorithms: Divide and Conquer
CSE 373: Data Structures and Algorithms
The Substitution method T(n) = 2T(n/2) + cn Guess:T(n) = O(n log n) Proof by Mathematical Induction: Prove that T(n)  d n log n for d>0 T(n)  2(d  n/2.
Copyright (C) Gal Kaminka Data Structures and Algorithms Sorting II: Divide and Conquer Sorting Gal A. Kaminka Computer Science Department.
CSC 2300 Data Structures & Algorithms March 23, 2007 Chapter 7. Sorting.
Updated QuickSort Problem From a given set of n integers, find the missing integer from 0 to n using O(n) queries of type: “what is bit[j]
CS Data Structures I Chapter 10 Algorithm Efficiency & Sorting III.
Quick Sort. Quicksort Quicksort is a well-known sorting algorithm developed by C. A. R. Hoare. The quick sort is an in-place, divide- and-conquer, massively.
CS 171: Introduction to Computer Science II Quicksort.
Section 8.8 Heapsort.  Merge sort time is O(n log n) but still requires, temporarily, n extra storage locations  Heapsort does not require any additional.
1 Issues with Matrix and Vector Issues with Matrix and Vector Quicksort Quicksort Determining Algorithm Efficiency Determining Algorithm Efficiency Substitution.
Sorting - Selection Sort Cmput Lecture 10 Department of Computing Science University of Alberta ©Duane Szafron 2000 Some code in this lecture is.
Ordered Containers Cmput Lecture 21 Department of Computing Science University of Alberta ©Duane Szafron 2000 Some code in this lecture is based.
Self-Reference - Induction Cmput Lecture 7 Department of Computing Science University of Alberta ©Duane Szafron 1999 Some code in this lecture is.
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
1 Sorting Algorithms (Part II) Overview  Divide and Conquer Sorting Methods.  Merge Sort and its Implementation.  Brief Analysis of Merge Sort.  Quick.
Sorting - Merge Sort Cmput Lecture 12 Department of Computing Science University of Alberta ©Duane Szafron 2000 Some code in this lecture is based.
Self-Reference - Recursion Cmput Lecture 6 Department of Computing Science University of Alberta ©Duane Szafron 1999 Some code in this lecture is.
Object (Data and Algorithm) Analysis Cmput Lecture 5 Department of Computing Science University of Alberta ©Duane Szafron 1999 Some code in this.
CS2420: Lecture 10 Vladimir Kulyukin Computer Science Department Utah State University.
Sorting Chapter 10.
Data Structures Review Session 1
Cmpt-225 Sorting – Part two. Idea of Quick Sort 1) Select: pick an element 2) Divide: partition elements so that x goes to its final position E 3) Conquer:
Sorting - Insertion Sort Cmput Lecture 11 Department of Computing Science University of Alberta ©Duane Szafron 2000 Some code in this lecture is.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Sorting Chapter 10. Chapter 10: Sorting2 Chapter Objectives To learn how to use the standard sorting methods in the Java API To learn how to implement.
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
CIS 068 Welcome to CIS 068 ! Lesson 9: Sorting. CIS 068 Overview Algorithmic Description and Analysis of Selection Sort Bubble Sort Insertion Sort Merge.
Fall 2013 Instructor: Reza Entezari-Maleki Sharif University of Technology 1 Fundamentals of Programming Session 17 These.
Chapter 12 Recursion, Complexity, and Searching and Sorting
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Chapter 10 B Algorithm Efficiency and Sorting. © 2004 Pearson Addison-Wesley. All rights reserved 9 A-2 Sorting Algorithms and Their Efficiency Sorting.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Sorting Chapter 10. Chapter Objectives  To learn how to use the standard sorting methods in the Java API  To learn how to implement the following sorting.
CS 61B Data Structures and Programming Methodology July 28, 2008 David Sun.
Examples of Recursion Data Structures in Java with JUnit ©Rick Mercer.
Ordered Containers CMPUT Lecture 19 Department of Computing Science University of Alberta ©Duane Szafron 2003 Some code in this lecture is based.
Chapter 7: Sorting Algorithms Insertion Sort. Sorting Algorithms  Insertion Sort  Shell Sort  Heap Sort  Merge Sort  Quick Sort 2.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
Chapter 5 Searching and Sorting. Copyright © 2004 Pearson Addison-Wesley. All rights reserved.1-2 Chapter Objectives Examine the linear search and binary.
Divide and Conquer Applications Sanghyun Park Fall 2002 CSE, POSTECH.
Algorithm Design Methods (II) Fall 2003 CSE, POSTECH.
Chapter 8 Sorting and Searching Goals: 1.Java implementation of sorting algorithms 2.Selection and Insertion Sorts 3.Recursive Sorts: Mergesort and Quicksort.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
SORTING ALGORITHMS King Saud University College of Applied studies and Community Service CSC 1101 By: Nada Alhirabi 1.
QUICKSORT 2015-T2 Lecture 16 School of Engineering and Computer Science, Victoria University of Wellington COMP 103 Marcus Frean.
Data Structures - CSCI 102 Selection Sort Keep the list separated into sorted and unsorted sections Start by finding the minimum & put it at the front.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Sorting Algorithms Merge Sort Quick Sort Hairong Zhao New Jersey Institute of Technology.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
Computer Sciences Department1. Sorting algorithm 4 Computer Sciences Department3.
Sorting – Lecture 3 More about Merge Sort, Quick Sort.
Algorithm Design Methods
10.3 Bubble Sort Chapter 10 - Sorting.
Chapter 4: Divide and Conquer
Advanced Sorting Methods: Shellsort
Welcome to CIS 068 ! Lesson 9: Sorting CIS 068.
Data Structures Review Session
Sorting Chapter 8 CS 225.
CSE 373 Data Structures and Algorithms
Advanced Sorting Methods: Shellsort
Presentation transcript:

Quick Sort Cmput Lecture 13 Department of Computing Science University of Alberta ©Duane Szafron 2000 Some code in this lecture is based on code from the book: Java Structures by Duane A. Bailey or the companion structure package Revised 1/26/00

©Duane Szafron About This Lecture In this lecture we will learn about a sorting algorithm called the Quick Sort. We will study its implementation and its time and space complexity.

©Duane Szafron Outline The Quick Sort Algorithm Quick Sort - Arrays Time and Space Complexity of Quick Sort

©Duane Szafron The Sort Problem Given a collection, with elements that can be compared, put the elements in increasing or decreasing order

©Duane Szafron Quick Sort Algorithm 0 Our initial goal is to move one element, called the pivot, to its correct final position so that all elements to the left of it are smaller than it and all elements to the right of it are larger than it. We will call this operation partition(). We select the left element as the pivot rlp

©Duane Szafron Quick Sort Algorithm 1 Find the rightmost element that is smaller than the pivot element. Exchange the elements and increment the left lprr rlp

©Duane Szafron Quick Sort Algorithm 2 Find the leftmost element that is larger than the pivot element. Exchange the elements and decrement the right lpr rlpl

©Duane Szafron Quick Sort Algorithm 3 Find the rightmost element that is smaller than the pivot element. Since the right passes the left, there is no element and the pivot is the final location rlpr

©Duane Szafron Quick Sort Algorithm 4 To complete the sort, recursively partition the sub-list to the left of the pivot and the sub-list to the right of the pivot.

©Duane Szafron QuickSort Algorithm - Arrays public static void quickSort(Comparable anArray[ ], int size) { // pre: 0 <= size <= anArray.length // post: values in anArray[0..size - 1] are in // ascending order quickSortR(anArray, 0, size - 1); }

©Duane Szafron QuickSortR Algorithm - Arrays private static void quickSortR(Comparable anArray[], int left, int right) { // pre: left <= right // post: anArray[left..right] are ascending int pivot; if (left >= right) return; pivot = partition(anArray, left, right); quickSortR(anArray, left, pivot - 1); quickSortR(anArray, pivot + 1, right); }

©Duane Szafron Partition Algorithm - Arrays private static int partition(Comparable anArray[], int left, int right) { // pre: left <= right // post: data[left] is in the correct sort location and its // index is returned. while (true) { while ((left < right) && (anArray[left].compareTo(anArray[right]) < 0)) right--; //find rightmost element less than left if (left < right) swap(anArray, left++, right); else return left; //pivot is now in final location while ((left < right) && (anArray[left].compareTo(anArray[right]) < 0)) left++; //find leftmost element greater than right if (left < right) swap(anArray, left, right--); else return right; //pivot is now in final location } code based on Bailey pg. 89

©Duane Szafron QuickSort Calling Sequence qSR(a, 0, 8) –p(a,0,8) -> 5 –qSR(a, 0,4) p(a,0,4) -> 4 qSR(a,0,3) –p(a,0,3) -> 3 –qSR(a,0,2) p(a,0,2) -> 1 qSR(a, 0, 0) qSR(a, 2, 2) –qSR(a,4,3) qSR(a,4,3) –qSR(a, 6, 8) p(a, 6, 8) -> 6 qSR(a, 6, 5) qSR(a, 7, 8) –p(a,7,8) -> 7 –qSR(a, 7,6) –qSR(a, 8, 8)

©Duane Szafron Quick Sort Trace 1 qSR(0,8) p(0,8) qSR(0,4)

©Duane Szafron Quick Sort Trace 2 qSR(0,4) p(0,4) qSR(0,3) p(0,3)

©Duane Szafron Quick Sort Trace 3 p(0,3) - cont qSR(0,2) p(0,2)

©Duane Szafron Quick Sort Trace qSR(0,0) qSR(2,2) qSR(4,3) qSR(6,8)

©Duane Szafron Quick Sort Trace 5 qSR(7,8) qSR(6,8) p(6,8) qSR(6,5) p(7,8)

©Duane Szafron Quick Sort Trace qSR(7,6) qSR(8,8)

©Duane Szafron Counting Comparisons How many comparison operations are required for a quick sort of an n-element collection? The recursive sort method calls partition() once, and then divides the collection. if (left >= right) return; pivot = partition(anArray, left, right); quickSortR(anArray, left, pivot - 1); quickSortR(anArray, pivot + 1, right); Every call to partition() for a list of size k > 1, performs k-1 comparisons.

©Duane Szafron Comparisons - Best Case 1 In the best case, the pivot is always in the middle of the list being sorted. Count comparisons, C(k), for k = 15: C(15) = 14 + C(7) + C(7) = 14 + (6+2+2) + (6+2+2) = 34 In general, we see that C(k) = k-1 + 2*C((k-1)/2) for k = 2 m -1 k=7 k=3 k=1 k=3 k=1 k=7 k=3 k=1 k=3 k= best case comparisons per call to partition n=k=15

©Duane Szafron Comparisons - Best Case 2 To find a general formula for C(n), we need to solve the recurrence relation: C(n) = 2*C((n-1)/2) for n >= 3 Solving recurrence relations is beyond the scope of this course and is considered in C204. The solution of this recurrence relation is: C(n) = (n+1) log(n+1) - 2n for n >= 3 Therefore, in the best case, the total number of comparisons is: (n+1)*log(n+1) - 2n = O(n log(n))

©Duane Szafron Check the Recurrence Relation Check that the solution is valid: C(n) = (n+1) log(n+1) - 2n for n >= 3, Checking: C(3) = 4*log(4) - 6 = 4*2 - 6 = = 2 Checking: C(7) = 8*log(8) - 14 = 8* = = 10 Checking: C(15) = 16*log(16)-30 = 16*4-30 = = 34 Checking: C(8) = 9*log(9) - 16 ~ 9* ~ ~ 13 k=3 k=1 k=4 k=1k=2 k=0k= n=k=8 Number of comparisons in this partitioning

©Duane Szafron Prove Recurrence Solution is Valid Prove that a solution to: C(n) = n-1 + 2*C((n-1)/2) is: C(n) = (n+1) log(n+1) - 2n for n >= 3 Note that: C((n-1)/2) = [(n-1)/2 + 1]*log [(n-1)/2 + 1] -2[(n-1)/2] = ((n+1)/2) *log ((n+1)/2) - (n-1) Therefore: 2*C((n-1)/2) + (n-1) = 2*{((n+1)/2)*log ((n+1)/2) - (n-1)} + (n-1) = (n+1)*log ((n+1)/2) - 2*(n-1) + (n-1) = (n+1)*{log (n+1) - log (2)} - 2n n -1 = (n+1)*log (n+1) - (n+1)*log (2) - n + 1 = (n+1)*log (n+1) - (n+1)*1 - n + 1 = (n+1)*log (n+1) - n n + 1 = (n+1)*log (n+1) - 2n = C(n)

©Duane Szafron Comparisons - Worst Case In the worst case, the pivot is at one end of the list so a list of size k gets divided into a list of size 0 and a list of size k-1. In this case there are n calls to partition(). Since there are k-1 comparisons in partition() for a list of size k, we have a total number of comparisons: (n-1) + (n-2) + … + 1 = (n-1)*n/2 = O(n 2 ) The bad news is that the worst case occurs when the list is sorted (or near sorted). Why? Think about the consequences of this!

©Duane Szafron Comparisons - Average Case The average case must be between the best case and the worst case, but since the best case is O(n log(n)) and the worst case is so the average case is O(n 2 ), some analysis is necessary to find the answer. Analysis yields a complex recurrence relation. The average case number of comparisons is approximately: 1.386*n*log(n) *n Therefore, the average case time complexity is: O(n log(n)).

©Duane Szafron Counting Assignments How many assignment operations are required for a quick sort of an n-element collection? In general there are far fewer assignments than comparisons, since there are 3 assignments per swap, and swap is not called for every comparison: while ((left < right) &&(anArray[left].compareTo(anArray[right]) < 0)) right--; if (left < right) swap(anArray, left++, right); else return left; Therefore we ignore the assignments.

©Duane Szafron Time Complexity of Quick Sort Best case O(n log(n)) for comparisons. Worst case O(n 2 ) for comparisons. Average case O(n log(n)) for comparisons. Note that the quick sort is inferior to insertion sort and merge sort if the list is sorted, nearly sorted, or reverse sorted.

©Duane Szafron Space Complexity of Quick Sort This sort appears to only need one temporary memory element for the swap. However, in the worst case, there are n recursive calls, each of which requires three parameters: a reference to the array, and two ints, left and right. This means that there are 3*n more words of storage necessary. In the average and best cases there are only log(n) recursive calls, so this extra storage is negligible. Note that we ignored this extra space in the merge sort since there were only log(n) recursive calls.