CSC 413/513: Intro to Algorithms

Slides:



Advertisements
Similar presentations
David Luebke 1 4/22/2015 CS 332: Algorithms Quicksort.
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 6.
ADA: 5. Quicksort1 Objective o describe the quicksort algorithm, it's partition function, and analyse its running time under different data conditions.
Analysis of Algorithms CS 477/677 Sorting – Part B Instructor: George Bebis (Chapter 7)
Spring 2015 Lecture 5: QuickSort & Selection
David Luebke 1 5/20/2015 CS 332: Algorithms Quicksort.
Quicksort Ack: Several slides from Prof. Jim Anderson’s COMP 750 notes. UNC Chapel Hill1.
Introduction to Algorithms Chapter 7: Quick Sort.
Quicksort Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
CS 162 Intro to Programming II Quick Sort 1. Quicksort Maybe the most commonly used algorithm Quicksort is also a divide and conquer algorithm Advantage.
CS 253: Algorithms Chapter 7 Mergesort Quicksort Credit: Dr. George Bebis.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Tutorial 4 The Quicksort Algorithm. QuickSort  Divide: Choose a pivot, P Form a subarray with all elements ≤ P Form a subarray with all elements > P.
Quicksort CIS 606 Spring Quicksort Worst-case running time: Θ(n 2 ). Expected running time: Θ(n lg n). Constants hidden in Θ(n lg n) are small.
1 QuickSort Worst time:  (n 2 ) Expected time:  (nlgn) – Constants in the expected time are small Sorts in place.
Computer Algorithms Lecture 10 Quicksort Ch. 7 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
10 Algorithms in 20th Century Science, Vol. 287, No. 5454, p. 799, February 2000 Computing in Science & Engineering, January/February : The Metropolis.
Quick Sort By: HMA. RECAP: Divide and Conquer Algorithms This term refers to recursive problem-solving strategies in which 2 cases are identified: A case.
Chapter 7 Quicksort Ack: This presentation is based on the lecture slides from Hsu, Lih- Hsing, as well as various materials from the web.
David Luebke 1 6/3/2016 CS 332: Algorithms Analyzing Quicksort: Average Case.
David Luebke 1 6/3/2016 CS 332: Algorithms Heapsort Priority Queues Quicksort.
1 CSE 373 Sorting 3: Merge Sort, Quick Sort reading: Weiss Ch. 7 slides created by Marty Stepp
CS 2133: Data Structures Quicksort. Review: Heaps l A heap is a “complete” binary tree, usually represented as an array:
QuickSort (Ch. 7) Like Merge-Sort, based on the three-step process of divide- and-conquer. Input: An array A[1…n] of comparable elements, the starting.
David Luebke 1 2/19/2016 Priority Queues Quicksort.
2IS80 Fundamentals of Informatics Fall 2015 Lecture 7: Sorting and Directed Graphs.
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
Mudasser Naseer 1 3/4/2016 CSC 201: Design and Analysis of Algorithms Lecture # 6 Bubblesort Quicksort.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Analysis of Algorithms CS 477/677
CS 3343: Analysis of Algorithms
Order Statistics.
Quick Sort Divide: Partition the array into two sub-arrays
Order Statistics Comp 122, Spring 2004.
Introduction to Algorithms Prof. Charles E. Leiserson
Algorithm Design & Analysis
Chapter 7 Sorting Spring 14
Algorithms CSCI 235, Fall 2017 Lecture 16 Quick Sort Read Ch. 7
Advance Analysis of Algorithms
Insertion Sort
Order Statistics(Selection Problem)
QSORT.
Quick Sort (11.2) CSE 2011 Winter November 2018.
CO 303 Algorithm Analysis And Design Quicksort
Ch 7: Quicksort Ming-Te Chi
Lecture 3 / 4 Algorithm Analysis
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
Order Statistics Comp 550, Spring 2015.
Sub-Quadratic Sorting Algorithms
CS 583 Analysis of Algorithms
Design and Analysis of Algorithms
EE 312 Software Design and Implementation I
CS 3343: Analysis of Algorithms
Sorting.
CS 332: Algorithms Quicksort David Luebke /9/2019.
CSE 373 Data Structures and Algorithms
Algorithms: Design and Analysis
Ack: Several slides from Prof. Jim Anderson’s COMP 202 notes.
Chapter 7 Quicksort.
Algorithms CSCI 235, Spring 2019 Lecture 16 Quick Sort Read Ch. 7
Order Statistics Comp 122, Spring 2004.
CS200: Algorithm Analysis
The Selection Problem.
Quicksort and Randomized Algs
Design and Analysis of Algorithms
Quicksort Quick sort Correctness of partition - loop invariant
Algorithms CSCI 235, Spring 2019 Lecture 17 Quick Sort II
Presentation transcript:

CSC 413/513: Intro to Algorithms Quicksort Ch7

Quicksort Sorts in place Sorts O(n lg n) in the average case Sorts O(n2) in the worst case But in practice, it’s quick And the worst case doesn’t happen often (but more on this later…) So why would people use it instead of merge sort?

Quicksort Another divide-and-conquer algorithm The array A[p..r] is partitioned into two non-empty subarrays A[p..q] and A[q+1..r] Invariant: All elements in A[p..q] are less than all elements in A[q+1..r] The subarrays are recursively sorted by calls to quicksort Unlike merge sort, no combining step: two subarrays form an already-sorted array

Quicksort Code Quicksort(A, p, r) { if (p < r) q = Partition(A, p, r); Quicksort(A, p, q); Quicksort(A, q+1, r); }

Partition Clearly, all the action takes place in the partition() function Rearranges the subarray in place End result: Two subarrays All values in first subarray  all values in second Returns the index of the “pivot” element separating the two subarrays How do you suppose we implement this?

Partition In Words Partition(A, p, r): Select an element to act as the “pivot” (which?) Grow two regions, A[p..i] and A[j..r] All elements in A[p..i] <= pivot All elements in A[j..r] >= pivot Increment i until A[i] >= pivot Decrement j until A[j] <= pivot Swap A[i] and A[j] Repeat until i >= j Return j Note: slightly different from book’s partition()

What is the running time of partition()? Partition Code Partition(A, p, r) x = A[p]; i = p - 1; j = r + 1; while (TRUE) repeat j--; until A[j] <= x; i++; until A[i] >= x; if (i < j) Swap(A, i, j); else return j; Illustrate on A = {5, 3, 2, 6, 4, 1, 3, 7}; What is the running time of partition()?

partition() runs in O(n) time Partition Code Partition(A, p, r) x = A[p]; i = p - 1; j = r + 1; while (TRUE) repeat j--; until A[j] <= x; i++; until A[i] >= x; if (i < j) Swap(A, i, j); else return j; partition() runs in O(n) time

Analyzing Quicksort What will be the worst case for the algorithm? Partition is always unbalanced What will be the best case for the algorithm? Partition is perfectly balanced Which is more likely? The latter, by far, except... Will any particular input elicit the worst case? Yes: Already-sorted input

Analyzing Quicksort In the worst case: Works out to T(n) = (n2) T(n) = T(n - 1) + (n) Works out to T(n) = (n2)

Analyzing Quicksort In the best case: What does this work out to? T(n) = 2T(n/2) + (n) What does this work out to? T(n) = (n lg n)

Improving Quicksort The real liability of quicksort is that it runs in O(n2) on already-sorted input Book discusses two solutions: Randomize the input array, OR Pick a random pivot element How will these solve the problem? By ensuring that no particular input can be chosen to make quicksort run in O(n2) time

Analyzing Quicksort: Average Case Assuming random input, average-case running time is much closer to O(n lg n) than O(n2) First, a more intuitive explanation/example: Suppose that partition() always produces a 9-to-1 split. This looks quite unbalanced! The recurrence is thus: T(n) = T(9n/10) + T(n/10) + n How deep will the recursion go? (draw it) Use n instead of O(n) for convenience

Analyzing Quicksort: Average Case Intuitively, a real-life run of quicksort will produce a mix of “bad” and “good” splits Randomly distributed among the recursion tree Pretend for intuition that they alternate between best-case (n/2 : n/2) and worst-case (n-1 : 1) What happens if we bad-split root node, then good-split the resulting size (n-1) node?

Analyzing Quicksort: Average Case Intuitively, a real-life run of quicksort will produce a mix of “bad” and “good” splits Randomly distributed among the recursion tree Pretend for intuition that they alternate between best-case (n/2 : n/2) and worst-case (n-1 : 1) What happens if we bad-split root node, then good-split the resulting size (n-1) node? We fail English

Analyzing Quicksort: Average Case Intuitively, a real-life run of quicksort will produce a mix of “bad” and “good” splits Randomly distributed among the recursion tree Pretend for intuition that they alternate between best-case (n/2 : n/2) and worst-case (n-1 : 1) What happens if we bad-split root node, then good-split the resulting size (n-1) node? We end up with three subarrays, size 1, (n-1)/2, (n-1)/2 Combined cost of splits = n + n -1 = 2n -1 = O(n) No worse than if we had good-split the root node!

Analyzing Quicksort: Average Case Intuitively, the O(n) cost of a bad split (or 2 or 3 bad splits) can be absorbed into the O(n) cost of each good split Thus running time of alternating bad and good splits is still O(n lg n), with slightly higher constants We will not pursue a more rigorous analysis of the average case in this class.