Instructor Neelima Gupta Introduction to some tools to designing algorithms through Sorting Iterative Divide and Conquer.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

David Luebke 1 4/22/2015 CS 332: Algorithms Quicksort.
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Theory of Computing Lecture 3 MAS 714 Hartmut Klauck.
Sorting Comparison-based algorithm review –You should know most of the algorithms –We will concentrate on their analyses –Special emphasis: Heapsort Lower.
MS 101: Algorithms Instructor Neelima Gupta
ADA: 5. Quicksort1 Objective o describe the quicksort algorithm, it's partition function, and analyse its running time under different data conditions.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Using Divide and Conquer for Sorting
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Efficient Sorts. Divide and Conquer Divide and Conquer : chop a problem into smaller problems, solve those – Ex: binary search.
Spring 2015 Lecture 5: QuickSort & Selection
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Quicksort COMP171 Fall Sorting II/ Slide 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N.
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
Quicksort Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
CS 171: Introduction to Computer Science II Quicksort.
September 19, Algorithms and Data Structures Lecture IV Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 4 Comparison-based sorting Why sorting? Formal analysis of Quick-Sort Comparison.
CSC 2300 Data Structures & Algorithms March 27, 2007 Chapter 7. Sorting.
Sorting. Introduction Assumptions –Sorting an array of integers –Entire sort can be done in main memory Straightforward algorithms are O(N 2 ) More complex.
TTIT33 Algorithms and Optimization – Dalg Lecture 2 HT TTIT33 Algorithms and optimization Lecture 2 Algorithms Sorting [GT] 3.1.2, 11 [LD] ,
Quicksort CIS 606 Spring Quicksort Worst-case running time: Θ(n 2 ). Expected running time: Θ(n lg n). Constants hidden in Θ(n lg n) are small.
Sorting Lower Bound Andreas Klappenecker based on slides by Prof. Welch 1.
DAST 2005 Week 4 – Some Helpful Material Randomized Quick Sort & Lower bound & General remarks…
David Luebke 1 7/2/2015 Linear-Time Sorting Algorithms.
1 QuickSort Worst time:  (n 2 ) Expected time:  (nlgn) – Constants in the expected time are small Sorts in place.
Sorting Chapter 6 Chapter 6 –Insertion Sort 6.1 –Quicksort 6.2 Chapter 5 Chapter 5 –Mergesort 5.2 –Stable Sorts Divide & Conquer.
CSE 373 Data Structures Lecture 19
Computer Algorithms Lecture 10 Quicksort Ch. 7 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Sorting in Linear Time Lower bound for comparison-based sorting
Mathematics Review and Asymptotic Notation
MS 101: Algorithms Instructor Neelima Gupta
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Order Statistics The ith order statistic in a set of n elements is the ith smallest element The minimum is thus the 1st order statistic The maximum is.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Sorting Fun1 Chapter 4: Sorting     29  9.
CS 61B Data Structures and Programming Methodology July 28, 2008 David Sun.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
September 29, Algorithms and Data Structures Lecture V Simonas Šaltenis Aalborg University
David Luebke 1 6/3/2016 CS 332: Algorithms Analyzing Quicksort: Average Case.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
Instructor Neelima Gupta Table of Contents Review of Lower Bounding Techniques Decision Trees Linear Sorting Selection Problems.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
QuickSort (Ch. 7) Like Merge-Sort, based on the three-step process of divide- and-conquer. Input: An array A[1…n] of comparable elements, the starting.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Instructor Neelima Gupta Expected Running Times and Randomized Algorithms Instructor Neelima Gupta
Sorting Fundamental Data Structures and Algorithms Aleks Nanevski February 17, 2004.
David Luebke 1 2/19/2016 Priority Queues Quicksort.
CSE 326: Data Structures Lecture 23 Spring Quarter 2001 Sorting, Part 1 David Kaplan
Sorting Algorithms Merge Sort Quick Sort Hairong Zhao New Jersey Institute of Technology.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
Prof. Amr Goneid, AUC1 CSCE 210 Data Structures and Algorithms Prof. Amr Goneid AUC Part 8b. Sorting(2): (n log n) Algorithms.
SORTING AND ASYMPTOTIC COMPLEXITY Lecture 13 CS2110 – Fall 2009.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting So Far Insertion sort: –Easy to code –Fast on small inputs (less than ~50 elements) –Fast on nearly-sorted.
David Luebke 1 7/2/2016 CS 332: Algorithms Linear-Time Sorting: Review + Bucket Sort Medians and Order Statistics.
Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount.
Quick Sort Divide: Partition the array into two sub-arrays
Quick-Sort 9/12/2018 3:26 PM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
CSC 413/513: Intro to Algorithms
Ch 7: Quicksort Ming-Te Chi
Lecture 3 / 4 Algorithm Analysis
CS 332: Algorithms Quicksort David Luebke /9/2019.
Quick-Sort 4/25/2019 8:10 AM Quick-Sort     2
The Selection Problem.
Presentation transcript:

Instructor Neelima Gupta

Introduction to some tools to designing algorithms through Sorting Iterative Divide and Conquer

Iterative Algorithms: Insertion Sort – an example x 1,x 2, , x i-1,x i, …,x n For I = 2 to n Insert the ith element x i in the partially sorted list x 1,x 2, , x i-1. (at r th position)

An Example: Insertion Sort InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } At Iteration 1: key = 8 Thanks Brijesh Kumar (08) : MCA -12

An Example: Insertion Sort InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } At Iteration 2: key = 7 Thanks Brijesh Kumar (08) : MCA -12

An Example: Insertion Sort At Iteration 3: key = 10 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } Thanks Brijesh Kumar (08) : MCA -12

An Example: Insertion Sort InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } At Iteration 4: key = 12 Thanks Brijesh Kumar (08) : MCA -12

An Example: Insertion Sort At Iteration 5: key = 5 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } Thanks Brijesh Kumar (08) : MCA -12

An Example: Insertion Sort Final Output InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } Thanks Brijesh Kumar (08) : MCA -12

Analysis: Insertion Sort Thanks : MCA 2012 Dharam Deo Prasad InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } A[j+1] = key } }

Statement C N InsertionSort(A, n) { for i = 2 to n { c 1 n key = A[i]c 2 (n-1) j = i - 1;c 3 (n-1) while (j > 0) and (A[j] > key)c 4 Σ(T i + 1) { A[j+1] = A[j]c 5 Σ T i j = j - 1c 6 Σ T i } A[j+1] = keyc 7 (n-1) } where T i is number of while expression evaluations for the i th for loop iteration C i is the constant time required for 1 execution of the statement N is the number of times the statement is executed Running Time Analysis Thanks : MCA 2012 Dharam Deo Prasad

Total time Thanks : MCA 2012 Dharam Deo Prasad T(n) = ( c 1 + c 2 + c 3 + c 7 )n – (c 2 + c 3 + c 7 ) + [(c 4 + c 5 + c 6 ) T i + c 4 ] n i=2 ∑

Worst Case Thanks : MCA 2012 Dharam Deo Prasad Worst case:T i = i – 1 i.e. T i = (i – 1) = n(n-1)/2 hence, T(n) = (c 1 + c 2 + c 3 + c 4 + c 7 )n – (c 2 +c 3 + c 4 + c 7 ) + (c 4 + c 5 + c 6 )n(n-1/2) = an 2 + bn + c where, a = 1/2 (c 4 + c 5 + c 6 ) b = -1/2 (c 4 + c 5 + c 6 ) + (c 1 + c 2 + c 3 + c 4 + c 7 ) c = -(c 2 + c 3 + c 4 + c 7 ) n i=2 ∑ n ∑

Best Case Thanks : MCA 2012 Dharam Deo Prasad Best case:T i = 1 i.e. T i = 1 =(n – 1) hence, T(n) = (c 1 + c 2 + c 3 + c 4 + c 7 )n – (c 2 +c 3 + c 4 + c 7 ) + (c 4 + c 5 + c 6 )(n- 1) = an + b where, a = c 1 + c 2 + c 3 + 2c 4 + c 5 + c 6 + c 7 b = -(c 2 + c 3 + 2c 4 + c 5 + c 6 + c 7 ) n i=2 ∑ n ∑

Analysis of Algorithms Before we move ahead, let us define the notion of analysis of algorithms more formally

Input Size Time and space complexity This is generally a function of the input size How we characterize input size depends: Sorting: number of input items Multiplication: total number of bits Graph algorithms: number of nodes & edges Etc

Lower Bounds Please understand the following statements carefully. Any algorithm that sorts by removing at most one inversion per comparison does at least n(n-1)/2 comparisons in the worst case. Hence Insertion Sort is optimal in this category of algorithms.

Optimal? What do you mean by the term “ Optimal ” ? Answer: If an algorithm runs as fast in the worst case as it is possible (in the best case), we say that the algorithm is optimal. i.e if the worst case performance of an algorithm matches the lower bound, the algorithm is said to be “ optimal ”

Inversion :- Example : 4, 2, 3 No. of pairs = 3 C 2, out of which (4,2) is out of order i.e. inversion (2,3) is in order (4,3) inversion Thanks to:Dileep Jaiswal (11) :MCA 2012

In ‘n’ elements there will be n(n-1)/2 inversions in worst case. Thus, if an algorithm sorts by removing at most one inversion per comparison then it must do at least n(n- 1)/2 comparisons in the worst case. Thanks to:Dileep Jaiswal (11) :MCA 2012

Insertion Sort x 1,x 2,…...,x k-1, x k, …..,x i-1, x i Let x i is inserted after x k-1 No. of comparisons = (i-1) – (k – 1) +1 = i – k + 1 No. of inversions removed = (i-1) – (k – 1) = i – k No. of inversions removed/comparison = (1-k+1)/(i-k) < = 1 Thanks to:Dileep Jaiswal (11) :MCA 2012

Thus Insertion sort falls in the category of comparison algorithms that remove at most one inversion per comparison. Insertion sort is optimal in this category of algorithms. Thanks to:Dileep Jaiswal (11) :MCA 2012

SELECTION SORT The algorithm works as follows: Find the maximum value in the array. Swap it with the value in the last position Repeat the steps above for the remainder of the array. Thanks to: MCA 2012 Chhaya Nathani(9)

Selection Sort : An Example  For i = n to 2 a) Select the maximum in a[1] a[i]. b) Swap it with a[i]. Thanks to: MCA 2012 Chhaya Nathani(9)

Selection Sort x 1,x 2,…, x k, ……, x n Let the maximum is located at x k Swap it with x n Continue Thanks to:Dileep Jaiswal (11) :MCA 2012

Selection Sort x 1,x 2,…, x k, ……, x n-i+1,…… x n Suppose we are at the ith iteration Let the maximum is located at x k Swap it with x n-i+1 Thanks to:Dileep Jaiswal (11) :MCA 2012

Thanks to: MCA 2012 Chhaya Nathani(9) An Example: Selection Sort

An Example: Selection Sort Thanks to: MCA 2012 Chhaya Nathani(9)

An Example: Selection Sort Thanks to: MCA 2012 Chhaya Nathani(9)

An Example: Selection Sort

Thanks to: MCA 2012 Chhaya Nathani(9) An Example: Selection Sort

Thanks to: MCA 2012 Chhaya Nathani(9) An Example: Selection Sort

Thanks to: MCA 2012 Chhaya Nathani(9) An Example: Selection Sort

Thanks to: MCA 2012 Chhaya Nathani(9) DONE!!!! An Example: Selection Sort

SelectionSort(A, n) { for i = n to 2 { max=i for j=i-1 to 1 {If a[j]>a[max] Max=j } If(max!=i){ Swap a[i]……a[max]} Thanks to: MCA 2012 Chhaya Nathani(9)

ANALYSIS: SELECTION SORT Selecting the largest element requires scanning all n elements (this takes n − 1 comparisons) and then swapping it into the last position. Finding the next largest element requires scanning the remaining n − 1 elements and so on... (n − 1) + (n − 2) = n(n − 1) / 2 i.e Θ(n 2 ) comparisons T(n)= Θ(n 2 ) Thanks to: MCA 2012 Chhaya Nathani(9)

Selection Sort x 1,x 2,…, x k, ……, x n-i+1,…… x n Suppose we are at the ith iteration Let the maximum is located at x k No. of comparisons = (n – i + 1) - 1 = n - i No. of inversions removed = (n – i + 1) – 1 – (k-1) = n – i – k + 1 No. of inversions removed/comparison = ( n – i – k +1)/ (n – i) = 1) Thanks to:Dileep Jaiswal (11) :MCA 2012

Thus Selection sort also falls in category of comparison algorithms that remove at most one inversion per comparison. Selection sort is also optimal in this category of algorithms. Thanks to:Dileep Jaiswal (11) :MCA 2012

Merge Sort (Divide and Conquer Technique)  Divide the list into nearly equal halves  Sort each half recursively  Merge these lists Thanks to:Gaurav Gulzar (MCA -11)

Divide into 2 Subsequence Thanks to Himanshu (MCA 2012, Roll No 14)

Next Sort the 2 subsequences and merge them. Thanks to Himanshu (MCA 2012, Roll No 14)

Sort & merge 2 Subsequence Initial Subsequence Sorted Sequence Thanks to Himanshu (MCA 2012, Roll No 14)

Merging Let we have two sorted lists : A: B: Compare a 1 with b 1 and put smaller one in new array C and increase the index of array having smaller element. Thanks to:Gaurav Gulzar (MCA -11) a1a1 a2a2 a3a3 ……………. anan b1b1 b2b2 b3b3 bmbm

Let a 1 < b 1 A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11) a1a1 a2a2 a3a3 ……………. anan b1b1 b2b2 b3b3 bmbm a1a1

Example 1: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11) 6080

Example 1: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11) 6080

Example 1: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11) 6080

Example 1: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11) 6080

Example 1: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11) 6080

Example 1: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11) 6080

Example 1: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11)

Worst Case Analysis If no. of elements in first list is ‘n’ & in second list is ‘m’ then: Total No. of comparisons = n-1+m i.e. O (m + n) in worst case and Thanks to:Gaurav Gulzar (MCA -11)

What is the best case? Number of Comparisons in the best case: min(n, m), i.e. Ω(min(n, m)) Thanks to:Gaurav Gulzar (MCA -11)

Example 2: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11)

Example 2: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11)

Example 2: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11)

Example 2: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11)

Example 2: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11)

Example 2: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11)

Example 2: A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11)

What is the best case? Arrays Total number of steps: Ω(m+n)…copying the rest of the elements of the bigger array. Linked List : Total number of steps: min(n, m), i.e. Ω(min(n, m)) in best case.

Analysis of Merge Sort Since size of both the lists to be merged is n/2, in either case (arrays or linked list), time to merge the two lists is Θ(n). If T(n) = no. of comparisons performed on an Input of size ‘n’ then : T(n) = T(n/2) + T(n/2) + Θ(n) = 2T(n/2) +cn ∀ n>= n 0 ∴ T(n) = O (nlogn) Thanks to:Gaurav Gulzar (MCA -11)

If T(n) = no. of comparisons performed on an Input of size ‘n’ then : T(n) = T(n/2) + T(n/2) + Θ(n) = 2T(n/2) +cn ∀ n>= n 0 ∴ T(n) = O (nlogn) Thanks to:Gaurav Gulzar (MCA -11)

Final Points Merging is Θ(m + n) in case of an Array If we use link list then it is Ω(min(m, n)) Time for merging is O(n) and Ω(n/2) i.e. Θ(n) Thanks to:Gaurav Gulzar (MCA -11)

Conclusion Merge Sort = Θ(nlogn) Have we beaten the lower bound? No, It just means that merge sort does not fall in the previous category of algorithms. It removes > 1 inversions per comparisons. Thanks to:Gaurav Gulzar (MCA -11)

What is the Worst case Best Case for Merge Sort?

Merge Sort Vs Insertion Sort What is the advantage of merge sort? What is the advantage of insertion sort?

Merge Sort Vs Insertion Sort contd.. Merge Sort is faster but does not take advantage if the list is already partially sorted. Insertion Sort takes advantage if the list is already partially sorted.

Lower Bound Any algorithm that sorts by comparison only does at least  (n lg n) comparisons in the worst case.

Decision Trees Provides an abstraction of comparison sorts. In a decision tree, each node represents a comparison. Insertion sort applied on x1, x2, x3 x1:x2 x2:x3 x1:x3 x3:x1 x1<x2<x3 x3>x1>x2 x2:x3 x2>x1>x3x2>x3>x1 x1>x2>x3 x1>x3>x2 x2>x 3 x1<x2 x3>x2 x3<x1 x3>x1 x1>x2 x3>x1 x1>x3 x2>x3 x3>x2

What is the minimum number of leaves in a decision tree? Longest path of the tree gives us the height of the tree, and actually represents the worst case scenario for the algorithm. height of a decision tree = Ω (n log n) i.e. any comparison sort will perform at least (n logn) comparisons in the worst case. Decision Trees

Proof : Let h denotes the height of the tree. What’s the maximum # of leaves of a binary tree of height h? : 2 h 2 h >= number of leaves >= n! (where n = no. of elements and n! is a lower bound on the no. of leaves in the decision tree) => h>= log(n!) > n log n ( By Stirling’s Approximation) ( SA: n!= √ (2.π.n).(n/e) n > (n/e) n Thus, log(n!) > n log n – n log e > n logn. )

Merge Sort is Optimal Thus the time to comparison sort n elements is  (n lg n) Corollary: Merge-sort is asymptotically optimal comparison sorts. Later we’ll see another sorting algorithm in this category namely heap-sort that is also optimal. We’ll also see some algorithms which beat this bound. Needless to say those algorithms are not purely based on comparisons. They do something extra.

Quick Sort (Divide and Conquer Technique)  Pick a pivot element x  Partition the array into two subarrays around the pivot x such that elements in left subarray is less than equal to x and element in right subarray is greater than x  Recursively sort left subarray and right subarray Thanks to:Krishn Kant Kundan (MCA -19) x ≤ x ≤ x>x

Quick Sort (Algorithm) QUICKSORT(A, p, q) if p < q k=PARTITION(A, p, q) QUICKSORT(A, p, k-1) QUICKSORT(A, k+1, q) Thanks to:Krishn Kant Kundan (MCA -19)

Quick Sort (Example) Let we have following elements in our array :- i j Thanks to:Krishn Kant Kundan (MCA -19)

Quick Sort (Example) i j Thanks to:Krishn Kant Kundan (MCA -19)

Quick Sort (Example) i j Thanks to:Krishn Kant Kundan (MCA -19)

Quick Sort (Example) i j Thanks to:Krishn Kant Kundan (MCA -19)

Quick Sort (Example) i j i j (stop) (recursive call on left array) (recursive call on right array) Thanks to:Krishn Kant Kundan (MCA -19)

i j i j i j i j i j i ji j (stop) i j (st op)

i j ij i j i j i j ij i j i j (stop) (stop) i j i j (stop) Sorted Array Thanks to:Krishn Kant Kundan (MCA -19)

Analyzing Quicksort Worst Case? Partition is always unbalanced Worst Case input? Already-sorted input, if the first element is always picked as the pivot Best case? Partition is perfectly balanced Best Case input? ? Worst Case and Best Case input when the middle element is always picked as the pivot?

Worst Case of Quicksort In the worst case: T(1) =  (1) T(n) = T(n - 1) +  (n) Does the recursion look familiar? T(n) =  (n 2 )

Best Case of Quicksort In the best case: T(n) = 2T(n/2) +  (n) Does the recursion familiar? T(n) =  (n lg n)

Why does Qsort works well in practice? Suppose that partition() always produces a 9-to-1 split. This looks quite unbalanced! The recurrence is: T(n) = T(9n/10) + T(n/10) + n T(n) = θ (n log n) Such an imbalanced partition and θ(n log n) time?

Why does Qsort works well in practice? Intuitively, a real-life run of quicksort will produce a mix of “ bad ” and “ good ” splits Pretend for intuition that they alternate between best- case (n/2 : n/2) and worst-case (n-1 : 1) What happens if we bad-split root node, then good-split the resulting size (n-1) node?

Why does Qsort works well in practice? Intuitively, a real-life run of quicksort will produce a mix of “ bad ” and “ good ” splits Pretend for intuition that they alternate between best- case (n/2 : n/2) and worst-case (n-1 : 1) What happens if we bad-split root node, then good-split the resulting size (n-1) node? We end up with three subarrays, size 1, (n-1)/2, (n-1)/2 Combined cost of splits = n + n -1 = 2n -1 = O(n) No worse than if we had good-split the root node!

Why does Qsort works well in practice? Intuitively, the O(n) cost of a bad split (or 2 or 3 bad splits) can be absorbed into the O(n) cost of each good split Thus running time of alternating bad and good splits is still O(n lg n), with slightly higher constants How can we be more rigorous? : we’ll do average analysis of Qsort later while doing randomized algorithms.

Quicksort Vs Merge Sort Merge Sort takes O(n lg n) in the worst case Quick Sort takes O(n 2 ) in the worst case So why would anybody use Qsort instead of merge sort? Because in practice, Qsort is quick as the worst case doesn ’ t happen often.

Up Next Linear-Time Sorting Algorithms

The End