Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.

Slides:



Advertisements
Similar presentations
Garfield AP Computer Science
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
§7 Quicksort -- the fastest known sorting algorithm in practice 1. The Algorithm void Quicksort ( ElementType A[ ], int N ) { if ( N < 2 ) return; pivot.
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
Sorting. “Sorting” When we just say “sorting,” we mean in ascending order (smallest to largest) The algorithms are trivial to modify if we want to sort.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
The Substitution method T(n) = 2T(n/2) + cn Guess:T(n) = O(n log n) Proof by Mathematical Induction: Prove that T(n)  d n log n for d>0 T(n)  2(d  n/2.
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Sorting Algorithms and Average Case Time Complexity
Data Structures and Algorithms PLSD210 Sorting. Card players all know how to sort … First card is already sorted With all the rest, ¶Scan back from the.
Chapter 19: Searching and Sorting Algorithms
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
Quicksort Divide-and-Conquer. Quicksort Algorithm Given an array S of n elements (e.g., integers): If array only contains one element, return it. Else.
CS 171: Introduction to Computer Science II Quicksort.
Recitation on analysis of algorithms. runtimeof MergeSort /** Sort b[h..k]. */ public static void mS(Comparable[] b, int h, int k) { if (h >= k) return;
1 Issues with Matrix and Vector Issues with Matrix and Vector Quicksort Quicksort Determining Algorithm Efficiency Determining Algorithm Efficiency Substitution.
Data Structures Advanced Sorts Part 2: Quicksort Phil Tayco Slide version 1.0 Mar. 22, 2015.
Recitation 9 Programming for Engineers in Python.
© 2006 Pearson Addison-Wesley. All rights reserved10-1 Chapter 10 Algorithm Efficiency and Sorting CS102 Sections 51 and 52 Marc Smith and Jim Ten Eyck.
1 Sorting Algorithms (Part II) Overview  Divide and Conquer Sorting Methods.  Merge Sort and its Implementation.  Brief Analysis of Merge Sort.  Quick.
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
Copyright © 2006 Pearson Addison-Wesley. All rights reserved. Sorting III 1 An Introduction to Sorting.
CHAPTER 11 Sorting.
Quicksort.
Quicksort
Algorithm Efficiency and Sorting
Cmpt-225 Sorting – Part two. Idea of Quick Sort 1) Select: pick an element 2) Divide: partition elements so that x goes to its final position E 3) Conquer:
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
Recitation 11 Analysis of Algorithms and inductive proofs 1.
HKOI 2006 Intermediate Training Searching and Sorting 1/4/2006.
Chapter 10 B Algorithm Efficiency and Sorting. © 2004 Pearson Addison-Wesley. All rights reserved 9 A-2 Sorting Algorithms and Their Efficiency Sorting.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
CS 61B Data Structures and Programming Methodology July 28, 2008 David Sun.
ASYMPTOTIC COMPLEXITY CS2111 CS2110 – Fall
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
© 2006 Pearson Addison-Wesley. All rights reserved10 B-1 Chapter 10 (continued) Algorithm Efficiency and Sorting.
Chapter 5 Searching and Sorting. Copyright © 2004 Pearson Addison-Wesley. All rights reserved.1-2 Chapter Objectives Examine the linear search and binary.
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Big Java by Cay Horstmann Copyright © 2009 by John Wiley & Sons. All rights reserved. Selection Sort Sorts an array by repeatedly finding the smallest.
Chapter 8 Sorting and Searching Goals: 1.Java implementation of sorting algorithms 2.Selection and Insertion Sorts 3.Recursive Sorts: Mergesort and Quicksort.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
1 Searching and Sorting Searching algorithms with simple arrays Sorting algorithms with simple arrays –Selection Sort –Insertion Sort –Bubble Sort –Quick.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Data Structures - CSCI 102 Selection Sort Keep the list separated into sorted and unsorted sections Start by finding the minimum & put it at the front.
Copyright © Curt Hill Sorting Ordering an array.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
INTRO2CS Tirgul 8 1. Searching and Sorting  Tips for debugging  Binary search  Sorting algorithms:  Bogo sort  Bubble sort  Quick sort and maybe.
CSCI 104 Sorting Algorithms
Algorithm Efficiency and Sorting
Data Structures and Algorithms
Quicksort 1.
Quick Sort (11.2) CSE 2011 Winter November 2018.
Quicksort analysis Bubble sort
Sub-Quadratic Sorting Algorithms
CSE 326: Data Structures Sorting
Quicksort.
CSE 373 Data Structures and Algorithms
Topic: Divide and Conquer
Algorithm Efficiency and Sorting
Quicksort.
Algorithm Efficiency and Sorting
Quicksort.
Presentation transcript:

Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations

Selection sort for k = 1 to n-1 –find kth smallest item –swap it with the one in the kth position number of comparisons –n-1 for k=1 –n-2 for k=2 –…–… –1 for k=n-1 total: n(n-1)/2

Insertion sort for k=2 to n –move kth item into sorted order with respect to the first k-1 items which already sorted –involves moving sorted items over: can do this at the same time as finding where kth item has to go number of comparisons: worst case –1 for k=2 –2 for k=3 –…–… –n-1 for k=n total: n(n-1)/2 on average, maybe n(n-1)/4, but involves more moving of data than selection sort

Bubble sort Pass through data, swapping neighbors that are out of order After first pass, largest element has “sunk” to bottom Repeat until no more swaps are needed, considering one less pair each time Cost: n-1 comparisons for first pass, n-2 for 2 nd pass, etc. May need n-1 passes, so also “O(n 2 )”

Merge sort Recursive If only one item, return Sort first half by merge sort Sort second half by merge sort Merge the results

Keys to efficiency Recursion: reduce problem to two problems of half the size Also known as: divide and conquer Merge step: only requires n comparisons, where n is number of items to be merged Need two arrays, but no more Do not construct arrays inside the recursive call!!! (The comprehensive edition of Liang does this) How should we implement this in Java? public static sort(what parameters?) We cannot pass “half an array”, so what to do?

Details for sort public static void sort(int left, int right, double[] x, double[] temp){ if(left == right){ return; } int mid = (left+right)/2; sort(left, mid, x, temp); // sort first half of x using temp sort(mid+1, right, x, temp); // sort second half of x using temp merge(left, mid, right, x, temp); // merge them into temp copy(left, right, temp, x); // copy temp back to x }

Details for merge(left, mid, right, x, temp) Keep 3 ints: i,j,k i initialized to left j initialized to mid+1 k initialized to left advance until i > mid or j > right copy the rest of one of the other half cute trick: 2 additional while loops another trick: do nothing if x[mid] <= x[mid+1] for this reason, better move the copy inside merge

Number of comparisons Let C(n) = number of comparisons to sort array of length n by merge sort Clearly, C(n) = 2 C(n/2) + (n-1), if n is even, > 0 or a little less, if we program merge efficiently C(1) = 0 C(2) = 1 C(4) = 5 C(8) = 17 C(16) = 49 Hard to see a pattern, but when we double n, the number of comparisons is also doubled, + an extra n-1 comparisons

Let’s simplify this slightly Count the merge as n comparisons, instead of n-1 C(n) = 2 C(n/2) + n, if n is even, > 0 C(1) = 0 C(2) = 2 C(4) = 8 C(8) = 24 C(16) = 64 Do you see a pattern now? Assume n is a power of 2, say 2 k By inspection, C(n) = k n In other words, C(n) = n log 2 (n)

Proof by induction on k Base case: C(1) = 1 log 2 (1) = 0 Inductive hypothesis: suppose true for k-1: C(2 k-1 ) = (k-1) 2 k-1 Just like: assume the recursive magic works Want to prove true for n=k C(2 k ) = 2 C(2 k-1 ) + 2 k = 2 (k-1) 2 k k = (k-1) 2 k + 2 k = k 2 k

Proof that all horses have same color proof by induction on number of horses if n=1, true inductive hypothesis: suppose true for n-1: if you have n-1 horses, they are all same color given n horses in a corral, take 1 out: rest must have same color put it back and take a different one out: rest must have same color therefore they are all the same color what is wrong with this?

An easier way to think about merge sort 1 call to sort array of length n 2 calls to sort array of length n/2 4 calls to sort array of length n/4 8 calls to sort array of length n/8 … n calls to sort array of length 1 at each level, the merge operations take a total of at most n comparisons since the number of levels is log 2 n the total number of comparisons is at most n log 2 n

What if n is not a power of 2? Does not really matter: mid = (left + right)/2 rounds down which is fine Number of comparisons is still approximately n log 2 n (which is not an integer) We say running time is O(n log 2 n) Base of logarithm doesn’t matter much, because log 10 n = log 2 n/log 2 10

Quicksort Similar recursive idea, but avoids the need for 2 arrays Chooses a pivot element, then divides array into two parts, one with elements ≤ pivot, and one with elements > pivot Possible ways to choose pivot: see next page To partition the array, loop from left to find first element > pivot, and loop from right to find first element <= pivot Swap them and repeat Place pivot in right place Make two recursive calls

How to choose the pivot Goal: want to choose pivot to divide array approximately in half, but want to do this fast Ideas? First position middle position average value median value randomly chosen median of items in first, middle and last positions

Number of comparisons for quicksort Assuming arrays divided approximately in half each time, same as merge sort Advantage: only one array Disadvantage: items with equal values may end up interchanged from their original value Doesn’t matter for primitive types, but may matter for objects For this reason the Arrays.sort methods use quicksort for primitive types and merge sort for Comparable objects

Other sorts Heapsort: slightly slower than quicksort on average but number of comparisons is n log 2 n even in worst case, like merge sort Radix sort

Big O Notation We say a function f(n) is O(g(n)) if there is a constant c so that f(n) ≤ c g(n) for all n Thus we say the number of comparisons for merge sort is O(n log n) We don’t need to write the base Which of these functions are O of the others: log n, 2 n, 10 n, n 2, n 3, n, n log n