1 Sorting Algorithms Sections 7.1 to 7.7. 2 Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.

Slides:



Advertisements
Similar presentations
Chapter 9 continued: Quicksort
Advertisements

Sorting Comparison-based algorithm review –You should know most of the algorithms –We will concentrate on their analyses –Special emphasis: Heapsort Lower.
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Efficient Sorts. Divide and Conquer Divide and Conquer : chop a problem into smaller problems, solve those – Ex: binary search.
CSE 373: Data Structures and Algorithms
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Quicksort COMP171 Fall Sorting II/ Slide 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N.
Chapter 7: Sorting Algorithms
1 Today’s Material Divide & Conquer (Recursive) Sorting Algorithms –QuickSort External Sorting.
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
Data Structures and Algorithms PLSD210 Sorting. Card players all know how to sort … First card is already sorted With all the rest, ¶Scan back from the.
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
Insertion Sort Merge Sort QuickSort. Sorting Problem Definition an algorithm that puts elements of a list in a certain order Numerical order and Lexicographical.
CS203 Programming with Data Structures Sorting California State University, Los Angeles.
Quicksort Divide-and-Conquer. Quicksort Algorithm Given an array S of n elements (e.g., integers): If array only contains one element, return it. Else.
Cpt S 223 – Advanced Data Structures Sorting (Chapter 7)
CS 171: Introduction to Computer Science II Quicksort.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
1 TCSS 342, Winter 2005 Lecture Notes Sorting Weiss Ch. 8, pp
Quicksort. 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom.
CHAPTER 11 Sorting.
Merge sort, Insertion sort
Sorting. Introduction Assumptions –Sorting an array of integers –Entire sort can be done in main memory Straightforward algorithms are O(N 2 ) More complex.
Quicksort.
Merge sort, Insertion sort. Sorting I / Slide 2 Sorting * Selection sort or bubble sort 1. Find the minimum value in the list 2. Swap it with the value.
CSC 2300 Data Structures & Algorithms March 20, 2007 Chapter 7. Sorting.
1 7.5 Heapsort Average number of comparison used to heapsort a random permutation of N items is 2N logN - O (N log log N).
Sorting Rearrange n elements into ascending order. 7, 3, 6, 2, 1  1, 2, 3, 6, 7.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Design and Analysis of Algorithms – Chapter 51 Divide and Conquer (I) Dr. Ying Lu RAIK 283: Data Structures & Algorithms.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
CS 202, Spring 2003 Fundamental Structures of Computer Science II Bilkent University1 Sorting - 3 CS 202 – Fundamental Structures of Computer Science II.
CSE 373 Data Structures Lecture 19
Data Structures/ Algorithms and Generic Programming Sorting Algorithms.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
Divide-And-Conquer Sorting Small instance.  n
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
CSC – 332 Data Structures Sorting
CHAPTER 09 Compiled by: Dr. Mohammad Omar Alhawarat Sorting & Searching.
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
Merge sort, Insertion sort. Sorting I / Slide 2 Sorting * Selection sort (iterative, recursive?) * Bubble sort.
Sorting Algorithms Jyh-Shing Roger Jang ( 張智星 ) CSIE Dept, National Taiwan University.
Sorting CSIT 402 Data Structures II. 2 Sorting (Ascending Order) Input ›an array A of data records ›a key value in each data record ›a comparison function.
1 CSE 326: Data Structures A Sort of Detour Henry Kautz Winter Quarter 2002.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Divide And Conquer A large instance is solved as follows:  Divide the large instance into smaller instances.  Solve the smaller instances somehow. 
1 Heapsort, Mergesort, and Quicksort Sections 7.5 to 7.7.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Sorting Fundamental Data Structures and Algorithms Aleks Nanevski February 17, 2004.
CSE 326: Data Structures Lecture 23 Spring Quarter 2001 Sorting, Part 1 David Kaplan
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
Intro. to Data Structures Chapter 7 Sorting Veera Muangsin, Dept. of Computer Engineering, Chulalongkorn University 1 Chapter 7 Sorting Sort is.
Sorting Algorithms Sections 7.1 to 7.4.
Fundamental Data Structures and Algorithms
Divide-And-Conquer-And-Combine
Quick-Sort 9/13/2018 1:15 AM Quick-Sort     2
Chapter 7 Sorting Spring 14
Data Structures and Algorithms
Quick Sort (11.2) CSE 2011 Winter November 2018.
Divide-And-Conquer-And-Combine
CSE 326: Data Structures Sorting
EE 312 Software Design and Implementation I
CSE 373: Data Structures and Algorithms
CSE 373 Data Structures and Algorithms
Design and Analysis of Algorithms
Sorting Algorithms Jyh-Shing Roger Jang (張智星)
Presentation transcript:

1 Sorting Algorithms Sections 7.1 to 7.7

2 Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects – Rabbit, Cat, Rat ?? Class must specify how to compare Objects In general, need the support of –‘ ’ operators

3 Sorting Definitions In place sorting –Sorting of a data structure does not require any external data structure for storing the intermediate steps External sorting –Sorting of records not present in memory Stable sorting –If the same element is present multiple times, then they retain the original relative order of positions

4 C++ STL sorting algorithms sort function template –void sort(iterator begin, iterator end) –void sort(iterator begin, iterator end, Comparator cmp) –begin and end are start and end marker of container (or a range of it) –Container needs to support random access such as vector –sort is not stable sorting stable_sort() is stable

5 Heapsort Min heap –Build a binary minHeap of N elements O(N) time –Then perform N findMin and deleteMin operations log(N) time per deleteMin –Total complexity O(N log N) –It requires an extra array to store the results Max heap –Storing deleted elements at the end avoid the need for an extra element

6 Heapsort Implementation

7 Example (MaxHeap) After BuildHeap After first deleteMax

8 Bubble Sort Simple and uncomplicated Compare neighboring elements Swap if out of order Two nested loops O(n 2 )

9 Bubble Sort vector a contains n elements to be sorted. for (i=0; i<n-1; i++) { for (j=0; j<n-1-i; j++) if (a[j+1] < a[j]) { /* compare neighbors */ tmp = a[j]; /* swap a[j] and a[j+1] */ a[j] = a[j+1]; a[j+1] = tmp; }

10 Bubble Sort Example 2, 3, 1, 15 2, 1, 3, 15 // after one loop 1, 2, 3, 15 // after second loop 1, 2, 3, 15 // after third loop

11 Insertion Sort O(n 2 ) sort N-1 passes –After pass p all elements from 0 to p are sorted –Following step inserts the next element in correct position within the sorted part

12 Insertion Sort

13 Insertion Sort: Example

14 Insertion Sort - Analysis Pass p involves at most p comparisons Total comparisons = ∑i ; i = [1, n-1] = O(n²)

15 Insertion Sort - Analysis Worst Case ? – Reverse sorted list – Max possible number of comparisons – O(n²) Best Case ? – Sorted input – 1 comparison in each pass – O(n)

16 Lower Bound on ‘Simple’ Sorting Simple sorting –Performing only adjacent exchanges –Such as bubble sort and insertion sort Inversions – an ordered pair (i, j) such that i a[j] –34,8,64,51,32,21 –(34,8), (34,32), (34,21), (64,51) … Once an array has no inversions it is sorted So sorting bounds depend on ‘average’ number of inversions performed

17 Theorem 1 Average number of inversions in an array of N distinct elements is N(N-1)/4 –For any list L, consider reverse list L r L: 34, 8, 64, 51, 32, 21 Lr: 21, 32, 51, 64, 8, 34 –All possible number of pairs is in L and Lr –= N(N-1)/2 –Average number of inversion in L = N(N-1)/4

18 Theorem 2 Any algorithm that sorts by exchanging adjacent elements requires Ω(n²) average time –Average number of inversions = Ω(n 2 ) –Number of swaps required = Ω(n 2 )

19 Bound for Comparison Based Sorting O( n logn ) –Optimal bound for comparison-based sorting algorithms –Achieved by Quick Sort, Merge Sort, and Heap Sort

20 Mergesort Divide the N values to be sorted into two halves Recursively sort each half using Mergesort –Base case N=1  no sorting required Merge the two (sorted) halves –O(N) operation

21 Merging O(N) Time In each step, one element of C gets filled –Each element takes constant time –So, total time = O(N)

22 Mergesort Example

23 Mergesort Implementation

24

25 Mergesort Complexity Analysis Let T(N) be the complexity when size is N Recurrence relation –T(1) = 1 –T(N) = 2T(N/2) + N –T(N) = 4T(N/4) + 2N –T(N) = 8T(N/8) + 3N –… –T(N) = 2 k T(N/2 k ) + k*N –For k = log N T(N) = N T(1) + N log N Complexity: O(N logN)

26 Quicksort Fastest known sorting algorithm in practice –Caveats: not stable Average case complexity  O(N log N ) Worst-case complexity  O(N 2 ) –Rarely happens, if implemented well

27 Quicksort Outline Divide and conquer approach Given array S to be sorted If size of S < 1 then done; Pick any element v in S as the pivot Partition S-{v} (remaining elements in S ) into two groups S1 = {all elements in S-{v} that are smaller than v } S2 = {all elements in S-{v} that are larger than v } Return { quicksort(S1) followed by v followed by quicksort(S2) } Trick lies in handling the partitioning (step 3). –Picking a good pivot –Efficiently partitioning in-place

28 Quicksort Example Select pivot partition Recursive call Merge

29 Quicksort Structure What is the time complexity if the pivot is always the median? Note: Partitioning can be performed in O(N) time What is the worst case height

30 Picking the Pivot How would you pick one? Strategy 1: Pick the first element in S –Works only if input is random –What if input S is sorted, or even mostly sorted? All the remaining elements would go into either S1 or S2 ! Terrible performance!

31 Picking the Pivot (contd.) Strategy 2: Pick the pivot randomly –Would usually work well, even for mostly sorted input –Unless the random number generator is not quite random! –Plus random number generation is an expensive operation

32 Picking the Pivot (contd.) Strategy 3: Median-of-three Partitioning –Ideally, the pivot should be the median of input array S Median = element in the middle of the sorted sequence –Would divide the input into two almost equal partitions –Unfortunately, its hard to calculate median quickly, even though it can be done in O(N) time! –So, find the approximate median Pivot = median of the left-most, right-most and center element of the array S Solves the problem of sorted input

33 Picking the Pivot (contd.) Example: Median-of-three Partitioning –Let input S = {6, 1, 4, 9, 0, 3, 5, 2, 7, 8} –left=0 and S[left] = 6 –right=9 and S[right] = 8 –center = (left+right)/2 = 4 and S[center] = 0 –Pivot = Median of S[left], S[right], and S[center] = median of 6, 8, and 0 = S[left] = 6

34 Partitioning Algorithm Original input : S = {6, 1, 4, 9, 0, 3, 5, 2, 7, 8} Get the pivot out of the way by swapping it with the last element Have two ‘iterators’ – i and j –i starts at first element and moves forward –j starts at last element and moves backwards pivot ijpivot

35 Partitioning Algorithm (contd.)  While (i < j) –Move i to the right till we find a number greater than pivot –Move j to the left till we find a number smaller than pivot –If (i < j) swap(S[i], S[j]) –(The effect is to push larger elements to the right and smaller elements to the left) 4.Swap the pivot with S[i]

36 Partitioning Algorithm Illustrated ijpivot ij pivot ijpivot Move swap ijpivot move ijpivot swap ijpivot move ij pivot Swap S[i] with pivot i and j have crossed

37 Dealing with small arrays For small arrays (say, N ≤ 20), –Insertion sort is faster than quicksort Quicksort is recursive –So it can spend a lot of time sorting small arrays Hybrid algorithm: –Switch to using insertion sort when problem size is small (say for N < 20 )

38 Quicksort Driver Routine

39 Quicksort Pivot Selection Routine Swap a[left], a[center] and a[right] in-place Pivot is in a[center] now Swap the pivot a[center] with a[right-1]

40 Quicksort routine Has a side effect move swap