Sorting. Introduction Common problem: sort a list of values, starting from lowest to highest. List of exam scores Words of dictionary in alphabetical.

Slides:



Advertisements
Similar presentations
P p Two slow but simple algorithms are Selectionsort and Insertionsort. p p This presentation demonstrates how the two algorithms work. Quadratic Sorting.
Advertisements

Introduction to Algorithms Quicksort
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Sorting I Chapter 8 Kruse and Ryba. Introduction Common problem: sort a list of values, starting from lowest to highest. –List of exam scores –Words of.
Sorting A fundamental operation in computer science (many programs need to sort as an intermediate step). Many sorting algorithms have been developed Choose.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Searching Kruse and Ryba Ch and 9.6. Problem: Search We are given a list of records. Each record has an associated key. Give efficient algorithm.
Analysis of Quicksort. Quicksort Algorithm Given an array of n elements (e.g., integers): If array only contains one element, return Else –pick one element.
Sorting Algorithms and Average Case Time Complexity
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 4 Some of the sides are exported from different sources.
Quicksort Divide-and-Conquer. Quicksort Algorithm Given an array S of n elements (e.g., integers): If array only contains one element, return it. Else.
Sorting Chapter 9.
Sorting1 Sorting Order in the court!. sorting2 Importance of sorting Sorting a list of values is a fundamental task of computers - this task is one of.
Sorting21 Recursive sorting algorithms Oh no, not again!
1 Sorting/Searching CS308 Data Structures. 2 Sorting means... l Sorting rearranges the elements into either ascending or descending order within the array.
1 Sorting Algorithms (Part II) Overview  Divide and Conquer Sorting Methods.  Merge Sort and its Implementation.  Brief Analysis of Merge Sort.  Quick.
1 C++ Plus Data Structures Nell Dale Chapter 10 Sorting and Searching Algorithms Slides by Sylvia Sorkin, Community College of Baltimore County - Essex.
1 Algorithm Efficiency and Sorting (Walls & Mirrors - Remainder of Chapter 9)
C++ Plus Data Structures
Cmpt-225 Sorting. Fundamental problem in computing science  putting a collection of items in order Often used as part of another algorithm  e.g. sort.
1 Foundations of Software Design Fall 2002 Marti Hearst Lecture 20: Sorting.
Sorting CS-212 Dick Steflik. Exchange Sorting Method : make n-1 passes across the data, on each pass compare adjacent items, swapping as necessary (n-1.
Mergesort and Quicksort Chapter 8 Kruse and Ryba.
CSE 373 Data Structures Lecture 19
Week 11 Sorting Algorithms. Sorting Sorting Algorithms A sorting algorithm is an algorithm that puts elements of a list in a certain order. We need sorting.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
1 Data Structures and Algorithms Sorting and Searching Algorithms Slides by Sylvia Sorkin, Community College of Baltimore County - Essex Campus and Robert.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
HKOI 2006 Intermediate Training Searching and Sorting 1/4/2006.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
Sorting CSIT 402 Data Structures II. 2 Sorting (Ascending Order) Input ›an array A of data records ›a key value in each data record ›a comparison function.
Sort Algorithms.
1 C++ Plus Data Structures Nell Dale Chapter 10 Sorting and Searching Algorithms Slides by Sylvia Sorkin, Community College of Baltimore County - Essex.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
CENG 213 Data Structures Sorting Algorithms. CENG 213 Data Structures Sorting Sorting is a process that organizes a collection of data into either ascending.
Lecture No. 04,05 Sorting.  A process that organizes a collection of data into either ascending or descending order.  Can be used as a first step for.
1 Searching and Sorting Searching algorithms with simple arrays Sorting algorithms with simple arrays –Selection Sort –Insertion Sort –Bubble Sort –Quick.
Quicksort Data Structures and Algorithms CS 244 Brent M. Dingle, Ph.D. Game Design and Development Program Department of Mathematics, Statistics, and Computer.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Sorting divide and conquer. Divide-and-conquer  a recursive design technique  solve small problem directly  divide large problem into two subproblems,
Sorting and Searching Algorithms CS Sorting means... l The values stored in an array have keys of a type for which the relational operators are.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Nothing is particularly hard if you divide it into small jobs. Henry Ford Nothing is particularly hard if you divide it into small jobs. Henry Ford.
Concepts of Algorithms CSC-244 Unit 17 & 18 Divide-and-conquer Algorithms Quick Sort Shahid Iqbal Lone Computer College Qassim University K.S.A.
Computer Sciences Department1. Sorting algorithm 4 Computer Sciences Department3.
329 3/30/98 CSE 143 Searching and Sorting [Sections 12.4, ]
UNIT - IV SORTING By B.Venkateswarlu Dept of CSE.
CSCI 104 Sorting Algorithms
Sorting Chapter 13 presents several common algorithms for sorting an array of integers. Two slow but simple algorithms are Selectionsort and Insertionsort.
Design and Analysis of Algorithms
Divide and Conquer.
Sorting means The values stored in an array have keys of a type for which the relational operators are defined. (We also assume unique keys.) Sorting.
Quicksort Algorithm Given an array of n elements (e.g., integers):
Subject Name : Data Structure Using C Unit Title : Sorting Methods
C++ Plus Data Structures
Sorting "There's nothing in your head the sorting hat can't see. So try me on and I will tell you where you ought to be." -The Sorting Hat, Harry Potter.
Sorting Chapter 13 presents several common algorithms for sorting an array of integers. Two slow but simple algorithms are Selectionsort and Insertionsort.
CSE 373 Data Structures and Algorithms
Chapter 10 Sorting Algorithms
Searching/Sorting/Searching
Sorting.
Presentation transcript:

Sorting

Introduction Common problem: sort a list of values, starting from lowest to highest. List of exam scores Words of dictionary in alphabetical order Students names listed alphabetically Student records sorted by ID# Generally, we are given a list of records that have keys. These keys are used to define an ordering of the items in the list.

C++ Implementation of Sorting Use C++ templates to implement a generic sorting function. This would allow use of the same function to sort items of any class. However, class to be sorted must provide the following overloaded operators: Assignment: = Ordering: >, <, == Example class: C++ STL string class In this lecture, we’ll talk about sorting integers; however, the algorithms are general and can be applied to any class as described above.

Quadratic Sorting Algorithms We are given n records to sort. There are a number of simple sorting algorithms whose worst and average case performance is quadratic O(n 2 ): Selection sort Insertion sort Bubble sort

Sorting an Array of Integers Example: we are given an array of six integers that we want to sort from smallest to largest [0] [1] [2] [3] [4] [5]

The Selection Sort Algorithm Start by finding the smallest entry. [0] [1] [2] [3] [4] [5]

The Selection Sort Algorithm Swap the smallest entry with the first entry. [0] [1] [2] [3] [4] [5]

The Selection Sort Algorithm Swap the smallest entry with the first entry. [0] [1] [2] [3] [4] [5]

The Selection Sort Algorithm Part of the array is now sorted. Sorted side Unsorted side [0] [1] [2] [3] [4] [5]

The Selection Sort Algorithm Find the smallest element in the unsorted side. Sorted side Unsorted side [0] [1] [2] [3] [4] [5]

The Selection Sort Algorithm Swap with the front of the unsorted side. Sorted side Unsorted side [0] [1] [2] [3] [4] [5]

The Selection Sort Algorithm We have increased the size of the sorted side by one element. Sorted side Unsorted side [0] [1] [2] [3] [4] [5]

The Selection Sort Algorithm The process continues... Sorted side Unsorted side Smallest from unsorted Smallest from unsorted [0] [1] [2] [3] [4] [5]

The Selection Sort Algorithm The process continues... Sorted side Unsorted side [0] [1] [2] [3] [4] [5] Swap with front Swap with front

The Selection Sort Algorithm The process continues... Sorted side Unsorted side Sorted side is bigger Sorted side is bigger [0] [1] [2] [3] [4] [5]

The Selection Sort Algorithm The process keeps adding one more number to the sorted side. The sorted side has the smallest numbers, arranged from small to large. Sorted side Unsorted side [0] [1] [2] [3] [4] [5]

The Selection Sort Algorithm We can stop when the unsorted side has just one number, since that number must be the largest number. [0] [1] [2] [3] [4] [5] Sorted side Unsorted side

The Selection Sort Algorithm The array is now sorted. We repeatedly selected the smallest element, and moved this element to the front of the unsorted side. [0] [1] [2] [3] [4] [5]

template void selection_sort(Item data[ ], size_t n) { size_t i, j, smallest; Item temp; if(n < 2) return; // nothing to sort!! for(i = 0; i < n-1 ; ++i) { // find smallest in unsorted part of array smallest = i; for(j = i+1; j < n; ++j) if(data[smallest] > data[j]) smallest = j; // put it at front of unsorted part of array (swap) temp = data[i]; data[i] = data[smallest]; data[smallest] = temp; }

Selection Time Sort Analysis In O-notation, what is: Worst case running time for n items? Average case running time for n items? Steps of algorithm: for i = 1 to n-1 find smallest key in unsorted part of array swap smallest item to front of unsorted array decrease size of unsorted array by 1

Selection Time Sort Analysis In O-notation, what is: Worst case running time for n items? Average case running time for n items? Steps of algorithm: for i = 1 to n-1 O(n) find smallest key in unsorted part of array O(n) swap smallest item to front of unsorted array decrease size of unsorted array by 1 Selection sort analysis: O(n 2 )

template void selection_sort(Item data[ ], size_t n) { size_t i, j, smallest; Item temp; if(n < 2) return; // nothing to sort!! for(i = 0; i < n-1 ; ++i) { // find smallest in unsorted part of array smallest = i; for(j = i+1; j < n; ++j) if(data[smallest] > data[j]) smallest = j; // put it at front of unsorted part of array (swap) temp = data[i]; data[i] = data[smallest]; data[smallest] = temp; } Outer loop: O(n)

template void selection_sort(Item data[ ], size_t n) { size_t i, j, smallest; Item temp; if(n < 2) return; // nothing to sort!! for(i = 0; i < n-1 ; ++i) { // find smallest in unsorted part of array smallest = i; for(j = i+1; j < n; ++j) if(data[smallest] > data[j]) smallest = j; // put it at front of unsorted part of array (swap) temp = data[i]; data[i] = data[smallest]; data[smallest] = temp; } Outer loop: O(n) Inner loop: O(n)

The Insertion Sort Algorithm The Insertion Sort algorithm also views the array as having a sorted side and an unsorted side. [0] [1] [2] [3] [4] [5]

The Insertion Sort Algorithm The sorted side starts with just the first element, which is not necessarily the smallest element. [0] [1] [2] [3] [4] [5] Sorted side Unsorted side

The Insertion Sort Algorithm The sorted side grows by taking the front element from the unsorted side... [0] [1] [2] [3] [4] [5] Sorted side Unsorted side

The Insertion Sort Algorithm...and inserting it in the place that keeps the sorted side arranged from small to large. [0] [1] [2] [3] [4] [5] Sorted side Unsorted side

The Insertion Sort Algorithm [0] [1] [2] [3] [4] [5] Sorted side Unsorted side

The Insertion Sort Algorithm Sometimes we are lucky and the new inserted item doesn't need to move at all. [0] [1] [2] [3] [4] [5] Sorted side Unsorted side

The Insertion Sort Algorithm Sometimes we are lucky twice in a row. [0] [1] [2] [3] [4] [5] Sorted side Unsorted side

How to Insert One Element ¶ Copy the new element to a separate location. [0] [1] [2] [3] [4] [5] Sorted side Unsorted side

How to Insert One Element · Shift elements in the sorted side, creating an open space for the new element. [0] [1] [2] [3] [4] [5]

How to Insert One Element · Shift elements in the sorted side, creating an open space for the new element. [0] [1] [2] [3] [4] [5]

How to Insert One Element · Continue shifting elements... [0] [1] [2] [3] [4] [5]

How to Insert One Element · Continue shifting elements... [0] [1] [2] [3] [4] [5]

How to Insert One Element ·...until you reach the location for the new element. [0] [1] [2] [3] [4] [5]

How to Insert One Element ¸ Copy the new element back into the array, at the correct location. [0] [1] [2] [3] [4] [5] Sorted side Unsorted side

How to Insert One Element The last element must also be inserted. Start by copying it... [0] [1] [2] [3] [4] [5] Sorted side Unsorted side

Sorted Result [0] [1] [2] [3] [4] [5]

template void insertion_sort(Item data[ ], size_t n) { size_t i, j; Item temp; if(n < 2) return; // nothing to sort!! for(i = 1; i < n; ++i) { // take next item at front of unsorted part of array // and insert it in appropriate location in sorted part of array temp = data[i]; for(j = i; data[j-1] > temp and j > 0; --j) data[j] = data[j-1]; // shift element forward data[j] = temp; }

Insertion Sort Time Analysis In O-notation, what is: Worst case running time for n items? Average case running time for n items? Steps of algorithm: for i = 1 to n-1 take next key from unsorted part of array insert in appropriate location in sorted part of array: for j = i down to 0, shift sorted elements to the right if key > key[i] increase size of sorted array by 1

Insertion Sort Time Analysis In O-notation, what is: Worst case running time for n items? Average case running time for n items? Steps of algorithm: for i = 1 to n-1 take next key from unsorted part of array insert in appropriate location in sorted part of array: for j = i down to 0, shift sorted elements to the right if key > key[i] increase size of sorted array by 1 Outer loop: O(n)

Insertion Sort Time Analysis In O-notation, what is: Worst case running time for n items? Average case running time for n items? Steps of algorithm: for i = 1 to n-1 take next key from unsorted part of array insert in appropriate location in sorted part of array: for j = i down to 0, shift sorted elements to the right if key > key[i] increase size of sorted array by 1 Outer loop: O(n) Inner loop: O(n)

template void insertion_sort(Item data[ ], size_t n) { size_t i, j; Item temp; if(n < 2) return; // nothing to sort!! for(i = 1; i < n; ++i) { // take next item at front of unsorted part of array // and insert it in appropriate location in sorted part of array temp = data[i]; for(j = i; data[j-1] > temp and j > 0; --j) data[j] = data[j-1]; // shift element forward data[j] = temp; } O(n)

The Bubble Sort Algorithm The Bubble Sort algorithm looks at pairs of entries in the array, and swaps their order if needed. [0] [1] [2] [3] [4] [5]

The Bubble Sort Algorithm The Bubble Sort algorithm looks at pairs of entries in the array, and swaps their order if needed. [0] [1] [2] [3] [4] [5] Swap?

The Bubble Sort Algorithm The Bubble Sort algorithm looks at pairs of entries in the array, and swaps their order if needed. [0] [1] [2] [3] [4] [5] Yes!

The Bubble Sort Algorithm The Bubble Sort algorithm looks at pairs of entries in the array, and swaps their order if needed. [0] [1] [2] [3] [4] [5] Swap?

The Bubble Sort Algorithm The Bubble Sort algorithm looks at pairs of entries in the array, and swaps their order if needed. [0] [1] [2] [3] [4] [5] No.

The Bubble Sort Algorithm The Bubble Sort algorithm looks at pairs of entries in the array, and swaps their order if needed. [0] [1] [2] [3] [4] [5] Swap?

The Bubble Sort Algorithm The Bubble Sort algorithm looks at pairs of entries in the array, and swaps their order if needed. [0] [1] [2] [3] [4] [5] No.

The Bubble Sort Algorithm The Bubble Sort algorithm looks at pairs of entries in the array, and swaps their order if needed. [0] [1] [2] [3] [4] [5] Swap?

The Bubble Sort Algorithm The Bubble Sort algorithm looks at pairs of entries in the array, and swaps their order if needed. [0] [1] [2] [3] [4] [5] Yes!

The Bubble Sort Algorithm The Bubble Sort algorithm looks at pairs of entries in the array, and swaps their order if needed. [0] [1] [2] [3] [4] [5] Swap?

The Bubble Sort Algorithm The Bubble Sort algorithm looks at pairs of entries in the array, and swaps their order if needed. [0] [1] [2] [3] [4] [5] Yes!

The Bubble Sort Algorithm Repeat. [0] [1] [2] [3] [4] [5] Swap? No.

The Bubble Sort Algorithm Repeat. [0] [1] [2] [3] [4] [5] Swap? No.

The Bubble Sort Algorithm Repeat. [0] [1] [2] [3] [4] [5] Swap? Yes.

The Bubble Sort Algorithm Repeat. [0] [1] [2] [3] [4] [5] Swap? Yes.

The Bubble Sort Algorithm Repeat. [0] [1] [2] [3] [4] [5] Swap? Yes.

The Bubble Sort Algorithm Repeat. [0] [1] [2] [3] [4] [5] Swap? Yes.

The Bubble Sort Algorithm Repeat. [0] [1] [2] [3] [4] [5] Swap? No.

The Bubble Sort Algorithm Loop over array n-1 times, swapping pairs of entries as needed. [0] [1] [2] [3] [4] [5] Swap? No.

The Bubble Sort Algorithm Loop over array n-1 times, swapping pairs of entries as needed. [0] [1] [2] [3] [4] [5] Swap? Yes.

The Bubble Sort Algorithm Loop over array n-1 times, swapping pairs of entries as needed. [0] [1] [2] [3] [4] [5] Swap? Yes.

The Bubble Sort Algorithm Loop over array n-1 times, swapping pairs of entries as needed. [0] [1] [2] [3] [4] [5] Swap? Yes.

The Bubble Sort Algorithm Loop over array n-1 times, swapping pairs of entries as needed. [0] [1] [2] [3] [4] [5] Swap? Yes.

The Bubble Sort Algorithm Loop over array n-1 times, swapping pairs of entries as needed. [0] [1] [2] [3] [4] [5] Swap? No.

The Bubble Sort Algorithm Loop over array n-1 times, swapping pairs of entries as needed. [0] [1] [2] [3] [4] [5] Swap? No.

The Bubble Sort Algorithm Continue looping, until done. [0] [1] [2] [3] [4] [5] Swap? Yes.

template void bubble_sort(Item data[ ], size_t n) { size_t i, j; Item temp; if(n < 2) return; // nothing to sort!! for(i = 0; i < n-1; ++i) { for(j = 0; j < n-1;++j) if(data[j] > data[j+1]) // if out of order, swap! { temp = data[j]; data[j] = data[j+1]; data[j+1] = temp; }

template void bubble_sort(Item data[ ], size_t n) { size_t i, j; Item temp; bool swapped = true; if(n < 2) return; // nothing to sort!! for(i = 0; swapped and i < n-1; ++i) {// if no elements swapped in an iteration, // then elements are in order: done! for(swapped = false, j = 0; j < n-1;++j) if(data[j] > data[j+1]) // if out of order, swap! { temp = data[j]; data[j] = data[j+1]; data[j+1] = temp; swapped = true; }

Bubble Sort Time Analysis In O-notation, what is: Worst case running time for n items? Average case running time for n items? Steps of algorithm: for i = 0 to n-1 for j =0 to n-2 if key[j] > key[j+1] then swap if no elements swapped in this pass through array, done. otherwise, continue

Bubble Sort Time Analysis In O-notation, what is: Worst case running time for n items? Average case running time for n items? Steps of algorithm: for i = 0 to n-1 for j =0 to n-2 if key[j] > key[j+1] then swap if no elements swapped in this pass through array, done. otherwise, continue O(n)

Selection Sort, Insertion Sort, and Bubble Sort all have a worst-case time of O(n 2 ), making them impractical for large arrays. But they are easy to program, easy to debug. Insertion Sort also has good performance when the array is nearly sorted to begin with. But more sophisticated sorting algorithms are needed when good performance is needed in all cases for large arrays. Next time: Merge Sort, Quick Sort, and Radix Sort. Timing and Other Issues Timing and Other Issues

Mergesort and Quicksort

Sorting algorithms Insertion, selection and bubble sort have quadratic worst-case performance The faster comparison based algorithm ? O(nlogn) Mergesort and Quicksort

Sorting Algorithms and Average Case Number of Comparisons Simple Sorts Straight Selection Sort Bubble Sort Insertion Sort More Complex Sorts Quick Sort Merge Sort Heap Sort O(N 2 ) O(N*log N) 78

79 Heap Sort Approach First, make the unsorted array into a heap by satisfying the order property. Then repeat the steps below until there are no more unsorted elements. l Take the root (maximum) element off the heap by swapping it into its correct place in the array at the end of the unsorted elements. l Reheap the remaining unsorted elements. (This puts the next-largest element into the root position).

80 After creating the original heap [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root 10 6

81 Swap root element into last place in unsorted array [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root 10 6

82 After swapping root element into its place [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root NO NEED TO CONSIDER AGAIN

83 After reheaping remaining unsorted elements [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root

84 Swap root element into last place in unsorted array [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root

85 After swapping root element into its place [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root NO NEED TO CONSIDER AGAIN 60 5

86 [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root After reheaping remaining unsorted elements

87 [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root Swap root element into last place in unsorted array

88 [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root After swapping root element into its place NO NEED TO CONSIDER AGAIN

89 [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root After reheaping remaining unsorted elements

90 [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root Swap root element into last place in unsorted array

91 [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root After swapping root element into its place NO NEED TO CONSIDER AGAIN

92 [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root After reheaping remaining unsorted elements

93 [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root Swap root element into last place in unsorted array

94 [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root After swapping root element into its place NO NEED TO CONSIDER AGAIN

95 [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root After reheaping remaining unsorted elements

96 [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root Swap root element into last place in unsorted array

97 [ 0 ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] values root After swapping root element into its place ALL ELEMENTS ARE SORTED

template void HeapSort ( ItemType values [ ], int numValues ) // Post: Sorts array values[ 0.. numValues-1 ] into ascending // order by key { int index ; // Convert array values[ 0.. numValues-1 ] into a heap. for ( index = numValues/2 - 1 ; index >= 0 ; index-- ) ReheapDown ( values, index, numValues - 1 ) ; // Sort the array. for ( index = numValues - 1 ; index >= 1 ; index-- ) { Swap ( values [0], values [index] ); ReheapDown ( values, 0, index - 1 ) ; } 98

99 Heap Sort: How many comparisons? root In reheap down, an element is compared with its 2 children (and swapped with the larger). But only one element at each level makes this comparison, and a complete binary tree with N nodes has only O(log 2 N) levels

Merge Sort Apply divide-and-conquer to sorting problem Problem: Given n elements, sort elements into non-decreasing order Divide-and-Conquer: If n=1 terminate (every one-element list is already sorted) If n>1, partition elements into two or more sub- collections; sort each; combine into a single sorted list How do we partition?

Partitioning - Choice 1 First n-1 elements into set A, last element set B Sort A using this partitioning scheme recursively B already sorted Combine A and B using method Insert() (= insertion into sorted array) Leads to recursive version of InsertionSort() Number of comparisons: O(n 2 ) Best case = n-1 Worst case =

Partitioning - Choice 2 Put element with largest key in B, remaining elements in A Sort A recursively To combine sorted A and B, append B to sorted A Use Max() to find largest element  recursive SelectionSort() Use bubbling process to find and move largest element to right-most position  recursive BubbleSort() All O(n 2 )

Partitioning - Choice 3 Let’s try to achieve balanced partitioning A gets n/2 elements, B gets rest half Sort A and B recursively Combine sorted A and B using a process called merge, which combines two sorted lists into one How? We will see soon

Example Partition into lists of size n/2 [10, 4, 6, 3] [10, 4, 6, 3, 8, 2, 5, 7] [8, 2, 5, 7] [10, 4] [6, 3] [8, 2] [5, 7] [4] [10] [3][6] [2][8] [5][7]

Example Cont’d Merge [3, 4, 6, 10] [2, 3, 4, 5, 6, 7, 8, 10 ] [2, 5, 7, 8] [4, 10] [3, 6] [2, 8] [5, 7] [4] [10] [3][6] [2][8] [5][7]

Static Method mergeSort() void mergeSort(Comparable []a, int left, int right) { // sort a[left:right] if (left < right) {// at least two elements int mid = (left+right)/2; //midpoint mergeSort(a, left, mid); mergeSort(a, mid + 1, right); merge(a, b, left, mid, right); //merge from a to b copy(b, a, left, right); //copy result back to a }

Merge Function

Evaluation Recurrence equation: Assume n is a power of 2 c 1 if n=1 T(n) = 2T(n/2) + c 2 n if n>1, n=2 k

Solution By Substitution: T(n) = 2T(n/2) + c 2 n T(n/2) = 2T(n/4) + c 2 n/2 T(n) = 4T(n/4) + 2 c 2 n T(n) = 8T(n/8) + 3 c 2 n T(n) = 2 i T(n/2 i ) + ic 2 n Assuming n = 2 k, expansion halts when we get T(1) on right side; this happens when i=k T(n) = 2 k T(1) + kc 2 n Since 2 k =n, we know k=logn; since T(1) = c 1, we get T(n) = c 1 n + c 2 nlogn; thus an upper bound for T mergeSort (n) is O(nlogn)

Quicksort Algorithm Given an array of n elements (e.g., integers): If array only contains one element, return Else pick one element to use as pivot. Partition elements into two sub-arrays: Elements less than or equal to pivot Elements greater than pivot Quicksort two sub-arrays Return results

Example We are given array of n integers to sort:

Pick Pivot Element There are a number of ways to pick the pivot element. In this example, we will use the first element in the array:

Partitioning Array Given a pivot, partition the elements of the array such that the resulting array consists of: 1. One sub-array that contains elements >= pivot 2. Another sub-array that contains elements < pivot The sub-arrays are stored in the original data array. Partitioning loops through, swapping elements below/above pivot.

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index]

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index]

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1.

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1.

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1.

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1.

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1.

pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1.

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1. 5.Swap data[too_small_index] and data[pivot_index] pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1. 5.Swap data[too_small_index] and data[pivot_index] pivot_index = 4 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

Partition Result [0] [1] [2] [3] [4] [5] [6] [7] [8] <= data[pivot]> data[pivot]

Recursion: Quicksort Sub- arrays [0] [1] [2] [3] [4] [5] [6] [7] [8] <= data[pivot]> data[pivot]

Quicksort Analysis Assume that keys are random, uniformly distributed. What is best case running time?

Quicksort Analysis Assume that keys are random, uniformly distributed. What is best case running time? Recursion: 1. Partition splits array in two sub-arrays of size n/2 2. Quicksort each sub-array

Quicksort Analysis Assume that keys are random, uniformly distributed. What is best case running time? Recursion: 1. Partition splits array in two sub-arrays of size n/2 2. Quicksort each sub-array Depth of recursion tree?

Quicksort Analysis Assume that keys are random, uniformly distributed. What is best case running time? Recursion: 1. Partition splits array in two sub-arrays of size n/2 2. Quicksort each sub-array Depth of recursion tree? O(log 2 n)

Quicksort Analysis Assume that keys are random, uniformly distributed. What is best case running time? Recursion: 1. Partition splits array in two sub-arrays of size n/2 2. Quicksort each sub-array Depth of recursion tree? O(log 2 n) Number of accesses in partition?

Quicksort Analysis Assume that keys are random, uniformly distributed. What is best case running time? Recursion: 1. Partition splits array in two sub-arrays of size n/2 2. Quicksort each sub-array Depth of recursion tree? O(log 2 n) Number of accesses in partition? O(n)

Quicksort Analysis Assume that keys are random, uniformly distributed. Best case running time: O(n log 2 n)

Quicksort Analysis Assume that keys are random, uniformly distributed. Best case running time: O(n log 2 n) Worst case running time?

Quicksort: Worst Case Assume first element is chosen as pivot. Assume we get array that is already in order: pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1. 5.Swap data[too_small_index] and data[pivot_index] pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_indextoo_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1. 5.Swap data[too_small_index] and data[pivot_index] pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_indextoo_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1. 5.Swap data[too_small_index] and data[pivot_index] pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_indextoo_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1. 5.Swap data[too_small_index] and data[pivot_index] pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_indextoo_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1. 5.Swap data[too_small_index] and data[pivot_index] pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_indextoo_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1. 5.Swap data[too_small_index] and data[pivot_index] pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_indextoo_small_index

1.While data[too_big_index] <= data[pivot] ++too_big_index 2.While data[too_small_index] > data[pivot] --too_small_index 3.If too_big_index < too_small_index swap data[too_big_index] and data[too_small_index] 4.While too_small_index > too_big_index, go to 1. 5.Swap data[too_small_index] and data[pivot_index] pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] > data[pivot]<= data[pivot]

Quicksort Analysis Assume that keys are random, uniformly distributed. Best case running time: O(n log 2 n) Worst case running time? Recursion: 1. Partition splits array in two sub-arrays: one sub-array of size 0 the other sub-array of size n-1 2. Quicksort each sub-array Depth of recursion tree?

Quicksort Analysis Assume that keys are random, uniformly distributed. Best case running time: O(n log 2 n) Worst case running time? Recursion: 1. Partition splits array in two sub-arrays: one sub-array of size 0 the other sub-array of size n-1 2. Quicksort each sub-array Depth of recursion tree? O(n)

Quicksort Analysis Assume that keys are random, uniformly distributed. Best case running time: O(n log 2 n) Worst case running time? Recursion: 1. Partition splits array in two sub-arrays: one sub-array of size 0 the other sub-array of size n-1 2. Quicksort each sub-array Depth of recursion tree? O(n) Number of accesses per partition?

Quicksort Analysis Assume that keys are random, uniformly distributed. Best case running time: O(n log 2 n) Worst case running time? Recursion: 1. Partition splits array in two sub-arrays: one sub-array of size 0 the other sub-array of size n-1 2. Quicksort each sub-array Depth of recursion tree? O(n) Number of accesses per partition? O(n)

Quicksort Analysis Assume that keys are random, uniformly distributed. Best case running time: O(n log 2 n) Worst case running time: O(n 2 )!!!

Quicksort Analysis Assume that keys are random, uniformly distributed. Best case running time: O(n log 2 n) Worst case running time: O(n 2 )!!! What can we do to avoid worst case?

Improved Pivot Selection Pick median value of three elements from data array: data[0], data[n/2], and data[n-1]. Use this median value as pivot.

Improving Performance of Quicksort Improved selection of pivot. For sub-arrays of size 3 or less, apply brute force search: Sub-array of size 1: trivial Sub-array of size 2: if(data[first] > data[second]) swap them

Radix Sort Limit input to fixed-length numbers or words. Represent symbols in some base b. Each input has exactly d “digits”. Sort numbers d times, using 1 digit as key. Must sort from least-significant to most-significant digit. Must use any “stable” sort, keeping equal-keyed items in same order.

Radix Sort Example ababaccaaacbbabccabbaaac Input data:

Radix Sort Example ababaccaaacbbabccabbaaac Input data: abc Pass 1: Looking at rightmost position.

Radix Sort Example aba baccaaacbbabccabbaaac Input data: abc Place into appropriate pile.

Radix Sort Example ababac caaacbbabccabbaaac Input data: abc Place into appropriate pile.

Radix Sort Example ababac caa acbbabccabbaaac Input data: abc Place into appropriate pile.

Radix Sort Example ababac caa acb babccabbaaac Input data: abc Place into appropriate pile.

Radix Sort Example ababac caa acb bab cca bba aac Input data: abc Place into appropriate pile.

Radix Sort Example ababaccaaacbbabccabbaaac abc Join piles. Pass 2 looks at next position.

Radix Sort Example aba bac caa acb bab cca bba aac abc Place into appropriate pile.

Radix Sort Example abc Join piles. Pass 3 looks at next position. baccaababaacababbaacbcca

Radix Sort Example abc Join piles. Pass 3 looks at next position. bac caababaac aba bbaacb cca

Radix Sort Example abc Join piles. baccaababaacababbaacbcca

Radix Sort Algorithm rsort(A,n): for d := 0 to n-1 /* Stable sort A, using digit position d as the key. */ for i := 1 to |A| add A[i] to end of list ((A[i]>>d) mod b) A = join lists 0..b-1  (dn) time, where d is taken to be a constant.