Chapter 7 Sorting Part II. 7.3 QUICK SORT Example 4151264 left right pivot i j 5 > pivot and should go to the other side. 2 < pivot and should go to.

Slides:



Advertisements
Similar presentations
©TheMcGraw-Hill Companies, Inc. Permission required for reproduction or display. Chapter 15 Recursive Algorithms.
Advertisements

Order of complexity. Consider four algorithms 1.The naïve way of adding the numbers up to n 2.The smart way of adding the numbers up to n 3.A binary search.
Continuation of chapter 6…. Nested while loop A while loop used within another while loop is called nested while loop. Q. An illustration to generate.
Chapter 14 Recursion Lecture Slides to Accompany An Introduction to Computer Science Using Java (2nd Edition) by S.N. Kamin, D. Mickunas,, E. Reingold.
©The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 4 th Ed Chapter Chapter 15 Recursive Algorithms.
For(int i = 1; i
Lab class 10: 1. Analyze and implement the following merge-sorting program. //import java.lang.*; public class MergeSorter { /** * Sort the elements of.
SORTING AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Spring 2014 File searchSortAlgorithms.zip on course website (lecture notes for lectures 12, 13) contains.
Quicksort File: D|\data\bit143\Fall01\day1212\quicksort.sdd BIT Gerard Harrison Divide and Conquer Reduce the problem by reducing the data set. The.
Quick Sort. Concept Used What is the concept used in Merge and Quick Sort? This two sorting techniques use “DIVIDE and CONQUER “ Technique. What is Divide.
Foundations of Algorithms, Fourth Edition
Quick Sort Elements pivot Data Movement Sorted.
Faster Sorting Methods Chapter 9 Copyright ©2012 by Pearson Education, Inc. All rights reserved.
D1: Quick Sort. The quick sort is an algorithm that sorts data into a specified order. For a quick sort, select the data item in the middle of the list.
Chapter 7 Sorting Part I. 7.1 Motivation list: a collection of records. keys: the fields used to distinguish among the records. One way to search for.
QuickSort Example Use the first number in the list as a ‘pivot’ First write a list of the numbers smaller than the pivot, in the order.
Data Structures and Algorithms
CSE 1302 Lecture 22 Quick Sort and Merge Sort Richard Gesick.
CS 162 Intro to Programming II Quick Sort 1. Quicksort Maybe the most commonly used algorithm Quicksort is also a divide and conquer algorithm Advantage.
Linear-Time Selection Randomized Selection (Algorithm) Design and Analysis of Algorithms I.
Data Structures Chapter 8 Sorting Andreas Savva. 2 Sorting Smith Sanchez Roberts Kennedy Jones Johnson Jackson Brown George Brown 32 Cyprus Road Good.
CS 280 Data Structures Professor John Peterson. Project Questions? /CIS280/f07/project5http://wiki.western.edu/mcis/index.php.
General Computer Science for Engineers CISC 106 James Atlas Computer and Information Sciences 10/23/2009.
Copyright © 2006 Pearson Addison-Wesley. All rights reserved. Sorting III 1 An Introduction to Sorting.
CS2420: Lecture 10 Vladimir Kulyukin Computer Science Department Utah State University.
Unit 061 Quick Sort csc326 Information Structures Spring 2009.
QuickSort QuickSort is often called Partition Sort. It is a recursive method, in which the unsorted array is first rearranged so that there is some record,
S: Application of quicksort on an array of ints: partitioning.
Selection Sort
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
CSE 221/ICT221 Analysis and Design of Algorithms Lecture 05: Analysis of time Complexity of Sorting Algorithms Dr.Surasak Mungsing
Sorting Algorithms Data Structures & Problem Solving Using JAVA Second Edition Mark Allen Weiss Chapter 8 © 2002 Addison Wesley.
Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Average Case Analysis.
Selection Sort
COMP 1001: Introduction to Computers for Arts and Social Sciences Sorting Algorithms Wednesday, June 1, 2011.
CSS106 Introduction to Elementary Algorithms M.Sc Askar Satabaldiyev Lecture 05: MergeSort & QuickSort.
Computer Science 101 A Survey of Computer Science QuickSort.
Sorting: Advanced Techniques Smt Genap
Sorting 1. Insertion Sort
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
M180: Data Structures & Algorithms in Java Sorting Algorithms Arab Open University 1.
CSC 201 Analysis and Design of Algorithms Lecture 05: Analysis of time Complexity of Sorting Algorithms Dr.Surasak Mungsing
Bubble Sort Example
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley. Ver Chapter 9: Algorithm Efficiency and Sorting Data Abstraction &
3:00. 2:59 2:58 2:57 2:56 2:55 2:54 2:53 2:52.
Quicksort Dr. Yingwu Zhu. 2 Quicksort A more efficient exchange sorting scheme than bubble sort – A typical exchange involves elements that are far apart.
Quick Sort Modifications By Mr. Dave Clausen Updated for Python.
Sorting Arrays ANSI-C. Selection Sort Assume want to sort a table of integers, of size length, in increasing order. Sketch of algorithm: –Find position.
QuickSort Algorithm 1. If first < last then begin 2. Partition the elements in the subarray first..last so that the pivot value is in place (in position.
1Computer Sciences Department. 2 QUICKSORT QUICKSORT TUTORIAL 5.
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
基 督 再 來 (一). 經文: 1 你們心裡不要憂愁;你們信神,也當信我。 2 在我父的家裡有許多住處;若是沒有,我就早 已告訴你們了。我去原是為你們預備地去 。 3 我 若去為你們預備了地方,就必再來接你們到我那 裡去,我在 那裡,叫你們也在那裡, ] ( 約 14 : 1-3)
Insertion Sorting example { 48}
Lecture No.45 Data Structures Dr. Sohail Aslam.
Quick-Sort To understand quick-sort, let’s look at a high-level description of the algorithm 1) Divide : If the sequence S has 2 or more elements, select.
QuickSort QuickSort is often called Partition Sort.
Top Fire Protection Services Ottawa available on Dubinskyconstruction
Advance Analysis of Algorithms
Shaker.
Слайд-дәріс Қарағанды мемлекеттік техникалық университеті
.. -"""--..J '. / /I/I =---=-- -, _ --, _ = :;:.
Sorting.
Yan Shi CS/SE 2630 Lecture Notes
slides adapted from Marty Stepp
II //II // \ Others Q.
I1I1 a 1·1,.,.,,I.,,I · I 1··n I J,-·
Chapter 2: Getting Started
Template Functions Lecture 9 Fri, Feb 9, 2007.
. '. '. I;.,, - - "!' - -·-·,Ii '.....,,......, -,
Algorithm Efficiency and Sorting
Presentation transcript:

Chapter 7 Sorting Part II

7.3 QUICK SORT

Example left right pivot i j 5 > pivot and should go to the other side. 2 < pivot and should go to the other side. Interchange a[i] and a[j] 25 Stop, because i == j. a[j] will eventually stop at a position where a[j] < pivot. Interchange a[j] and pivot. 144 Stop; because i > j.

Algorithm void QuickSort(int a[], int left, int right) { if (left < right) { pivot = a[left]; i = left; j = right+1; while (i < j) { for (i++; i<j and a[i] < pivot; i++) ; for (j--; i = pivot; j--) ; if (i < j) interchages a[i] and [i]; } interchange a[j] and pivot; QuickSort(a, left, j-1); QuickSort(a, j+1 right;); } void QuickSort(int a[], int left, int right) { if (left < right) { pivot = a[left]; i = left; j = right+1; while (i < j) { for (i++; i<j and a[i] < pivot; i++) ; for (j--; i = pivot; j--) ; if (i < j) interchages a[i] and [i]; } interchange a[j] and pivot; QuickSort(a, left, j-1); QuickSort(a, j+1 right;); }

Analysis of QuickSort() Worst case: ◦ Consider a list is stored.  The smallest one is always chosen as pivot.  In 1st iteration, n-1 elements are examined. In second iterations, n-2 elements are examined...  Totally, the execution steps are n-1 + n-2 + … + 1 = O(n 2 )  The time complexity is O(n 2 ); 16425

Lemma 7.1 Let T avg (n) be the expect time for function QuickSort() to sort a list with n records. Then there exists a constant k such that T avg (n) ≦ knlog e n for n ≧ 2. ◦ In other words, T avg (n) = O(nlogn)

Variations The position of the pivot decides the time complexity of QuickSort(). The best choice for the pivot is the median. ◦ Variations:  Median-of-three: select the median among three records: the most left, the most right, and the middle one.  Random: select the pivot randomly.

7.4 HOW FAST CAN WE SORT? (DECISION TREE)

Consideration What is the best computing time for sorting that we can hope for? ◦ Suppose the only operations permitted on keys are comparisons and exchanges.  In this section, we shall prove O(n logn) is the best possible time.  Using decision tree.

Example 7.4 Decision tree for Insertion Sort working on [K 1, K 2, K 3 ] K1 ≦K2K1 ≦K2 [1, 2, 3] K2 ≦K3K2 ≦K3 Stop [1, 2, 3] K1 ≦K3K1 ≦K3 [1, 3, 2] Stop [3, 1, 2][1, 3, 2] K1 ≦K3K1 ≦K3 [2, 1, 3] Stop K2 ≦K3K2 ≦K3 [2, 3, 1] Stop [3, 2, 1][2, 3, 1] [2, 1, 3] Y Y Y Y Y N N N N N I IIIII IV VVI

Observations The leaf nodes denote the states to terminate. The number of permutations is 3! = 6. ◦ n! possible permutation for n records to sort.  A path from the root to some leaf node represents one of n! possibilities. The maximum depth of the tree is 3. ◦ The depth represents the number of comparisons.

Theorem 7.1 Any decision tree that sorts n distinct elements has a height of at least log 2 (n!)+1. ◦ When sorting n elements, there are n! different possible results.  Every decision tree for the sorting must at least have n! leaves. ◦ A decision tree is a binary tree; therefore  2 k-1 leaves if its height is k.

Corollary Any algorithm that sorts only by comparisons must have a worst case computing time of Ω (n logn). Proof ◦ By theorem, there is a path of length log 2 n! So,

7.5 MERGE SORT

Merging Consider how to merge two ordered lists. initList mergeList l mm+1n sorted merge

Example l mm+1n i1i2i1 iResult i1 i2 iResult iResult = l initList mergeList i1i2 iResult Copy the small one to mergeList Copy the rest to mergeList

void Merge(int *initList, int *mergeList, int l, int m, int n) { int i1=l, iResult=l, i2=m+1; while (i1 <= m && i2 <= n) { if (initList[i1] <= initList[i2]) { mergeList[iResult++] = initList[i1]; i1++; } else { mergeList[iResult++] = initList[i2]; i2++; } for (i1; i1<=m; i1++) mergeList[iResult++] = initList[i1]; for (i2; i2<=n; i2++) mergeList[iResult++] = initList[i2]; }

Analysis of Merge() Time complexity: ◦ The while-loop and two for-loops examine each element in initList exactly once. ◦ The time complexity is O(n- l +1). Space complexity: ◦ The additional array mergeList is required to store the merged result. ◦ Space complexity is O(n- l +1).

7.5.2 Iterative Merge Sort L: the maximum number of records in a block L = 1 L = 2 L = 4 L = 8 L = 16

Merging blocks with length of L: L = 2 ii+L-1i+Li+2L-1i+2L 0123

Adjacent pairs of blocks of size L are merged from initList to resultList. n is the number of records in initList. void MergePass(int *initList, int *resultList, int n, int L) { int i; for (i=0; i<=n-2*L; i+=2*L) Merge(initList, resultList, i i+L-1, i+2L-1); if (i + L - 1 < n-1) Merge(initList, resultList, i, i+L-1, n-1); else { for (i; i<n; i++) resultList[i] = resultList[i]; } void MergePass(int *initList, int *resultList, int n, int L) { int i; for (i=0; i<=n-2*L; i+=2*L) Merge(initList, resultList, i i+L-1, i+2L-1); if (i + L - 1 < n-1) Merge(initList, resultList, i, i+L-1, n-1); else { for (i; i<n; i++) resultList[i] = resultList[i]; } Merge remaining blocks of L < size < 2L

i<=n-2L. Merge adjacent blacks. i<=n-2L. Merge adjacent blacks i+L-1 < n-1. Merge from i to n-1. i+L-1 < n-1. Merge from i to n i+L-1 >= n-1. Copy the rest. i+L-1 >= n-1. Copy the rest. 5

Merge Sort L denotes the length of block currently being merged. void MergeSort(int *a, int n) { int *tempList = new int[n]; for (int L=1; L<n; L*=2) { MergePass(a, tempList, n, L); L *= 2; MergePass(tempList, a, n, L); } delete [] tempList; } The result is put into tempList Do the next pass directly. Merge records from tempList to a

Analysis of MergeSort() Suppose there are n records. Space complexity: O(n). Time Complexity ◦ MergePass():  O(n). ◦ MergeSort():  A total passes are made over the data.  Therefore, the time complexity O(nlogn). LL n

7.5.2 Recursive Merge Sort We divide the list into two roughly equal parts and sort them recursively. sorted merge leftright

s void Merge(int *initList, int s, int m, int e) { int *temp = new int[e-s+1]; int i1=s, iResult=0, i2=m+1; while (i1 <= m && i2 <= e) { if (initList[i1] <= initList[i2]) { temp[iResult++] = initList[i1]; i1++; } else { temp[iResult++] = initList[i2]; i2++; } for (i1; i1<=m; i1++) temp[iResult++] = initList[i1]; for (i2; i2<=n; i2++) temp[iResult++] = initList[i2]; for (i=0; i<iResult; i++) initList[i] = temp[i]; delete [] temp; }

MergeSort() start and end respectively denote the left end and right end to be sorted in the array a. void MergeSort(int *a, int start, int end) { if (end <= start) return; middle = (start + end) / 2; MergeSort(a, start, middle); MergeSort(a, middle+1, end); Merge(a, start, middle, end); }

Analysis of Recursive Merge Sort Suppose there are n records to be sorted. Time complexity O(nlogn)

Variation Natural Merge Sort ◦ Make an initial pass over the data  To determine the sublists of records that are in order.

7.6 HEAP SORT

Discussion Merge Sort ◦ In worst case and average case, the time complexity is O(nlogn). ◦ However, additional storage is required.  There is O(1) space merge algorithm, but it is much slower than the original one. Heap Sort ◦ Only a fixed amount of additional storage is required. ◦ The time complexity is also O(nlogn). ◦ Using max heap.

Selection Sort Suppose there are n records in the list. How to sort the list? ◦ First, find the largest and put it at the position n-1. ◦ Second, find the largest from 0 to n-2 (the second largest), and put it at the position n-2. … ◦ In the i-th iteration, the i-th largest is selected and is put at the position of n-i. Consider how to sort [3, 4, 1, 5, 2].

34152 Select Select Select Select Select

Analysis of Selection In the i-th iteration, O(n-i+1) computing time is required to select the i-th largest. O(n) + O(n-1) + … + O(1) = O(n 2 ). ◦ How do we decrease the time complexity?  Improve the approach of selecting the maximum.  Using max heap  The deletion of the maximum from a max heap is O(log n), when there are n elements in the heap.  Note: when using heap sort, the data is stored in [1;n].

Example 26 [1] 5 [2] 77 [3] 1 [4] 61 [5] 11 [6] 59 [7] 15 [8] 48 [9] 19 [10] 77 [1] 61 [2] 59 [3] 48 [4] 19 [5] 11 [6] 26 [7] 15 [8] 1 [9] 5 [10] Initial Array Initial Heap

61 [1] 48 [2] 59 [3] 15 [4] 19 [5] 11 [6] 26 [7] 5 [8] 1 [9] 59 [1] 48 [2] 26 [3] 15 [4] 19 [5] 11 [6] 1 [7] 5 [8] Heap Size = 9 Sorted = [77] Heap Size = 8 Sorted = [61, 77]

48 [1] 19 [2] 26 [3] 15 [4] 5 [5] 11 [6] 1 [7] 26 [1] 19 [2] 11 [3] 15 [4] 5 [5] 1 [6] Heap Size = 7 Sorted = [59, 61, 77] Heap Size = 69 Sorted = [48, 59, 61, 77]

Adjust() Adjust binary tree with root to satisfy heap property. void Adjust(int *a, int root, int n) { int e = a[root]; for (int j=2*root; j<=n; j*=2) { if (j < n && a[j] < a[j+1]) //j is max child of its parent j++; if (e >= a[j]) break; a[j/2] = a[j]; } a[j/2] = e; }

Heap Sort void HeapSort(int *a, int n) { for (int i=n/2; i>=1; i--) //heapify Adjust(a, i, n); for (int i=n-1; i>=1; i--) //sort { swap(a[1], a[i+1]); Adjust(a, 1, i); }

Analysis of HeapSort() Space complexity: O(1). Time complexity: ◦ Suppose the tree has k levels.  The number of nodes on level i is ≦ 2 i-1. ◦ In the first loop, Adjust() is called once for each node that has a child.

Analysis of HeapSort() Space complexity: O(1). Time complexity: ◦ Suppose the tree has k levels.  The number of nodes on level i is ≦ 2 i-1. ◦ In the first loop, Adjust() is called once for each node that has a child.  Time complexity:

Analysis of HeapSort() ◦ In the next loop, n-1 times of Adjust() are made with maximum depth k = and swap is invoked n-1 times. ◦ Consequently, time computing time for the loop is O(n logn). Overall, the time complexity for HeapSort() is O(n logn).