Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount.

Similar presentations


Presentation on theme: "Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount."โ€” Presentation transcript:

1 Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount (Wiley 2004) and slides from Jory Denny and Mukulika Ghosh 1 1

2 Divide and Conquer Algorithms analysis with recurrence equations
Divide-and conquer is a general algorithm design paradigm: Divide: divide the input data ๐‘† into ๐‘˜ (disjoint) subsets ๐‘† 1 , ๐‘† 2 ,โ€ฆ, ๐‘† ๐‘˜ Recur: solve the sub-problems recursively Conquer: combine the solutions for ๐‘† 1 , ๐‘† 2 ,โ€ฆ, ๐‘† ๐‘˜ into a solution for ๐‘† The base case for the recursion are sub-problems of constant size Analysis can be done using recurrence equations (relations)

3 Divide and Conquer Algorithms analysis with recurrence equations
When the size of all sub-problems is the same (frequently the case) the recurrence equation representing the algorithm is: ๐‘‡ ๐‘› =๐ท ๐‘› +๐‘˜๐‘‡ ๐‘› ๐‘ +๐ถ ๐‘› Where ๐ท ๐‘› is the cost of dividing ๐‘† into the ๐‘˜ sub-problems ๐‘† 1 , ๐‘† 2 ,โ€ฆ, ๐‘† ๐‘˜ There are ๐‘˜ sub-problems, each of size ๐‘› ๐‘ that will be solved recursively ๐ถ ๐‘› is the cost of combining the sub-problem solutions to get the solution for ๐‘†

4 Exercise Recurrence Equation Setup
Algorithm โ€“ transform multiplication of two ๐‘›-bit integers ๐ผ and ๐ฝ into multiplication of ๐‘› 2 -bit integers and some additions/shifts Where does recursion happen in this algorithm? Rewrite the step(s) of the algorithm to show this clearly. Algorithm multiply ๐ผ,๐ฝ Input: ๐‘›-bit integers ๐ผ,๐ฝ Output: ๐ผโˆ—๐ฝ if ๐‘›>1 Split ๐ผ and ๐ฝ into high and low order halves: ๐ผ โ„Ž , ๐ผ ๐‘™ , ๐ฝ โ„Ž , ๐ฝ ๐‘™ ๐‘ฅ 1 โ† ๐ผ โ„Ž โˆ— ๐ฝ โ„Ž ; ๐‘ฅ 2 โ† ๐ผ โ„Ž โˆ— ๐ฝ ๐‘™ ; ๐‘ฅ 3 โ† ๐ผ ๐‘™ โˆ— ๐ฝ โ„Ž ; ๐‘ฅ 4 โ† ๐ผ ๐‘™ โˆ— ๐ฝ ๐‘™ ๐‘โ† ๐‘ฅ 1 โˆ— 2 ๐‘› + ๐‘ฅ 2 โˆ— 2 ๐‘› 2 + ๐‘ฅ 3 โˆ— 2 ๐‘› 2 + ๐‘ฅ 4 else ๐‘โ†๐ผโˆ—๐ฝ return ๐‘

5 Exercise Recurrence Equation Setup
Algorithm โ€“ transform multiplication of two ๐‘›-bit integers ๐ผ and ๐ฝ into multiplication of ๐‘› 2 -bit integers and some additions/shifts Assuming that additions and shifts of ๐‘›-bit numbers can be done in ๐‘‚ ๐‘› time, describe a recurrence equation showing the running time of this multiplication algorithm Algorithm multiply ๐ผ,๐ฝ Input: ๐‘›-bit integers ๐ผ,๐ฝ Output: ๐ผโˆ—๐ฝ if ๐‘›>1 Split ๐ผ and ๐ฝ into high and low order halves: ๐ผ โ„Ž , ๐ผ ๐‘™ , ๐ฝ โ„Ž , ๐ฝ ๐‘™ ๐‘ฅ 1 โ†multiply ๐ผ โ„Ž , ๐ฝ โ„Ž ; ๐‘ฅ 2 โ†multiply ๐ผ โ„Ž , ๐ฝ ๐‘™ ; ๐‘ฅ 3 โ†multiply ๐ผ ๐‘™ , ๐ฝ โ„Ž ; ๐‘ฅ 4 โ†multiply ๐ผ ๐‘™ , ๐ฝ ๐‘™ ๐‘โ† ๐‘ฅ 1 โˆ— 2 ๐‘› + ๐‘ฅ 2 โˆ— 2 ๐‘› 2 + ๐‘ฅ 3 โˆ— 2 ๐‘› 2 + ๐‘ฅ 4 else ๐‘โ†๐ผโˆ—๐ฝ return ๐‘

6 Exercise Recurrence Equation Setup
Algorithm โ€“ transform multiplication of two ๐‘›-bit integers ๐ผ and ๐ฝ into multiplication of ๐‘› 2 -bit integers and some additions/shifts The recurrence equation for this algorithm is: ๐‘‡ ๐‘› =4๐‘‡ ๐‘› 2 +๐‘‚ ๐‘› The solution is ๐‘‚ ๐‘› 2 which is the same as naรฏve algorithm Algorithm multiply ๐ผ,๐ฝ Input: ๐‘›-bit integers ๐ผ,๐ฝ Output: ๐ผโˆ—๐ฝ if ๐‘›>1 Split ๐ผ and ๐ฝ into high and low order halves: ๐ผ โ„Ž , ๐ผ ๐‘™ , ๐ฝ โ„Ž , ๐ฝ ๐‘™ ๐‘ฅ 1 โ†multiply ๐ผ โ„Ž , ๐ฝ โ„Ž ; ๐‘ฅ 2 โ†multiply ๐ผ โ„Ž , ๐ฝ ๐‘™ ; ๐‘ฅ 3 โ†multiply ๐ผ ๐‘™ , ๐ฝ โ„Ž ; ๐‘ฅ 4 โ†multiply ๐ผ ๐‘™ , ๐ฝ ๐‘™ ๐‘โ† ๐‘ฅ 1 โˆ— 2 ๐‘› + ๐‘ฅ 2 โˆ— 2 ๐‘› 2 + ๐‘ฅ 3 โˆ— 2 ๐‘› 2 + ๐‘ฅ 4 else ๐‘โ†๐ผโˆ—๐ฝ return ๐‘

7 Merge Sort Merge Sort ไบŒโ—‹ไธ€ๅ…ซๅนดไธ€ๆœˆๅไบŒๆ—ฅ 7 2 ๏‚ฝ 9 4 ๏‚ฎ 2 4 7 9 7 ๏‚ฝ 2 ๏‚ฎ 2 7
7 2 ๏‚ฝ ๏‚ฎ 7 ๏‚ฝ 2 ๏‚ฎ 2 7 9 ๏‚ฝ 4 ๏‚ฎ 4 9 7 ๏‚ฎ 7 2 ๏‚ฎ 2 9 ๏‚ฎ 9 4 ๏‚ฎ 4 Merge Sort

8 Merge-Sort Merge-sort is based on the divide-and-conquer paradigm. It consists of three steps: Divide: partition input sequence ๐‘† into two sequences ๐‘† 1 and ๐‘† 2 of about ๐‘› 2 elements each Recur: recursively sort ๐‘† 1 and ๐‘† 2 Conquer: merge ๐‘† 1 and ๐‘† 2 into a sorted sequence Algorithm mergeSort(๐‘†, ๐ถ) Input: Sequence ๐‘† of ๐‘› elements, Comparator ๐ถ Output: Sequence ๐‘† sorted according to ๐ถ if ๐‘†.๐‘ ๐‘–๐‘ง๐‘’ >1 ๐‘† 1 , ๐‘† 2 โ†partition ๐‘†, ๐‘› 2 ๐‘† 1 โ†mergeSort ๐‘† 1 , ๐ถ ๐‘† 2 โ†mergeSort ๐‘† 2 ,๐ถ ๐‘†โ†merge ๐‘† 1 , ๐‘† 2 return ๐‘†

9 Merge-Sort Execution Tree (recursive calls)
An execution of merge-sort is depicted by a binary tree Each node represents a recursive call of merge-sort and stores Unsorted sequence before the execution and its partition Sorted sequence at the end of the execution The root is the initial call The leaves are calls on subsequences of size 0 or 1 7 2 ๏‚ฝ ๏‚ฎ 7 ๏‚ฝ 2 ๏‚ฎ 2 7 9 ๏‚ฝ 4 ๏‚ฎ 4 9 7 ๏‚ฎ 7 2 ๏‚ฎ 2 9 ๏‚ฎ 9 4 ๏‚ฎ 4

10 Execution Example Partition 7 2 9 4 ๏‚ฎ 2 4 7 9 3 8 6 1 ๏‚ฎ 1 3 8 6
๏‚ฎ ๏‚ฎ 7 2 ๏‚ฎ 2 7 9 4 ๏‚ฎ 4 9 3 8 ๏‚ฎ 3 8 6 1 ๏‚ฎ 1 6 7 ๏‚ฎ 7 2 ๏‚ฎ 2 9 ๏‚ฎ 9 4 ๏‚ฎ 4 3 ๏‚ฎ 3 8 ๏‚ฎ 8 6 ๏‚ฎ 6 1 ๏‚ฎ 1 ๏‚ฝ ๏‚ฎ

11 Execution Example Recursive Call, partition 7 2 | 9 4 ๏‚ฎ 2 4 7 9
7 2 | ๏‚ฎ ๏‚ฎ 7 2 ๏‚ฎ 2 7 9 4 ๏‚ฎ 4 9 3 8 ๏‚ฎ 3 8 6 1 ๏‚ฎ 1 6 7 ๏‚ฎ 7 2 ๏‚ฎ 2 9 ๏‚ฎ 9 4 ๏‚ฎ 4 3 ๏‚ฎ 3 8 ๏‚ฎ 8 6 ๏‚ฎ 6 1 ๏‚ฎ 1 ๏‚ฝ ๏‚ฎ

12 Execution Example Recursive Call, partition 7 2 | 9 4 ๏‚ฎ 2 4 7 9
7 2 | ๏‚ฎ ๏‚ฎ 7 | 2 ๏‚ฎ 2 7 9 4 ๏‚ฎ 4 9 3 8 ๏‚ฎ 3 8 6 1 ๏‚ฎ 1 6 7 ๏‚ฎ 7 2 ๏‚ฎ 2 9 ๏‚ฎ 9 4 ๏‚ฎ 4 3 ๏‚ฎ 3 8 ๏‚ฎ 8 6 ๏‚ฎ 6 1 ๏‚ฎ 1 ๏‚ฝ ๏‚ฎ

13 Execution Example Recursive Call, base case 7 2 | 9 4 ๏‚ฎ 2 4 7 9
7 2 | ๏‚ฎ ๏‚ฎ 7 | 2 ๏‚ฎ 2 7 9 4 ๏‚ฎ 4 9 3 8 ๏‚ฎ 3 8 6 1 ๏‚ฎ 1 6 7 ๏‚ฎ 7 2 ๏‚ฎ 2 9 ๏‚ฎ 9 4 ๏‚ฎ 4 3 ๏‚ฎ 3 8 ๏‚ฎ 8 6 ๏‚ฎ 6 1 ๏‚ฎ 1 ๏‚ฝ ๏‚ฎ

14 Execution Example Recursive Call, base case 7 2 | 9 4 ๏‚ฎ 2 4 7 9
7 2 | ๏‚ฎ ๏‚ฎ 7 | 2 ๏‚ฎ 2 7 9 4 ๏‚ฎ 4 9 3 8 ๏‚ฎ 3 8 6 1 ๏‚ฎ 1 6 7 ๏‚ฎ 7 2 ๏‚ฎ 2 9 ๏‚ฎ 9 4 ๏‚ฎ 4 3 ๏‚ฎ 3 8 ๏‚ฎ 8 6 ๏‚ฎ 6 1 ๏‚ฎ 1 ๏‚ฝ ๏‚ฎ

15 Execution Example Merge 7 2 | 9 4 ๏‚ฎ 2 4 7 9 3 8 6 1 ๏‚ฎ 1 3 8 6
7 2 | ๏‚ฎ ๏‚ฎ 7 | 2 ๏‚ฎ 2 7 9 4 ๏‚ฎ 4 9 3 8 ๏‚ฎ 3 8 6 1 ๏‚ฎ 1 6 7 ๏‚ฎ 7 2 ๏‚ฎ 2 9 ๏‚ฎ 9 4 ๏‚ฎ 4 3 ๏‚ฎ 3 8 ๏‚ฎ 8 6 ๏‚ฎ 6 1 ๏‚ฎ 1 ๏‚ฝ ๏‚ฎ

16 Execution Example Recursive call, โ€ฆ, base case, merge
7 2 | ๏‚ฎ ๏‚ฎ 7 | 2 ๏‚ฎ 2 7 9 | 4 ๏‚ฎ 4 9 3 8 ๏‚ฎ 3 8 6 1 ๏‚ฎ 1 6 7 ๏‚ฎ 7 2 ๏‚ฎ 2 9 ๏‚ฎ 9 4 ๏‚ฎ 4 3 ๏‚ฎ 3 8 ๏‚ฎ 8 6 ๏‚ฎ 6 1 ๏‚ฎ 1 ๏‚ฝ ๏‚ฎ

17 Execution Example Merge 7 2 | 9 4 ๏‚ฎ 2 4 7 9 3 8 6 1 ๏‚ฎ 1 3 8 6
7 2 | ๏‚ฎ ๏‚ฎ 7 | 2 ๏‚ฎ 2 7 9 | 4 ๏‚ฎ 4 9 3 8 ๏‚ฎ 3 8 6 1 ๏‚ฎ 1 6 7 ๏‚ฎ 7 2 ๏‚ฎ 2 9 ๏‚ฎ 9 4 ๏‚ฎ 4 3 ๏‚ฎ 3 8 ๏‚ฎ 8 6 ๏‚ฎ 6 1 ๏‚ฎ 1 ๏‚ฝ ๏‚ฎ

18 Execution Example Recursive call, โ€ฆ, merge, merge 7 2 | 9 4 ๏‚ฎ 2 4 7 9
7 2 | ๏‚ฎ 3 8 | ๏‚ฎ 7 | 2 ๏‚ฎ 2 7 9 | 4 ๏‚ฎ 4 9 3 | 8 ๏‚ฎ 3 8 6 | 1 ๏‚ฎ 1 6 7 ๏‚ฎ 7 2 ๏‚ฎ 2 9 ๏‚ฎ 9 4 ๏‚ฎ 4 3 ๏‚ฎ 3 8 ๏‚ฎ 8 6 ๏‚ฎ 6 1 ๏‚ฎ 1 ๏‚ฝ ๏‚ฎ

19 Execution Example Merge 7 2 | 9 4 ๏‚ฎ 2 4 7 9 3 8 | 6 1 ๏‚ฎ 1 3 8 6
7 2 | ๏‚ฎ 3 8 | ๏‚ฎ 7 | 2 ๏‚ฎ 2 7 9 | 4 ๏‚ฎ 4 9 3 | 8 ๏‚ฎ 3 8 6 | 1 ๏‚ฎ 1 6 7 ๏‚ฎ 7 2 ๏‚ฎ 2 9 ๏‚ฎ 9 4 ๏‚ฎ 4 3 ๏‚ฎ 3 8 ๏‚ฎ 8 6 ๏‚ฎ 6 1 ๏‚ฎ 1 ๏‚ฝ ๏‚ฎ

20 Analysis Using Recurrence relation
The running time of Merge Sort can be expressed by the recurrence equation: ๐‘‡ ๐‘› =2๐‘‡ ๐‘› 2 +๐‘€ ๐‘› We need to determine ๐‘€ ๐‘› , the time to merge two sorted sequences each of size ๐‘› 2 . Algorithm mergeSort(๐‘†, ๐ถ) Input: Sequence ๐‘† of ๐‘› elements, Comparator ๐ถ Output: Sequence ๐‘† sorted according to ๐ถ if ๐‘†.๐‘ ๐‘–๐‘ง๐‘’ >1 ๐‘† 1 , ๐‘† 2 โ†partition ๐‘†, ๐‘› 2 ๐‘† 1 โ†mergeSort ๐‘† 1 , ๐ถ ๐‘† 2 โ†mergeSort ๐‘† 2 ,๐ถ ๐‘†โ†merge ๐‘† 1 , ๐‘† 2 return ๐‘†

21 Merging Two Sorted Sequences
The conquer step of merge-sort consists of merging two sorted sequences ๐ด and ๐ต into a sorted sequence ๐‘† containing the union of the elements of ๐ด and ๐ต Merging two sorted sequences, each with ๐‘› 2 elements and implemented by means of a doubly linked list, takes ๐‘‚ ๐‘› time ๐‘€ ๐‘› =๐‘‚ ๐‘› Algorithm ๐‘š๐‘’๐‘Ÿ๐‘”๐‘’(๐ด, ๐ต) Input: Sequences ๐ด,๐ต with ๐‘› 2 elements each Output: Sorted sequence of ๐ดโˆช๐ต ๐‘†โ†โˆ… while ยฌ๐ด.๐‘’๐‘š๐‘๐‘ก๐‘ฆ โˆงยฌ๐ต.๐‘’๐‘š๐‘๐‘ก๐‘ฆ if ๐ด.๐‘“๐‘Ÿ๐‘œ๐‘›๐‘ก <๐ต.๐‘“๐‘Ÿ๐‘œ๐‘›๐‘ก ๐‘†.๐‘–๐‘›๐‘ ๐‘’๐‘Ÿ๐‘ก๐ต๐‘Ž๐‘๐‘˜ ๐ด.๐‘“๐‘Ÿ๐‘œ๐‘›๐‘ก ; ๐ด.๐‘’๐‘Ÿ๐‘Ž๐‘ ๐‘’๐น๐‘Ÿ๐‘œ๐‘›๐‘ก else ๐‘†.๐‘–๐‘›๐‘ ๐‘’๐‘Ÿ๐‘ก๐ต๐‘Ž๐‘๐‘˜ ๐ต.๐‘“๐‘Ÿ๐‘œ๐‘›๐‘ก ; ๐ต.๐‘’๐‘Ÿ๐‘Ž๐‘ ๐‘’๐น๐‘Ÿ๐‘œ๐‘›๐‘ก while ยฌ๐ด.๐‘’๐‘š๐‘๐‘ก๐‘ฆ while ยฌ๐ต.๐‘’๐‘š๐‘๐‘ก๐‘ฆ return ๐‘†

22 And the complexity of mergesortโ€ฆ
Algorithm mergeSort(๐‘†, ๐ถ) Input: Sequence ๐‘† of ๐‘› elements, Comparator ๐ถ Output: Sequence ๐‘† sorted according to ๐ถ if ๐‘†.๐‘ ๐‘–๐‘ง๐‘’ >1 ๐‘† 1 , ๐‘† 2 โ†partition ๐‘†, ๐‘› 2 ๐‘† 1 โ†mergeSort ๐‘† 1 , ๐ถ ๐‘† 2 โ†mergeSort ๐‘† 2 ,๐ถ ๐‘†โ†merge ๐‘† 1 , ๐‘† 2 return ๐‘† So, the running time of Merge Sort can be expressed by the recurrence equation: ๐‘‡ ๐‘› =2๐‘‡ ๐‘› 2 +๐‘€ ๐‘› =2๐‘‡ ๐‘› 2 +๐‘‚ ๐‘› =๐‘‚ ๐‘› log ๐‘›

23 Another Analysis of Merge-Sort
The height โ„Ž of the merge-sort tree is ๐‘‚ log ๐‘› at each recursive call we divide in half the sequence, The work done at each level is ๐‘‚ ๐‘› At level ๐‘–, we partition and merge 2 ๐‘– sequences of size ๐‘› 2 ๐‘– Thus, the total running time of merge-sort is ๐‘‚ ๐‘› log ๐‘› depth #seqs size Cost for level 1 ๐‘› 2 n/2 โ€ฆ i 2 ๐‘– ๐‘› 2 ๐‘– log ๐‘› 2 log ๐‘› =๐‘› ๐‘› 2 log ๐‘› =1

24 Summary of Sorting Algorithms (so far)
Time Notes Selection Sort ๐‘‚ ๐‘› 2 Slow, in-place For small data sets Insertion Sort ๐‘‚ ๐‘› 2 WC, AC ๐‘‚ ๐‘› BC Heap Sort ๐‘‚ ๐‘› log ๐‘› Fast, in-place For large data sets Merge Sort Fast, sequential data access For huge data sets

25 Quick-Sort Merge Sort ไบŒโ—‹ไธ€ๅ…ซๅนดไธ€ๆœˆๅไบŒๆ—ฅ 7 4 9 6 2 ๏‚ฎ 2 4 6 7 9 4 2 ๏‚ฎ 2 4
๏‚ฎ 4 2 ๏‚ฎ 2 4 7 9 ๏‚ฎ 7 9 2 ๏‚ฎ 2 9 ๏‚ฎ 9

26 Quick-Sort x L G E Quick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm: Divide: pick a random element ๐‘ฅ (called pivot) and partition ๐‘† into ๐ฟ - elements less than ๐‘ฅ ๐ธ - elements equal ๐‘ฅ ๐บ - elements greater than ๐‘ฅ Recur: sort ๐ฟ and ๐บ Conquer: join ๐ฟ, ๐ธ, and ๐บ

27 Partition Algorithm partition ๐‘†, ๐‘
We partition an input sequence as follows: We remove, in turn, each element ๐‘ฆ from ๐‘† and We insert ๐‘ฆ into ๐ฟ, ๐ธ, or ๐บ, depending on the result of the comparison with the pivot ๐‘ฅ Algorithm partition ๐‘†, ๐‘ Input: Sequence ๐‘†, position ๐‘ of the pivot Output: Subsequences ๐ฟ, ๐ธ, ๐บ of the elements of ๐‘† less than, equal to, or greater than the pivot, respectively ๐ฟ, ๐ธ, ๐บโ†โˆ… ๐‘ฅโ†๐‘†.erase ๐‘ while ยฌ๐‘†.empty ๐‘ฆโ†๐‘†.eraseFront if ๐‘ฆ<๐‘ฅ ๐ฟ.insertBack ๐‘ฆ else if ๐‘ฆ=๐‘ฅ ๐ธ.insertBack ๐‘ฆ else //๐‘ฆ>๐‘ฅ ๐บ.insertBack ๐‘ฆ return ๐ฟ,๐ธ,๐บ

28 Quick-Sort Tree An execution of quick-sort is depicted by a binary tree Each node represents a recursive call of quick-sort and stores Unsorted sequence before the execution and its pivot Sorted sequence at the end of the execution The root is the initial call The leaves are calls on subsequences of size 0 or 1 ๏‚ฎ 4 2 ๏‚ฎ 2 4 7 9 ๏‚ฎ 7 9 2 ๏‚ฎ 2 9 ๏‚ฎ 9

29 Execution Example Pivot selection 2 4 3 1 ๏‚ฎ 1 2 3 4 1 ๏‚ฎ 1
๏‚ฎ 1 ๏‚ฎ 1 ๏‚ฎ ๏‚ฎ 9 ๏‚ฎ 9 4 3 ๏‚ฎ 3 4 4 ๏‚ฎ 4

30 Execution Example Partition, recursive call, pivot selection
๏‚ฎ 1 ๏‚ฎ 1 ๏‚ฎ ๏‚ฎ 9 ๏‚ฎ 9 4 3 ๏‚ฎ 3 4 4 ๏‚ฎ 4

31 Execution Example Partition, recursive call, base case
๏‚ฎ 1 ๏‚ฎ 1 ๏‚ฎ ๏‚ฎ 9 ๏‚ฎ 9 4 3 ๏‚ฎ 3 4 4 ๏‚ฎ 4

32 Execution Example Recursive call, โ€ฆ, base case, join 2 4 3 1 ๏‚ฎ 1 2 3 4
๏‚ฎ 1 ๏‚ฎ 1 ๏‚ฎ ๏‚ฎ 9 ๏‚ฎ 9 4 3 ๏‚ฎ 3 4 4 ๏‚ฎ 4

33 Execution Example Recursive call, pivot selection 2 4 3 1 ๏‚ฎ 1 2 3 4
๏‚ฎ 1 ๏‚ฎ 1 ๏‚ฎ ๏‚ฎ 9 ๏‚ฎ 9 4 3 ๏‚ฎ 3 4 4 ๏‚ฎ 4

34 Execution Example Partition, โ€ฆ, recursive call, base case
๏‚ฎ 1 ๏‚ฎ 1 ๏‚ฎ ๏‚ฎ 9 ๏‚ฎ 9 4 3 ๏‚ฎ 3 4 4 ๏‚ฎ 4

35 Execution Example Join, join 2 4 3 1 ๏‚ฎ 1 2 3 4 1 ๏‚ฎ 1
๏‚ฎ 1 ๏‚ฎ 1 ๏‚ฎ ๏‚ฎ 9 ๏‚ฎ 9 4 3 ๏‚ฎ 3 4 4 ๏‚ฎ 4

36 In-Place Quick-Sort Algorithm inPlaceQuickSort(๐‘†, ๐‘™, ๐‘Ÿ)
Quick-sort can be implemented to run in-place In the partition step, we use replace operations to rearrange the elements of the input sequence such that the elements less than the pivot have indices less than โ„Ž the elements equal to the pivot have indices between โ„Ž and ๐‘˜ the elements greater than the pivot have indices greater than ๐‘˜ The recursive calls consider elements with indices less than โ„Ž elements with indices greater than ๐‘˜ Algorithm inPlaceQuickSort(๐‘†, ๐‘™, ๐‘Ÿ) Input: Array ๐‘†, indices ๐‘™, r Output: Array ๐‘† with the elements between ๐‘™ and ๐‘Ÿ sorted if ๐‘™โ‰ฅ๐‘Ÿ return ๐‘† ๐‘–โ†rand % ๐‘Ÿโˆ’๐‘™ +๐‘™ //random integer between ๐‘™ and ๐‘Ÿ ๐‘ฅโ†๐‘† ๐‘– โ„Ž, ๐‘˜ โ†inPlacePartition ๐‘ฅ inPlaceQuickSort(๐‘†, ๐‘™, โ„Žโˆ’1) inPlaceQuickSort ๐‘†, ๐‘˜+1, ๐‘Ÿ

37 In-Place Partitioning
Perform the partition using two indices to split ๐‘† into ๐ฟ and ๐ธโˆช๐บ (a similar method can split ๐ธโˆช๐บ into ๐ธ and ๐บ). Repeat until ๐‘— and ๐‘˜ cross: Scan ๐‘— to the right until finding an element โ‰ฅ๐‘ฅ. Scan ๐‘˜ to the left until finding an element <๐‘ฅ. Swap elements at indices ๐‘— and ๐‘˜ j k (pivot = 6) ๐‘— ๐‘˜

38 Analysis of Quick Sort using Recurrence Relations
Algorithm quickSort(๐‘†, ๐‘™, ๐‘Ÿ) Input: Sequence ๐‘†, indices ๐‘™, r Output: Sequence ๐‘† with the elements between ๐‘™ and ๐‘Ÿ sorted if ๐‘™โ‰ฅ๐‘Ÿ return ๐‘† ๐‘–โ†rand % ๐‘Ÿโˆ’๐‘™ +๐‘™ //random integer between ๐‘™ and ๐‘Ÿ ๐‘ฅโ†๐‘†.at ๐‘– โ„Ž, ๐‘˜ โ†partition ๐‘ฅ quickSort ๐‘†, ๐‘™, โ„Žโˆ’1 quickSort ๐‘†, ๐‘˜+1 , ๐‘Ÿ Assumption: random pivot expected to give equal sized sub-lists The running time of Quick Sort can be expressed as: ๐‘‡ ๐‘› =2๐‘‡ ๐‘› 2 +๐‘ƒ ๐‘› ๐‘ƒ ๐‘› - time to run partition on input of size ๐‘›

39 Analysis of Partition Algorithm partition ๐‘†, ๐‘
Each insertion and removal is at the beginning or at the end of a sequence, and hence takes ๐‘‚ 1 time Thus, the partition step of quick-sort takes ๐‘‚ ๐‘› time Algorithm partition ๐‘†, ๐‘ Input: Sequence ๐‘†, position ๐‘ of the pivot Output: Subsequences ๐ฟ, ๐ธ, ๐บ of the elements of ๐‘† less than, equal to, or greater than the pivot, respectively ๐ฟ, ๐ธ, ๐บโ†โˆ… ๐‘ฅโ†๐‘†.erase ๐‘ while ยฌ๐‘†.empty ๐‘ฆโ†๐‘†.eraseFront if ๐‘ฆ<๐‘ฅ ๐ฟ.insertBack ๐‘ฆ else if ๐‘ฆ=๐‘ฅ ๐ธ.insertBack ๐‘ฆ else //๐‘ฆ>๐‘ฅ ๐บ.insertBack ๐‘ฆ return ๐ฟ,๐ธ,๐บ

40 So, the expected complexity of Quick Sort
Assumption: random pivot expected to give equal sized sub-lists The running time of Quick Sort can be expressed as: ๐‘‡ ๐‘› =2๐‘‡ ๐‘› 2 +๐‘ƒ ๐‘› =2๐‘‡ ๐‘› 2 +๐‘‚ ๐‘› =๐‘‚ ๐‘› log ๐‘› Algorithm quickSort(๐‘†, ๐‘™, ๐‘Ÿ) Input: Sequence ๐‘†, indices ๐‘™, r Output: Sequence ๐‘† with the elements between ๐‘™ and ๐‘Ÿ sorted if ๐‘™โ‰ฅ๐‘Ÿ return ๐‘† ๐‘–โ†rand % ๐‘Ÿโˆ’๐‘™ +๐‘™ //random integer between ๐‘™ and ๐‘Ÿ ๐‘ฅโ†๐‘†.at ๐‘– โ„Ž, ๐‘˜ โ†partition ๐‘ฅ quickSort ๐‘†, ๐‘™, โ„Žโˆ’1 quickSort ๐‘†, ๐‘˜+1 , ๐‘Ÿ

41 Worst-case Running Time
depth time n 1 n - 1 โ€ฆ The worst case for quick-sort occurs when the pivot is the unique minimum or maximum element One of ๐ฟ and ๐บ has size ๐‘›โˆ’1 and the other has size 0 The running time is proportional to: ๐‘›+ ๐‘›โˆ’1 +โ€ฆ+2+1=๐‘‚ ๐‘› 2 Alternatively, using recurrence equations: ๐‘‡ ๐‘› =๐‘‡ ๐‘›โˆ’1 +๐‘‚ ๐‘› =๐‘‚ ๐‘› 2 โ€ฆ

42 Expected Running Time removing equal split assumption
Consider a recursive call of quick-sort on a sequence of size ๐‘  Good call: the sizes of ๐ฟ and ๐บ are each less than 3๐‘  4 Bad call: one of ๐ฟ and ๐บ has size greater than 3๐‘  4 A call is good with probability 1/2 1/2 of the possible pivots cause good calls: 1 Good call Bad call Bad pivots Good pivots Bad pivots

43 Expected Running Time Probabilistic Fact: The expected number of coin tosses required in order to get ๐‘˜ heads is 2๐‘˜ (e.g., it is expected to take 2 tosses to get heads) For a node of depth ๐‘–, we expect ๐‘– 2 ancestors are good calls The size of the input sequence for the current call is at most ๐‘– 2 ๐‘› Therefore, we have For a node of depth 2 log ๐‘› , the expected input size is one The expected height of the quick-sort tree is ๐‘‚ log ๐‘› The amount or work done at the nodes of the same depth is ๐‘‚ ๐‘› Thus, the expected running time of quick-sort is ๐‘‚ ๐‘› log ๐‘›

44 Summary of Sorting Algorithms (so far)
Time Notes Selection Sort ๐‘‚ ๐‘› 2 Slow, in-place For small data sets Insertion Sort ๐‘‚ ๐‘› 2 WC, AC ๐‘‚ ๐‘› BC Heap Sort ๐‘‚ ๐‘› log ๐‘› Fast, in-place For large data sets Quick Sort Exp. ๐‘‚ ๐‘› log ๐‘› AC, BC ๐‘‚ ๐‘› 2 WC Fastest, randomized, in-place Merge Sort Fast, sequential data access For huge data sets

45 (2,4) Trees 1/12/2018 9:05 AM Sorting Lower Bound

46 Comparison-Based Sorting
Many sorting algorithms are comparison based. They sort by making comparisons between pairs of objects Examples: bubble-sort, selection-sort, insertion-sort, heap-sort, merge-sort, quick-sort, ... Let us therefore derive a lower bound on the running time of any algorithm that uses comparisons to sort ๐‘› elements, ๐‘ฅ 1 , ๐‘ฅ 2 ,โ€ฆ, ๐‘ฅ ๐‘› . Is ๐‘ฅ ๐‘– < ๐‘ฅ ๐‘— ? yes no

47 Counting Comparisons Let us just count comparisons then.
Each possible run of the algorithm corresponds to a root-to-leaf path in a decision tree

48 Decision Tree Height The height of the decision tree is a lower bound on the running time Every input permutation must lead to a separate leaf output If not, some input โ€ฆ4โ€ฆ5โ€ฆ would have same output ordering as โ€ฆ5โ€ฆ4โ€ฆ, which would be wrong Since there are ๐‘›!=1โˆ—2โˆ—โ€ฆโˆ—๐‘› leaves, the height is at least log ๐‘›!

49 The Lower Bound Any comparison-based sorting algorithm takes at least log ๐‘›! time log (๐‘›!) โ‰ฅ log ๐‘› 2 ๐‘› 2 = ๐‘› 2 log ๐‘› 2 That is, any comparison-based sorting algorithm must run in ฮฉ(๐‘› log ๐‘› ) time.

50 Bucket-Sort and Radix-Sort
Merge Sort ไบŒโ—‹ไธ€ๅ…ซๅนดไธ€ๆœˆๅไบŒๆ—ฅ Bucket-Sort and Radix-Sort Can we sort in linear time? 1 2 3 4 5 6 7 8 9 B 1, c 7, d 7, g 3, b 3, a 7, e ๏ƒ†

51 Bucket-Sort Let be S be a sequence of ๐‘› (key, element) items with keys in the range 0,๐‘โˆ’1 Bucket-sort uses the keys as indices into an auxiliary array ๐ต of sequences (buckets) Phase 1: Empty sequence ๐‘† by moving each entry into its bucket ๐ต ๐‘˜ Phase 2: for ๐‘–โ†0โ€ฆ๐‘โˆ’1, move the items of bucket ๐ต ๐‘– to the end of sequence ๐‘† Analysis: Phase 1 takes ๐‘‚ ๐‘› time Phase 2 takes ๐‘‚ ๐‘›+๐‘ time Bucket-sort takes ๐‘‚ ๐‘›+๐‘ time Algorithm bucketSort(๐‘†, ๐‘) Input: Sequence ๐‘† of entries with integer keys in the range 0, ๐‘โˆ’1 Output: Sequence ๐‘† sorted in non-decreasing order of the keys ๐ตโ† array of ๐‘ empty sequences for each entry ๐‘’โˆˆ๐‘† do ๐‘˜โ†๐‘’.key remove ๐‘’ from ๐‘† and insert it at the end of bucket ๐ต ๐‘˜ for ๐‘–โ†0โ€ฆ๐‘โˆ’1 do for each entry ๐‘’โˆˆ๐ต ๐‘– do remove ๐‘’ from bucket ๐ต ๐‘– and insert it at the end of ๐‘†

52 Properties and Extensions
Key-type The keys are used as indices into an array and cannot be arbitrary objects No external comparator Stable sorting The relative order of any two items with the same key is preserved after the execution of the algorithm Extensions Integer keys in the range ๐‘Ž,๐‘ Put entry ๐‘’ into bucket ๐ต ๐‘˜โˆ’๐‘Ž String keys from a set ๐ท of possible strings, where ๐ท has constant size (e.g., names of the 50 U.S. states) Sort ๐ท and compute the index ๐‘– ๐‘˜ of each string ๐‘˜ of ๐ท in the sorted sequence Put item ๐‘’ into bucket ๐ต ๐‘– ๐‘˜

53 Example Merge Sort ไบŒโ—‹ไธ€ๅ…ซๅนดไธ€ๆœˆๅไบŒๆ—ฅ 45, d 37, c 40, a 45, g 40, b 46, e
Key range [37, 46] โ€“ map to buckets [0,9] Example 45, d 37, c 40, a 45, g 40, b 46, e Phase 1 1 2 3 4 5 6 7 8 9 B 37, c 45, d 45, g 40, b 40, a 46, e ๏ƒ† Is it in-place? Is it stable? Can we use a hash table instead of a bucket array? Phase 2 37, c 40, a 40, b 45, d 45, g 46, e

54 Lexicographic Order Merge Sort ไบŒโ—‹ไธ€ๅ…ซๅนดไธ€ๆœˆๅไบŒๆ—ฅ
Given a list of tuples: (7,4,6) (5,1,5) (2,4,6) (2,1,4) (5,1,6) (3,2,4) After sorting, the list is in lexicographical order: (2,1,4) (2,4,6) (3,2,4) (5,1,5) (5,1,6) (7,4,6) Non-decreasing order

55 Lexicographic Order Formalized
A ๐‘‘-tuple is a sequence of ๐‘‘ keys ๐‘˜ 1 , ๐‘˜ 2 ,โ€ฆ, ๐‘˜ ๐‘‘ , where key ๐‘˜ ๐‘– is said to be the ๐‘–-th dimension of the tuple Example - the Cartesian coordinates of a point in space is a 3-tuple ๐‘ฅ, ๐‘ฆ, ๐‘ง The lexicographic order of two ๐‘‘-tuples is recursively defined as follows ๐‘ฅ 1 , ๐‘ฅ 2 ,โ€ฆ, ๐‘ฅ ๐‘‘ < ๐‘ฆ 1 , ๐‘ฆ 2 ,โ€ฆ, ๐‘ฆ ๐‘‘ ๐‘ฅ 1 < ๐‘ฆ 1 โˆจ ๐‘ฅ 1 = ๐‘ฆ 1 โˆง ๐‘ฅ 2 ,โ€ฆ, ๐‘ฅ ๐‘‘ < ๐‘ฆ 2 ,โ€ฆ, ๐‘ฆ ๐‘‘ i.e., the tuples are compared by the first dimension, then by the second dimension, etc.

56 Exercise Lexicographic Order
Merge Sort ไบŒโ—‹ไธ€ๅ…ซๅนดไธ€ๆœˆๅไบŒๆ—ฅ Exercise Lexicographic Order Given a list of 2-tuples, we can order the tuples lexicographically by applying a stable sorting algorithm two times: (3,3) (1,5) (2,5) (1,2) (2,3) (1,7) (3,2) (2,2) Possible ways of doing it: Sort first by 1st element of tuple and then by 2nd element of tuple Sort first by 2nd element of tuple and then by 1st element of tuple Show the result of sorting the list using both options Non-decreasing order What must it be stable?

57 Exercise Lexicographic Order
Merge Sort ไบŒโ—‹ไธ€ๅ…ซๅนดไธ€ๆœˆๅไบŒๆ—ฅ Exercise Lexicographic Order (3,3) (1,5) (2,5) (1,2) (2,3) (1,7) (3,2) (2,2) Using a stable sort, Sort first by 1st element of tuple and then by 2nd element of tuple Sort first by 2nd element of tuple and then by 1st element of tuple Option 1: 1st sort: (1,5) (1,2) (1,7) (2,5) (2,3) (2,2) (3,3) (3,2) 2nd sort: (1,2) (2,2) (3,2) (2,3) (3,3) (1,5) (2,5) (1,7) - WRONG Option 2: 1st sort: (1,2) (3,2) (2,2) (3,3) (2,3) (1,5) (2,5) (1,7) 2nd sort: (1,2) (1,5) (1,7) (2,2) (2,3) (2,5) (3,2) (3,3) - CORRECT Non-decreasing order What must it be stable?

58 Lexicographic-Sort Let ๐ถ ๐‘– be the comparator that compares two tuples by their ๐‘–-th dimension Let stableSort(๐‘†, ๐ถ) be a stable sorting algorithm that uses comparator ๐ถ Lexicographic-sort sorts a sequence of ๐‘‘-tuples in lexicographic order by executing ๐‘‘ times algorithm stableSort, one per dimension Lexicographic-sort runs in ๐‘‚ ๐‘‘๐‘‡ ๐‘› time, where ๐‘‡ ๐‘› is the running time of stableSort Algorithm lexicographicSort(๐‘†) Input: Sequence ๐‘† of ๐‘‘-tuples Output: Sequence ๐‘† sorted in lexicographic order for ๐‘–โ†๐‘‘โ€ฆ1 do stableSort ๐‘†, ๐ถ ๐‘–

59 Radix-Sort Radix-sort is a specialization of lexicographic-sort that uses bucket-sort as the stable sorting algorithm in each dimension Radix-sort is applicable to tuples where the keys in each dimension ๐‘– are integers in the range 0,๐‘โˆ’1 Radix-sort runs in time ๐‘‚ ๐‘‘ ๐‘›+๐‘ Algorithm radixSort ๐‘†,๐‘ Input: Sequence ๐‘† of ๐‘‘-tuples such that ,โ€ฆ,0 โ‰ค ๐‘ฅ 1 ,โ€ฆ, ๐‘ฅ ๐‘‘ and ๐‘ฅ 1 ,โ€ฆ, ๐‘ฅ ๐‘‘ โ‰ค ๐‘โˆ’1,โ€ฆ,๐‘โˆ’ for each tuple ๐‘ฅ 1 ,โ€ฆ, ๐‘ฅ ๐‘‘ in ๐‘† Output: Sequence ๐‘† sorted in lexicographic order for ๐‘–โ†๐‘‘โ€ฆ1 do set the key ๐‘˜ of each entry ๐‘˜, ๐‘ฅ 1 ,โ€ฆ, ๐‘ฅ ๐‘‘ of ๐‘† to ๐‘–th dimension ๐‘ฅ ๐‘– bucketSort ๐‘†,๐‘

60 Example Radix-Sort for Binary Numbers
Merge Sort ไบŒโ—‹ไธ€ๅ…ซๅนดไธ€ๆœˆๅไบŒๆ—ฅ Sorting a sequence of 4-bit integers ๐‘‘=4,๐‘=2 so ๐‘‚ ๐‘‘ ๐‘›+๐‘ =๐‘‚ 4 ๐‘›+2 =๐‘‚ ๐‘› Example Radix-Sort for Binary Numbers 1001 0010 1101 0001 1110 0010 1110 1001 1101 0001 1001 1101 0001 0010 1110 1001 0001 0010 1101 1110 0001 0010 1001 1101 1110 Whatโ€™s the complexity? Whatโ€™s the constant in O(d( n + N)) Sort by d=4 Sort by d=3 Sort by d=2 Sort by d=1

61 Summary of Sorting Algorithms
Time Notes Selection Sort ๐‘‚(๐‘›2) Slow, in-place For small data sets Insertion Sort ๐‘‚(๐‘›2) WC, AC ๐‘‚(๐‘›) BC Heap Sort ๐‘‚(๐‘› log ๐‘› ) Fast, in-place For large data sets Quick Sort Exp. ๐‘‚ ๐‘› log ๐‘› AC, BC ๐‘‚(๐‘›2) WC Fastest, randomized, in-place Merge Sort Fast, sequential data access For huge data sets Radix Sort ๐‘‚ ๐‘‘ ๐‘›+๐‘ , ๐‘‘ #digits, ๐‘ range of digit values Fastest, stable only for integers


Download ppt "Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount."

Similar presentations


Ads by Google