Presentation is loading. Please wait.

Presentation is loading. Please wait.

David Luebke 1 7/2/2016 CS 332: Algorithms Linear-Time Sorting: Review + Bucket Sort Medians and Order Statistics.

Similar presentations


Presentation on theme: "David Luebke 1 7/2/2016 CS 332: Algorithms Linear-Time Sorting: Review + Bucket Sort Medians and Order Statistics."— Presentation transcript:

1 David Luebke 1 7/2/2016 CS 332: Algorithms Linear-Time Sorting: Review + Bucket Sort Medians and Order Statistics

2 David Luebke 2 7/2/2016 Review: Linear-Time Sorting l Comparison sorts: O(n lg n) at best n Model sort with decision tree n Path down tree = execution trace of algorithm n Leaves of tree = possible permutations of input n Tree must have n! leaves, so O(n lg n) height

3 David Luebke 3 7/2/2016 Review: Counting Sort l Counting sort: n Assumption: input is in the range 1..k n Basic idea: u Count number of elements k  each element i u Use that number to place i in position k of sorted array n No comparisons! Runs in time O(n + k) n Stable sort n Does not sort in place: u O(n) array to hold sorted output u O(k) array for scratch storage

4 David Luebke 4 7/2/2016 Review: Counting Sort 1CountingSort(A, B, k) 2for i=1 to k 3C[i]= 0; 4for j=1 to n 5C[A[j]] += 1; 6for i=2 to k 7C[i] = C[i] + C[i-1]; 8for j=n downto 1 9B[C[A[j]]] = A[j]; 10C[A[j]] -= 1;

5 David Luebke 5 7/2/2016 Review: Radix Sort l Radix sort: n Assumption: input has d digits ranging from 0 to k n Basic idea: u Sort elements by digit starting with least significant u Use a stable sort (like counting sort) for each stage n Each pass over n numbers with d digits takes time O(n+k), so total time O(dn+dk) u When d is constant and k=O(n), takes O(n) time n Fast! Stable! Simple! n Doesn’t sort in place

6 David Luebke 6 7/2/2016 Bucket Sort l Bucket sort n Assumption: input is n reals from [0, 1) n Basic idea: u Create n linked lists (buckets) to divide interval [0,1) into subintervals of size 1/n u Add each input element to appropriate bucket and sort buckets with insertion sort n Uniform input distribution  O(1) bucket size u Therefore the expected total time is O(n) n These ideas will return when we study hash tables

7 David Luebke 7 7/2/2016 Order Statistics l The ith order statistic in a set of n elements is the ith smallest element l The minimum is thus the 1st order statistic l The maximum is (duh) the nth order statistic l The median is the n/2 order statistic n If n is even, there are 2 medians l How can we calculate order statistics? l What is the running time?

8 David Luebke 8 7/2/2016 Order Statistics l How many comparisons are needed to find the minimum element in a set? The maximum? l Can we find the minimum and maximum with less than twice the cost? l Yes: n Walk through elements by pairs u Compare each element in pair to the other u Compare the largest to maximum, smallest to minimum n Total cost: 3 comparisons per 2 elements = O(3n/2)

9 David Luebke 9 7/2/2016 Finding Order Statistics: The Selection Problem l A more interesting problem is selection: finding the ith smallest element of a set l We will show: n A practical randomized algorithm with O(n) expected running time n A cool algorithm of theoretical interest only with O(n) worst-case running time

10 David Luebke 10 7/2/2016 Randomized Selection l Key idea: use partition() from quicksort n But, only need to examine one subarray n This savings shows up in running time: O(n) l We will again use a slightly different partition than the book: q = RandomizedPartition(A, p, r)  A[q]  A[q] q pr

11 David Luebke 11 7/2/2016 Randomized Selection RandomizedSelect(A, p, r, i) if (p == r) then return A[p]; q = RandomizedPartition(A, p, r) k = q - p + 1; if (i == k) then return A[q]; // not in book if (i < k) then return RandomizedSelect(A, p, q-1, i); else return RandomizedSelect(A, q+1, r, i-k);  A[q]  A[q] k q pr

12 David Luebke 12 7/2/2016 Randomized Selection Analyzing RandomizedSelect() n Worst case: partition always 0:n-1 T(n) = T(n-1) + O(n)= ??? = O(n 2 ) (arithmetic series) u Worse than sorting! n “Best” case: suppose a 9:1 partition T(n) = T(9n/10) + O(n) = ??? = O(n)(Master Theorem, case 3) u Better than sorting! u What if this had been a 99:1 split?

13 David Luebke 13 7/2/2016 Randomized Selection l Average case n For upper bound, assume ith element always falls in larger side of partition: n Let’s show that T(n) = O(n) by substitution What happened here?

14 David Luebke 14 7/2/2016 What happened here?“Split” the recurrence What happened here? Randomized Selection l Assume T(n)  cn for sufficiently large c: The recurrence we started with Substitute T(n)  cn for T(k) Expand arithmetic series Multiply it out

15 David Luebke 15 7/2/2016 What happened here?Subtract c/2 What happened here? Randomized Selection l Assume T(n)  cn for sufficiently large c: The recurrence so far Multiply it out Rearrange the arithmetic What we set out to prove

16 David Luebke 16 7/2/2016 Worst-Case Linear-Time Selection l Randomized algorithm works well in practice l What follows is a worst-case linear time algorithm, really of theoretical interest only l Basic idea: n Generate a good partitioning element n Call this element x

17 David Luebke 17 7/2/2016 Worst-Case Linear-Time Selection l The algorithm in words: 1.Divide n elements into groups of 5 2.Find median of each group (How? How long?) 3.Use Select() recursively to find median x of the  n/5  medians 4.Partition the n elements around x. Let k = rank(x) 5.if (i == k) then return x if (i k) use Select() recursively to find (i-k)th smallest element in last partition

18 David Luebke 18 7/2/2016 Worst-Case Linear-Time Selection l (Sketch situation on the board) l How many of the 5-element medians are  x? n At least 1/2 of the medians =   n/5  / 2  =  n/10  l How many elements are  x? n At least 3  n/10  elements l For large n, 3  n/10   n/4 (How large?) l So at least n/4 elements  x l Similarly: at least n/4 elements  x

19 David Luebke 19 7/2/2016 Worst-Case Linear-Time Selection l Thus after partitioning around x, step 5 will call Select() on at most 3n/4 elements l The recurrence is therefore: ???  n/5   n/5 Substitute T(n) = cn Combine fractions Express in desired form What we set out to prove

20 David Luebke 20 7/2/2016 Worst-Case Linear-Time Selection l Intuitively: n Work at each level is a constant fraction (19/20) smaller u Geometric progression! n Thus the O(n) work at the root dominates

21 David Luebke 21 7/2/2016 Linear-Time Median Selection l Given a “black box” O(n) median algorithm, what can we do? n ith order statistic: u Find median x u Partition input around x u if (i  (n+1)/2) recursively find ith element of first half u else find (i - (n+1)/2)th element in second half u T(n) = T(n/2) + O(n) = O(n) n Can you think of an application to sorting?

22 David Luebke 22 7/2/2016 Linear-Time Median Selection l Worst-case O(n lg n) quicksort n Find median x and partition around it n Recursively quicksort two halves n T(n) = 2T(n/2) + O(n) = O(n lg n)


Download ppt "David Luebke 1 7/2/2016 CS 332: Algorithms Linear-Time Sorting: Review + Bucket Sort Medians and Order Statistics."

Similar presentations


Ads by Google