David Luebke 1 10/3/2015 CS 332: Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.

Slides:



Advertisements
Similar presentations
BY Lecturer: Aisha Dawood. Heapsort  O(n log n) worst case like merge sort.  Sorts in place like insertion sort.  Combines the best of both algorithms.
Advertisements

Analysis of Algorithms
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
ADA: 4. Divide/Conquer1 Objective o look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their.
David Luebke 1 5/20/2015 CS 332: Algorithms Quicksort.
September 19, Algorithms and Data Structures Lecture IV Simonas Šaltenis Nykredit Center for Database Research Aalborg University
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2009 Heap Lecture Chapter 6 Use NOTES feature to see explanation.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
CS 253: Algorithms Chapter 6 Heapsort Appendix B.5 Credit: Dr. George Bebis.
Analysis of Algorithms CS 477/677
Comp 122, Spring 2004 Heapsort. heapsort - 2 Lin / Devi Comp 122 Heapsort  Combines the better attributes of merge sort and insertion sort. »Like merge.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 7 Heapsort and priority queues Motivation Heaps Building and maintaining heaps.
Data Structures Dynamic Sets Heaps Binary Trees & Sorting Hashing.
Heapsort Chapter 6. Heaps A data structure with  Nearly complete binary tree  Heap property: A[parent(i)] ≥ A[i] eg. Parent(i) { return } Left(i) {
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2007 Heap Lecture Chapter 6 Use NOTES feature to see explanation.
David Luebke 1 7/2/2015 Merge Sort Solving Recurrences The Master Theorem.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Heap Lecture 2 Chapter 7 Wed. 10/10/01 Use NOTES feature to.
Heapsort CIS 606 Spring Overview Heapsort – O(n lg n) worst case—like merge sort. – Sorts in place—like insertion sort. – Combines the best of both.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
David Luebke 1 9/8/2015 CS 332: Algorithms Review for Exam 1.
Sorting Algorithms (Part II) Slightly modified definition of the sorting problem: input: A collection of n data items where data item a i has a key, k.
2IL50 Data Structures Spring 2015 Lecture 3: Heaps.
Heaps, Heapsort, Priority Queues. Sorting So Far Heap: Data structure and associated algorithms, Not garbage collection context.
Binary Heap.
2IL50 Data Structures Fall 2015 Lecture 3: Heaps.
Merge Sort Solving Recurrences The Master Theorem
CSC 413/513: Intro to Algorithms Merge Sort Solving Recurrences.
CS 2133:Data Structures Merge Sort Solving Recurrences The Master Theorem.
September 29, Algorithms and Data Structures Lecture V Simonas Šaltenis Aalborg University
Computer Sciences Department1. Sorting algorithm 3 Chapter 6 3Computer Sciences Department Sorting algorithm 1  insertion sort Sorting algorithm 2.
Algorithms Merge Sort Solving Recurrences The Master Theorem.
David Luebke 1 6/3/2016 CS 332: Algorithms Heapsort Priority Queues Quicksort.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 9.
CS 2133: Data Structures Quicksort. Review: Heaps l A heap is a “complete” binary tree, usually represented as an array:
1 Analysis of Algorithms Chapter - 03 Sorting Algorithms.
1 Algorithms CSCI 235, Fall 2015 Lecture 14 Analysis of Heap Sort.
David Luebke 1 12/23/2015 Heaps & Priority Queues.
Computer Algorithms Lecture 9 Heapsort Ch. 6, App. B.5 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
CPSC 311 Section 502 Analysis of Algorithm Fall 2002 Department of Computer Science Texas A&M University.
Midterm Review 1. Midterm Exam Thursday, October 15 in classroom 75 minutes Exam structure: –TRUE/FALSE questions –short questions on the topics discussed.
CSC 413/513: Intro to Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
Chapter 6: Heapsort Combines the good qualities of insertion sort (sort in place) and merge sort (speed) Based on a data structure called a “binary heap”
Lecture 8 : Priority Queue Bong-Soo Sohn Assistant Professor School of Computer Science and Engineering Chung-Ang University.
CSE 5392 Fall 2005 Week 4 Devendra Patel Subhesh Pradhan.
1 Heap Sort. A Heap is a Binary Tree Height of tree = longest path from root to leaf =  (lgn) A heap is a binary tree satisfying the heap condition:
Mergeable Heaps David Kauchak cs302 Spring Admin Homework 7?
David Luebke 1 2/5/2016 CS 332: Algorithms Introduction to heapsort.
Heapsort A minimalist's approach Jeff Chastine. Heapsort Like M ERGE S ORT, it runs in O(n lg n) Unlike M ERGE S ORT, it sorts in place Based off of a.
David Luebke 1 2/19/2016 Priority Queues Quicksort.
Sept Heapsort What is a heap? Max-heap? Min-heap? Maintenance of Max-heaps -MaxHeapify -BuildMaxHeap Heapsort -Heapsort -Analysis Priority queues.
Heapsort Lecture 4 Asst. Prof. Dr. İlker Kocabaş.
6.Heapsort. Computer Theory Lab. Chapter 6P.2 Why sorting 1. Sometimes the need to sort information is inherent in a application. 2. Algorithms often.
1 Algorithms CSCI 235, Fall 2015 Lecture 13 Heap Sort.
"Teachers open the door, but you must enter by yourself. "
Heaps, Heapsort, and Priority Queues
Heaps, Heap Sort and Priority Queues
Introduction to Algorithms
CS 583 Analysis of Algorithms
Heaps, Heapsort, and Priority Queues
CS200: Algorithm Analysis
Ch 6: Heapsort Ming-Te Chi
Heapsort.
"Teachers open the door, but you must enter by yourself. "
Design and Analysis of Algorithms
Merge Sort Solving Recurrences The Master Theorem
Topic 5: Heap data structure heap sort Priority queue
Heapsort Sorting in place
Solving Recurrences Continued The Master Theorem
Asst. Prof. Dr. İlker Kocabaş
Presentation transcript:

David Luebke 1 10/3/2015 CS 332: Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort

David Luebke 2 10/3/2015 Review: Merge Sort MergeSort(A, left, right) { if (left < right) { mid = floor((left + right) / 2); MergeSort(A, left, mid); MergeSort(A, mid+1, right); Merge(A, left, mid, right); } // Merge() takes two sorted subarrays of A and // merges them into a single sorted subarray of A. // Code for this is in the book. It requires O(n) // time, and *does* require allocating O(n) space

David Luebke 3 10/3/2015 Review: Analysis of Merge Sort StatementEffort ● So T(n) =  (1) when n = 1, and 2T(n/2) +  (n) when n > 1 ● This expression is a recurrence MergeSort(A, left, right) {T(n) if (left < right) {  (1) mid = floor((left + right) / 2);  (1) MergeSort(A, left, mid); T(n/2) MergeSort(A, mid+1, right); T(n/2) Merge(A, left, mid, right);  (n) }

David Luebke 4 10/3/2015 Review: Solving Recurrences ● Substitution method ● Iteration method ● Master method

David Luebke 5 10/3/2015 Review: Solving Recurrences ● The substitution method ■ A.k.a. the “making a good guess method” ■ Guess the form of the answer, then use induction to find the constants and show that solution works ■ Run an example: merge sort ○ T(n) = 2T(n/2) + cn ○ We guess that the answer is O(n lg n) ○ Prove it by induction ■ Can similarly show T(n) = Ω(n lg n), thus Θ(n lg n)

David Luebke 6 10/3/2015 Review: Solving Recurrences ● The “iteration method” ■ Expand the recurrence ■ Work some algebra to express as a summation ■ Evaluate the summation ● We showed several examples, were in the middle of:

David Luebke 7 10/3/2015 ● T(n) = aT(n/b) + cn a(aT(n/b/b) + cn/b) + cn a 2 T(n/b 2 ) + cna/b + cn a 2 T(n/b 2 ) + cn(a/b + 1) a 2 (aT(n/b 2 /b) + cn/b 2 ) + cn(a/b + 1) a 3 T(n/b 3 ) + cn(a 2 /b 2 ) + cn(a/b + 1) a 3 T(n/b 3 ) + cn(a 2 /b 2 + a/b + 1) … a k T(n/b k ) + cn(a k-1 /b k-1 + a k-2 /b k-2 + … + a 2 /b 2 + a/b + 1)

David Luebke 8 10/3/2015 ● So we have ■ T(n) = a k T(n/b k ) + cn(a k-1 /b k a 2 /b 2 + a/b + 1) ● For k = log b n ■ n = b k ■ T(n)= a k T(1) + cn(a k-1 /b k a 2 /b 2 + a/b + 1) = a k c + cn(a k-1 /b k a 2 /b 2 + a/b + 1) = ca k + cn(a k-1 /b k a 2 /b 2 + a/b + 1) = cna k /b k + cn(a k-1 /b k a 2 /b 2 + a/b + 1) = cn(a k /b k a 2 /b 2 + a/b + 1)

David Luebke 9 10/3/2015 ● So with k = log b n ■ T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) ● What if a = b? ■ T(n)= cn(k + 1) = cn(log b n + 1) =  (n log n)

David Luebke 10 10/3/2015 ● So with k = log b n ■ T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) ● What if a < b?

David Luebke 11 10/3/2015 ● So with k = log b n ■ T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) ● What if a < b? ■ Recall that  (x k + x k-1 + … + x + 1) = (x k+1 -1)/(x-1)

David Luebke 12 10/3/2015 ● So with k = log b n ■ T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) ● What if a < b? ■ Recall that (x k + x k-1 + … + x + 1) = (x k+1 -1)/(x-1) ■ So:

David Luebke 13 10/3/2015 ● So with k = log b n ■ T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) ● What if a < b? ■ Recall that  (x k + x k-1 + … + x + 1) = (x k+1 -1)/(x-1) ■ So: ■ T(n) = cn ·  (1) =  (n)

David Luebke 14 10/3/2015 ● So with k = log b n ■ T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) ● What if a > b?

David Luebke 15 10/3/2015 ● So with k = log b n ■ T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) ● What if a > b?

David Luebke 16 10/3/2015 ● So with k = log b n ■ T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) ● What if a > b? ■ T(n) = cn ·  (a k / b k )

David Luebke 17 10/3/2015 ● So with k = log b n ■ T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) ● What if a > b? ■ T(n) = cn ·  (a k / b k ) = cn ·  (a log n / b log n ) = cn ·  (a log n / n)

David Luebke 18 10/3/2015 ● So with k = log b n ■ T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) ● What if a > b? ■ T(n) = cn ·  (a k / b k ) = cn ·  (a log n / b log n ) = cn ·  (a log n / n) recall logarithm fact: a log n = n log a

David Luebke 19 10/3/2015 ● So with k = log b n ■ T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) ● What if a > b? ■ T(n) = cn ·  (a k / b k ) = cn ·  (a log n / b log n ) = cn ·  (a log n / n) recall logarithm fact: a log n = n log a = cn ·  (n log a / n) =  (cn · n log a / n)

David Luebke 20 10/3/2015 ● So with k = log b n ■ T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) ● What if a > b? ■ T(n) = cn ·  (a k / b k ) = cn ·  (a log n / b log n ) = cn ·  (a log n / n) recall logarithm fact: a log n = n log a = cn ·  (n log a / n) =  (cn · n log a / n) =  (n log a )

David Luebke 21 10/3/2015 ● So…

David Luebke 22 10/3/2015 The Master Theorem ● Given: a divide and conquer algorithm ■ An algorithm that divides the problem of size n into a subproblems, each of size n/b ■ Let the cost of each stage (i.e., the work to divide the problem + combine solved subproblems) be described by the function f(n) ● Then, the Master Theorem gives us a cookbook for the algorithm’s running time:

David Luebke 23 10/3/2015 The Master Theorem ● if T(n) = aT(n/b) + f(n) then

David Luebke 24 10/3/2015 Using The Master Method ● T(n) = 9T(n/3) + n ■ a=9, b=3, f(n) = n ■ n log b a = n log 3 9 =  (n 2 ) ■ Since f(n) = O(n log  ), where  =1, case 1 applies: ■ Thus the solution is T(n) =  (n 2 )

David Luebke 25 10/3/2015 Sorting Revisited ● So far we’ve talked about two algorithms to sort an array of numbers ■ What is the advantage of merge sort? ■ What is the advantage of insertion sort? ● Next on the agenda: Heapsort ■ Combines advantages of both previous algorithms

David Luebke 26 10/3/2015 ● A heap can be seen as a complete binary tree: ■ What makes a binary tree complete? ■ Is the example above complete? Heaps

David Luebke 27 10/3/2015 ● A heap can be seen as a complete binary tree: ■ The book calls them “nearly complete” binary trees; can think of unfilled slots as null pointers Heaps

David Luebke 28 10/3/2015 Heaps ● In practice, heaps are usually implemented as arrays: A ==

David Luebke 29 10/3/2015 Heaps ● To represent a complete binary tree as an array: ■ The root node is A[1] ■ Node i is A[i] ■ The parent of node i is A[i/2] (note: integer divide) ■ The left child of node i is A[2i] ■ The right child of node i is A[2i + 1] A ==

David Luebke 30 10/3/2015 Referencing Heap Elements ● So… Parent(i) { return  i/2  ; } Left(i) { return 2*i; } right(i) { return 2*i + 1; } ● An aside: How would you implement this most efficiently? ● Another aside: Really?

David Luebke 31 10/3/2015 The Heap Property ● Heaps also satisfy the heap property: A[Parent(i)]  A[i]for all nodes i > 1 ■ In other words, the value of a node is at most the value of its parent ■ Where is the largest element in a heap stored? ● Definitions: ■ The height of a node in the tree = the number of edges on the longest downward path to a leaf ■ The height of a tree = the height of its root

David Luebke 32 10/3/2015 Heap Height ● What is the height of an n-element heap? Why? ● This is nice: basic heap operations take at most time proportional to the height of the heap

David Luebke 33 10/3/2015 Heap Operations: Heapify() ● Heapify() : maintain the heap property ■ Given: a node i in the heap with children l and r ■ Given: two subtrees rooted at l and r, assumed to be heaps ■ Problem: The subtree rooted at i may violate the heap property (How?) ■ Action: let the value of the parent node “float down” so subtree at i satisfies the heap property ○ What do you suppose will be the basic operation between i, l, and r?

David Luebke 34 10/3/2015 Heap Operations: Heapify() Heapify(A, i) { l = Left(i); r = Right(i); if (l A[i]) largest = l; else largest = i; if (r A[largest]) largest = r; if (largest != i) Swap(A, i, largest); Heapify(A, largest); }

David Luebke 35 10/3/2015 Heapify() Example A =

David Luebke 36 10/3/2015 Heapify() Example A = 4

David Luebke 37 10/3/2015 Heapify() Example A = 414

David Luebke 38 10/3/2015 Heapify() Example A =

David Luebke 39 10/3/2015 Heapify() Example A = 4

David Luebke 40 10/3/2015 Heapify() Example A = 48

David Luebke 41 10/3/2015 Heapify() Example A =

David Luebke 42 10/3/2015 Heapify() Example A = 4

David Luebke 43 10/3/2015 Heapify() Example A =

David Luebke 44 10/3/2015 Analyzing Heapify(): Informal ● Aside from the recursive call, what is the running time of Heapify() ? ● How many times can Heapify() recursively call itself? ● What is the worst-case running time of Heapify() on a heap of size n?

David Luebke 45 10/3/2015 Analyzing Heapify(): Formal ● Fixing up relationships between i, l, and r takes  (1) time ● If the heap at i has n elements, how many elements can the subtrees at l or r have? ■ Draw it ● Answer: 2n/3 (worst case: bottom row 1/2 full) ● So time taken by Heapify() is given by T(n)  T(2n/3) +  (1)

David Luebke 46 10/3/2015 Analyzing Heapify(): Formal ● So we have T(n)  T(2n/3) +  (1) ● By case 2 of the Master Theorem, T(n) = O(lg n) ● Thus, Heapify() takes linear time

David Luebke 47 10/3/2015 Heap Operations: BuildHeap() ● We can build a heap in a bottom-up manner by running Heapify() on successive subarrays ■ Fact: for array of length n, all elements in range A[  n/2  n] are heaps (Why?) ■ So: ○ Walk backwards through the array from n/2 to 1, calling Heapify() on each node. ○ Order of processing guarantees that the children of node i are heaps when i is processed

David Luebke 48 10/3/2015 BuildHeap() // given an unsorted array A, make A a heap BuildHeap(A) { heap_size(A) = length(A); for (i =  length[A]/2  downto 1) Heapify(A, i); }

David Luebke 49 10/3/2015 BuildHeap() Example ● Work through example A = {4, 1, 3, 2, 16, 9, 10, 14, 8, 7}

David Luebke 50 10/3/2015 Analyzing BuildHeap() ● Each call to Heapify() takes O(lg n) time ● There are O(n) such calls (specifically,  n/2  ) ● Thus the running time is O(n lg n) ■ Is this a correct asymptotic upper bound? ■ Is this an asymptotically tight bound? ● A tighter bound is O(n) ■ How can this be? Is there a flaw in the above reasoning?

David Luebke 51 10/3/2015 Analyzing BuildHeap(): Tight ● To Heapify() a subtree takes O(h) time where h is the height of the subtree ■ h = O(lg m), m = # nodes in subtree ■ The height of most subtrees is small ● Fact: an n-element heap has at most  n/2 h+1  nodes of height h ● CLR 7.3 uses this fact to prove that BuildHeap() takes O(n) time

David Luebke 52 10/3/2015 Heapsort ● Given BuildHeap(), an in-place sorting algorithm is easily constructed: ■ Maximum element is at A[1] ■ Discard by swapping with element at A[n] ○ Decrement heap_size[A] ○ A[n] now contains correct value ■ Restore heap property at A[1] by calling Heapify() ■ Repeat, always swapping A[1] for A[heap_size(A)]

David Luebke 53 10/3/2015 Heapsort Heapsort(A) { BuildHeap(A); for (i = length(A) downto 2) { Swap(A[1], A[i]); heap_size(A) -= 1; Heapify(A, 1); }

David Luebke 54 10/3/2015 Analyzing Heapsort ● The call to BuildHeap() takes O(n) time ● Each of the n - 1 calls to Heapify() takes O(lg n) time ● Thus the total time taken by HeapSort() = O(n) + (n - 1) O(lg n) = O(n) + O(n lg n) = O(n lg n)

David Luebke 55 10/3/2015 Priority Queues ● Heapsort is a nice algorithm, but in practice Quicksort (coming up) usually wins ● But the heap data structure is incredibly useful for implementing priority queues ■ A data structure for maintaining a set S of elements, each with an associated value or key ■ Supports the operations Insert(), Maximum(), and ExtractMax() ■ What might a priority queue be useful for?

David Luebke 56 10/3/2015 Priority Queue Operations ● Insert(S, x) inserts the element x into set S ● Maximum(S) returns the element of S with the maximum key ● ExtractMax(S) removes and returns the element of S with the maximum key ● How could we implement these operations using a heap?