Lecture 3 / 4 Algorithm Analysis

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 6.
BY Lecturer: Aisha Dawood. Heapsort  O(n log n) worst case like merge sort.  Sorts in place like insertion sort.  Combines the best of both algorithms.
Analysis of Algorithms
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Spring 2015 Lecture 5: QuickSort & Selection
David Luebke 1 5/20/2015 CS 332: Algorithms Quicksort.
Quicksort Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
September 19, Algorithms and Data Structures Lecture IV Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Ch. 7 - QuickSort Quick but not Guaranteed. Ch.7 - QuickSort Another Divide-and-Conquer sorting algorithm… As it turns out, MERGESORT and HEAPSORT, although.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 4 Comparison-based sorting Why sorting? Formal analysis of Quick-Sort Comparison.
Comp 122, Spring 2004 Heapsort. heapsort - 2 Lin / Devi Comp 122 Heapsort  Combines the better attributes of merge sort and insertion sort. »Like merge.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 7 Heapsort and priority queues Motivation Heaps Building and maintaining heaps.
Heapsort Chapter 6. Heaps A data structure with  Nearly complete binary tree  Heap property: A[parent(i)] ≥ A[i] eg. Parent(i) { return } Left(i) {
Quicksort CIS 606 Spring Quicksort Worst-case running time: Θ(n 2 ). Expected running time: Θ(n lg n). Constants hidden in Θ(n lg n) are small.
David Luebke 1 7/2/2015 Merge Sort Solving Recurrences The Master Theorem.
1 QuickSort Worst time:  (n 2 ) Expected time:  (nlgn) – Constants in the expected time are small Sorts in place.
Heapsort CIS 606 Spring Overview Heapsort – O(n lg n) worst case—like merge sort. – Sorts in place—like insertion sort. – Combines the best of both.
Computer Algorithms Lecture 10 Quicksort Ch. 7 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Chapter 7 Quicksort Ack: This presentation is based on the lecture slides from Hsu, Lih- Hsing, as well as various materials from the web.
September 29, Algorithms and Data Structures Lecture V Simonas Šaltenis Aalborg University
David Luebke 1 6/3/2016 CS 332: Algorithms Heapsort Priority Queues Quicksort.
CS 2133: Data Structures Quicksort. Review: Heaps l A heap is a “complete” binary tree, usually represented as an array:
QuickSort (Ch. 7) Like Merge-Sort, based on the three-step process of divide- and-conquer. Input: An array A[1…n] of comparable elements, the starting.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
Computer Algorithms Lecture 9 Heapsort Ch. 6, App. B.5 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Midterm Review 1. Midterm Exam Thursday, October 15 in classroom 75 minutes Exam structure: –TRUE/FALSE questions –short questions on the topics discussed.
Lecture 8 : Priority Queue Bong-Soo Sohn Assistant Professor School of Computer Science and Engineering Chung-Ang University.
Heaps and basic data structures David Kauchak cs161 Summer 2009.
COSC 3101A - Design and Analysis of Algorithms 4 Quicksort Medians and Order Statistics Many of these slides are taken from Monica Nicolescu, Univ. of.
David Luebke 1 2/19/2016 Priority Queues Quicksort.
Sept Heapsort What is a heap? Max-heap? Min-heap? Maintenance of Max-heaps -MaxHeapify -BuildMaxHeap Heapsort -Heapsort -Analysis Priority queues.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Analysis of Algorithms CS 477/677
Lecture 2 Sorting.
"Teachers open the door, but you must enter by yourself. "
Quick Sort Divide: Partition the array into two sub-arrays
Heaps, Heapsort, and Priority Queues
Heaps, Heap Sort and Priority Queues
Algorithms and Data Structures Lecture VI
Heapsort Chapter 6 Lee, Hsiu-Hui
Heapsort.
Heap Sort Example Qamar Abbas.
Introduction to Algorithms
Lecture 7 Algorithm Analysis
CSC 413/513: Intro to Algorithms
CS 583 Analysis of Algorithms
Heaps, Heapsort, and Priority Queues
Ch 6: Heapsort Ming-Te Chi
Draw pictures to indicate the subproblems middleMax solves at each level and the resulting maxPtr and PrevPtr for each on this linked list:
Ch 7: Quicksort Ming-Te Chi
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
Divide-And-Conquer-And-Combine
"Teachers open the door, but you must enter by yourself. "
Lecture 7 Algorithm Analysis
CS 583 Analysis of Algorithms
Lecture 7 Algorithm Analysis
Topic 5: Heap data structure heap sort Priority queue
HEAPS.
CS 332: Algorithms Quicksort David Luebke /9/2019.
Heapsort Sorting in place
Solving Recurrences Continued The Master Theorem
The Selection Problem.
Quicksort and Randomized Algs
Computer Algorithms CISC4080 CIS, Fordham Univ.
Asst. Prof. Dr. İlker Kocabaş
Analysis of Algorithms CS 477/677
Presentation transcript:

Lecture 3 / 4 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea

Heapsort

Fundamentals - Heap A Heap is a nearly complete binary tree, where each node comprises some value. Heap-Property: the value of some node N is greater than or equal to the values of both children of N Height of node = # of edges on a longest simple path from the node down to a leaf. Height of heap = height of root = Θ(lg n), where n is the of nodes in the tree Algorithm Analysis

Heap as Datastructure Interpretation of arrays as trees: PARENT(i) return ⌊i/2⌋ LEFT(i) return 2i RIGHT(i) return 2i + 1 There are Max-heaps and min-heaps Algorithm Analysis

MAX-HEAPIFY MAX-HEAPIFY is used to maintain the max-heap property. Before MAX-HEAPIFY, A[i] may be smaller than its children. Assume left and right subtrees of i are max-heaps. After MAX-HEAPIFY, subtree rooted at i is a max-heap. Algorithm Analysis

The way MAX-HEAPIFY works Compare A[i ], A[LEFT(i )], and A[RIGHT(i )]. If necessary, swap A[i ] with the larger of the two children to preserve heap property. Continue this process of comparing and swapping down the heap, until subtree rooted at i is max-heap. If we hit a leaf, then the subtree rooted at the leaf is trivially a max-heap. Algorithm Analysis

MAX-HEAPIFY example Algorithm Analysis

MAX-HEAPIFY Pseudocode Time / Comparisons: O(lg n). Algorithm Analysis

Using MAX-HEAPIFY for creating a Max-heap Algorithm Analysis Primitive Max-heaps

Build-Max-Heap Correctness Algorithm Analysis

Complexity Build-Max-Heap Obviously O(n log(n)) Tighter Analysis: Number of nodes of height h: Time/Comparisons: We use Formula A.8 (Textbook) And get: Algorithm Analysis

Heapsort Algorithm Starting with the root (the maximum element), the algorithm places the maximum element into the correct place in the array by swapping it with the element in the last position in the array. Discard. this last node (knowing that it is in its correct place) by decreasing the heap size, and calling MAX-HEAPIFY on the new (possibly incorrectly-placed) root. Repeat this .discarding. process until only one node (the smallest element) remains, and therefore is in the correct place in the array. Algorithm Analysis

Pseudcode Heapsort algorithm Complexity: BUILD-MAX-HEAP: O(n) • for loop: n − 1 times / exchange elements: O(1) • MAX-HEAPIFY: O(lg n) Total time: O(n lg n) Algorithm Analysis

Priority Queues

Priority queue Maintains a dynamic set S of elements. Each set element has a key - an associated value. Max-priority queue supports dynamic-set operations: INSERT(S, x): inserts element x into set S. MAXIMUM(S): returns element of S with largest key. EXTRACT-MAX(S): removes and returns element of S with largest key. INCREASE-KEY(S, x, k): increases value of element x.s key to k. Assume k ≥ x.s current key value. Algorithm Analysis

Priority queues 1 Algorithm Analysis

Priority queues 2 Algorithm Analysis

Priority queues 3 Algorithm Analysis

Quicksort

Highlights Worst-case running time: Θ(n2). Expected running time: Θ(n lg n). Constants hidden in Θ(n lg n) are small. Sorts in place. Algorithm Analysis

Divide and Conquer Strategy with Quicksort Divide: Partition A[p . . r ], into two (possibly empty) subarrays A[p . . q − 1] and A[q + 1 . . r ], such that each element in the first subarray A[p . . q − 1] is ≤ A[q] and A[q] is ≤ each element in the second subarray A[q + 1 . . r ] Conquer: Sort the two subarrays by recursive calls to QUICKSORT Combine: No work is needed to combine the subarrays, because they are sorted in place. Algorithm Analysis

Pseudocode Partition Algorithm Analysis

Pseudocode Algorithm Algorithm Analysis

Creation of Partitions (Divide Step) Loop invariant: 1. All entries in A[p . . i ] are ≤ pivot. 2. All entries in A[i + 1 . . j − 1] are > pivot. 3. A[r ] = pivot. Algorithm Analysis

Creation of Partitions (Divide Step) Time for partitioning: Θ(n) to partition an n-element subarray. left recursive call right recursive call Algorithm Analysis

Correctness of Partition Initialization: Before the loop starts, all the conditions of the loop invariant are satisfied, because r is the pivot and the subarrays A[p . . i ] and A[i +1 . . j −1] are empty. Maintenance: While the loop is running, if A[ j ] ≤ pivot, then A[ j ] and A[i +1] are swapped and then i and j are incremented. If A[ j ] > pivot, then increment only j . Termination: When the loop terminates, j = r , so all elements in A are partitioned into one of the three cases: A[p . . i ] ≤ pivot, A[i + 1 . . r − 1] > pivot, and A[r ] = pivot. Algorithm Analysis

Worst-Case (intuitive) Occurs when the subarrays are completely unbalanced. Have 0 elements in one subarray and n − 1 elements in the other subarray. Get the recurrence T(n) = T (n − 1) + T (0) + Θ(n) = T (n − 1) + Θ(n) = Θ(n2) Algorithm Analysis

Best Case (intuitive) Occurs when the subarrays are completely balanced every time. Each subarray has ≤ n/2 elements. Get the recurrence T(n) = 2T(n/2) + Θ(n) = Θ(n lg n) Algorithm Analysis

Balanced Partitioning Quicksort’s average running time is much closer to the best case than to the worst case. Imagine that PARTITION always produces a 9-to-1 split. Get the recurrence T(n) ≤ T(9n/10) + T(n/10) + Θ(n) = O(n lg n) Algorithm Analysis

Balanced Partitioning (cont.) full levels longest path Algorithm Analysis

Average Case (Intuition) Splits in the recursion tree will not always be constant. There will usually be a mix of good and bad splits throughout the recursion tree. To see that this doesn’t affect the asymptotic running time of quicksort, assume that levels alternate between best-case and worst-case splits. Algorithm Analysis

Average Case (Intuition) The extra level in the left-hand figure only adds to the constant hidden in the notation. There are still the same number of subarrays to sort, and only twice as much work was done to get to that point. Algorithm Analysis

Randomized version of quicksort We have assumed that all input permutations are equally likely. This is not always true. To correct this, we add randomization to quicksort. Algorithm Analysis

Average Case (Detailed) The dominant cost of the algorithm is partitioning. PARTITION removes the pivot element from future consideration each time. Thus, PARTITION is called at most n times. The amount of work that each call to PARTITION does is a constant plus the number of comparisons that are performed in its for loop. Let X = the total number of comparisons performed in all calls to PARTITION. So, total work done over the entire execution is O(n + X).. Algorithm Analysis

Average Case – Number of Comparisons For ease of analysis: Rename the elements of A as z1, z2, . . . , zn, with zi being the i th smallest element. Define the set Zi j = {zi , zi+1, . . . , zj } to be the set of elements between zi and zj , inclusive. Algorithm Analysis

Average Case – Number of Comparisons (cont.) Important: Each pair of elements is compared at most once, because elements are compared only to the pivot element, and then the pivot element is never in any later call to PARTITION. Let Xij = I {zi is compared to zj }. Algorithm Analysis

Average Case – Number of Comparisons (cont.) Since each pair is compared at most once, the total number of comparisons performed by the algorithm is Algorithm Analysis

Average Case – Number of Comparisons (cont.) Expectations of both sides: Algorithm Analysis

Average Case – Number of Comparisons (cont.) Probability that two elements are compared: Once a pivot x is chosen such that zi < x < zj , then zi and zj will never be compared at any later time. If either zi or zj is chosen before any other element of Zij , then it will be compared to all the elements of Zij, except itself. The probability that zi is compared to zj is the probability that either zi or zj is the first element chosen Algorithm Analysis

Average Case – Number of Comparisons (cont.) There are j−i+1 elements, and pivots are chosen randomly and independently. Thus, the probability that any particular one of them is the first one chosen is 1/( j − i + 1). Algorithm Analysis

Average Case – Number of Comparisons (cont.) simple transformation approximation Harmonic series Algorithm Analysis