CSE 373 Data Structures and Algorithms

Slides:



Advertisements
Similar presentations
COL 106 Shweta Agrawal and Amit Kumar
Advertisements

CMSC 341 Binary Heaps Priority Queues. 8/3/2007 UMBC CSMC 341 PQueue 2 Priority Queues Priority: some property of an object that allows it to be prioritized.
1 HeapSort CS 3358 Data Structures. 2 Heapsort: Basic Idea Problem: Arrange an array of items into sorted order. 1) Transform the array of items into.
CSE332: Data Abstractions Lecture 5: Binary Heaps, Continued Tyler Robison Summer
Version TCSS 342, Winter 2006 Lecture Notes Priority Queues Heaps.
1 TCSS 342, Winter 2005 Lecture Notes Priority Queues and Heaps Weiss Ch. 21, pp
CSE 326: Data Structures Binomial Queues Ben Lerner Summer 2007.
CSE 373 Data Structures and Algorithms Lecture 13: Priority Queues (Heaps)
Chapter 21 Binary Heap.
CSE332: Data Abstractions Lecture 5: Binary Heaps, Continued Dan Grossman Spring 2012.
Lecture 15 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
FALL 2005CENG 213 Data Structures1 Priority Queues (Heaps) Reference: Chapter 7.
CompSci 100e 8.1 Scoreboard l What else might we want to do with a data structure? AlgorithmInsertionDeletionSearch Unsorted Vector/array Sorted vector/array.
1 Priority Queues (Heaps). 2 Priority Queues Many applications require that we process records with keys in order, but not necessarily in full sorted.
Priority Queues and Heaps. John Edgar  Define the ADT priority queue  Define the partially ordered property  Define a heap  Implement a heap using.
Heaps and Priority Queues What is a heap? A heap is a binary tree storing keys at its internal nodes and satisfying the following properties:
Heaps and Heap Sort. Sorting III / Slide 2 Background: Complete Binary Trees * A complete binary tree is the tree n Where a node can have 0 (for the leaves)
CSE373: Data Structures & Algorithms Priority Queues
CSE373: Data Structures & Algorithms
CS 201 Data Structures and Algorithms
Heaps (8.3) CSE 2011 Winter May 2018.
Priority Queues and Heaps Suggested reading from Weiss book: Ch. 21
Heaps, Heap Sort and Priority Queues
Priority Queues and Heaps
October 30th – Priority QUeues
Hashing Exercises.
Heaps, Heap Sort, and Priority Queues
Priority Queues (Heaps)
ADT Heap data structure
CSE373: Data Structures & Algorithms Lecture 6: Binary Search Trees
CMSC 341: Data Structures Priority Queues – Binary Heaps
CSE 326: Data Structures Lecture #4 Heaps more Priority Qs
CSE332: Data Abstractions Lecture 5: Binary Heaps, Continued
Part-D1 Priority Queues
Data Structures & Algorithms Priority Queues & HeapSort
Data Structures and Algorithms
Instructor: Lilian de Greef Quarter: Summer 2017
Dr. David Matuszek Heapsort Dr. David Matuszek
CSE 373: Data Structures and Algorithms
CMSC 341 Lecture 14 Priority Queues & Heaps
B-Tree Insertions, Intro to Heaps
CSE 332: Data Structures Priority Queues – Binary Heaps
BuildHeap The general algorithm is to place the N keys in an array and consider it to be an unordered binary tree. The following algorithm will build a.
CSE 373 Data Structures and Algorithms
Heaps and the Heapsort Heaps and priority queues
Tree Representation Heap.
Ch. 8 Priority Queues And Heaps
CSE 373: Data Structures and Algorithms
Hassan Khosravi / Geoffrey Tien
Computer Science 2 Heaps.
CE 221 Data Structures and Algorithms
CSE 326: Data Structures Priority Queues & Binary Heaps
Data Structures Lecture 30 Sohail Aslam.
CSE 332: Data Structures Priority Queues – Binary Heaps Part II
Heapsort.
CSE 326: Data Structures Sorting
CSE 373: Data Structures and Algorithms
CSE 12 – Basic Data Structures
Sorting And Searching CSE116A,B 4/7/2019 B.Ramamurthy.
CSE373: Data Structures & Algorithms Lecture 7: Binary Heaps, Continued Dan Grossman Fall 2013.
Priority Queues (Heaps)
Priority Queues CSE 373 Data Structures.
Heapsort.
CSE 373, Copyright S. Tanimoto, 2002 Priority Queues -
CSE 326: Data Structures Lecture #5 Heaps More
CSE 373: Data Structures and Algorithms
Heaps & Multi-way Search Trees
CO 303 Algorithm Analysis and Design
EE 312 Software Design and Implementation I
CSE 373: Data Structures and Algorithms
Presentation transcript:

CSE 373 Data Structures and Algorithms Lecture 15: Priority Queues (Heaps) III

Generic Collections

Generics and arrays public class Foo<T> { private T myField; // ok public void method1(T param) { myField = new T(); // error T[] a = new T[10]; // error } You cannot create objects or arrays of a parameterized type. Why not?

Generics/arrays, fixed public class Foo<T> { private T myField; // ok public void method1(T param) { myField = param; // ok T[] a2 = (T[])(new Object[10]); // ok } But you can declare variables of that type, accept them as parameters, return them, or create arrays by casting Object[].

The compareTo method The standard way for a Java class to define a comparison function for its objects is to define a compareTo method. Example: in the String class, there is a method: public int compareTo(String other) A call of A.compareTo(B) will return: a value < 0 if A comes "before" B a value > 0 if A comes "after" B or 0 if A and B are "equal”

Comparable public interface Comparable<E> { public int compareTo(E other); } A class can implement the Comparable interface to define a natural ordering function for its objects. A call to the compareTo method should return: a value < 0 if the other object comes "before" this one a value > 0 if the other object comes "after" this one or 0 if the other object is considered "equal" to this

Comparable template public class name implements Comparable<name> { ... public int compareTo(name other) { } Exercise: Add a compareTo method to the PrintJob class such that PrintJobs are ordered according to their priority (ascending – lower priorities are more important than higher ones).

Comparable example public class PrintJob implements Comparable<PrintJob> { private String user; private int number; private int priority; public PrintJob(int number, String user, int priority) { this.number = number; this.user = user; this.priority = priority; } public int compareTo(PrintJob otherJob) { return priority - otherJob.priority; public String toString() { return number + " (" + user + "):" + priority;

d-Heaps

Generalization: d-Heaps Each node has d children Still can be represented by array Good choices for d are a power of 2 Only because multiplying and dividing by powers of 2 is fast on a computer How does height compare to binary heap? 4 9 6 5 3 2 1 8 10 12 7 11 D-heaps address all these problems. Note that will reduce the height of the tree: insert is faster.

Operations on d-Heap insert: runtime = remove: runtime = Does this help insert or remove more? depth of tree decreases, (logd n) bubbleDown requires more comparisons to find min, (d logd n) 11

Other Priority Queue Operations

More Min-Heap Operations decreasePriority: reduce the priority value of an element in the queue Solution: change priority and ________________________ increasePriority: increase the priority value of an element in the queue Solution: change priority and _________________________ How do we find the element in the queue? What about duplicates? Need a reference to the element! percolateUp Why do I insist on a pointer to each item in decreaseKey, increaseKey, and remove? Because it’s hard to find an item in a heap; it’s easy to delete the minimum, but how would you find some arbitrary element? This is where the Dictionary ADT and its various implementations which we will study soon come up. decreaseKey: percolateUp increaseKey: percolateDown percolateDown It’s hard to find in a pQ

More Min-Heap Operations remove: given a reference to an object in the queue, remove the object from the queue Solution: set priority to negative infinity, percolate up to root and deleteMin findMax Solution: Can look at all leaves, but not really the point of a min-heap! Why do I insist on a pointer to each item in decreaseKey, increaseKey, and remove? Because it’s hard to find an item in a heap; it’s easy to delete the minimum, but how would you find some arbitrary element? This is where the Dictionary ADT and its various implementations which we will study soon come up. buildHeap: Just call insert N times: N log N worst case, O(n) average case Can we get guaranteed linear time? What about build heap? Is that really a necessary operation? It turns out that the necessity for this is based _purely_ on the fact that we can do it faster than the naïve implementation! Call insert n times Θ(n log n) worst, Θ(n) average

Building a Heap Given a list of numbers, how would you build a heap? At every point, the new item may need to percolate all the way through the heap Adding the items one at a time is (n log n) in the worst case A more sophisticated algorithm does it in (n)

O(N) buildHeap 10 11 12 0 1 2 3 First, add all elements arbitrarily maintaining the completeness property Then fix the heap order property by performing a "bubble down" operation on every node that is not a leaf, starting from the rightmost internal node and working back to the root Red nodes need to percolate down 6 60 14 18 21 45 32 Let’s try pretending it’s a heap already and just fixing the heap-order property. The red nodes are the ones that are out of order. Question: which nodes MIGHT be out of order in any heap?

buildHeap practice problem 10 11 12 0 1 2 3 Each element in the list [12, 5, 11, 3, 10, 6, 9, 4, 8, 1, 7, 2] has been inserted into a heap such that the completeness property has been maintained. Now, fix the heap's order property by "bubbling down" every internal node. 2 7 1 8 4 9 6 10 3 11 5 12 Red nodes need to percolate down Let’s try pretending it’s a heap already and just fixing the heap-order property. The red nodes are the ones that are out of order. Question: which nodes MIGHT be out of order in any heap?

6 7 1 8 4 9 2 10 3 11 5 12 6 7 10 8 4 9 2 1 3 11 5 12 11 7 10 8 4 9 6 1 3 2 5 12 11 7 10 8 4 9 6 5 3 2 1 12

Final State of the Heap - Runtime bounded by sum of heights of nodes, which is linear. O(n) - How many nodes at height 1, and height 2, up to root, - See text, Thm. 6.1 p. 194 for detailed proof. 1 3 2 4 5 6 9 How long does this take? Well, everything above the fringe might move 1 step. Everything height 2 or greater might move 2 steps. Most nodes move only a small number of steps  the runtime is O(n). (see text for proof) Full sum = (I=0 to height) SUM (h-I) * 2^i 12 8 10 7 11

Successive inserts (n log n): Different Heaps - Runtime bounded by sum of heights of nodes, which is linear. O(n) - How many nodes at height 1, and height 2, up to root, - See text, Thm. 6.1 p. 194 for detailed proof. Successive inserts (n log n): buildHeap (n): 11 7 10 8 12 9 6 4 5 2 3 1 11 7 10 8 12 9 6 5 4 2 3 1 But it doesn't matter because they are both heaps.

Heap Sort

Heap sort heap sort: an algorithm to sort an array of N elements by turning the array into a heap, then doing a remove N times The elements will come out in sorted order! What is the runtime? This algorithm is not very space-efficient. Why not?

Improved heap sort The heap sort shown requires a second array We can use a max-heap to implement an improved version of heap sort that needs no extra storage Useful on low-memory devices Still only O(n log n) runtime Elegant

Improved heap sort 1 Use an array heap, but with 0 as the root index max-heap state after buildHeap operation:

Improved heap sort 2 State after one remove operation: Modified remove that moves element to end

Improved heap sort 3 State after two remove operations: Notice that the largest elements are at the end (becoming sorted!)

Sorting algorithms review Best case Average case (†) Worst case Bubble sort n n2 Selection sort Insertion sort Mergesort n log2n Heapsort Quicksort † According to Knuth, the average growth rate of Insertion sort is about 0.9 times that of Selection sort and about 0.4 times that of Bubble Sort. The average growth rate of Quicksort is about 0.74 times that of Mergesort and about 0.5 times that of Heapsort.