Presentation is loading. Please wait.

Presentation is loading. Please wait.

School of Computing Clemson University Fall, 2012

Similar presentations


Presentation on theme: "School of Computing Clemson University Fall, 2012"— Presentation transcript:

1 School of Computing Clemson University Fall, 2012
Lecture 16. Binary Heaps CpSc 212: Algorithms and Data Structures Brian C. Dean School of Computing Clemson University Fall, 2012

2 Data Structure Review Abstract Data Types (Specifications)
“Concrete” Implementations Sequences Stacks Queues Sets Maps Priority Queues “Databases” (i.e., multi-dimensional point sets) Arrays Linked Lists Hash Tables Binary Search Trees (many types of balancing mechanisms, B-trees, skip lists) Binary Heaps kd Trees, Suffix Trees, …

3 Priority Queues In a simple FIFO queue, elements exit in the same order as they enter. In a priority queue, the element with highest priority (usually defined as having lowest key) is always the first to exit. Many uses: Scheduling: Manage a set of tasks, where you always perform the highest-priority task next. Sorting: Insert n elements into a priority queue and they will emerge in sorted order. Complex Algorithms: For example, Dijkstra’s shortest path algorithm is built on top of a priority queue.

4 Priority Queues All priority queues support:
Insert(e, k) : Insert a new element e with key k. Remove-Min : Remove and return the element with minimum key. In practice (mostly due to Dijsktra’s algorithm), many support: Decrease-Key(e, Δk) : Given a pointer to element e within the heap, reduce e’s key by Δk. Some priority queues also support: Increase-key(e, Δk) : Increase e’s key by Δk. Delete(e) : Remove e from the structure. Find-min : Return a pointer to the element with minimum key.

5 Redundancies Among Operations
Given insert and delete, we can implement increase-key and decrease-key. Given decrease-key and remove-min, we can implement delete. Given find-min and delete, we can implement remove-min. Given insert and remove-min, we can implement find-min.

6 Priority Queue Implementations
There are many simple ways to implement the abstract notion of a priority queue as a concrete data structure: insert remove-min Unsorted array or linked list O(1) O(n) Sorted array or linked list Binary heap O(log n) Balanced binary search tree

7 The Binary Heap An almost-complete binary tree (all levels full except the last, which is filled from the left side up to some point). Satisfies the heap property: for every element e, key(parent(e)) ≤ key(e). Minimum element always resides at root. Physically stored in an array A[0...n-1]. Easy to move around the array in a treelike fashion: Parent(i) = floor((i-1)/2). Left-child(i) = 2i + 1 Right-child(i) = 2i + 2. 2 5 3 9 8 6 5 1 2 3 4 5 6 7 8 9 A: 2 5 3 9 8 6 5 13 14 10 13 14 10 Actual array representation in memory Mental picture as a tree

8 Heap Operations : sift-up and sift-down
All binary heap operations are built from the two fundamental operations sift-up and sift-down: sift-up(i) : Repeatedly swap element A[i] with its parent as long as A[i] violates the heap property with respect to its parent (i.e., as long as A[i] < A[parent(i)]). sift-down(i) : As long as A[i] violates the heap property with one of its children, swap A[i] with its smallest child. Both operations run in O(log n) time since the height of an n-element heap is O(log n). In some other places, sift-down is called heapify, and sift-up is known as up-heap.

9 Implementing Heap Operations Using sift-up and sift-down
The remaining operations are now easy to implement in terms of sift-up and sift-down: insert : place new element in A[n+1], then sift-up(n+1). remove-min : swap A[n] and A[1], then sift-down(1). decrease-key(i, Δk) : decrease A[i] by Δk, then sift-up(i). increase-key(i, Δk) : increase A[i] by Δk, then sift-down(i). delete(i) : swap A[i] with A[n], then sift-up(i), sift-down(i). All of these clearly run in O(log n) time. General idea: modify the heap, then fix any violation of the heap property with one or two calls to sift-up or sift-down.

10 Caveat: You Can’t Easily Find Elements In Heaps (Except the Min)
Actual Elements of Data: (say, stored in an array) Heap: e1 2 e2 e3 5 3 e4 e5 9 7 4 e6 e7 6 Each record in the data structure keeps a pointer to the physical element of data it represents, and each element of data maintains a pointer to its corresponding record in the data structure.

11 Building a Binary Heap We could build a binary heap in O(n log n) time using n successive calls to insert. Another way to build a heap: start with our n elements in arbitrary order in A[1..n], then call sift-down(i) for each i from n down to 1. Remarkable fact #1: this builds a valid heap! Remarkable fact #2: this runs in only O(n) time!

12 Bottom-Up Heap Construction
The key property of sift-down is that it fixes an isolated violation of the heap property at the root: Using induction, it is now easy to prove that our “bottom-up” construction yields a valid heap. Possible heap property violations Valid Heap Valid Heap Valid Heap

13 Bottom-Up Heap Construction
To analyze the running time of bottom-up construction, note that: At most n elements reside in the bottom level of the heap. Only 1 unit of work done to them by sift-down. At most n/2 elements reside in the 2nd lowest level, and at most 2 units of work are done to each of them. At most n/4 elements reside in the 3rd lowest level, and at most 3 units of work are done to them. So total time ≤ T = n + 2(n/2) + 3(n/4) + 4(n/8) + … (for simplicity, we carry the sum out to infinity, as this will certainly give us an upper bound). Claim: T = 4n = O(n)

14 “Shifting” Technique for Sums
T = n + 2(n/2) + 3(n/4) + 4(n/8) + … - T/2 = n/2 + 2(n/4) + 3(n/8) + … T/2 = n + n/ n/ n/8 + … Applying the same trick again: T = 2n + n + (n/2) + (n/4) + … - T/2 = n + (n/2) + (n/4) + … T/2 = 2n

15 Heapsort Any priority queue can be used to sort. Just use n inserts followed by n remove-mins. The binary heap gives us a particularly nice way to sort in O(n log n) time, known as heapsort: Start with an array A[1..n] of elements to sort. Build a heap (bottom up) on A in O(n) time. Call remove-min n times. Afterwards, A will end up reverse-sorted (it would be forward-sorted if we had started with a “max” heap) Heapsort compares favorably to… Merge sort, because heapsort runs in place. Randomized quicksort, because heapsort is deterministic.

16 Treaps Heap BST 5 23 10 11 12 28 26 7 4 15 15 24 14 31 A treap is a binary tree in which each node contains two keys, a “heap key” and a “BST key”. It satisfies the heap property with respect to the heap keys, and the BST property with respect to the BST keys. The BST keys will store the elements of our BST; we’ll choose heap keys as necessary to assist in balancing.

17 Treaps If heap keys are all distinct, then there is only one valid “shape” for a treap. (why?) If we choose heap keys at random, then our treap will be randomly-structured! What about insert and delete? insert : Insert new element as leaf using the standard BST insertion procedure (so BST property satisfied). Assign new element a random heap key. Then restore heap property (while preserving BST property) using sift-up implemented with rotations. delete : Give element a heap key of +∞, sift it down (again using rotations, to preserve BST property) to a leaf, then delete it.

18 Mergeable Heaps : Motivation
A mergeable (meldable) heap supports all fundamental priority queue operations as well as: merge(h1, h2) : merge heaps h1 and h2 into a single heap, destroying h1 and h2 in the process. The binary heap doesn’t seem to support merge any faster than in O(n) time. Why study mergeable heaps? They illustrate elegant data structure design techniques. Mergeable heaps are the first step along the road to more powerful priority queues, such as binomial heaps, pairing heaps, skew heaps, leftist heaps, and Fibonacci heaps.

19 Suppose we store our priority queue in a heap-ordered binary tree.
Heap-Ordered Trees Suppose we store our priority queue in a heap-ordered binary tree. Not necessarily an “almost complete” tree though, so we can’t encode it succinctly in a single array as with the binary heap. Each node maintains a pointer to its parent, left child, and right child. The tree is not necessarily “balanced”. It could conceivably be nothing more than a single sorted path. 2 5 3 7 8 6 10

20 Suppose we can merge two heap-ordered trees in O(log n) time.
All You Need is Merge… Suppose we can merge two heap-ordered trees in O(log n) time. All priority queue operations now easy to implement in O(log n) time! insert: merge with a new 1-element tree. remove-min: remove root, merge left & right subtrees. decrease-key(e): detach e’s subtree, decrease e’s key, then merge it back into the main tree. delete: use decrease-key + remove-min. increase-key: use delete + insert.

21 Recursively Merging Two
Heap-Ordered Trees Take two heap-ordered trees h1 and h2, where h1 has the smaller root. Clearly, h1’s root must become the root of the merged tree. To complete the merge, recursively merge h2 into either the left or right subtree of h1: As a base case, the process ends when we merge a heap h1 with an empty heap, the result being just h1. h1: 2 h2: 5 L R Recursively merge into either L or R.


Download ppt "School of Computing Clemson University Fall, 2012"

Similar presentations


Ads by Google