Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSE 326: Data Structures Lecture #3 Mind your Priority Qs

Similar presentations


Presentation on theme: "CSE 326: Data Structures Lecture #3 Mind your Priority Qs"— Presentation transcript:

1 CSE 326: Data Structures Lecture #3 Mind your Priority Qs
Bart Niswonger Summer Quarter 2001 Raise your hand if you can read. Raise your other hand if you can write. Stand up if you can think. WHOOP if you can talk. Point: this classroom is not sacred ground. I am not a fearsome taskmaster. Please take part in this class, the most horrible thing in the world for an instructor is unresponsive students; don’t be my nightmare!

2 Today’s Outline The First Quiz!
Things Bart Didn’t Get to on Wednesday (Priority Queues & Heaps) Extra heap operations d-Heaps OK, back to comupter science. Today we’ll cover everything I missed on Wed. That’s heaps of stuff, and then move on to some extra heap operations and d-heaps. I’ll hand the quizzes back right now, however. You MUST hand in your score sheet along with your homework!

3 Back to Queues Some applications Problems? ordering CPU jobs
simulating events picking the next search site Problems? short jobs should go first earliest (simulated time) events should go first most promising sites should be searched first Alright, now we all remember what a tree is. Let’s harken back to queues now. Here are some possible applications of a queue which aren’t quite right. All of these want to favor certain entries in the Queue. How do we handle that?

4 Priority Queue ADT Priority Queue operations
Remember ADTs? Priority Queue ADT Priority Queue operations create destroy insert deleteMin is_empty Priority Queue property: for two elements in the queue, x and y, if x has a lower priority value than y, x will be deleted before y F(7) E(5) D(100) A(4) B(6) insert deleteMin G(9) C(3) We need a new ADT for this. One that returns the best (which we’ll generally define as lowest priority value) node next. I’m often going to represent priority queues as collections of priorities; remember that they are collections of data with priorities. MOREOVER, some priority queues don’t even need priorities, just the ability to compare priorities. (all of the ones we’ll talk about fall in this category).

5 Applications of the Priority Q
Hold jobs for a printer in order of length Store packets on network routers in order of urgency Simulate events Select symbols for compression Sort numbers Anything greedy There are, of course, gobs of applications for priority queues. Here are just a few.

6 Naïve Priority Q Data Structures
Unsorted list: insert: deleteMin: Sorted list: OK, let’s look at some ways we might implement priority Qs. Unsorted lists give us fast inserts, but we have to search the whole list to delete. Sorted lists give us fast delete (it’s just the first element!), but we have to either search the whole list (for ll) or move the whole list (for array) on an insert.

7 Binary Search Tree Priority Q Data Structure (that’s a mouthful)
8 insert: deleteMin: 5 11 2 6 10 12 A binary search tree is a binary tree in which all nodes in the left subtree of a node have lower values than the node. All nodes in the right subtree of a node have higher value than the node. We’ll talk more about these later. For now, let’s get a feel for how well they work as PQ data structures. How long might an insert take here? (log n) How about deleteMin? It’s always the leftmost node in the tree, right? So, it might take log n time to find the node. What about worst case? The book says that these lopsided deletes don’t cause a problem. Ask me about that via if you’re curious why not. 4 7 9 14 13

8 Binary Heap Priority Q Data Structure
Heap-order property parent’s key is less than children’s keys result: minimum is always at the top Structure property complete tree with fringe nodes packed to the left result: depth is always O(log n); next open location always known 20 14 12 9 11 8 10 6 7 5 4 2 Alright, there are two problems with binary search trees as pqs. First, they’re overkill. Why keep everything ordered? We just need to know the least at any given time. Second, they’re not guaranteed to be complete, so we have worst case O(n) times. So – whats the very worst an insert could cost? First we look at a neato storage trick, then we check out the operations

9 Nifty Storage Trick Calculations: child: parent: root: next free: 20
1 Calculations: child: parent: root: next free: 20 14 12 9 11 8 10 6 7 5 4 2 2 3 4 5 6 7 8 9 It so happens that we can store a complete tree in an array and still be able to find children and parents easily. We’ll take advantage of this. Child: left = 2*node right = 2*node + 1 parent = floor(node/2) nextfree = length + 1 root = 1 12 10 11 1 2 3 4 5 6 7 8 9 10 11 12 12 2 4 5 7 6 10 8 11 9 12 14 20

10 DeleteMin pqueue.deleteMin() 2 20 14 12 9 11 8 10 6 7 5 4 2 ? 4 5 7 6
OK – back to Priority Qs – the most important operation: deleting!! OK, to delete, we start by plucking out that value in O(1) time. Now, we have a hole at the top, and a node that isn’t proper for a complete tree at the bottom. So, we fix it by putting a value in the hole, and moving the hole until we restore the heap-order property. 11 9 12 14 20

11 Percolate Down ? 4 4 5 ? 5 7 6 10 8 7 6 10 8 11 9 12 14 20 11 9 12 14 20 4 4 6 5 6 5 7 ? 10 8 7 12 10 8 11 9 12 14 20 11 9 20 14 20

12 Finally… 4 6 5 7 12 10 8 11 9 20 14

13 DeleteMin Code runtime: Object deleteMin() { assert(!isEmpty());
int percolateDown(int hole, Object val) { while ( 2 * hole <= size ) { left = 2 * hole; right = left + 1; if ( right <= size && Heap[right] < Heap[left]) target = right; else target = left; if ( Heap[target] < val ) { Heap[hole] = Heap[target]; hole = target; } break; return hole; Object deleteMin() { assert(!isEmpty()); returnVal = Heap[1]; size--; newPos = percolateDown(1, Heap[size+1]); Heap[newPos] = Heap[size + 1]; return returnVal; } Note that there are three things going on here: find the smaller child if the smaller child is still smaller than the moving value, move the smaller child up otherwise, we’ve found the right spot, and stop. runtime:

14 Insert pqueue.insert(3) 20 14 12 9 11 8 10 6 7 5 4 2 2 4 5 7 6 10 8 11
Inserting works similarly. We tack a hole on at the end of the tree, and move that hole _up_ until it’s in the right place for the new value. 11 9 12 14 20 ?

15 Percolate Up 2 2 4 5 4 5 3 7 6 10 8 7 6 ? 8 3 11 9 12 14 20 ? 11 9 12 14 20 10 2 2 3 4 ? 4 3 7 6 5 8 7 6 5 8 11 9 12 14 20 10 11 9 12 14 20 10

16 Insert Code runtime: void insert(Object o) { assert(!isFull());
size++; newPos = percolateUp(size,o); Heap[newPos] = o; } int percolateUp(int hole, Object val) { while (hole > 1 && val < Heap[hole/2]) Heap[hole] = Heap[hole/2]; hole /= 2; } return hole; Notice that this code is a _lot_ easier. runtime:

17 Changing Priorities In many applications the priority of an object in a priority queue may change over time if a job has been sitting in the printer queue for a long time increase its priority unix “renice” Must have some (separate) way of find the position in the queue of the object to change (e.g. a hash table)

18 Other Priority Queue Operations
decreaseKey given the position of an object in the queue, reduce its priority value increaseKey given the position of an an object in the queue, increase its priority value remove given the position of an object in the queue, remove it buildHeap given a set of items, build a heap Why do I insist on a pointer to each item in decreaseKey, increaseKey, and remove? Because it’s hard to find an item in a heap; it’s easy to delete the minimum, but how would you find some arbitrary element? This is where the Dictionary ADT and its various implementations which we will study soon come up. What about build heap? Is that really a necessary operation? It turns out that the necessity for this is based _purely_ on the fact that we can do it faster than the naïve implementation!

19 DecreaseKey, IncreaseKey, and Remove
void decreaseKey(int obj) { assert(size >= obj); temp = Heap[obj]; newPos = percolateUp(obj, temp); Heap[newPos] = temp; } void increaseKey(int obj) { newPos = percolateDown(obj, temp); void remove(int obj) { assert(size >= obj); percolateUp(obj, NEG_INF_VAL); deleteMin(); } The code for these is pretty simple. Notice that these are called just after the key is changed, they do not include the new value (just assume it’s there). This is different from the book’s version. And, it’s not necessarily better.

20 BuildHeap Floyd’s Method. Thank you, Floyd.
12 5 11 3 10 6 9 4 8 1 7 2 pretend it’s a heap and fix the heap-order property! 12 5 11 Pull other slide back up and ask how long it takes. O(n log n) worst case, but O(n) average case. Can we get a worst case O(n) bound? Let’s try pretending it’s a heap already and just fixing the heap-order property. The red nodes are the ones that are out of order. Question: which nodes MIGHT be out of order in any heap? 3 10 6 9 4 8 1 7 2

21 Build(this)Heap 12 12 5 11 5 11 3 10 2 9 3 1 2 9 4 8 1 7 6 4 8 10 7 6 12 12 5 2 1 2 3 1 6 9 3 5 6 9 4 8 10 7 11 4 8 10 7 11

22 Finally… 1 3 2 4 5 6 9 How long does this take? Well, everything above the fringe might move 1 step. Everything height 2 or greater might move 2 steps. In other words n * Sum I=0 to n of I/2i But, this is (as n->inf) just 2. See chapter I to see why. So, the runtime is O(n). 12 8 10 7 11 runtime:

23 Thinking about Heaps Observations Realities
finding a child/parent index is a multiply/divide by two operations jump widely through the heap each operation looks at only two new nodes inserts are at least as common as deleteMins Realities division and multiplication by powers of two are fast looking at one new piece of data sucks in a cache line with huge data sets, disk accesses dominate Now, let’s look at some facts about heaps and see if we can’t come up with a new priority queue implementation.

24 Solution: d-Heaps Each node has d children
1 Each node has d children Still representable by array Good choices for d: optimize performance based on # of inserts/removes choose a power of two for efficiency fit one set of children in a cache line fit one set of children on a memory page/disk block 3 7 2 4 8 5 12 11 10 6 9 D-heaps address all these problems. 12 1 3 7 2 4 8 5 12 11 10 6 9

25 One More Operation Merge two heaps. Ideas?
Anyone have a way to merge two heaps? We’ll find out Monday that we can actually do this in O(log n) time!

26 To Do Read chapter 6 in the book Have teams Do project 1
Ask questions!

27 Coming Up Mergeable heaps Dictionary ADT and Self-Balancing Trees
Unix Development Tutorial (Tuesday) First project due (next Wednesday) Again, turn in that next homework assignment with your score sheet!


Download ppt "CSE 326: Data Structures Lecture #3 Mind your Priority Qs"

Similar presentations


Ads by Google