Presentation is loading. Please wait.

Presentation is loading. Please wait.

Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Similar presentations


Presentation on theme: "Week 13 - Wednesday.  What did we talk about last time?  NP-completeness."— Presentation transcript:

1 Week 13 - Wednesday

2  What did we talk about last time?  NP-completeness

3

4

5

6 Student Lecture

7

8  Running time  Best case  Worst case  Average case  Stable  Will elements with the same value get reordered?  Adaptive  Will a mostly sorted list take less time to sort?  In-place  Can we perform the sort without additional memory?  Simplicity of implementation  Relates to the constant hidden by Big Oh  Online  Can sort as numbers arrive

9

10  Pros:  Best case running time of O(n)  Stable  Adaptive  In-place  Simple implementation (one of the fastest sorts for 10 elements or fewer!)  Online  Cons:  Worst case running time of O(n 2 )

11  We do n rounds  For round i, assume that the elements 0 through i – 1 are sorted  Take element i and move it up the list of already sorted elements until you find the spot where it fits

12 7 7 45 0 0 54 37 108 51 7 7 45 0 0 54 37 108 51 0 0 7 7 45 54 37 108 51 0 0 7 7 45 54 37 108 51 0 0 7 7 37 45 54 108 51 0 0 7 7 37 45 54 108 51 0 0 7 7 37 45 51 54 108

13

14

15  Pros:  Best, worst, and average case running time of O(n log n)  Stable  Ideal for linked lists  Cons:  Not adaptive  Not in-place ▪ O(n) additional space needed for an array ▪ O(log n) additional space needed for linked lists

16  Take a list of numbers, and divide it in half, then, recursively:  Merge sort each half  After each half has been sorted, merge them together in order

17 7 7 45 0 0 0 0 0 0 0 0 7 7 54 37 108 37 108 37 108 37 54 108 7 7 45 0 0 54 37 108 0 0 7 7 37 45 54 108 7 7 45 0 0 54 37 108

18

19

20  Pros:  Best and average case running time of O(n log n)  Very simple implementation  In-place  Ideal for arrays  Cons:  Worst case running time of O(n 2 )  Not stable

21 1. Pick a pivot 2. Partition the array into a left half smaller than the pivot and a right half bigger than the pivot 3. Recursively, quicksort the left half 4. Recursively quicksort the right half

22  Input: array, index, left, right  Set pivot to be array[index]  Swap array[index] with array[right]  Set index to left  For i from left up to right – 1  If array[i] ≤ pivot ▪ Swap array[i] with array[index] ▪ index++  Swap array[index] with array[right]  Return index //so that we know where pivot is

23 7 7 45 0 0 54 37 108 0 0 7 7 45 54 37 108 0 0 7 7 45 54 37 108 0 0 7 7 37 45 54 108 0 0 7 7 37 45 54 108 0 0 7 7 37 45 54 108 0 0 7 7 37 45 54 108

24  Everything comes down to picking the right pivot  If you could get the median every time, it would be great  A common choice is the first element in the range as the pivot  Gives O(n 2 ) performance if the list is sorted (or reverse sorted)  Why?  Another implementation is to pick a random location  Another well-studied approach is to pick three random locations and take the median of those three  An algorithm exists that can find the median in linear time, but its constant is HUGE

25

26  A maximum heap is a complete binary tree where  The left and right children of the root have key values less than the root  The left and right subtrees are also maximum heaps  We can define minimum heaps similarly

27 10 9 9 3 3 0 0 1 1

28  Easy!  We look at the location of the new node in binary  Ignore the first 1, then each 0 is for going left, each 1 is for going right

29  Location: 6  In binary: 110  Right then left 10 9 9 3 3 0 0 1 1

30  Oh no! 10 9 9 3 3 0 0 1 1 15

31 10 9 9 3 3 0 0 1 1 15 10 9 9 15 0 0 1 1 3 3 9 9 10 0 0 1 1 3 3

32 9 9 3 3 0 0 1 1

33 9 9 3 3 0 0 1 1 9 9 3 3 0 0 1 1

34 9 9 3 3 0 0 1 1 1 1 3 3 0 0 9 9

35  Heaps only have:  Add  Remove Largest  Get Largest  Which cost:  Add:O(log n)  Remove Largest:O(log n)  Get Largest:O(1)  Heaps are a perfect data structure for a priority queue

36

37  A priority queue is an ADT that allows us to insert key values and efficiently retrieve the highest priority one  It has these operations:  Insert(key)Put the key into the priority queue  Max()Get the highest value key  Remove Max()Remove the highest value key

38  It turns out that a heap is a great way to implement a priority queue  Although it's useful to think of a heap as a complete binary tree, almost no one implements them that way  Instead, we can view the heap as an array  Because it's a complete binary tree, there will be no empty spots in the array

39 public class PriorityQueue { private int[] keys = new int[10]; private int size = 0; … }

40  Illustrated:  The left child of element i is at 2i + 1  The right child of element i is at 2i + 2 10 9 9 3 3 0 0 1 1 9 9 3 3 0 0 1 1 01234

41 public void insert(int key)  Always put the key at the end of the array (resizing if needed)  The value will often need to be bubbled up, using the following helper method private void bubbleUp(int index)

42 public int max()  Find the maximum value in the priority queue  Hint: this method is really easy

43 public int removeMax()  Store the value at the top of the heap (array index 0 )  Replace it with the last legal value in the array  This value will generally need to be bubbled down, using the following helper method  Bubbling down is harder than bubbling up, because you might have two legal children! private void bubbleDown(int index)

44

45  Heapsort  TimSort  Counting sort  Radix sort

46  Work on Project 4  Work on Assignment 6  Due on Friday  Read sections 2.4 and 5.1


Download ppt "Week 13 - Wednesday.  What did we talk about last time?  NP-completeness."

Similar presentations


Ads by Google