Presentation is loading. Please wait.

Presentation is loading. Please wait.

Greedy Algorithms 15-211 Fundamental Data Structures and Algorithms Peter Lee March 19, 2004.

Similar presentations


Presentation on theme: "Greedy Algorithms 15-211 Fundamental Data Structures and Algorithms Peter Lee March 19, 2004."— Presentation transcript:

1 Greedy Algorithms 15-211 Fundamental Data Structures and Algorithms Peter Lee March 19, 2004

2 Announcements HW6 is due on April 5! Quiz #2 postponed until March 31  an online quiz  requires up to one hour of uninterrupted time with a web browser actually, only a 15-minute quiz  must be completed by April 1, 11:59pm

3

4 Objects in calendar are closer than they appear

5 Greed is Good

6 Example: Counting change Suppose we want to give out change, using the minimal number of bills and coins.

7 A change-counting algorithm An easy algorithm for giving out N cents in change:  Choose the largest bill or coin that is N.  Subtract the value of the chosen bill/coin from N, to get a new value of N.  Repeat until a total of N cents has been counted. Does this work? I.e., does this really give out the minimal number of coins and bills?

8 Our simple algorithm For US currency, this simple algorithm actually works. Why do we call this a greedy algorithm?

9 Greedy algorithms At every step, a greedy algorithm  makes a locally optimal decision,  with the idea that in the end it all adds up to a globally optimal solution. Being optimistic like this usually leads to very simple algorithms.

10 Lu Lu’s Pan Fried Noodle Shop Think Globally Act Locally Eat Noodles Over on Craig Street… How Californian...

11 But… What happens if we have a 12-cent coin?

12 Hill-climbing Greedy algorithms are often visualized as “hill-climbing”.  Suppose you want to reach the summit, but can only see 10 yards ahead and behind (due to thick fog). Which way?

13 Hill-climbing Greedy algorithms are often visualized as “hill-climbing”.  Suppose you want to reach the summit, but can only see 10 yards ahead and behind (due to thick fog). Which way?

14 Hill-climbing, cont’d Making the locally-best guess is efficient and easy, but doesn’t always work.

15 Where have we seen this before? Greedy algorithms are common in computer science In fact, from last week…

16 Finding shortest airline routes PVD BOS JFK ORD LAX SFO DFW BWI MIA 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

17 Three 2-hop BWI->DFW routes PVD BOS JFK ORD LAX SFO DFW BWI MIA 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

18 A greedy algorithm Assume that every city is infinitely far away.  I.e., every city is  miles away from BWI (except BWI, which is 0 miles away).  Now perform something similar to breadth-first search, and optimistically guess that we have found the best path to each city as we encounter it.  If we later discover we are wrong and find a better path to a particular city, then update the distance to that city.

19 Intuition behind Dijkstra’s alg. For our airline-mileage problem, we can start by guessing that every city is  miles away.  Mark each city with this guess. Find all cities one hop away from BWI, and check whether the mileage is less than what is currently marked for that city.  If so, then revise the guess. Continue for 2 hops, 3 hops, etc.

20 Shortest mileage from BWI PVD  BOS  JFK  ORD  LAX  SFO  DFW  BWI 0 MIA  337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

21 Shortest mileage from BWI PVD  BOS  JFK 184 ORD 621 LAX  SFO  DFW  BWI 0 MIA 946 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

22 Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX  SFO  DFW 1575 BWI 0 MIA 946 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

23 Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX  SFO  DFW 1575 BWI 0 MIA 946 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

24 Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX  SFO 3075 DFW 1575 BWI 0 MIA 946 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

25 Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX  SFO 2467 DFW 1423 BWI 0 MIA 946 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

26 Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX 3288 SFO 2467 DFW 1423 BWI 0 MIA 946 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

27 Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX 2658 SFO 2467 DFW 1423 BWI 0 MIA 946 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

28 Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX 2658 SFO 2467 DFW 1423 BWI 0 MIA 946 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

29 Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX 2658 SFO 2467 DFW 1423 BWI 0 MIA 946 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

30 Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX 2658 SFO 2467 DFW 1423 BWI 0 MIA 946 337 2704 1846 1464 1235 2342 802 867 849 740 187 144 1391 184 1121 946 1090 1258621

31 Dijkstra’s algorithm Algorithm initialization:  Label each node with the distance , except start node, which is labeled with distance 0. D[v] is the distance label for v.  Put all nodes into a priority queue Q, using the distances as labels.

32 Dijkstra’s algorithm, cont’d While Q is not empty do:  u = Q.removeMin  for each node z one hop away from u do: if D[u] + miles(u,z) < D[z] then D[z] = D[u] + miles(u,z) change key of z in Q to D[z] Note use of priority queue allows “finished” nodes to be found quickly (in O(log N) time).

33 Another Greedy Algorithm

34 The Fractional Knapsack Problem (FKP) You rob a store: find n kinds of items  Gold dust. Wheat. Beer.

35

36 Example 2: Fractional knapsack problem (FKP) You rob a store: find n kinds of items  Gold dust. Wheat. Beer. The total inventory for the i th kind of item:  Weight: w i pounds  Value: v i dollars Knapsack can hold a maximum of W pounds. Q: how much of each kind of item should you take? (Can take fractional weight)

37 FKP: solution Greedy solution:  Fill knapsack with “most valuable” item until all is taken. Most valuable = v i /w i (dollars per pound)  Then next “most valuable” item, etc.  Until knapsack is full.

38 Ingredients of a greedy alg. An optimization problem. Is iterative / Proceeds in stages. Has the greedy-choice property: A greedy choice will lead to a globally optimal solution.

39 FKP is greedy An optimization problem:  Maximize value of loot, subject to maximum weight W. (constrained optimization) Proceeds in stages:  Knapsack is filled with one item at a time.

40 FKP is greedy Greedy-choice property: A locally greedy choice will lead to a globally optimal solution. In steps…: Step 1: Does the optimal solution contain the greedy choice? Step 2: can the greedy choice always be made first?

41 FKP: Greedy-choice: Step 1 Consider total value, V, of knapsack. Knapsack must contain item h:  Item h is the item with highest $/lb. Why? Because if h is not included, we can replace some other item in knapsack with an equivalent weight of h, and increase V. This can continue until knapsack is full, or all of h is taken. Therefore any optimal solution must include greedy-choice.

42 More rigorously… Let item h be the item with highest $/lb. Total inventory of h is w h pounds. Total value of h is v i dollars. Let k i be weight of item i in knapsack. Then total value: If k h 0 for some jh, then replace j with an equal weight of h. Let new total value = V’. Difference in total value: since, by definition of h, Therefore all of item h should be taken.

43 FKP: Greedy-choice: Step 2 Now we want to show that we can always make the greedy choice first. If item h is more than what knapsack can hold, then fill knapsack completely with h.  No other item gives higher total value. Otherwise, knapsack contains h and some other item. We can always make h the first choice, without changing total value V. Therefore greedy-choice can always be made first.

44 More rigorously… Case I: w h  W  Fill knapsack completely with h.  No other item gives higher total value. Case II: w h < W  Let 1st choice be item i, and kth choice be h, then we can always swap our 1st and kth choices, and total value V remains unchanged. Therefore greedy-choice can always be made first.

45 The Binary Knapsack Problem You win the Supermarket Shopping Spree contest.  You are given a shopping cart with capacity C.  You are allowed to fill it with any items you want from Giant Eagle.  Giant Eagle has items 1, 2, … n, which have values v 1, v 2, …, v n, and sizes s 1, s 2, …, s n.  How do you (efficiently) maximize the value of the items in your cart?

46

47 BKP is not greedy The obvious greedy strategy of taking the maximum value item that still fits in the cart does not work. Consider:  Suppose item i has size s i = C and value v i.  It can happen that there are items j and k with combined size s j +s k  C but v j +v k > v i.

48 BKP: Greedy approach fails item 1 item 2 item 3 knapsack $60, 10 lbs $100, 20 lbs $120, 30 lbs Maximum weight = 50 lbs Dollars/pound Item 1$6 Item 2$5 Item 3$4 BKP has optimal substructure, but not greedy-choice property: optimal solution does not contain greedy choice. $160 $180$220 (optimal)

49 A question for a future lecture… How can we (efficiently) solve the binary knapsack problem? One possible approach:  Dynamic programming

50 Machine Scheduling

51 Optimal machine scheduling We are given n tasks and an infinite supply of machines to perform them  each task t i = [s i, f i ] has start time s i and finish time f i An assignment of tasks to machines is feasible if no machine is assigned two overlapping tasks An assignment is optimal if it is feasible and uses the minimal number of machines

52 Example Tasks:  task: a b c d e f g  start: 0 3 4 9 7 1 6  finish: 2 7 7 11 10 5 8 Can you invent a greedy algorithm to find an optimal schedule for these tasks?

53 Succeeding with greed 3 ingredients needed: Optimization problem. Proceed in stages. Greedy-choice property: A greedy choice will lead to a globally optimal solution.


Download ppt "Greedy Algorithms 15-211 Fundamental Data Structures and Algorithms Peter Lee March 19, 2004."

Similar presentations


Ads by Google