Greedy Algorithms 15-211 Fundamental Data Structures and Algorithms Peter Lee March 19, 2004.

Slides:



Advertisements
Similar presentations
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Advertisements

MCS 312: NP Completeness and Approximation algorithms Instructor Neelima Gupta
Greedy Algorithms.
The Greedy Method1. 2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1) Task Scheduling (§5.1.2) Minimum Spanning.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Chapter 5 Fundamental Algorithm Design Techniques.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
Greed is good. (Some of the time)
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Shortest Paths and Dijkstra's Algorithm CS 110: Data Structures and Algorithms First Semester,
Merge Sort 4/15/2017 6:09 PM The Greedy Method The Greedy Method.
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Greedy Algorithms Reading Material: –Alsuwaiyel’s Book: Section 8.1 –CLR Book (2 nd Edition): Section 16.1.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
1 Graphs: shortest paths Fundamental Data Structures and Algorithms Ananda Guna April 3, 2003.
ASC Program Example Part 3 of Associative Computing Examining the MST code in ASC Primer.
Fundamental Techniques
Lecture 7: Greedy Algorithms II
Shortest Paths1 C B A E D F
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
Graphs – Shortest Path (Weighted Graph) ORD DFW SFO LAX
Shortest Path Algorithms. Kruskal’s Algorithm We construct a set of edges A satisfying the following invariant:  A is a subset of some MST We start with.
IT 60101: Lecture #201 Foundation of Computing Systems Lecture 20 Classic Optimization Problems.
Prims’ spanning tree algorithm Given: connected graph (V, E) (sets of vertices and edges) V1= {an arbitrary node of V}; E1= {}; //inv: (V1, E1) is a tree,
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Greedy Algorithms Fundamental Data Structures and Algorithms Ananda Guna February 6, 2003 Based on lectures given by Peter Lee, Avrim Blum, Danny.
Honors Track: Competitive Programming & Problem Solving Optimization Problems Kevin Verbeek.
Minimum-Cost Spanning Tree CS 110: Data Structures and Algorithms First Semester,
Shortest Paths and Dijkstra’s Algorithm CS 105. SSSP Slide 2 Single-source shortest paths Given a weighted graph G and a source vertex v in G, determine.
The Greedy Method. The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations:
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
1 Dijkstra’s Algorithm Dr. Ying Lu RAIK 283 Data Structures & Algorithms.
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
CSC 213 – Large Scale Programming. Today’s Goals  Discuss what is meant by weighted graphs  Where weights placed within Graph  How to use Graph ’s.
December 14, 2015 Design and Analysis of Computer Algorithm Pradondet Nilagupta Department of Computer Engineering.
Greedy Algorithms Fundamental Data Structures and Algorithms Peter Lee February 6, 2003.
Greedy Algorithms Fundamental Data Structures and Algorithms Margaret Reid-Miller 25 March 2004.
SPANNING TREES Lecture 21 CS2110 – Fall Nate Foster is out of town. NO 3-4pm office hours today!
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
Greedy Algorithms Interval Scheduling and Fractional Knapsack These slides are based on the Lecture Notes by David Mount for the course CMSC 451 at the.
Greedy Algorithms Analysis of Algorithms.
Graphs - Shortest Paths Fundamental Data Structures and Algorithms Margaret Reid-Miller 24 March 2005.
CHAPTER 13 GRAPH ALGORITHMS ACKNOWLEDGEMENT: THESE SLIDES ARE ADAPTED FROM SLIDES PROVIDED WITH DATA STRUCTURES AND ALGORITHMS IN C++, GOODRICH, TAMASSIA.
Spring 2008The Greedy Method1. Spring 2008The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1)
CS 361 – Chapter 10 “Greedy algorithms” It’s a strategy of solving some problems –Need to make a series of choices –Each choice is made to maximize current.
Fundamental Data Structures and Algorithms Ananda Guna March 18, 2003 Dynamic Programming Part 1.
Dynamic Programming Fundamental Data Structures and Algorithms Klaus Sutner March 30, 2004.
A greedy algorithm is an algorithm that follows the problem solving heuristic of making the locally optimal choice at each stage with the hope of finding.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
CSC317 Greedy algorithms; Two main properties:
Lecture on Design and Analysis of Computer Algorithm
Greedy Method 6/22/2018 6:57 PM Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015.
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
Shortest Paths C B A E D F
Algorithm Design Methods
CS 3343: Analysis of Algorithms
Greedy Algorithm Enyue (Annie) Lu.
Advanced Algorithms Analysis and Design
Fundamental Data Structures and Algorithms
Fundamental Data Structures and Algorithms
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Spanning Trees Lecture 20 CS2110 – Spring 2015.
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Presentation transcript:

Greedy Algorithms Fundamental Data Structures and Algorithms Peter Lee March 19, 2004

Announcements HW6 is due on April 5! Quiz #2 postponed until March 31  an online quiz  requires up to one hour of uninterrupted time with a web browser actually, only a 15-minute quiz  must be completed by April 1, 11:59pm

Objects in calendar are closer than they appear

Greed is Good

Example: Counting change Suppose we want to give out change, using the minimal number of bills and coins.

A change-counting algorithm An easy algorithm for giving out N cents in change:  Choose the largest bill or coin that is N.  Subtract the value of the chosen bill/coin from N, to get a new value of N.  Repeat until a total of N cents has been counted. Does this work? I.e., does this really give out the minimal number of coins and bills?

Our simple algorithm For US currency, this simple algorithm actually works. Why do we call this a greedy algorithm?

Greedy algorithms At every step, a greedy algorithm  makes a locally optimal decision,  with the idea that in the end it all adds up to a globally optimal solution. Being optimistic like this usually leads to very simple algorithms.

Lu Lu’s Pan Fried Noodle Shop Think Globally Act Locally Eat Noodles Over on Craig Street… How Californian...

But… What happens if we have a 12-cent coin?

Hill-climbing Greedy algorithms are often visualized as “hill-climbing”.  Suppose you want to reach the summit, but can only see 10 yards ahead and behind (due to thick fog). Which way?

Hill-climbing Greedy algorithms are often visualized as “hill-climbing”.  Suppose you want to reach the summit, but can only see 10 yards ahead and behind (due to thick fog). Which way?

Hill-climbing, cont’d Making the locally-best guess is efficient and easy, but doesn’t always work.

Where have we seen this before? Greedy algorithms are common in computer science In fact, from last week…

Finding shortest airline routes PVD BOS JFK ORD LAX SFO DFW BWI MIA

Three 2-hop BWI->DFW routes PVD BOS JFK ORD LAX SFO DFW BWI MIA

A greedy algorithm Assume that every city is infinitely far away.  I.e., every city is  miles away from BWI (except BWI, which is 0 miles away).  Now perform something similar to breadth-first search, and optimistically guess that we have found the best path to each city as we encounter it.  If we later discover we are wrong and find a better path to a particular city, then update the distance to that city.

Intuition behind Dijkstra’s alg. For our airline-mileage problem, we can start by guessing that every city is  miles away.  Mark each city with this guess. Find all cities one hop away from BWI, and check whether the mileage is less than what is currently marked for that city.  If so, then revise the guess. Continue for 2 hops, 3 hops, etc.

Shortest mileage from BWI PVD  BOS  JFK  ORD  LAX  SFO  DFW  BWI 0 MIA 

Shortest mileage from BWI PVD  BOS  JFK 184 ORD 621 LAX  SFO  DFW  BWI 0 MIA

Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX  SFO  DFW 1575 BWI 0 MIA

Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX  SFO  DFW 1575 BWI 0 MIA

Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX  SFO 3075 DFW 1575 BWI 0 MIA

Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX  SFO 2467 DFW 1423 BWI 0 MIA

Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX 3288 SFO 2467 DFW 1423 BWI 0 MIA

Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX 2658 SFO 2467 DFW 1423 BWI 0 MIA

Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX 2658 SFO 2467 DFW 1423 BWI 0 MIA

Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX 2658 SFO 2467 DFW 1423 BWI 0 MIA

Shortest mileage from BWI PVD 328 BOS 371 JFK 184 ORD 621 LAX 2658 SFO 2467 DFW 1423 BWI 0 MIA

Dijkstra’s algorithm Algorithm initialization:  Label each node with the distance , except start node, which is labeled with distance 0. D[v] is the distance label for v.  Put all nodes into a priority queue Q, using the distances as labels.

Dijkstra’s algorithm, cont’d While Q is not empty do:  u = Q.removeMin  for each node z one hop away from u do: if D[u] + miles(u,z) < D[z] then D[z] = D[u] + miles(u,z) change key of z in Q to D[z] Note use of priority queue allows “finished” nodes to be found quickly (in O(log N) time).

Another Greedy Algorithm

The Fractional Knapsack Problem (FKP) You rob a store: find n kinds of items  Gold dust. Wheat. Beer.

Example 2: Fractional knapsack problem (FKP) You rob a store: find n kinds of items  Gold dust. Wheat. Beer. The total inventory for the i th kind of item:  Weight: w i pounds  Value: v i dollars Knapsack can hold a maximum of W pounds. Q: how much of each kind of item should you take? (Can take fractional weight)

FKP: solution Greedy solution:  Fill knapsack with “most valuable” item until all is taken. Most valuable = v i /w i (dollars per pound)  Then next “most valuable” item, etc.  Until knapsack is full.

Ingredients of a greedy alg. An optimization problem. Is iterative / Proceeds in stages. Has the greedy-choice property: A greedy choice will lead to a globally optimal solution.

FKP is greedy An optimization problem:  Maximize value of loot, subject to maximum weight W. (constrained optimization) Proceeds in stages:  Knapsack is filled with one item at a time.

FKP is greedy Greedy-choice property: A locally greedy choice will lead to a globally optimal solution. In steps…: Step 1: Does the optimal solution contain the greedy choice? Step 2: can the greedy choice always be made first?

FKP: Greedy-choice: Step 1 Consider total value, V, of knapsack. Knapsack must contain item h:  Item h is the item with highest $/lb. Why? Because if h is not included, we can replace some other item in knapsack with an equivalent weight of h, and increase V. This can continue until knapsack is full, or all of h is taken. Therefore any optimal solution must include greedy-choice.

More rigorously… Let item h be the item with highest $/lb. Total inventory of h is w h pounds. Total value of h is v i dollars. Let k i be weight of item i in knapsack. Then total value: If k h 0 for some jh, then replace j with an equal weight of h. Let new total value = V’. Difference in total value: since, by definition of h, Therefore all of item h should be taken.

FKP: Greedy-choice: Step 2 Now we want to show that we can always make the greedy choice first. If item h is more than what knapsack can hold, then fill knapsack completely with h.  No other item gives higher total value. Otherwise, knapsack contains h and some other item. We can always make h the first choice, without changing total value V. Therefore greedy-choice can always be made first.

More rigorously… Case I: w h  W  Fill knapsack completely with h.  No other item gives higher total value. Case II: w h < W  Let 1st choice be item i, and kth choice be h, then we can always swap our 1st and kth choices, and total value V remains unchanged. Therefore greedy-choice can always be made first.

The Binary Knapsack Problem You win the Supermarket Shopping Spree contest.  You are given a shopping cart with capacity C.  You are allowed to fill it with any items you want from Giant Eagle.  Giant Eagle has items 1, 2, … n, which have values v 1, v 2, …, v n, and sizes s 1, s 2, …, s n.  How do you (efficiently) maximize the value of the items in your cart?

BKP is not greedy The obvious greedy strategy of taking the maximum value item that still fits in the cart does not work. Consider:  Suppose item i has size s i = C and value v i.  It can happen that there are items j and k with combined size s j +s k  C but v j +v k > v i.

BKP: Greedy approach fails item 1 item 2 item 3 knapsack $60, 10 lbs $100, 20 lbs $120, 30 lbs Maximum weight = 50 lbs Dollars/pound Item 1$6 Item 2$5 Item 3$4 BKP has optimal substructure, but not greedy-choice property: optimal solution does not contain greedy choice. $160 $180$220 (optimal)

A question for a future lecture… How can we (efficiently) solve the binary knapsack problem? One possible approach:  Dynamic programming

Machine Scheduling

Optimal machine scheduling We are given n tasks and an infinite supply of machines to perform them  each task t i = [s i, f i ] has start time s i and finish time f i An assignment of tasks to machines is feasible if no machine is assigned two overlapping tasks An assignment is optimal if it is feasible and uses the minimal number of machines

Example Tasks:  task: a b c d e f g  start:  finish: Can you invent a greedy algorithm to find an optimal schedule for these tasks?

Succeeding with greed 3 ingredients needed: Optimization problem. Proceed in stages. Greedy-choice property: A greedy choice will lead to a globally optimal solution.