Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

CS 332: Algorithms NP Completeness David Luebke /2/2017.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Greedy Algorithms.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
1.1 Data Structure and Algorithm Lecture 6 Greedy Algorithm Topics Reference: Introduction to Algorithm by Cormen Chapter 17: Greedy Algorithm.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Introduction to Algorithms Jiafen Liu Sept
David Luebke 1 5/4/2015 CS 332: Algorithms Dynamic Programming Greedy Algorithms.
Review: Dynamic Programming
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Greedy Algorithms Reading Material: –Alsuwaiyel’s Book: Section 8.1 –CLR Book (2 nd Edition): Section 16.1.
Lecture 6: Greedy Algorithms I Shang-Hua Teng. Optimization Problems A problem that may have many feasible solutions. Each solution has a value In maximization.
CSE 780 Algorithms Advanced Algorithms Greedy algorithm Job-select problem Greedy vs DP.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 5. Greedy Algorithms - 1 Greedy.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2002 Lecture 1 (Part 3) Tuesday, 1/29/02 Design Patterns for Optimization.
Week 2: Greedy Algorithms
Fundamental Techniques
Lecture 7: Greedy Algorithms II
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
16.Greedy algorithms Hsu, Lih-Hsing. Computer Theory Lab. Chapter 16P An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a.
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
1 Summary: Design Methods for Algorithms Andreas Klappenecker.
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
COSC 3101A - Design and Analysis of Algorithms 8 Elements of DP Memoization Longest Common Subsequence Greedy Algorithms Many of these slides are taken.
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
December 14, 2015 Design and Analysis of Computer Algorithm Pradondet Nilagupta Department of Computer Engineering.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
Greedy Algorithms Chapter 16 Highlights
Data Structures and Algorithms (AT70.02) Comp. Sc. and Inf. Mgmt. Asian Institute of Technology Instructor: Prof. Sumanta Guha Slide Sources: CLRS “Intro.
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
Greedy Algorithms Lecture 10 Asst. Prof. Dr. İlker Kocabaş 1.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
CS6045: Advanced Algorithms Greedy Algorithms. Main Concept –Divide the problem into multiple steps (sub-problems) –For each step take the best choice.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy algorithms: CSC317
CSC317 Greedy algorithms; Two main properties:
Lecture on Design and Analysis of Computer Algorithm
Algorithm Design Methods
Analysis of Algorithms CS 477/677
Greedy Algorithm.
Algorithm Design Methods
CS 3343: Analysis of Algorithms
CS6045: Advanced Algorithms
CS4335 Design and Analysis of Algorithms/WANG Lusheng
Greedy Algorithm Enyue (Annie) Lu.
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Advanced Algorithms Analysis and Design
Greedy Algorithms.
Advanced Algorithms Analysis and Design
Lecture 6 Topics Greedy Algorithm
Data Structures and Algorithms (AT70. 02) Comp. Sc. and Inf. Mgmt
Chapter 16: Greedy algorithms Ming-Te Chi
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Algorithm Design Methods
Greedy algorithms.
Asst. Prof. Dr. İlker Kocabaş
Advance Algorithm Dynamic Programming
Algorithm Design Methods
Presentation transcript:

Lecture 7: Greedy Algorithms II Shang-Hua Teng

Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday examples: Driving in Los Angeles, NY, or Boston for that matter Playing cards Invest on stocks Choose a university –The hope: a locally optimal choice will lead to a globally optimal solution –For some problems, it works greedy algorithms tend to be easier to code

An Activity Selection Problem (Conference Scheduling Problem) Input: A set of activities S = {a 1,…, a n } Each activity has start time and a finish time –a i =(s i, f i ) Two activities are compatible if and only if their interval does not overlap Output: a maximum-size subset of mutually compatible activities

The Activity Selection Problem Here are a set of start and finish times What is the maximum number of activities that can be completed? {a 3, a 9, a 11 } can be completed But so can {a 1, a 4, a 8’ a 11 } which is a larger set But it is not unique, consider {a 2, a 4, a 9’ a 11 }

Interval Representation

Early Finish Greedy Select the activity with the earliest finish Eliminate the activities that could not be scheduled Repeat!

Assuming activities are sorted by finish time

Why it is Greedy? Greedy in the sense that it leaves as much opportunity as possible for the remaining activities to be scheduled The greedy choice is the one that maximizes the amount of unscheduled time remaining

Why this Algorithm is Optimal? We will show that this algorithm uses the following properties The problem has the optimal substructure property The algorithm satisfies the greedy-choice property Thus, it is Optimal

Greedy-Choice Property Show there is an optimal solution that begins with a greedy choice (with activity 1, which as the earliest finish time) Suppose A  S in an optimal solution –Order the activities in A by finish time. The first activity in A is k If k = 1, the schedule A begins with a greedy choice If k  1, show that there is an optimal solution B to S that begins with the greedy choice, activity 1 –Let B = A – {k}  {1} f 1  f k  activities in B are disjoint (compatible) B has the same number of activities as A Thus, B is optimal

Optimal Substructures –Once the greedy choice of activity 1 is made, the problem reduces to finding an optimal solution for the activity-selection problem over those activities in S that are compatible with activity 1 Optimal Substructure If A is optimal to S, then A’ = A – {1} is optimal to S’={i  S: s i  f 1 } Why? –If we could find a solution B’ to S’ with more activities than A’, adding activity 1 to B’ would yield a solution B to S with more activities than A  contradicting the optimality of A –After each greedy choice is made, we are left with an optimization problem of the same form as the original problem By induction on the number of choices made, making the greedy choice at every step produces an optimal solution

Optimal substructures Define the following subset of activities which are activities that can start after a i finishes and finish before a j starts Sort the activities according to finish time We now define the the maximal set of activities from i to j as Let c[i,j] be the maximal number of activities We can solve this using dynamic programming, but a simpler approach exists Our recurrence relation for finding c[i, j] becomes

Elements of Greedy Strategy An greedy algorithm makes a sequence of choices, each of the choices that seems best at the moment is chosen –NOT always produce an optimal solution Two ingredients that are exhibited by most problems that lend themselves to a greedy strategy –Greedy-choice property –Optimal substructure

Greedy-Choice Property A globally optimal solution can be arrived at by making a locally optimal (greedy) choice –Make whatever choice seems best at the moment and then solve the sub-problem arising after the choice is made –The choice made by a greedy algorithm may depend on choices so far, but it cannot depend on any future choices or on the solutions to sub-problems Usually progress in a top-down fashion  making one greedy choice on after another, iteratively reducing each given problem instance to a smaller one Of course, we must prove that a greedy choice at each step yields a globally optimal solution

Optimal Substructures A problem exhibits optimal substructure if an optimal solution to the problem contains within it optimal solutions to sub-problems –If an optimal solution A to S begins with activity 1, then A’ = A – {1} is optimal to S’={i  S: s i  f 1 }

Knapsack Problem You want to pack n items in your luggage –The ith item is worth v i dollars and weighs w i pounds –Take a valuable a load as possible, but cannot exceed W pounds –v i, w i, W are integers 0-1 knapsack  each item is taken or not taken Fractional knapsack  fractions of items can be taken Both exhibit the optimal-substructure property –0-1: consider a optimal solution. If item j is removed from the load, the remaining load must be the most valuable load weighing at most W-w j –Fractional: If w of item j is removed from the optimal load, the remaining load must be the most valuable load weighing at most W-w that can be taken from other n-1 items plus w j – w of item j

Greedy Algorithm for Fractional Knapsack problem Fractional knapsack can be solvable by the greedy strategy –Compute the value per pound v i /w i for each item –Obeying a greedy strategy, we take as much as possible of the item with the greatest value per pound. –If the supply of that item is exhausted and we can still carry more, we take as much as possible of the item with the next value per pound, and so forth until we cannot carry any more –O(n lg n) (we need to sort the items by value per pound) –Greedy Algorithm? –Correctness?

O-1 knapsack is harder! 0-1 knapsack cannot be solved by the greedy strategy –Unable to fill the knapsack to capacity, and the empty space lowers the effective value per pound of the load –We must compare the solution to the sub-problem in which the item is included with the solution to the sub- problem in which the item is excluded before we can make the choice –Dynamic Programming

Next Week Ben Hescott will be guest lecturer and he will present dynamic programming I will be away for a conference Both Ben and TF will offer more office hour to cover me. TF: T 12:30 – 2:00 and TR: 9:30—11:00am