 Review: The Greedy Method

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

Dynamic Programming 25-Mar-17.
§1 Greedy Algorithms ALGORITHM DESIGN TECHNIQUES
Algorithm Design Methods (I) Fall 2003 CSE, POSTECH.
Algorithm Design Methods Spring 2007 CSE, POSTECH.
Minimum Clique Partition Problem with Constrained Weight for Interval Graphs Jianping Li Department of Mathematics Yunnan University Jointed by M.X. Chen.
MCS 312: NP Completeness and Approximation algorithms Instructor Neelima Gupta
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Counting the bits Analysis of Algorithms Will it run on a larger problem? When will it fail?
Types of Algorithms.
CS223 Advanced Data Structures and Algorithms 1 Greedy Algorithms Neil Tang 4/8/2010.
Greedy Algorithms Basic idea Connection to dynamic programming
Lecture 4 Divide and Conquer for Nearest Neighbor Problem
1 Divide & Conquer Algorithms. 2 Recursion Review A function that calls itself either directly or indirectly through another function Recursive solutions.
15-May-15 Dynamic Programming. 2 Algorithm types Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide.
Divide and Conquer. Recall Complexity Analysis – Comparison of algorithm – Big O Simplification From source code – Recursive.
1 Algorithms & Data Structures for games Lecture 2B Minor Games Programming.
Merge Sort 4/15/2017 6:09 PM The Greedy Method The Greedy Method.
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
CSE 421 Algorithms Richard Anderson Lecture 6 Greedy Algorithms.
Lecture 6 Divide and Conquer for Nearest Neighbor Problem Shang-Hua Teng.
Fundamental Techniques
1 Algorithm Design Techniques Greedy algorithms Divide and conquer Dynamic programming Randomized algorithms Backtracking.
Bold Stroke January 13, 2003 Advanced Algorithms CS 539/441 OR In Search Of Efficient General Solutions Joe Hoffert
ALGORITHM TYPES Divide and Conquer, Dynamic Programming, Backtracking, and Greedy. Note the general strategy from the examples. The classification is neither.
Optimal Scheduling of File Transfers with Divisible Sizes on Multiple Disjoint Paths Mugurel Ionut Andreica Polytechnic University of Bucharest Computer.
For Wednesday No reading No homework There will be homework for Friday, as well the program being due – plan ahead.
1 Prune-and-Search Method 2012/10/30. A simple example: Binary search sorted sequence : (search 9) step 1  step 2  step 3  Binary search.
The Greedy Method. The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations:
Outline Introduction Minimizing the makespan Minimizing total flowtime
Exhaustion, Branch and Bound, Divide and Conquer.
1 Algorithms & Data Structures for Games Lecture 2A Minor Games Programming.
CSE 421 Algorithms Lecture 15 Closest Pair, Multiplication.
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Algorithm Design Methods 황승원 Fall 2011 CSE, POSTECH.
Instructor Neelima Gupta Instructor: Ms. Neelima Gupta.
The bin packing problem. For n objects with sizes s 1, …, s n where 0 < s i ≤1, find the smallest number of bins with capacity one, such that n objects.
CSE 340: Review (at last!) Measuring The Complexity Complexity is a function of the size of the input O() Ω() Θ() Complexity Analysis “same order” Order.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
1 Ch18. The Greedy Methods. 2 BIRD’S-EYE VIEW Enter the world of algorithm-design methods In the remainder of this book, we study the methods for the.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
8.3.2 Constant Distance Approximations
Dynamic Programming Sequence of decisions. Problem state.
Priority Queues An abstract data type (ADT) Similar to a queue
Algorithm Design Methods
Approximation Algorithms
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Punya Biswas Lecture 15 Closest Pair, Multiplication
Exam 2 LZW not on syllabus. 73% / 75%.
Richard Anderson Lecture 13 Divide and Conquer
Advanced Algorithms Analysis and Design
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Richard Anderson Lecture 6 Greedy Algorithms
Priority Queues An abstract data type (ADT) Similar to a queue
Richard Anderson Lecture 7 Greedy Algorithms
CSE 326: Data Structures Lecture #24 The Algorhythmics
Algorithm Design Methods
Dynamic Programming II DP over Intervals
Algorithm Design Methods
Lecture 15, Winter 2019 Closest Pair, Multiplication
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
The Selection Problem.
Lecture 15 Closest Pair, Multiplication
Bin Packing Michael T. Goodrich Some slides adapted from slides from
Algorithm Design Methods
Algorithm Course Algorithms Lecture 3 Sorting Algorithm-1
Richard Anderson Lecture 13 Divide and Conquer
Presentation transcript:

 Review: The Greedy Method §1 Greedy Algorithms  Review: The Greedy Method Make the best decision at each stage, under some greedy criterion. A decision made in one stage is not changed in a later stage, so each decision should assure feasibility. 2. A Simple Scheduling Problem  The Single Processor Case Given N jobs j1 , j2 , …, jN , their running times t1 , t2 , …, tN , and one processor. Schedule jobs to minimize the average completion time. /* assume nonpreemptive scheduling */ 1/19

What is the total cost? How can we be "greedy"? §1 Greedy Algorithms 〖Example〗 job time j1 j2 j3 j4 15 8 3 10 Schedule 1 j1 15 j2 23 j3 26 j4 36 Tavg = ( 15 + 23 + 26 + 36 ) / 4 = 25 Schedule 2 j3 3 j2 11 j4 21 j1 36 Tavg = ( 3 + 11 + 21 + 36 ) / 4 = 17.75 In general: ji1 ti1 ji2 ti1 + ti2 ji3 ti1 + ti2 + ti3 … … Discussion 21: What is the total cost? How can we be "greedy"? 2/19

 The Multiprocessor Case – N jobs on P processors §1 Greedy Algorithms  The Multiprocessor Case – N jobs on P processors 〖Example〗 P = 3 job time j1 j2 j3 j4 3 5 6 10 j5 j6 j7 j8 11 14 15 18 j9 20 An Optimal Solution Another Optimal Solution j1 3 j2 5 j3 6 j4 15 j5 14 j6 20 j7 30 j8 38 j9 34 j1 3 j4 13 j7 28 j2 5 j5 16 j8 34 j3 6 j6 20 j9 40  Minimizing the Final Completion Time An Optimal Solution j1 3 j2 5 j3 9 j4 19 j5 16 j6 14 j7 j8 34 j9 NP Hard 3/19

 jij must be processed by Pj and the processing time is tij . §1 Greedy Algorithms  Flow Shop Scheduling – a simple case with 2 processors Consider a machine shop that has 2 identical processors P1 and P2 . We have N jobs J1, ... , JN that need to be processed. Each job Ji can be broken into 2 tasks ji1 and ji2 . A schedule is an assignment of jobs to time intervals on machines such that  jij must be processed by Pj and the processing time is tij .  No machine processes more than one job at any time.  ji2 may not be started before ji1 is finished. 〖Example〗 Given N = 4, T1234 = ? 40 Construct a minimum-completion-time 2 machine schedule for a given set of N jobs. Let ai = ti1  0, and bi = ti2 . Discussion 22: What if ai = 0? 4/19

【Proposition】 An optimal schedule exists if min { bi , aj }  §1 Greedy Algorithms 【Proposition】 An optimal schedule exists if min { bi , aj }  min { bj , ai } for any pair of adjacent jobs Ji and Jj . All the schedules with this property have the same completion time. Algorithm { Sort { a1 , ... , aN , b1 , ... , bN } into non-decreasing sequence ; m = minimum element ; do { if ( ( m == ai ) && ( Ji is not in the schedule ) ) Place Ji at the left-most empty position ; else if ( ( m == bj ) && ( Jj is not in the schedule ) ) Place Jj at the right-most empty position ; m = next minimum element ; } while ( m ); } T = O( N log N ) 〖Example〗 Given N = 4, Discussion 23: What is the optimal solution? 5/19

Sunny Cup 2004 3. Approximate Bin Packing  The Knapsack Problem §1 Greedy Algorithms 3. Approximate Bin Packing  The Knapsack Problem A knapsack with a capacity M is to be packed. Given N items. Each item i has a weight wi and a profit pi . If xi is the percentage of the item i being packed, then the packed profit will be pi xi . Sunny Cup 2004 http://acm.zju.edu.cn/onlinejudge/showProblem.do?problemCode=2109 An optimal packing is a feasible one with maximum profit. That is, we are supposed to find the values of xi such that obtains its maximum under the constrains Discussion 24: n = 3, M = 20, (p1, p2, p3) = (25, 24, 15) (w1, w2, w3)= (18, 15, 10) Solution is...? Q: What must we do in each stage? A: Pack one item into the knapsack. Q: On which criterion shall we be greedy?  maximum profit  minimum weight  maximum profit density pi / wi 6/19

 The Bin Packing Problem §1 Greedy Algorithms  The Bin Packing Problem Given N items of sizes S1 , S2 , …, SN , such that 0 < Si  1 for all 1  i  N . Pack these items in the fewest number of bins, each of which has unit capacity. 〖Example〗N = 7; Si = 0.2, 0.5, 0.4, 0.7, 0.1, 0.3, 0.8 0.8 0.2 B1 0.7 0.3 B2 0.1 0.5 0.4 B3 NP Hard An Optimal Packing 7/19

when the input might end. Hence an on-line algorithm §1 Greedy Algorithms  On-line Algorithms Place an item before processing the next one, and can NOT change decision. 〖Example〗Si = 0.4 , 0.4 , 0.6 , 0.6 You never know when the input might end. Hence an on-line algorithm cannot always give an optimal solution. 0.4 0.6 0.6 0.4 【Theorem】There are inputs that force any on-line bin-packing algorithm to use at least 4/3 the optimal number of bins. 8/19

§1 Greedy Algorithms  Next Fit void NextFit ( ) { read item1; while ( read item2 ) { if ( item2 can be packed in the same bin as item1 ) place item2 in the bin; else create a new bin for item2; item1 = item2; } /* end-while */ } 【Theorem】Let M be the optimal number of bins required to pack a list I of items. Then next fit never uses more than 2M bins. There exist sequences such that next fit uses 2M – 2 bins. 9/19

Can be implemented in O( N log N ) §1 Greedy Algorithms  First Fit void FirstFit ( ) { while ( read item ) { scan for the first bin that is large enough for item; if ( found ) place item in that bin; else create a new bin for item; } /* end-while */ } Can be implemented in O( N log N ) 【Theorem】Let M be the optimal number of bins required to pack a list I of items. Then first fit never uses more than 17M / 10 bins. There exist sequences such that first fit uses 17(M – 1) / 10 bins.  Best Fit Place a new item in the tightest spot among all bins. T = O( N log N ) and bin no. < 1.7M 10/19

 The optimal solution requires ? bins. §1 Greedy Algorithms 〖Example〗Si = 0.2, 0.5, 0.4, 0.7, 0.1, 0.3, 0.8 Next Fit First Fit Best Fit Discussion 25: Please show the results. 〖Example〗Si = 1/7+, 1/7+, 1/7+, 1/7+, 1/7+, 1/7+, 1/3+, 1/3+, 1/3+, 1/3+, 1/3+, 1/3+, 1/2+, 1/2+, 1/2+, 1/2+, 1/2+, 1/2+ where  = 0.001.  The optimal solution requires ? bins. However, all the three on-line algorithms require ? bins. 6 10 11/19

Simple greedy heuristics can give good results. §1 Greedy Algorithms  Off-line Algorithms View the entire item list before producing an answer. Trouble-maker: The large items Solution: Sort the items into non-increasing sequence of sizes. Then apply first (or best) fit – first (or best) fit decreasing. 〖Example〗Si = 0.2, 0.5, 0.4, 0.7, 0.1, 0.3, 0.8 0.8, 0.7, 0.5, 0.4, 0.3, 0.2, 0.1 0.8 0.2 0.7 0.3 0.5 0.1 【Theorem】Let M be the optimal number of bins required to pack a list I of items. Then first fit decreasing never uses more than 11M / 9 + 4 bins. There exist sequences such that first fit decreasing uses 11M / 9 bins. 0.4 Simple greedy heuristics can give good results. 12/19

Bin Packing Heuristics (25) Research Project 12 Bin Packing Heuristics (25) This project requires you to implement and compare the performance (both in time and number of bins used) of the various bin packing heuristics, including the on-line, next fit, first fit, best fit, and first-fit decreasing algorithms. Detailed requirements can be downloaded from http://acm.zju.edu.cn/dsaa/ 13/19

§2 Divide and Conquer Cases solved by divide and conquer Divide: Smaller problems are solved recursively (except base cases). Conquer: The solution to the original problem is then formed from the solutions to the subproblems. Cases solved by divide and conquer  The maximum subsequence sum – the O( N log N ) solution  Tree traversals – O( N )  Mergesort and quicksort – O( N log N ) Note: Divide and conquer makes at least two recursive calls and the subproblems are disjoint. 14/19

1. Running Time of Divide and Conquer Algorithms 【Theorem】The solution to the equation T(N) = a T(N / b) + (Nk logpN ), where a  1, b > 1, and p  0 is 〖Example〗 Mergesort has a = b = 2, p = 0 and k = 1. T = O( N log N ) 〖Example〗 Divide with a = 3, and b = 2 for each recursion; Conquer with O( N ) – that is, k = 1 and p = 0 . T = O( N1.59 ) If conquer takes O( N2 ) then T = O( N2 ) . 15/19

2. Closest Points Problem §2 Divide and Conquer 2. Closest Points Problem Given N points in a plane. Find the closest pair of points. (If two points have the same position, then that pair is the closest with distance 0.)  Simple Exhaustive Search Check ? pairs of points. T = O( ? ). N ( N – 1 ) / 2 N 2  Divide and Conquer – similar to the maximum subsequence sum problem 〖Example〗 Sort according to x-coordinates and divide; Conquer by forming a solution from left, right, and cross. 16/19

It is O( N log N ) all right. But is it really clearly so? §2 Divide and Conquer It is O( N log N ) all right. But is it really clearly so? It is so simple, and we clearly have an O( N log N ) algorithm. How about k ? Can you find the cross distance in linear time? Just like finding the max subsequence sum, we have a = b = 2 … 17/19

What is the worst case? What can we do then? §2 Divide and Conquer   - strip If NumPointInStrip = , we have /* points are all in the strip */ for ( i=0; i<NumPointsInStrip; i++ ) for ( j=i+1; j<NumPointsInStrip; j++ ) if ( Dist( Pi , Pj ) <  )  = Dist( Pi , Pj ); Discussion 26: What is the worst case? What can we do then? 18/19

Detailed requirements can be downloaded from Research Project 13 Quoit Design (25) Have you ever played quoit in a playground? Quoit is a game in which flat rings are pitched at some toys, with all the toys encircled awarded. In the field of Cyberground, the position of each toy is fixed, and the ring is carefully designed so it can only encircle one toy at a time. On the other hand, to make the game look more attractive, the ring is designed to have the largest radius. Given a configuration of the field, you are supposed to find the radius of such a ring. Detailed requirements can be downloaded from http://acm.zju.edu.cn/dsaa/ 19/19