#  Review: The Greedy Method

## Presentation on theme: " Review: The Greedy Method"— Presentation transcript:

 Review: The Greedy Method
§1 Greedy Algorithms  Review: The Greedy Method Make the best decision at each stage, under some greedy criterion. A decision made in one stage is not changed in a later stage, so each decision should assure feasibility. 2. A Simple Scheduling Problem  The Single Processor Case Given N jobs j1 , j2 , …, jN , their running times t1 , t2 , …, tN , and one processor. Schedule jobs to minimize the average completion time. /* assume nonpreemptive scheduling */ 1/19

What is the total cost? How can we be "greedy"?
§1 Greedy Algorithms 〖Example〗 job time j1 j2 j3 j4 15 8 3 10 Schedule 1 j1 15 j2 23 j3 26 j4 36 Tavg = ( ) / 4 = 25 Schedule 2 j3 3 j2 11 j4 21 j1 36 Tavg = ( ) / 4 = 17.75 In general: ji1 ti1 ji2 ti1 + ti2 ji3 ti1 + ti2 + ti3 … … Discussion 21: What is the total cost? How can we be "greedy"? 2/19

 The Multiprocessor Case – N jobs on P processors
§1 Greedy Algorithms  The Multiprocessor Case – N jobs on P processors 〖Example〗 P = 3 job time j1 j2 j3 j4 3 5 6 10 j5 j6 j7 j8 11 14 15 18 j9 20 An Optimal Solution Another Optimal Solution j1 3 j2 5 j3 6 j4 15 j5 14 j6 20 j7 30 j8 38 j9 34 j1 3 j4 13 j7 28 j2 5 j5 16 j8 34 j3 6 j6 20 j9 40  Minimizing the Final Completion Time An Optimal Solution j1 3 j2 5 j3 9 j4 19 j5 16 j6 14 j7 j8 34 j9 NP Hard 3/19

 jij must be processed by Pj and the processing time is tij .
§1 Greedy Algorithms  Flow Shop Scheduling – a simple case with 2 processors Consider a machine shop that has 2 identical processors P1 and P2 . We have N jobs J1, ... , JN that need to be processed. Each job Ji can be broken into 2 tasks ji1 and ji2 . A schedule is an assignment of jobs to time intervals on machines such that  jij must be processed by Pj and the processing time is tij .  No machine processes more than one job at any time.  ji2 may not be started before ji1 is finished. 〖Example〗 Given N = 4, T1234 = ? 40 Construct a minimum-completion-time 2 machine schedule for a given set of N jobs. Let ai = ti1  0, and bi = ti2 . Discussion 22: What if ai = 0? 4/19

【Proposition】 An optimal schedule exists if min { bi , aj } 
§1 Greedy Algorithms 【Proposition】 An optimal schedule exists if min { bi , aj }  min { bj , ai } for any pair of adjacent jobs Ji and Jj . All the schedules with this property have the same completion time. Algorithm { Sort { a1 , ... , aN , b1 , ... , bN } into non-decreasing sequence ; m = minimum element ; do { if ( ( m == ai ) && ( Ji is not in the schedule ) ) Place Ji at the left-most empty position ; else if ( ( m == bj ) && ( Jj is not in the schedule ) ) Place Jj at the right-most empty position ; m = next minimum element ; } while ( m ); } T = O( N log N ) 〖Example〗 Given N = 4, Discussion 23: What is the optimal solution? 5/19

Sunny Cup 2004 3. Approximate Bin Packing  The Knapsack Problem
§1 Greedy Algorithms 3. Approximate Bin Packing  The Knapsack Problem A knapsack with a capacity M is to be packed. Given N items. Each item i has a weight wi and a profit pi . If xi is the percentage of the item i being packed, then the packed profit will be pi xi . Sunny Cup 2004 An optimal packing is a feasible one with maximum profit. That is, we are supposed to find the values of xi such that obtains its maximum under the constrains Discussion 24: n = 3, M = 20, (p1, p2, p3) = (25, 24, 15) (w1, w2, w3)= (18, 15, 10) Solution is...? Q: What must we do in each stage? A: Pack one item into the knapsack. Q: On which criterion shall we be greedy?  maximum profit  minimum weight  maximum profit density pi / wi 6/19

 The Bin Packing Problem
§1 Greedy Algorithms  The Bin Packing Problem Given N items of sizes S1 , S2 , …, SN , such that 0 < Si  1 for all 1  i  N . Pack these items in the fewest number of bins, each of which has unit capacity. 〖Example〗N = 7; Si = 0.2, 0.5, 0.4, 0.7, 0.1, 0.3, 0.8 0.8 0.2 B1 0.7 0.3 B2 0.1 0.5 0.4 B3 NP Hard An Optimal Packing 7/19

when the input might end. Hence an on-line algorithm
§1 Greedy Algorithms  On-line Algorithms Place an item before processing the next one, and can NOT change decision. 〖Example〗Si = 0.4 , 0.4 , 0.6 , 0.6 You never know when the input might end. Hence an on-line algorithm cannot always give an optimal solution. 0.4 0.6 0.6 0.4 【Theorem】There are inputs that force any on-line bin-packing algorithm to use at least 4/3 the optimal number of bins. 8/19

§1 Greedy Algorithms  Next Fit void NextFit ( ) { read item1; while ( read item2 ) { if ( item2 can be packed in the same bin as item1 ) place item2 in the bin; else create a new bin for item2; item1 = item2; } /* end-while */ } 【Theorem】Let M be the optimal number of bins required to pack a list I of items. Then next fit never uses more than 2M bins. There exist sequences such that next fit uses 2M – 2 bins. 9/19

Can be implemented in O( N log N )
§1 Greedy Algorithms  First Fit void FirstFit ( ) { while ( read item ) { scan for the first bin that is large enough for item; if ( found ) place item in that bin; else create a new bin for item; } /* end-while */ } Can be implemented in O( N log N ) 【Theorem】Let M be the optimal number of bins required to pack a list I of items. Then first fit never uses more than 17M / 10 bins. There exist sequences such that first fit uses 17(M – 1) / 10 bins.  Best Fit Place a new item in the tightest spot among all bins. T = O( N log N ) and bin no. < 1.7M 10/19

 The optimal solution requires ? bins.
§1 Greedy Algorithms 〖Example〗Si = 0.2, 0.5, 0.4, 0.7, 0.1, 0.3, 0.8 Next Fit First Fit Best Fit Discussion 25: Please show the results. 〖Example〗Si = 1/7+, 1/7+, 1/7+, 1/7+, 1/7+, 1/7+, 1/3+, 1/3+, 1/3+, 1/3+, 1/3+, 1/3+, 1/2+, 1/2+, 1/2+, 1/2+, 1/2+, 1/2+ where  =  The optimal solution requires ? bins. However, all the three on-line algorithms require ? bins. 6 10 11/19

Simple greedy heuristics can give good results.
§1 Greedy Algorithms  Off-line Algorithms View the entire item list before producing an answer. Trouble-maker: The large items Solution: Sort the items into non-increasing sequence of sizes. Then apply first (or best) fit – first (or best) fit decreasing. 〖Example〗Si = 0.2, 0.5, 0.4, 0.7, 0.1, 0.3, 0.8 0.8, 0.7, 0.5, 0.4, 0.3, 0.2, 0.1 0.8 0.2 0.7 0.3 0.5 0.1 【Theorem】Let M be the optimal number of bins required to pack a list I of items. Then first fit decreasing never uses more than 11M / bins. There exist sequences such that first fit decreasing uses 11M / 9 bins. 0.4 Simple greedy heuristics can give good results. 12/19

Bin Packing Heuristics (25)
Research Project 12 Bin Packing Heuristics (25) This project requires you to implement and compare the performance (both in time and number of bins used) of the various bin packing heuristics, including the on-line, next fit, first fit, best fit, and first-fit decreasing algorithms. Detailed requirements can be downloaded from 13/19

§2 Divide and Conquer Cases solved by divide and conquer
Divide: Smaller problems are solved recursively (except base cases). Conquer: The solution to the original problem is then formed from the solutions to the subproblems. Cases solved by divide and conquer  The maximum subsequence sum – the O( N log N ) solution  Tree traversals – O( N )  Mergesort and quicksort – O( N log N ) Note: Divide and conquer makes at least two recursive calls and the subproblems are disjoint. 14/19

1. Running Time of Divide and Conquer Algorithms
【Theorem】The solution to the equation T(N) = a T(N / b) + (Nk logpN ), where a  1, b > 1, and p  0 is 〖Example〗 Mergesort has a = b = 2, p = 0 and k = 1. T = O( N log N ) 〖Example〗 Divide with a = 3, and b = 2 for each recursion; Conquer with O( N ) – that is, k = 1 and p = 0 . T = O( N1.59 ) If conquer takes O( N2 ) then T = O( N2 ) . 15/19

2. Closest Points Problem
§2 Divide and Conquer 2. Closest Points Problem Given N points in a plane. Find the closest pair of points. (If two points have the same position, then that pair is the closest with distance 0.)  Simple Exhaustive Search Check ? pairs of points. T = O( ? ). N ( N – 1 ) / 2 N 2  Divide and Conquer – similar to the maximum subsequence sum problem 〖Example〗 Sort according to x-coordinates and divide; Conquer by forming a solution from left, right, and cross. 16/19

It is O( N log N ) all right. But is it really clearly so?
§2 Divide and Conquer It is O( N log N ) all right. But is it really clearly so? It is so simple, and we clearly have an O( N log N ) algorithm. How about k ? Can you find the cross distance in linear time? Just like finding the max subsequence sum, we have a = b = 2 … 17/19

What is the worst case? What can we do then?
§2 Divide and Conquer  - strip If NumPointInStrip = , we have /* points are all in the strip */ for ( i=0; i<NumPointsInStrip; i++ ) for ( j=i+1; j<NumPointsInStrip; j++ ) if ( Dist( Pi , Pj ) <  )  = Dist( Pi , Pj ); Discussion 26: What is the worst case? What can we do then? 18/19