Presentation is loading. Please wait.

Presentation is loading. Please wait.

Algorithm Design Methods (I) Fall 2003 CSE, POSTECH.

Similar presentations


Presentation on theme: "Algorithm Design Methods (I) Fall 2003 CSE, POSTECH."— Presentation transcript:

1 Algorithm Design Methods (I) Fall 2003 CSE, POSTECH

2 Algorithm Design Methods Greedy method Divide and conquer Dynamic programming Backtracking Branch and bound

3 Some Methods Not Covered Linear Programming Integer programming Simulated annealing Neural networks Genetic algorithms Tabu search

4 Optimization Problem A problem in which some function (called the optimization/objective function) is to be optimized (usually minimized or maximized) It is subject to some constraints.

5 Machine Scheduling Find a schedule that minimizes the finish time. – optimization function … finish time – constraints Each job is scheduled continuously on a single machine for an amount of time equal to its processing requirement. No machine processes more than one job at a time.

6 Bin Packing Pack items into bins using fewest number of bins. – optimization function … number of bins – constraints Each item is packed into a single bin. The capacity of no bin is exceeded.

7 Min Cost Spanning Tree Find a spanning tree that has minimum cost. – optimization function … sum of edge costs – constraints Must select n-1 edges of the given n vertex graph. The selected edges must form a tree.

8 Feasible and Optimal Solutions A feasible solution is a solution that satisfies the constraints. An optimal solution is a feasible solution that optimizes the objective/optimization function.

9 Greedy Method Solve problem by making a sequence of decisions. Decisions are made one by one in some order. Each decision is made using a greedy criterion. A decision, once made, is (usually) not changed later.

10 Machine Scheduling LPT Scheduling. Schedule jobs one by one in decreasing order of processing time. Each job is scheduled on the machine on which it finishes earliest. Scheduling decisions are made serially using a greedy criterion (minimize finish time of this job). LPT scheduling is an application of the greedy method.

11 LPT Schedule LPT rule does not guarantee minimum finish time schedules. (LPT Finish Time)/(Minimum Finish Time) <= 4/3 – 1/(3m) where m is number of machines. Minimum finish time scheduling is NP-hard. In this case, the greedy method does not work. Greedy method does, however, give us a good heuristic for machine scheduling.

12 Container Loading Ship has capacity c. m containers are available for loading. Weight of container i is w i. Each weight is a positive number. Sum of container weight < c. Load as many containers as possible without sinking the ship.

13 Greedy Solution Load containers in increasing order of weight until we get to a container that does not fit. Does this greedy algorithm always load the maximum number of containers. Yes. May be proved using a proof by induction. (see Theorem 13.1, p. 624 of text.)

14 Container Loading With 2 Ships Can all containers be loaded into 2 ships whose capacity is c (each)? – Same as bin packing with 2 bins (Are 2 bins sufficient for all items?) – Same as machine scheduling with 2 machines (Can all jobs be completed by 2 machines in c time units?) – NP-hard

15 0/1 Knapsack Problem Hiker wishes to take n items on a trip. The weight of item i is w i. The knapsack has a weight capacity c. When sum of items weights <= c, all n items can be carried in the knapsack. When sum of item weights > c, some items must be left behind. Which items should be taken out?

16 0/1 Knapsack Problem Hiker assigns a profit/value p i to item i. All weights and profits are positive numbers. Hiker wants to select a subset of the n items to take. – The weight of the subset should not exceed the capacity of the knapsack. (constraint) – Cannot select a fraction of an item. (constraint) – The profit/value of the subset is the sum of the profits of the selected items. (optimization function) – The profit/value of the selected subset should be maximum. (optimization criterion)

17 0/1 Knapsack Problem Letx i =1 when item i is selected and letx i =0 when item i is not selected. maximize Sigma(i=1…n) p i x i subject to Sigma(i=1…n) w i x i <= c

18 Greedy Attempt 1 Be greedy on capacity utilization (select items in increasing order of weight). n = 2, c = 7 w = [3, 6] p = [2, 10] Only 1 item is selected. Profit/value of selection is 2. It is not best selection.

19 Greedy Attempt 2 Be greedy on profit earned (select items in decreasing order of profit). n = 3, c = 7 w = [7, 3, 2] p = [10, 8, 6] Only 1 item is selected. Profit/value of selection is 10. It is not best selection.

20 Greedy Attempt 3 Be greedy on profit density (p/w) (select items in decreasing order of profit density). n = 2, c = 7 w = [1, 7] p = [10, 20] Only 1 item is selected. Profit/value of selection is 10. It is not best selection.

21 Greedy Attempt 3 Be greedy on profit density (p/w). – works when selecting a fraction of an item is permitted. – Select items in decreasing order of profit density; if next item doesnt fit, take a fraction to fill knapsack. n = 2, c = 7 w = [1, 7] p = [10, 20] Item 1 and 6/7 of item 2 are selected.

22 Greedy Attempt 4 Select a subset with <= k items. If the weight of this subset is > c, discard the subset. If the subset weight is <= c, fill as much of the remaining capacity as possible by being greedy on profit density. Try all subsets with <= k items and select the one that yields maximum profit.

23 0/1 Knapsack Greedy Heuristics First sort into decreasing order of profit density. There are O(n k ) subsets with at most k items. (C(n,1) + C(n,2) + C(n,3) + … + C(n,k)) Try a subset takes O(n) time. Total time is O(n k+1 ) where k > 0. (best value – greedy value) / best value <= 1/(k+1)

24 0/1 Knapsack Greedy Heuristics

25 Divide and Conquer A large instance is solved as follows: – Divide the large instance into smaller instances. – Solve the smaller instances somehow. – Combine the results of the smaller instances to obtain the result for the original large instance. A small instance is solved in some other way.

26 Small and Large Instance Small instance – Sort a list that has n <= 10 elements. – Find the minimum of n <= 2 elements. Large instance – Sort a list that has n > 10 elements. – Find the minimum of n > 2 elements.

27 Solving A Small Instance A small instance is solved using some direct/simple strategy. – Sort a list that has n <= 10 elements. Use insertion, bubble, or selection sort. – Find the minimum of n <= 2 elements. When n = 0, there is no minimum element. When n = 1, the single element is the minimum. When n = 2, compare the two elements and determine which is smaller.

28 Sort A Large List Sort a list that has n > 10 elements. – Sort 15 elements by dividing them into 2 smaller lists. One list has 7 elements and the other has 8 elements. – Sort these two lists using the method for small lists. – Merge the two sorted lists into a single sorted list.

29 Find The Min Of A Large List Find the minimum of 20 elements. – Divide into two groups of 10 elements each. – Find the minimum element in each group somehow. – Compare the minimums of each group to determine the overall minimum.

30 Recursion In Divide and Conquer Often the smaller instances that result from the divide step are instances of the original problem (true for our sort and min problems). In this case, – If the new instance is a smaller instance, it is solved using the method for small instances. – If the new instance is a large instance, it is solved using the divide-and-conquer method recursively. Generally, performance is best when the smaller instances that result from the divide step are of approximately the same size.

31 Recursive Find Min Find the minimum of 20 elements. – Divide into two groups of 10 elements each. – Find the minimum element in each group recursively. The recursion terminates when the number of elements is <= 2. At this time the minimum is found using the method for small instances. – Compare the minimums of each group to determine the overall minimum.

32 Min And Max Find the lightest and heaviest of n elements using a balance that allows you to compare the weight of 2 elements. Minimize the number of comparisons.

33 Max Element Find element with max weight from w[0:n-1]. maxElement = 0; for (int i = 1; i < n; i++) if (w[maxElement] < w[i]) maxElement = i; Number of comparisons of w values is n-1.

34 Min And Max Find the max of n elements making n-1 comparisons. Find the min of the remaining n-1 elements making n-2 comparisons. Total number of comparisons is 2n-3.

35 Divide and Conquer Small instance: n <= 2. Find the min and max element making at most one comparison.

36 Large Instance Min And Max n > 2. Divide the n elements into 2 groups A and B with floor(n/2) and ceil(n/2) elements, respectively. Find the min and max of each group recursively. Overall min is min{min(A),min(B)}. Overall max is max{max(A),max(B)}.

37 Min And Max Example Find the min and max of {3,5,6,2,4,9,3,1}. Large instance. A = {3,5,6,2} and B = {4,9,3,1}. min(A) = 2, min(B) = 1. max(A) = 6, max(B) = 9. min{min(A),min(B)} = 1. max{max(A),max(B)} = 9.

38 Time Complexity Let c(n) be the number of comparisons made when finding the min and max of n elements. c(0) = c(1) = 0. c(2) = 1. c(n) = c(floor(n/2)) + c(ceil(n/2)) + 2 when c > 2. To solve the recurrence, assume n is a power of 2 and use repeated substitution. c(n) = ceil(3n/2) – 2.

39 Interpretation Of Recursive Version The working of recursive divide-and-conquer algorithm can be described by a tree: recursion tree. The algorithm moves down the recursion tree dividing the large instances into smaller ones. Leaves represent small instances. The recursive algorithm moves back up the tree combining the results from the subtrees. The combining finds the min of the mins computed at leaves and the max of the leaf maxs.

40 Downward Pass Divides Into Smaller Instances

41 Upward Pass Combines Results From Subtrees {2,8}

42 Merge Sort Sort the first half of the array using merge sort. Sort the second half of the array using merge sort. Merge the first half of the array with the second half.

43 Merge Algorithm Merge is an operation that combines two sorted arrays. Assume the result is to be placed in a separate array called result (already allocated). The two given arrays are called front and back. front and back are in increasing order. For the complexity analysis, the size of the input, n, is the sum n front + n back.

44 Merge Algorithm For each array keep track of the current position. REPEAT until all the elements of one of the given arrays have been copied into result: – Compare the current elements of front and back. – Copy the smaller into the current position of result (break the ties however you like). – Increment the current position of result and the array that was copied from. Copy all the remaining elements of the other given array into result.

45 Merge Algorithm - Complexity Every element in front and back is copied exactly once. Each copy is two accesses, so the total number of accessing due to copying is 2n. The number of comparisons could be as small as min(n front, n back ) or as large as n-1. Each comparison is two accesses.

46 Merge Algorithm - Complexity In the worst case the total number of accesses is 2n +2(n-1) = O(n). In the best case the total number of accesses is 2n + 2min(n front,n back ) = O(n). The average case is between the worst and best case and is therefore also O(n).

47 Merge Sort Algorithm Split anArray into two non-empty parts anyway you like. For example, front = the first n/2 elements in anArray back = the remaining elements in anArray Sort front and back by recursively calling MergeSort. Now you have two sorted arrays containing all the elements from the original array. Use merge to combine them, put the result in anArray.

48 MergeSort Call Graph (n=7) Each box represents one invocation of MergeSort. How many levels are there in general if the array is divided in half each time? 0~6 0~23~6 3~4 3~34~4 5~6 5~56~6 1~2 1~12~2 0~0

49 MergeSort Call Graph (general) Suppose n = 2 k. How many levels? How many boxes on level j? What values is in each box at level j? n n/2 n/4 11111111


Download ppt "Algorithm Design Methods (I) Fall 2003 CSE, POSTECH."

Similar presentations


Ads by Google