Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bold Stroke January 13, 2003 Advanced Algorithms CS 539/441 OR In Search Of Efficient General Solutions Joe Hoffert

Similar presentations


Presentation on theme: "Bold Stroke January 13, 2003 Advanced Algorithms CS 539/441 OR In Search Of Efficient General Solutions Joe Hoffert"— Presentation transcript:

1 Bold Stroke January 13, 2003 Advanced Algorithms CS 539/441 OR In Search Of Efficient General Solutions Joe Hoffert Joseph.W.Hoffert@Boeing.com

2 Bold Stroke January 13, 2003 Outline General Techniques For Polynomial Time Solutions –Greedy Algorithms –Dynamic Programming –Linear Programming Problems w/ unlikely Poly-time Solutions (NP Complete) Next best solutions (e.g., Approximation Algorithms) Lower Bound Techniques On-Line/Dynamic Algorithms

3 Bold Stroke January 13, 2003 Greedy Algorithms Locally optimal choice leads to globally optimal solution. Will not often yield optimal solution. Framework –Prove Greedy Choice Property Holds: first step “g” made by greedy algorithm is part of some optimal solution (i.e., using “g” does no worse than an arbitrary optimal solution) –Prove Optimal Substructure Property Holds We need subproblem P’ of P left after g is chosen P’ is optimally solved in S. That is, the solution to P’ contained within S is optimal for P’. Examples: Earliest Deadline First (EDF) scheduling, Huffman coding

4 Bold Stroke January 13, 2003 Greedy Algorithm Example EDF scheduling: sort jobs by deadline – O(n log n) Job 1 Job 2 Job 3 Job 4 Job 5 Schedule job with shortest deadline first Subproblem P’ is to schedule remaining jobs 2 - 5 must prove this choice is at least as good as any other Maximize # of jobs that meet deadlines Job 5 Job 3 Job 2 Job 1 Job 4 must prove this subproblem is independent of first choice made Job 1 Job 2 Job 3 Job 4 Job 5

5 Bold Stroke January 13, 2003 Dynamic Programming Locally optimal choice doesn’t lead to a globally optimal solution but there is still optimal substructure Framework –Try all possible first choices: One of these must be the correct first choice –Prove Optimal Substructure Property Holds We need subproblem P’ of P left after first choice P’ is optimally solved in S. That is, the solution S’ to P’ contained within S is optimal for P’. –Use bottom up approach to efficiently compute optimal cost for all subproblems (a.k.a. overlapping subproblems) Examples: assembly line scheduling, longest common subsequence

6 Bold Stroke January 13, 2003 Dynamic Programming Example Assembly line scheduling: find minimal cost through stations chassis enters e1 e2 a 1,1 a 2,1 a 2,2 a 2,3 a 1,2 a 1,3 assembly line 1 a 1,n-1 a 1,n a 2,n-1 a 2,n assembly line 2 x1 x2 completed auto exits t 2,1 t 1,1 t 2,2 t 1,2 t 2,n-1 t 1,n-1 … chassis enters 2 4 7 856 9 assembly line 1 84 57 assembly line 2 3 2 completed auto exits 1 3 1 4 4 4 2 1 2 3 2 2 3

7 Bold Stroke January 13, 2003 Linear Programming Problem defined by set of linear equations (equalities or inequalities) –Poly-time but high exponent on time complexity (e.g., n 8 to n 10 ) –Proves poly-time solution, more efficient solutions might be found Example: 2 dimensional constraints, minimum cost flow x 2 >= 0 x1x1 x2x2 x 1 >= 0 2x 1 + x 2 <= 10 4x 1 - x 2 <= 8 5x 1 - 2x 2 >= -2 x1x1 x2x2 x 1 + x 2 = 4 x 1 + x 2 = 0 x 1 + x 2 = 8 s x t y capacity = 5 cost = 2 capacity = 1 cost = 3 capacity = 2 cost = 5 capacity = 2 cost = 7 capacity = 4 cost = 1 s x t y 2 of 5 cost = 2 1 of 1 cost = 3 2 of 2 cost = 5 1 of 2 cost = 7 3 of 4 cost = 1 Minimize cost for flow of 4 units from s to t Maximize x 1 + x 2 given constraints below

8 Bold Stroke January 13, 2003 NP-Complete Problems NP complete problems are: –Complexity class Non-deterministic Polynomial (NP), i.e., solution to NP problem can be verified in polynomial time –NP-hard, i.e., reducible from all other NP problems in poly-time Unlikely to have poly-time solutions, not (yet) proven Example: MP-Scheduling problem Job 3 Job 2 Job 1 Job n-1 Job n … Processor 2 … Processor 1 Processor m Processor 2 … Processor 1 Processor m Is there a schedule in which all jobs are processed by specified time? Job 3 Job 2 Job 1 Job n-1 Job n Processor m-1

9 Bold Stroke January 13, 2003 Next Best Solutions What to do when problem shown to be NP-complete: –Is problem a special case? For instance, vertex-cover is NP- complete but if graph is a tree there exists a poly time algorithm for this case –Input size may be small enough not to be a problem –Use a heuristic and hope it gets something good enough –Use approximation algorithms (can be proven to be “good enough” or not “good enough”) LP relaxation (IP is NP-complete, LP may give a good enough answer) 0-1 Knapsack relaxation to fractional knapsack Non-preempted schedule relaxed to have preemption

10 Bold Stroke January 13, 2003 Approximation Algorithms Poly-time algorithms with provable bounds relative to optimal solution –If approximation bound is some constant n then the algorithm is said to be an n-approximation Example: 2-approximation 0-1 Knapsack problem Item 4 Item 3 Item 5 Item 1 Item 2 Item 4Item 3 Item 5 Item 1 Item 2 Sort items by value per unit quantity (i.e., total value/amount) and start filling up the knapsack Take the larger of the first item that won’t completely fit and the sum of all the previous items in the knapsack. This must be at least ½ of optimal solution which is a 2 approximation.

11 Bold Stroke January 13, 2003 Lower Bound Techniques Determine minimum number of operations needed for any solution –Use decision-tree lower bound algorithm when applicable All possibilities are enumerated as leaves in a decision tree Traverse the depth of the tree Example: lower bound for median among two sorted arrays –Adversary strategy Devise a strategy that makes it “hard” for any algorithm to find a solution (i.e., maximize the number of steps needed for any solution) Example: merging two sorted arrays Provides guidance for how good an algorithm can be

12 Bold Stroke January 13, 2003 Adversary Strategy Example Example: merging two sorted arrays –How many comparisons must be made before any algorithm has the arrays sorted? a1a1 a2a2 a3a3 a4a4 a n-1 anan … b1b1 b2b2 b3b3 b4b4 b n-1 … bnbn a1a1 a2a2 a3a3 a4a4 a n-1 anan b1b1 b2b2 b3b3 b4b4 b n-1 … bnbn If < 2n – 1 comparisons are made  more than one possible answer left (i.e., adversary can switch the elements that have not been compared) aiai bibi aiai bibi aiai bibi a i-1 b i-1 a i+1 b i+1 a i-1 b i-1 a i-1 b i-1 a i+1 b i+1 a i+1 b i+1 ………… …… Adversary strategy

13 Bold Stroke January 13, 2003 On-Line/Dynamic Algorithms Algorithms that don’t have knowledge of future requests/input –Compare to off-line algorithms that have complete knowledge of input –Provable bound compared to off-line solution Examples: Ski rental vs. purchase, scheduling jobs with deadlines using EDF, B = cost to buy Can have 2-competitive algorithm, i.e., provable 2B bound on cost R = cost to rent

14 Bold Stroke January 13, 2003 Conclusion Techniques exist for: –Provable poly-time algorithms –Determining unlikely poly-time algorithms –Next best solutions with provable bounds When to use what technique –Mostly developed through practice and intuition What does it matter? –Provable solutions


Download ppt "Bold Stroke January 13, 2003 Advanced Algorithms CS 539/441 OR In Search Of Efficient General Solutions Joe Hoffert"

Similar presentations


Ads by Google