Presentation is loading. Please wait.

Presentation is loading. Please wait.

Approximation Algorithms: Bristol Summer School 2008 Seffi Naor Computer Science Dept. Technion Haifa, Israel TexPoint fonts used in EMF. Read the TexPoint.

Similar presentations


Presentation on theme: "Approximation Algorithms: Bristol Summer School 2008 Seffi Naor Computer Science Dept. Technion Haifa, Israel TexPoint fonts used in EMF. Read the TexPoint."— Presentation transcript:

1 Approximation Algorithms: Bristol Summer School 2008 Seffi Naor Computer Science Dept. Technion Haifa, Israel TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A AA A A

2 Outline 1.Basics 2.Network design – the primal-dual method 3.Network design – iterative rounding and iterative relaxation 4.Competitive analysis via the primal-dial method

3 Outline 1.Basics 2.Network design – the primal-dual method 3.Network design – iterative rounding and iterative relaxation 4.Competitive analysis via the primal-dial method.

4 What is an Approximation Algorithm? Returns a sub-optimal solution for an optimization problem For a minimization problem: for every input I, Running time: polynomial Then, ALG is an ® -approximation

5 Why are Approximation Algorithms needed? Many interesting/useful optimization problems are NP-hard Optimal solution cannot be computed How well can they be approximated? Leads to beautiful algorithmic theory. And, the problems still need to be solved in practice …

6 Simple Example: Vertex Cover Input: undirected graph G=(V,E) Output: a minimum cardinality V’ µ V such that V’ Å e  ; for each e 2 E Classic NP-hard problem Approximation Algorithm: Find a maximal (inclusion-wise) matching M Vertex Cover V’ = V(M), vertices adjacent to edges in M

7 Analysis Feasibility of cover: V(M) covers all edges, Every edge must have an end point in V(M) Otherwise, M is not a maximal matching! Approximation factor: Optimal solution must pick a representative from each edge e 2 M Therefore, OPT ¸ V(M)/2 Approximation factor = V(M)/OPT · 2 Question: Can this factor be improved? Probably not …

8 Linear Programming Optimize a linear function over a set of linear constraints: minimize c ¢ x subject to: Ax ¸ 0 X ¸ 0 Linear programming is solvable in polynomial time [Khachiyan ’79]

9 Back to Vertex Cover Vertex cover can be formulated as integer/linear program: w v – weight of vertex v x v - indicator variable for choosing vertex v Relaxation: 0 · x i · 1 LP can be solved in poly time LP provides a lower bound on optimal integral solution!!! Rounding Algorithm: if x v ¸ ½, then x v à 1, else x v à 0 for each edge (u,v), at least one of x v, x u ¸ ½ Approximation factor = 2

10 Duality in Linear Programming Primal Dual primal constraint $ dual variable dual constraint $ primal variable Weak Duality Theorem: for u feasible dual solution and w feasible primal solution, b·u · c·w

11 Set Cover Elements: U ={ 1,2, …,n} Sets: S 1,…, S m (each S i µ { 1,2, …,n}) Each set S i has cost c i Goal: find a min cost collection of sets that cover U set cover can be formulated as integer/linear program: x i - indicator variable for choosing set S i Relaxation: 0 · x i · 1 LP can be solved in poly time LP provides a lower bound on optimal integral solution!!!

12 Rounding a Fractional Solution (1) For each S i : 0 · x i · 1 For each element j:  i x i ¸ 1 (summed over x i, j 2 S i ) Randomized Rounding: For each set S i : pick it to the cover with probability x i Analysis: Exp[cost of cover] =  i c i x i = LP cost Pr[element j is not covered] = Conclusion: probability of covering element j is at least a constant!

13 Rounding a Fractional Solution (2) Amplify probability of success: Repeat experiment clogn times so that Pr[element j is not covered] Analysis: Pr[some element is not covered] Exp[cost of cover] = O(logn)  i c i x i = O(logn)(LP cost) Conclusion: approximation factor is O(logn)

14 Set Cover: Greedy Algorithm Initially: C is empty While there is an uncovered element: Add to C the set S i minimizing c i /(# new elements covered) Analysis: via dual fitting Primal: covering ¸ dual: packing

15 Dual Fitting (1) Primal solution is feasible Define a dual solution: if element j is covered by set S i then y j = c i /(# new elements covered by S i ) In the iteration in which S i is picked: ¢ primal = ¢ dual = c i since cost of S i is “shared” between new elements covered by it Thus, cost of primal solution = cost of dual solution But is the dual solution feasible? Almost, but not quite …

16 Dual Fitting (2) Consider set S Suppose the elements in S are covered in the order e 1, …, e k When element e i is covered, Thus, Dividing dual variables by H(n) ¼ logn yields a feasible solution Greedy algorithm is O(logn)-approximation: primal · dual £ H(n)

17 Summary Linear programming: useful method for relaxing combinatorial problems Randomized rounding: going from a fractional solution to an integral solution without incurring too much damage Dual programs: useful for generating lower bounds useful for analyzing “primal” algorithms


Download ppt "Approximation Algorithms: Bristol Summer School 2008 Seffi Naor Computer Science Dept. Technion Haifa, Israel TexPoint fonts used in EMF. Read the TexPoint."

Similar presentations


Ads by Google