Presentation is loading. Please wait.

Presentation is loading. Please wait.

Algorithms for hard problems Introduction Juris Viksna, 2015.

Similar presentations


Presentation on theme: "Algorithms for hard problems Introduction Juris Viksna, 2015."— Presentation transcript:

1 Algorithms for hard problems Introduction Juris Viksna, 2015

2 Classical complexity - P and NP P = NPP  NP P = NP NP NP complete P

3 Vertex cover - how to solve this? VERTEX COVER Instance:A graph G=(V,E) and a positive integer k Question:Is there a subset S  V, such that |S|=k and for all {x,y}  E either x  S or y  S? For what values of n =|V | and k we can solve this problem in practice? [Adapted from R.Downey and M.Fellows]

4 Vertex cover - how to solve this? A "universal" approach: - problem is in NP, so we can try to search through all possible witnesses that vertex cover of the given size exists - the running time is n k, thus for n=100 we could probably deal with k<7 and for n=1000 with k<5... Can we do better?

5 Approaches to NP-hard problems Branch-and-bound algorithms Heuristic methods Approximation algorithms Probabilistic algorithms Pseudo-polynomial algorithms FPT algorithms

6 Branch-and-bound algorithms CLIQUE Instance:A graph G=(V,E) and a positive integer k Question:Is there a subset S  V, such that |S|=k and {x,y}  E for all x,y  S? [Adapted from D.Karabeg] Bron-Kerbosch algorithm 1971

7 Heuristic methods TSP (TRAVELING SALESMAN PROBLEM) Instance:A complete graph G=(V,E) with edge weights w: E  R + Problem:Find a cycle of minimum cost containing each of the vertices exactly once. Note, that in this case we consider an optimization problem and not a decision problem

8 Heuristic methods TSP tour of Sweden 24978 cities length 72500 km solved in 2004

9 Heuristic methods Idea - use a state space search algorithm (A*) with some reasonable heuristic: - states: partially completed cycles - production rules: all possible extensions of a partial cycle by adding one edge - if heuristic h will not exceed the minimum weight of extension to completed cycle, then A* guarantees to find an optimal solution (!)

10 Heuristic methods TSP. A legal tour is a (Hamiltonian) circuit: –It is a spanning tree (when an edge is removed) with the constraint that each node has at most 2 adjacent edges – Removing the adjacency constraint leads to h 1 : find the cheapest minimum spanning tree from the given graph (complexity O(n 2 log n)) The given graph A legal tour Other MST [Adapted from Y.Peng]

11 Heuristic methods TSP. A legal tour is a (Hamiltonian) circuit: –It is a connected second degree graph (each node has exactly two adjacent edges) – Removing the connectivity constraint leads to h 2 : find the cheapest second degree graph from the given graph (complexity O(n 3 )) The given complete graph A legal tour Other second degree graphs [Adapted from Y.Peng]

12 Approximation algorithms Approximation algorithm for VERTEX COVER problem: S   E  E[G] while E  let (u,v) be am arbitrary edge in E S  S  {u,v} remove from E every edge incident on either u or v return S VERTEX COVER Instance:A graph G=(V,E) Problem:Find a largest subset S  V, such for all {x,y}  E either x  S or y  S? Again, here we consider an optimization problem

13 Approximation algorithms Approximation algorithm for VERTEX COVER problem: S   E  E[G] while E  let (u,v) be am arbitrary edge in E S  S  {u,v} remove from E every edge incident on either u or v return S [Adapted from T.Cormen et al]

14 Approximation algorithms Approximation algorithm for VERTEX COVER problem: S   E  E[G] while E  let (u,v) be am arbitrary edge in E S  S  {u,v} remove from E every edge incident on either u or v return S C* - the size of vertex cover C - the size of set S returned by algorithm From each pair {u,v} added to S at least one vertex should belong to any vertex cover Thus C  2C* Algorithm finds a 2-approximation

15 Approximation algorithms Let C * be the cost of an optimal solution, and let C be the cost of the solution of an approximation algorithm. The algorithm has an approximation ratio of ρ(n) if, for all solutions max(C/C *,C * /C) ≤ ρ(n). We say that an approximation algorithm with an approximation ration of ρ(n) is a ρ(n)- approximation algorithm. [Adapted from S.Guattery]

16 Approximation algorithms MAX CUT Instance:A graph G=(V,E) Problem:Split V in 2 disjoint sets V 1 and V 2, such that {{x,y}  E | x  V 1 and y  V 2 } is maximal maximize Easy with  (n) = 2 [Erdös 1965] NP-hard for  (n) = 1.06 [Arora et al 1992] Polynomial for  (n) = 1.14 [Goemans,Williamson 1993] [Adapted from L.Lovász]

17 Approximation algorithms An approximation scheme is an approximation algorithm that takes an instance and an ε > 0, and produces a (1+ε) approximation. If an approximation scheme runs in polynomial time in both the size its input and ε, we say it is a polynomial-time approximation scheme (PTAS). [Adapted from S.Guattery]

18 Approximation algorithms KNAPSACK Instance:A finite set A of elements, with a size s: A  Z + and value v: A  Z + for each element, and integer K Problem:Find a subset S  A, such that  x  S s(x)  K and  x  S v(x) is maximal. There is a PTAS for KNAPSACK problem with a running time O(n 3 /  ). If P ≠ NP, the general TSP problem cannot be approximated within any constant ρ ≥ 1.

19 Probabilistic algorithms PRIMALITY TESTING Instance:A positive integer n Question:Is n a prime?

20 Probabilistic algorithms Miller-Rabin algorithm gives a correct answer with probability p in time O(log (1/ p) (log n) 3 ). If generalized Riemann hypothesis holds there is a O(log n) 5 ) time deterministic algorithm. There is a deterministic O(log n) 15/2 ) time algorithm (!) [Agarwal et al, 2004]. [Adapted from D.Harel]

21 Pseudo-polynomial algorithms PARTITION Instance:A finite set A={a 1,...,a n } of positive integers Question:Is there a subset S  A, such that  x  S x =  x  S x? A dynamic programming algorithm: B =  x  A x. For i  n and j  B/2 define T(i, j) to be true if and only if there is a subset Y  {a 1,...,a i }, such that  x  Y x = j. Formula: T(i,j) = true iff T(i-1, j)= true or T(i-1, j-a i )= true. Polynomial in nB (!), but not in n...

22 Pseudo-polynomial algorithms [Adapted from D.Karabeg]

23 Vertex cover revisited VERTEX COVER Instance:A graph G=(V,E) and a positive integer k Question:Is there a subset S  V, such that |S|=k and for all {x,y}  E either x  S or y  S? We already developed time n k algorithm for this problem... However, it is possible to do better:

24 Vertex cover revisited Algorithms for VERTEX COVER: O(f(k) n 3 ) [Fellows, Langston 1986] O(f(k) n 2 ) [Johnson 1987] O(2 k n) (polynomial for k=O(log n)) [Fellows 1988] O(kn + 2 k k 2k+2 ) [Buss 1989] O(kn + 2 k k 2 ) [Balasubramanian et al 1992] O(3 k n) [Papadimitriu 1993] O(kn + (4/3) k k 2 ) [Balasubramanian et al 1996]

25 Parametrized complexity Combinatorial "explosion" for NP-hard problems [Adapted from R.Downey and M.Fellows]

26 Parametrized complexity Parametrized complexity attempts to confine combinatorial "explosion" [Adapted from R.Downey and M.Fellows]

27 Parametrized complexity - Definitions Definition (FPT) A parametrized problem L   *  * is Fixed Parameter Tractable if there is an algorithm that for input (x,y)   *  * with |x| = k and |y| = n decides whether (x,y)  L in time f(k) n , where f is an arbitrary function and  is a constant. Definition does not change if f(k) n  is replaced by f(k) + n  (!)

28 Parametrized complexity - Definitions M k for every n solves the problem in time f(k) n  For each k there is a constant c k, such that f(k) n  > n  + 1 for n  c k M' k :- simulates M k for n  c k - looks up value from the table for n  c k M' k solves the problem in time f(k) c k + n a+1


Download ppt "Algorithms for hard problems Introduction Juris Viksna, 2015."

Similar presentations


Ads by Google