Presentation is loading. Please wait.

Presentation is loading. Please wait.

Weighted Matching-Algorithms, Hamiltonian Cycles and TSP

Similar presentations


Presentation on theme: "Weighted Matching-Algorithms, Hamiltonian Cycles and TSP"— Presentation transcript:

1 Weighted Matching-Algorithms, Hamiltonian Cycles and TSP
Graphs & Algorithms Lecture 6 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAA

2 Weighted bipartite matching
Given: Kn, n (complete bipartite graph on 2n vertices) n £ n weight matrix with entries wi, j ¸ 0 Want: perfect matching M maximizing the total weight Weighted cover (u, v): choice of vertex labels u = u1,…,un and v = v1,…,vn , u, v 2 Rn such that, for all 1 · i, j · n, we have ui + vj ¸ wi, j . Cost c(u, v) := i ui + vi .

3 Duality: max. weighted matching and min. weighted vertex cover
For any matching M and weighted cover (u, v) of a weighted bipartite graph, we have w(M) · c(u, v) . Also, w(M) = c(u, v) if and only if M consists of edges {i, j} such that ui + vj = wi, j. In this case, M and (u, v) are optimal. Equality subgraph Gu, v of a fixed cover (u, v): spanning subgraph of Kn, n , {i, j} 2 E(Gu, v) , ui + vj = wi, j . Idea: a perfect matching of Gu, v corresponds to a maximum weighted matching of Kn, n.

4 Hungarian Algorithm Kuhn (1955), Munkres (1957)

5 Correctness of the Hungarian Method
Theorem The Hungarian Algorithm finds a maximum weight matching and a minimum cost cover. Proof The statement is true if the algorithm terminates. Loop invariant: consider (u, v) before and (u', v') after the while loop (u, v) is cover of G ) (u', v') is a cover of G c(u, v) ¸ c(u', v') +  ¸ w(M) For rational weights,  is bounded from below by an absolute constant. In the presence of irrational weights, a more careful selection of the minimum vertex cover is necessary.

6 Hamiltonian Cycles A graph on n vertices is Hamiltonian if it contains a simple cycle of length n. The Hamiltonian-cycle problem is NP-complete (reduction to the vertex-cover problem). The naïve algorithm has running time (n!) = (2n). What are sufficient conditions for Hamiltonian graphs? Theorem (Dirac 1952) Every graph with n ¸ 3 vertices and minimum degree at least n/2 has a Hamiltonian cycle.

7 Proof of Dirac’s Theorem (Pósa)
Suppose G is a graph with (G)  n/2 that contains no Hamiltonian cycle. Insert as many edges into G as possible  Embedding of G into a saturated graph G’ that contains a Hamilton path Neighbourhood (x1) yields n/2 forbidden neighbours for xn in {x1, …, xn – 2}. Since xn cannot connect to itself, there is not enough space for all of its n/2 neighbours. x1 x2 xn xi-1 xi x3 Theorem of Chvatal?

8 Weaker degree conditions
Let G be a graph on n vertices with degrees d1 · … · dn. (d1, …, dn) is called the degree sequence An integer sequence (a1, …, an) is Hamiltonian if every graph on n vertices with a pointwise greater degree sequence (d1, …, dn) is Hamiltonian. Theorem (Chvátal 1972) An integer sequence (a1, …, an) such that 0 · a1 · … · an < n and n ¸ 3 is Hamiltonian iff, for all i < n/2, we have: ai · i ) an – i ¸ n – i.

9 Traveling Salesman Problem (TSP)
Given n cities and costs c(i, j) ¸ 0 for going from city i to city j (and vice versa). Find a Hamiltonian cycle H*of minimum cost c(H*) = e 2 E(H*) c(e) . Existence of a Hamiltonian cycle is a special case. Brute force: n! = (nn + 1/2 e-n) time complexity Better solutions: polynomial time approximation algorithm for TSP with triangle inequality approximation ratio 2 optimal solution with running time O(n22n)

10 Approximation algorithms
Consider a minimization problem. An algorithm ALG achieves approximation ratio (n) ¸ 1 if, for every problem instance P of size n, we have ALG(P) / OPT(P) · (n) , where OPT(P) is the optimal value of P. An approximation scheme takes one additional parameter  and achieves approximation ratio (1 + ) on every problem instance. A polynomial time approximation (PTAS) scheme runs in polynomial time for every fixed  ¸ 0. The running time of a fully polynomial time approximation scheme (FPTAS) is also polynomial in -1.

11 2-approximation algorithm for TSP
Running time Prim's algorithm with Fibonacci heaps: O(E + V logV) Kruskal's algorithm: O(E logV)

12 An optimal TSP algorithm
For each S µ {2,…,n} and k 2 S, define P(S, k) ´ "minimum cost of a Hamiltonian path on S starting in 1 and ending in k" Let V = {1,…,n} and positive costs c(i, j) be given. TSP = min{P(Vn{1}, k) + c(k, 1) : k 2 Vn{1}} Recursive computation of P(S, k) : P({k}, k) = c(1, k) P(S, k) = min{P(Sn{k}, j) + c(j, k) : j 2 Sn{k}} Compute P(S, k) buttom-up (dynamic programming) Number of distinct P(S, k) values is (n – 1)2n – 2 . Each time at most n operations are necessary.

13 Example Put on an extra slide

14 More positive and negative results on TSP
Slight modifications yield a 1.5-approximation algorithm for TSP with -inequality. (Exercise) Arora (1996) gave a PTAS for Euclidean TSP with running time nO(1/) . For general TSP, there exists no polynomial time approximation algorithm with any constant approximation ratio  ¸ 1, unless P = NP. (Exercise)


Download ppt "Weighted Matching-Algorithms, Hamiltonian Cycles and TSP"

Similar presentations


Ads by Google