Download presentation

Presentation is loading. Please wait.

Published byJavon Criddle Modified over 2 years ago

1
Weighted Matching-Algorithms, Hamiltonian Cycles and TSP Graphs & Algorithms Lecture 6 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A AA AA A A A A AA A

2
Weighted bipartite matching Given: –K n, n (complete bipartite graph on 2n vertices) –n £ n weight matrix with entries w i, j ¸ 0 Want: perfect matching M maximizing the total weight Weighted cover ( u, v ): choice of vertex labels u = u 1,…, u n and v = v 1,…, v n, u, v 2 R n such that, for all 1 · i, j · n, we have u i + v j ¸ w i, j. Cost c ( u, v ) := i u i + v i. 2

3
Duality: max. weighted matching and min. weighted vertex cover For any matching M and weighted cover ( u, v ) of a weighted bipartite graph, we have w ( M ) · c ( u, v ). Also, w ( M ) = c ( u, v ) if and only if M consists of edges { i, j } such that u i + v j = w i, j. In this case, M and ( u, v ) are optimal. Equality subgraph G u, v of a fixed cover ( u, v ): – spanning subgraph of K n, n, – { i, j } 2 E ( G u, v ), u i + v j = w i, j. Idea: a perfect matching of G u, v corresponds to a maximum weighted matching of K n, n. 3

4
Hungarian Algorithm Kuhn (1955), Munkres (1957) 4

5
Correctness of the Hungarian Method Theorem The Hungarian Algorithm finds a maximum weight matching and a minimum cost cover. Proof The statement is true if the algorithm terminates. Loop invariant: consider ( u, v ) before and ( u ', v ') after the while loop – ( u, v ) is cover of G ) ( u ', v ') is a cover of G – c ( u, v ) ¸ c( u ', v ') + ¸ w ( M ) For rational weights, is bounded from below by an absolute constant. In the presence of irrational weights, a more careful selection of the minimum vertex cover is necessary. 5

6
Hamiltonian Cycles A graph on n vertices is Hamiltonian if it contains a simple cycle of length n. The Hamiltonian-cycle problem is NP -complete (reduction to the vertex-cover problem). The naïve algorithm has running time ( n !) = (2 n ). What are sufficient conditions for Hamiltonian graphs? Theorem (Dirac 1952) Every graph with n ¸ 3 vertices and minimum degree at least n /2 has a Hamiltonian cycle. 6

7
Suppose G is a graph with ( G ) n /2 that contains no Hamiltonian cycle. Insert as many edges into G as possible Embedding of G into a saturated graph G that contains a Hamilton path Neighbourhood ( x 1 ) yields n /2 forbidden neighbours for x n in { x 1, …, x n – 2 }. Since x n cannot connect to itself, there is not enough space for all of its n /2 neighbours. Proof of Diracs Theorem (Pósa) x1x1 x2x2 xnxn xi-1xi-1 xixi x3x3

8
Weaker degree conditions Let G be a graph on n vertices with degrees d 1 · … · d n. ( d 1, …, d n ) is called the degree sequence An integer sequence ( a 1, …, a n ) is Hamiltonian if every graph on n vertices with a pointwise greater degree sequence ( d 1, …, d n ) is Hamiltonian. Theorem (Chvátal 1972) An integer sequence ( a 1, …, a n ) such that 0 · a 1 · … · a n < n and n ¸ 3 is Hamiltonian iff, for all i < n /2, we have: a i · i ) a n – i ¸ n – i. 8

9
Traveling Salesman Problem (TSP) Given n cities and costs c ( i, j ) ¸ 0 for going from city i to city j (and vice versa). Find a Hamiltonian cycle H *of minimum cost c ( H *) = e 2 E ( H *) c ( e ). Existence of a Hamiltonian cycle is a special case. Brute force: n ! = ( n n + 1/2 e - n ) time complexity Better solutions: – polynomial time approximation algorithm for TSP with triangle inequality approximation ratio 2 – optimal solution with running time O ( n 2 2 n ) 9

10
Approximation algorithms Consider a minimization problem. An algorithm ALG achieves approximation ratio ( n ) ¸ 1 if, for every problem instance P of size n, we have ALG( P ) / OPT( P ) · ( n ), where OPT( P ) is the optimal value of P. An approximation scheme takes one additional parameter and achieves approximation ratio (1 + ) on every problem instance. A polynomial time approximation (PTAS) scheme runs in polynomial time for every fixed ¸ 0. The running time of a fully polynomial time approximation scheme (FPTAS) is also polynomial in -1. 10

11
2-approximation algorithm for TSP Running time – Prim's algorithm with Fibonacci heaps: O ( E + V log V ) – Kruskal's algorithm: O ( E log V ) 11

12
An optimal TSP algorithm For each S µ {2,…, n } and k 2 S, define P ( S, k ) ´ "minimum cost of a Hamiltonian path on S starting in 1 and ending in k " Let V = {1,…, n } and positive costs c ( i, j ) be given. TSP = min { P ( V n {1}, k ) + c ( k, 1) : k 2 V n {1} } Recursive computation of P ( S, k ) : P ({ k }, k )= c (1, k ) P ( S, k )= min { P ( S n { k }, j ) + c ( j, k ) : j 2 S n { k } } Compute P ( S, k ) buttom-up (dynamic programming) Number of distinct P ( S, k ) values is ( n – 1)2 n – 2. Each time at most n operations are necessary. 12

13
Example 13

14
More positive and negative results on TSP Slight modifications yield a 1.5-approximation algorithm for TSP with -inequality. (Exercise) Arora (1996) gave a PTAS for Euclidean TSP with running time n O (1/ ). For general TSP, there exists no polynomial time approximation algorithm with any constant approximation ratio ¸ 1, unless P = NP. (Exercise) 14

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google