Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Traveling Salesman Problem in Theory & Practice Lecture 5: Tour Construction Heuristics 18 February 2014 David S. Johnson

Similar presentations


Presentation on theme: "The Traveling Salesman Problem in Theory & Practice Lecture 5: Tour Construction Heuristics 18 February 2014 David S. Johnson"— Presentation transcript:

1 The Traveling Salesman Problem in Theory & Practice Lecture 5: Tour Construction Heuristics 18 February 2014 David S. Johnson dstiflerj@gmail.com http://davidsjohnson.net Seeley Mudd 523, Tuesdays and Fridays

2 Outline Tour Construction Heuristics Definitions and Worst-Case Results NN, Greedy, Savings, Insertion Algorithms, etc. Performance in Practice (Tour quality)

3 Nearest Neighbor (NN): 1.Start with some city. 2.Repeatedly go next to the nearest unvisited neighbor of the last city added. 3.When all cities have been added, go from the last back to the first.

4 Theorem [ Rosenkrantz, Stearns, & Lewis, “An analysis of several heuristics for the traveling salesman problem,” SIAM J. Comput. 6 (1977), 563-581 ]: 1.There is a constant c, such that, for any instance I obeying the ∆-inequality, NN(I) ≤ clog(N)  Opt(I). 2.There exists a constant c’, such that for all N > 3, there are N-city instances I obeying the triangle inequality for which we have NN(I) > c’log(N)  Opt(I). Definition: For any algorithm A and instance size N, the worst- case ratio for A is R N (A) = max{A(I)/OPT(I): I is an N-city instance} Corollary: R N (NN) = Θ(log(N)). [Proved last week] [To be proved today]

5 Lower Bound Examples (Unspecified distances determined by shortest paths) F1:F1: F2:F2: Number N 1 of vertices = 3 OPT(F 1 ) = 3+ε NN-path(F 1 ) = 2 1+ ε 1 1 1 1 1 1 22 1 1 N 2 = 2N 1 + 3 = 9 OPT(F 1 ) = N 2 +5ε = 9+5ε NN-path(F 2 ) = 10 1+ ε NN path starts at left, ends in middle

6 Fk:Fk: 1+ ε 1 1 2 k-1 1+ε F k-1 without shortcut between left and right endpoints 2 k-1 Number N k of vertices = 2*N k-1 + 3 OPT(F k ) = N k + ½(N k + 1)ε (for k ≥ 2) NN-path(F k ) = 2NN-path(F k-1 )+2 k +2

7 N k = 3  (2 k -1) Proof by Induction: Initialization: N 2 = 9 = 3  (2 2 -1) Assume true for k’ < k. Then N k = 2N k-1 + 3 = 2(3  (2 k-1 -1)) + 3 = 3  2 k – 6 +3 = 3  (2 k -1) General Formula for N k Note for future reference: N k is always odd.

8 Opt(F k ) = N k + ½(N k + 1)ε Proof: All edge lengths must be 1 or 1+ε, which means they must lie along the base of F k, where the edges start with 1+ε and then alternate, and since N k is odd, this means that means we will get one more 1+ε than 1. Note that this means that, as ε → 0 and k → ∞, OPT(F k ) → N k = 3  (2 k -1), and hence log(OPT(F k )) → k+1 + log(3/2), and hence for sufficiently large k we have k+1 > ½log(Opt(F k )) General Formula for Opt(F k ) when k ≥ 2

9 General Formula for NN-path(F k ) NN-path(F k ) = (k+1)  2 k – 2 Proof by Induction: Initialization: NN-path(F 2 ) = 10 = 3  2 2 - 2 Assume true for k’ < k. Then NN-path(F k ) = 2  NN-path(F k-1 ) + 2 k + 2 = 2(k  2 k-1 – 2) + 2 k + 2 = (k+1)  2 k – 2. Corollary: as ε → 0 and k → ∞, NN-path(F k )/OPT(F k ) → (k+1)  2 k /(3  2 k ) > log(OPT(F k ))/6

10 Fk:Fk: 1+ ε 1 1 2 k-1 1+ ε F k-1 without shortcut between left and right endpoints 2 k-1 Number N k of vertices = 2*N k-1 + 3 OPT(F k ) = N k + ½(N k + 1)ε (for k ≥ 2) NN-path(F k ) = 2NN-path(F k-1 )+2 k +2 But are these two vertices really nearest neighbors?

11 1+ ε 1 1 2 k-1 FkFk 1+ε 1 1 2 k-2 F k-1 B C Is d(B,C) + 1+ ε > 2 k-1 F k-1

12 1+ ε 1 1 2 k-1 FkFk A C B Define: D 1 (F k ) = d(A,C), D 2 (F k ) = d(B,C) = d(A,B) Note: A shortest path from A to B need never go leftwards (by the triangle inequality) Results for General k We shall prove by induction that D 1 (F k ) > 2 k+1 - 3 and D 2 (F k ) > 2 k - 1

13 C 1+ ε 1 1 2 k-1 A B 1+ ε 1 1 2 k-1 A C B Assuming Induction Hypothesis (for k’ < k): D 1 (F k’ ) > 2 k ’ +1 - 3 D 2 (F k’ ) > 2 k ’ – 1 There are two choices for the shortest A-C path in F k Consequently, D 1 (F k ) = D 1 (F k-1 ) + 1+ε + 2 k-1 + D 2 (F k-1 ) Similarly, D 2 (F k ) = 2+ε + D 1 (F k-1 )

14 1+ε 1 1 1 1 22 1 1 D 1 (F k ) = D 1 (F k-1 ) + 1+ε + 2 k-1 + D 2 (F k-1 ) D 2 (F k ) = 2+  ε D 1 (F 2 ) = 5 + ε D 2 (F 2 ) = 3 + ε k2345 D 1 (F k )5 + ε 13 + 3 ε 29 + 6 ε 61 + 11 ε D 2 (F k )3 + ε 7 + ε 15 + 4 ε 31 + 7 ε Basis for the induction: F 2

15 k2345k D 2 (F k )3+ 7+15+31+(2 k -1)+ D 2 (F k-1 ) + 1+  4+8+16+(2 k-1 )+ 2 k-1 48162 k-1 Let us ignore the precise values of the  terms So the correct NN choice is the 2 k-1 edge, as claimed.

16 NN Running Time To find the k th vertex, k > 1, find the shortest distance among N-k-1 candidates. Total time = Θ ( ) = Θ (N 2 )

17 Greedy (Multi-Fragment) (GR): 1.Sort the edges, shortest first, and treat them in that order. Start with an empty graph. 2.While the current graph is not a TSP tour, attempt to add the next shortest edge. If it yields a vertex degree exceeding 2 or a tour of length less than N, delete.

18 Theorem [ Ong & Moore, “Worst-case analysis of two travelling salesman heuristics,” Inform. Proc. Letters 2 (1984), 273-277 ]: – For all instances I obeying the Δ -Inequality, GR(I)/OPT(I) = O(logN). Theorem [A. M. Frieze, Worst-case analysis of algorithms for travelling salesman problems,” Methods of Operations Research 32 (1979), 93-112.] : – There are N-city instances I N for arbitrarily large N that obey the Δ -Inequality and have GR(I N )/OPT(I N ) = Ω(logN/loglogN).

19 Greedy Running Time Initial sort takes Θ (N 2 log(N)). Processing an edge takes constant time to check for degree 3, and two union-find operations to check for short cycles, for a total of at most Θ (N 2 α (N)).* Total time = Θ (N 2 log(N)). *Note: The union-find operations can be avoided by storing with each vertex u its degree in the current forest (initially 0) and, if the degree is 1, the identity of the vertex v at the other end of the path containing it, which rules out edge {u,v} as a potential choice of next edge. This information is easily updated in constant time whenever we add an edge.

20 Clarke-Wright “Savings” Heuristic 1.Start with a pseudo-tour in which an arbitrarily chosen city is the “hub” and the salesman returns to the hub after visiting each city (a multigraph in which every hub city is connected by two edges to the hub). 2.For each pair of non-hub cities, let the “savings” be the amount by which the pseudo-tour would be shortened if we added an edge between the two and deleted one edge to the hub from each. 3.As long as we do not yet have a tour, find a pair of non-hub cities that have not yet undergone two such shortcuts, and yields the most savings, and perform the shortcut for them. ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂

21 Theorem [A. M. Frieze, Worst-case analysis of algorithms for travelling salesman problems,” Methods of Operations Research 32 (1979), 93-112]: There are N-city instances I N for arbitrarily large N that obey the  -Inequality and have Savings(I N )/OPT(I N ) = Ω(logN/loglogN). No upper bounds are known under the triangle inequality. We do have R N (A) = Θ(log(N)) for a sequential variant on the Savings heuristic, in which we 1.Pick an initial non-hub vertex v as the “current” vertex. 2.Perform the best legal shortcut involving v and some other non-hub vertex that has not yet been involved in a shortcut. 3.Declare the other non-hub vertex involved in the shortcut to be the new “current” vertex. 4.Repeat until all non-hub vertices have been involved in a shortcut. [Ong & Moore, 1984] [B. Golden, “Evaluating a sequential vehicle routing algorithm,” AIIE Transactions 9 (1977), 204-208] Clarke-Wright “Savings” Heuristic

22 Savings Heuristic Running Time Mirrors Greedy: – Sort all (N-1)(N-2)/2 potential shortcuts by decreasing savings: Θ (N 2 log(N)). – Processing a shortcut takes constant time for involvement in more than two shortcuts, and two union-find operations to check for short cycles, for a total of at most Θ (N 2 α (N)).* – Total time = Θ (N 2 log(N)). Sequential version mirrors NN analogously. – Total time = Θ (N 2 ). *See Footnote for Greedy running time – same observations apply.

23 Nearest Addition (NA): 1.Start with 2-city tour consisting of some city and its nearest neighbor. 2.Repeatedly insert the non-tour city u that is closest to a tour city v into one of the tour edges involving v (the one that yields the shorter tour).

24 Insertion Variants Nearest Insertion (NI): 1.Start with 2-city tour consisting of some city and its nearest neighbor. 2.Repeatedly insert the non-tour city u that is closest to a tour city v into the tour edge that yields the shortest new tour (not necessarily one involving v). Cheapest Insertion (CI): 1.Start with 2-city tour consisting of some city and its nearest neighbor. 2.Repeatedly perform the insertion of a non-tour vertex w into a tour edge {u,v} that results in the shortest new tour.

25 More Insertion Variants Farthest Insertion (FI): 1.Start with 2-city tour consisting of the two cities at maximum distance from each other. 2.Repeatedly insert the non-tour city u, whose minimum distance to a tour city is maximum, into the tour edge that yields the shortest new tour. Arbitrary Insertion (AI): 1.Start with some 2-city tour. 2.Repeatedly pick some non-tour vertex u and insert it into the tour edge {u,v} that yields the shortest new tour.

26 Farthest Insertion

27 Theorem [Rosenkrantz, Stearns, & Lewis, 1977] : If instance I obeys the triangle inequality, then 1. R N (AI) = O(log(N)). 2. R N (NA) = R N (NI) = R N (CI) = 2. The best lower bound known for R N (AI) is Ω(log(N)/loglog(N)) [Y. Azar, “Lower bounds for insertion methods for TSP,” Combinatorics, Probability & Computing 3 (1994) 285-292]. The best lower bound known for R N (FI) is R N (FI) ≥ 6.5. [C. A. J. Hurkens, “Nasty TSP instances for farthest insertion,” IPCO Proc. (1992), 346-352].

28 Upper Bound Proofs The proof that R N (AI) = O(log(N)) mirrors that for NN. For the R N (A) ≤ 2 bounds, the results for NA and NI are straightforward: – Note that the vertices are added to the tour in the same order that they are added to the tree in Prim’s algorithm for computing an MST, and the optimal tour is at least as long as the MST. – Thus we need only show that the increase in tour length when a vertex u is added under NA is no more than twice the increase in tree length in Prim’s algorithm, or 2d(u,v), where v is u’s nearest neighbor in the tour. The result for NI will follow since its cost at each step never exceeds NA’s cost. d(u,w) ≤ d(u,v) + d(v,w) by  -inequality, so d(u,w) + d(u,v) – d(v,w) ≤ 2d(u,v) v w u

29 Upper Bound for CI Basic plan: We will associate each insertion with a unique edge of the MST, with the cost of the insertion being no more than twice the cost of the edge. Label the vertices v 1, v 2, …, v N in the order by which they are added to the tree under CI. Let T be the MST and let T i be the tree just before v i is added. Say v j is compatible with v i if j i. (In other words, v k is not in T i ). For each i > 1, the “critical vertex” for v i is the compatible vertex for v i with largest index, and the ”critical edge” for v i is the first edge on the path to v i from its critical vertex.

30 1 2 3 4 5 7 6 8 9 10 Critical Vertices and Edges Note: In this example, every edge is critical for precisely one vertex. We can prove that this will always be the case.

31 Unique Criticality Suppose an edge {v i,v j }, i < j, is critical for two distinct vertices, v h and v k, with h < k. Since v i is the critical vertex for v h, we have that h > i and that v j and all the internal vertices in the path from v i to v h must exceed h. (We cannot have h = j since that would imply j < k and so v i would not be compatible with v k.) Since {v i,v j } is critical for v k, all the internal vertices of the path from v i to v k must exceed k > h. (In the case when k = j there are no internal vertices.) Hence, all the internal vertices on the (blue) path from v h to v k must have indices exceeding h, and so v h is compatible with v k, with index h > i, implying that v i is NOT the critical vertex for v k and {v i,v j } is not the critical edge, a contradiction. vivi vjvj vkvk vhvh

32 Completing the CI proof We first show that the cost of inserting vertex v k is no more than twice the cost of its critical edge {v i,v j }, i < j, for v k, 2 ≤ j ≤ N. Let the critical edge be {v i,v j }, i < j. By definition of critical edge we must have j ≥ k. Thus, at the time v k is added, v i was in the tour but v j was not. Hence d(v i, v j ) must be at least as large as the length of the shortest edge joining a tour vertex u to a non-tour vertex v. By our NA argument, the insertion of v into a tour edge incident on u will cost at most 2d(u,v) ≤ 2d(v i, v j ). The cheapest insertion can cost no more. Summing over all inserted vertices we get that the total tour length for CI is at most two times the sum of the lengths of all the critical edges, which by the uniqueness lemma is simply the length of the MST. QED

33 Lower Bound Examples 2-ε NA(I) = NI(I) = CI(I) = 2N-2-(N-2) ε OPT(I) = N+1- ε Adjacent points have distance 1, all other distances are the distance along the line minus ε.

34 More Lower Bounds: Double MST and Christofides Recall: Start by computing a minimum spanning tree [O(N 2 ) time] Then add edges so that every vertex has even degree. Double MST Double the edges of the spanning tree [O(N) time] Find an Euler tour for the resulting graph [O(N) time] Traverse the Euler tour, taking shortcuts to avoid revisiting vertices [O(N) time] Christofides Add minimum-weight matching on odd-degree vertices [O(N 3 ) time]

35 More Lower Bounds: Double MST Algorithm MST Length = n + (n+1)(1-ε) + 2ε = 2n + 1 – (n-1)ε DoubleMST Tour Length ∼ 2n + (2n)(1-ε) + 2ε = 4n – 2(n-1)ε OptimalTour Length ∼ 2n + 2

36 More Lower Bounds: Christofides Algorithm 1 N cities on bottom, N+1 on top. Distance to nearest city in same row = 1. Distance to nearest city on other row = 1 – ε’. MST: Length = 2N(1 – ε’) OPT: Length = 2N + 1 - 2ε’ Christofides: Length = 2N(1 – ε’) + N = 3N – 2Nε’

37 Subquadratic Algorithms for Euclidean Instances: Strip 1.Let R be a minimum rectangle containing all the cities. 2.Partition R into floor(sqrt(N)/3) vertical strips. 3.Sort the cities within each strip by y-coordinate. 4.Starting with the bottom-most point in the leftmost strip, traverse the cities up one strip and down the next until all have been visited, and then return to the starting point. Total Running time O(Nlog(N)). ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂

38 Lower Bounds for Strip ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ ∂∂∂∂∂∂ Assuming neighboring points are one unit apart, OPT = N Strip(I N ) > sqrt(N)/3)(N/4) = Ω (sqrt(N)  OPT)

39 Subquadratic Algorithms for Euclidean Instances: Spacefilling Curve Visit the cities in the order that they occur along a spacefilling curve for a square that contains them all. Wikipedia Entry for Spacefilling Curve Running time O(Nlog(N)). For details, see [Platzmann & Bartholdi, “Spacefilling curves and the planar travelling salesman problem,” J. ACM 36 (1989), 719-737].

40 More Geometric Tour Construction Heuristics Insertion variants for geometric instances – Cheapest Insertion into the convex hull (CHCI) – “Greatest Angle” insertion into the convex hull (CHGA) – Convex Hull, Cheapest Insertion with greatest angle (CCA) Double Strip (best of both horizontal and vertical strip tours) (DST) Karp’s Partitioning Algorithm (KP) Litke’s Clustering Algorithm Bentley’s Fast Recursive Partitioning Heuristic (FRP)

41 Performance “In Practice” Data Sources Johnson, Bentley, McGeoch, & Rothberg, “Near-Optimal Solutions to Very Large Traveling Salesman Problems,” unpublished (and as-yet- uncompleted) monograph (1994). Johnson & McGeoch, “The traveling salesman problem: A case study in local optimization,” chapter in Local Search in Combinatorial Optimization, Aarts & Lenstra (editors), Princeton University Press, Princeton, NJ, 2003, 215-310 [Also available on DSJ’s website]. Johnson & McGeoch, “Experimental analysis of heuristics for the STSP,” chapter in The Traveling Salesman Problem and its Variations, Gutin & Punnen (editors), Kluwer Academic Publishers, Dordrecht, 2002, 369-443 [Also available on DSJ’s website]. Website for the “8 th DIMACS Implementation Challenge: The Traveling Salesman Problem” [http://dimacs.rutgers.edu/Challenges/TSP – only minor updates since 2002].

42 Performance “In Practice” Testbed 1: Random Euclidean Instances (Results appear to be reasonably well-correlated with those for our real-world instances.) N = 10,000

43 Performance “In Practice” Testbed 2: Random Clustered Instances Choose N/100 centers uniformly, then generate 100 normally-distributed points around each. N = 1,000N = 10,000N = 3,162

44 Performance “In Practice” Testbed 3: Instances from TSPLIB Printed Circuit BoardsGeographyLaser Logic

45 Performance “In Practice” Testbed 4: Random Symmetric Distance Matrices (Unlikely to obey Triangle Inequality.) Let’s start with Random Euclidean and just a few algorithms…

46

47 Nearest Neighbor

48 Greedy

49 Smart-Shortcut Christofides

50 Standard Shortcuts

51 Smart Shortcuts

52 Random Euclidean Performance for N = 1,000,000 (% Excess over HK Bound) Algorithm% Smart-Shortcut Christofides9.8 Savings12.2 Farthest Insertion13.5 Greedy14.2 Classic-Shortcut Christofides14.5 Random Insertion15.2 Convex Hull, Cheapest Insertion22.0 Cheapest Insertion22.1 Nearest Neighbor23.3 Nearest Insertion27.0 Strip30.2 Nearest Addition32.6 Spacefilling Curve35.1 Double MST w. Smart Shortcuts39.9

53 Random Euclidean N = 10,000 Solution Quality Distributions

54 Smart-Shortcut Christofides

55 Savings

56 Farthest Insertion

57 Greedy

58 Savings versus Classic Christofides

59 Nearest Neighbor versus Nearest Insertion

60 Greedy versus Nearest Insertion

61 Savings versus Nearest Insertion

62 Conclusion: Many of the heuristics that are famous theoretically are totally dominated in practice. Question: Why care about these dominated heuristics?

63 Random Euclidean N = 1,000 Tour Quality versus Running Time Tradeoffs


Download ppt "The Traveling Salesman Problem in Theory & Practice Lecture 5: Tour Construction Heuristics 18 February 2014 David S. Johnson"

Similar presentations


Ads by Google