Presentation is loading. Please wait.

Presentation is loading. Please wait.

Techniques for Computing and Using Bounds for Combinatorial Optimization Problems Sharlee Climer and Weixiong Zhang Department of Computer Science and.

Similar presentations


Presentation on theme: "Techniques for Computing and Using Bounds for Combinatorial Optimization Problems Sharlee Climer and Weixiong Zhang Department of Computer Science and."— Presentation transcript:

1 Techniques for Computing and Using Bounds for Combinatorial Optimization Problems Sharlee Climer and Weixiong Zhang Department of Computer Science and Engineering Washington University in St. Louis This research was funded in part by NDSEG and Olin Fellowships, NSF grants IIS-0196057 and ITR/EIA-0113618, and in part by DARPA Cooperative Agreement F30602-00-2-0531

2 IJCAI-05 Tutorialnew slide Complete slides will be available at: www.cse.wustl.edu/~sclimer or www.climer.us

3 IJCAI-05 TutorialBounding Techniques3 Overview Introduction Introduction Formulation and notation Formulation and notation Historical perspective Historical perspective Modifications to obtain bounds Modifications to obtain bounds Two-step procedure Two-step procedure Exploiting the use of bounds Exploiting the use of bounds Future directions Future directions

4 IJCAI-05 TutorialBounding Techniques4 What is the “use of bounds”? What is the “use of bounds”?

5 IJCAI-05 TutorialBounding Techniques5 upper bound optimal solution lower bound

6 IJCAI-05 TutorialBounding Techniques6 upper bound optimal solution lower bound The use of bounds

7 IJCAI-05 TutorialBounding Techniques7 Use of bounds Bounds have been extensively studied in both computer science and operations research (OR) Bounds have been extensively studied in both computer science and operations research (OR) Pruning rules in branch-and-bound search Pruning rules in branch-and-bound search Without the use of bounds, many important problems would be unsolvable Without the use of bounds, many important problems would be unsolvable Previous efforts to systematically discover effective relaxations to be used for bounds Previous efforts to systematically discover effective relaxations to be used for bounds Recent work to systematically discover other modifications for use as bounds Recent work to systematically discover other modifications for use as bounds

8 IJCAI-05 TutorialBounding Techniques8 Objectives of tutorial Survey the previous uses of bounds Survey the previous uses of bounds Reorganize existing work in a systematic way Reorganize existing work in a systematic way Point out potential directions for future work Point out potential directions for future work

9 IJCAI-05 TutorialBounding Techniques9 Formulation and notation Techniques presented can be applied to a variety of optimization problems Techniques presented can be applied to a variety of optimization problems To demonstrate the use of bounds, we’ll use integer linear programs (IPs) as basic problem structure To demonstrate the use of bounds, we’ll use integer linear programs (IPs) as basic problem structure Without loss of generality, we consider only minimization problems Without loss of generality, we consider only minimization problems –Maximization problems can be cast as minimization problems

10 IJCAI-05 TutorialBounding Techniques10 Integer Linear Programs Minimize Z =  c i x i Minimize Z =  c i x i (objective function) Subject to: a set of linear constraints x i integer x i integer If x i integer constraints omitted, would have a linear program (LP)

11 IJCAI-05 TutorialBounding Techniques11 Integer Linear Programs Large number of commercial applications Large number of commercial applications –Scheduling –Routing –Planning STRIPS planning problems converted to IPs [Kautz and Walser, AAAI-99; Vossen et al. IJCAI-99] STRIPS planning problems converted to IPs [Kautz and Walser, AAAI-99; Vossen et al. IJCAI-99] –Simplifies adding costs and resources –Optimality conditions

12 IJCAI-05 TutorialBounding Techniques12 Integer Linear Programs Used to model: Used to model: –Traveling Salesman Problem –Constraint Satisfaction Problem –Robotic motion problems –Clustering –Multiple sequence alignment –Haplotype inferencing –VLSI circuit design –Computer disk read head scheduling –Derivation of physical structures of programs –Delay-Tolerant Network routing –Cellular radio network base station locations –Minimum-energy multicast problem in wireless ad hoc networks

13 IJCAI-05 TutorialBounding Techniques13 Linear program example Minimize Z = -11x + 4y Subject to: 3x + 8y <= 40 11x - 8y <= 16 x,y >= 0

14 IJCAI-05 TutorialBounding Techniques14 Linear program example Minimize Z = -11x + 4y y = 11/4 x + Z/4 Family of parallel lines with slope of 11/4 and unknown y-intercept

15 IJCAI-05 TutorialBounding Techniques15 Linear program example Optimal solution x = 4 y = 7/2 Z = -30 Optimal solution is always on a vertex

16 IJCAI-05 TutorialBounding Techniques16 Integer linear program Minimize Z = -11x + 4y Subject to: 3x + 8y <= 40 11x - 8y <= 16 x,y >= 0 x,y integer Optimal solution x = 3 y = 3 Z = -21

17 IJCAI-05 TutorialBounding Techniques17 Dual problem Every IP has a dual problem that is also an IP Every IP has a dual problem that is also an IP Original IP referred to as primal Original IP referred to as primal If primal is a minimization IP, then dual is a maximization IP If primal is a minimization IP, then dual is a maximization IP Same coefficients, but rearranged Same coefficients, but rearranged Inequalities in constraints are reversed Inequalities in constraints are reversed

18 IJCAI-05 TutorialBounding Techniques18 Dual linear program Primal: Minimize -11x + 4y Subject to: -3x - 8y >= -40 -11x + 8y >= -16 x,y >= 0 Dual: Maximize -40v - 16w Subject to: -3v - 11w <= -11 -8v + 8w <= 4 v,w >= 0

19 IJCAI-05 TutorialBounding Techniques19 Dual linear program Maximize D = -40v - 16w Subject to: -3v - 11w <= -11 -8v + 8w <= 4 v,w >= 0

20 IJCAI-05 TutorialBounding Techniques20 Dual linear program Optimal solution v = 11/28 w = 25/28 D = -30

21 IJCAI-05 TutorialBounding Techniques21 Dual integer program Maximize D = -40v - 16w Subject to: -3v - 11w <= -11 -8v + 8w <= 4 v,w >= 0 v,w integer Optimal solution v = 1 w = 1 D = -56

22 IJCAI-05 TutorialBounding Techniques22 The Traveling Salesman Problem The Traveling Salesman Problem (TSP) is the problem of finding a minimum cost complete tour of a set of cities The Traveling Salesman Problem (TSP) is the problem of finding a minimum cost complete tour of a set of cities STSP: cost from city i to city j is equal to the cost from city j to city i STSP: cost from city i to city j is equal to the cost from city j to city i ATSP: costs not necessarily equal ATSP: costs not necessarily equal

23 IJCAI-05 TutorialBounding Techniques23 The Traveling Salesman Problem Minimize Z =  c ij x ij s.t.:  x ij = 1 for j = 1,…,n  x ij = 1 for i = 1,…,n  x ij = 1 for i = 1,…,n  x ij <= |W| - 1, for all proper non- empty subsets W of V  x ij <= |W| - 1, for all proper non- empty subsets W of V x ij = {0,1} x ij = {0,1}

24 IJCAI-05 TutorialBounding Techniques24 Reprinted by permission, G. Dantzig, R. Fulkerson, and S. Johnson, Solution of a Large-Scale Traveling-Salesman Problem, Journal of the Operations Research Society of America, volume 2, 1954. Copyright 1954, the Institute for Operations Research and the Management Sciences, 7240 Parkway Drive, Suite 310, Hanover, Maryland 21076.

25 IJCAI-05 TutorialBounding Techniques25 Omit subtour elimination constraints Minimize Z =  c ij x ij s.t.:  x ij = 1 for j = 1,…,n  x ij = 1 for i = 1,…,n  x ij = 1 for i = 1,…,n  x ij <= |W| - 1, for all proper non- empty subsets W of V  x ij <= |W| - 1, for all proper non- empty subsets W of V x ij = {0,1} x ij = {0,1} The assignment problem The assignment problem A lower bound for the ATSP A lower bound for the ATSP Can be solved in polynomial time Can be solved in polynomial time

26 IJCAI-05 TutorialBounding Techniques26 Omit subtour elimination constraints

27 IJCAI-05 TutorialBounding Techniques27 Relax integrality constraints Minimize Z =  c ij x ij s.t.:  x ij = 1 for j = 1,…,n  x ij = 1 for i = 1,…,n  x ij = 1 for i = 1,…,n  x ij <= |W| - 1, for all proper non- empty subsets W of V  x ij <= |W| - 1, for all proper non- empty subsets W of V x ij = {0,1} x ij = {0,1} 0 <= x ij <= 1 0 <= x ij <= 1 Linear program (LP) relaxation Linear program (LP) relaxation Held-Karp lower bound Held-Karp lower bound Can be solved in polynomial time Can be solved in polynomial time

28 IJCAI-05 TutorialBounding Techniques28 Relax integrality constraints

29 IJCAI-05 TutorialBounding Techniques29 Historical perspective Branch-and-bound Branch-and-bound Alpha-beta pruning Alpha-beta pruning Admissible heuristics Admissible heuristics –Abstractions –Pattern databases Cutting planes Cutting planes Gomory cuts Gomory cuts Branch-and-cut Branch-and-cut

30 IJCAI-05 TutorialBounding Techniques30 Branch-and-bound In 1958, several papers appeared using branch- and-bound (BnB) [Bock, Op. Res. 1958; Croes, Op. Res. 1958; Eastman, PhD thesis 1958, Rossman and Twery, Op. Res. 1958] In 1958, several papers appeared using branch- and-bound (BnB) [Bock, Op. Res. 1958; Croes, Op. Res. 1958; Eastman, PhD thesis 1958, Rossman and Twery, Op. Res. 1958] Three of these papers introduced algorithms for the TSP Three of these papers introduced algorithms for the TSP “Branch-and-bound” coined [Little et al. Op. Res. 1963] (also a TSP algorithm) “Branch-and-bound” coined [Little et al. Op. Res. 1963] (also a TSP algorithm) Example: Carpaneto, Dell’Amico, and Toth’s (CDT) algorithm for the ATSP [ACM Trans. On Math. Software 1995] Example: Carpaneto, Dell’Amico, and Toth’s (CDT) algorithm for the ATSP [ACM Trans. On Math. Software 1995]

31 IJCAI-05 TutorialBounding Techniques31 CDT algorithm

32 IJCAI-05 TutorialBounding Techniques32 Branch-and-bound Croes TSP algorithm perhaps first BnB search published [Croes, Op. Res. 1958] Croes TSP algorithm perhaps first BnB search published [Croes, Op. Res. 1958] Used bounds in two different ways Used bounds in two different ways Found an approximate solution Found an approximate solution Used solution as upper bound for eliminating sets of arcs that cannot simultaneously appear Used solution as upper bound for eliminating sets of arcs that cannot simultaneously appear BnB tree over remaining arc combinations BnB tree over remaining arc combinations

33 IJCAI-05 TutorialBounding Techniques33 Branch-and-bound Reprinted by permission, G. A. Croes, A Method for Solving Traveling-Salesman Problems, Operations Research, volume 6, 1958. Copyright 1958, the Institute for Operations Research and the Management Sciences, 7240 Parkway Drive, Suite 310, Hanover, Maryland 21076.

34 IJCAI-05 TutorialBounding Techniques34 Branch-and-bound Croes solved 49-city STSP that had previously been solved by Dantzig, Fulkerson, and Johnson [Op. Res. 1954] Croes solved 49-city STSP that had previously been solved by Dantzig, Fulkerson, and Johnson [Op. Res. 1954] Improvements: Improvements: –Mechanized solver –Provides “anytime” solution –Faster than previous methods Croes solved by hand Croes solved by hand –70 hours to solve –Found optimal solution after 10 hours

35 IJCAI-05 TutorialBounding Techniques35 Alpha-beta pruning Minimax search is a specialized BnB search for game playing Minimax search is a specialized BnB search for game playing Objective is to determine the best move Objective is to determine the best move Alpha-beta pruning used to reduce the number of nodes that need to be evaluated Alpha-beta pruning used to reduce the number of nodes that need to be evaluated

36 IJCAI-05 TutorialBounding Techniques36 Alpha-beta pruning Devised by John McCarthy in 1956, but not published Devised by John McCarthy in 1956, but not published Used in the NSS chess program in 1958 [Newell et al. IBM Journal of R&D 1958] Used in the NSS chess program in 1958 [Newell et al. IBM Journal of R&D 1958] Pearl proved alpha-beta to be an asymptotically optimal game-searching algorithm [Pearl, Comm. of ACM 1982] Pearl proved alpha-beta to be an asymptotically optimal game-searching algorithm [Pearl, Comm. of ACM 1982] Modified strategy used by Deep Blue [Hsu, Behind Deep Blue 2002] Modified strategy used by Deep Blue [Hsu, Behind Deep Blue 2002]

37 IJCAI-05 TutorialBounding Techniques37 Minimax search tree

38 IJCAI-05 TutorialBounding Techniques38 Alpha-beta pruning

39 IJCAI-05 TutorialBounding Techniques39 Alpha-beta pruning

40 IJCAI-05 TutorialBounding Techniques40 Alpha-beta pruning

41 IJCAI-05 TutorialBounding Techniques41 Alpha-beta pruning

42 IJCAI-05 TutorialBounding Techniques42 Alpha-beta pruning

43 IJCAI-05 TutorialBounding Techniques43 Alpha-beta pruning

44 IJCAI-05 TutorialBounding Techniques44 Admissible heuristics Evaluation functions used by minimax are examples of heuristics Evaluation functions used by minimax are examples of heuristics Trade-off of accuracy vs. time to compute Trade-off of accuracy vs. time to compute Admissible heuristic provides a lower bound Admissible heuristic provides a lower bound Admissibility needed to guarantee A* search performance Admissibility needed to guarantee A* search performance Admissibility allows pruning Admissibility allows pruning

45 IJCAI-05 TutorialBounding Techniques45 Admissible heuristics Abstractions used for admissible heuristics can be generated by relaxing constraints Abstractions used for admissible heuristics can be generated by relaxing constraints In 1970, Held and Karp used relaxation for the STSP [Op. Res. 1970] In 1970, Held and Karp used relaxation for the STSP [Op. Res. 1970] –1-tree –A spanning tree with an added edge –Node 1 has degree two and is part of single cycle –Find MST without node 1 and add two smallest edges incident to node 1

46 IJCAI-05 TutorialBounding Techniques46 1-tree

47 IJCAI-05 TutorialBounding Techniques47 Systematic generation In 1982, Pearl suggested automatically deriving admissible heuristics by systematically deleting constraints [Pearl, reprinted in AI Magazine 1983] In 1982, Pearl suggested automatically deriving admissible heuristics by systematically deleting constraints [Pearl, reprinted in AI Magazine 1983] Absolver II [Prieditis, Machine Learning 1993] Absolver II [Prieditis, Machine Learning 1993] –An implementation to automatically derive admissible heuristics –Uses abstracting transformations  Reduces cost function and/or  Expands goal states –Used to find first useful heuristic for Rubik’s cube * Introduced earlier by Somalvico et al.

48 IJCAI-05 TutorialBounding Techniques48 Pearl’s example Used STRIPS formulation of 8-Puzzle problem 351 72 64812345 678 Start stateGoal state

49 IJCAI-05 TutorialBounding Techniques49 Pearl’s example 3 primitive predicates 3 primitive predicates –ON(x,y) tile x is on cell y –CLEAR(y) cell is clear of tiles –ADJ(y,z) cell y is adjacent to cell z Each state defined by a list of 8 ON predicates, one CLEAR predicate, and a fixed set of ADJ predicates describing the adjacencies of the cells on the board Each state defined by a list of 8 ON predicates, one CLEAR predicate, and a fixed set of ADJ predicates describing the adjacencies of the cells on the board

50 IJCAI-05 TutorialBounding Techniques50 Pearl’s example Operator to move tile x from cell y to cell z Operator to move tile x from cell y to cell z MOVE(x,y,z) Precondition list: ON(x,y), CLEAR(z), ADJ(y,z) Add list: ON(x,z), CLEAR(y) Delete list: ON(x,y), CLEAR(z) Find a sequence of MOVE operations to transform from the initial state to a goal state Find a sequence of MOVE operations to transform from the initial state to a goal state

51 IJCAI-05 TutorialBounding Techniques51 Pearl’s example Removing CLEAR(z) and ADJ(y,z) from the precondition list Removing CLEAR(z) and ADJ(y,z) from the precondition list –Permits a tile to be moved from its current position to any other position in one move –Equal to the number of tiles that are misplaced in initial state Removing CLEAR(z) from the precondition list Removing CLEAR(z) from the precondition list –Equals the sum of the Manhattan distances

52 IJCAI-05 TutorialBounding Techniques52 Pearl’s example Removing ADJ(y,z) from the precondition list Removing ADJ(y,z) from the precondition list Swap the blank with any other tile Swap the blank with any other tile Can be solved with greedy algorithm Can be solved with greedy algorithm –If current empty cell y is to be covered with tile x, move x to cell y. Else move any misplaced tile into y. Less intuitive than other two heuristics Less intuitive than other two heuristics Discovered 13 years after A* was tested using the other two Discovered 13 years after A* was tested using the other two Constraints can be divided Constraints can be divided –Tighter lower bound if only part of constraint is removed

53 IJCAI-05 TutorialBounding Techniques53 Pattern databases A look-up table containing precomputed solutions to subproblems A look-up table containing precomputed solutions to subproblems Cost of solving entire problem is at least as large as cost of solving subproblem Cost of solving entire problem is at least as large as cost of solving subproblem Large memory requirements Large memory requirements Used to find first optimal solutions to random Rubik’s Cube [Korf, Workshop on Computer Games, IJCAI-97] Used to find first optimal solutions to random Rubik’s Cube [Korf, Workshop on Computer Games, IJCAI-97] Introduced by Culberson and Schaeffer and demonstrated on 15-puzzle [Lecture Notes in AI, 1996] Introduced by Culberson and Schaeffer and demonstrated on 15-puzzle [Lecture Notes in AI, 1996] –Precomputed minimum cost for placing 7 tiles plus the “empty” tile in correct final position

54 IJCAI-05 TutorialBounding Techniques54 Pattern databases b3 7 11 12131415b8910 12131415 Fringe target patternCorner target pattern

55 IJCAI-05 TutorialBounding Techniques55 Cutting planes OR research moved in different direction OR research moved in different direction Cutting planes: added constraints that tighten a relaxation Cutting planes: added constraints that tighten a relaxation Introduced by Dantzig, Fulkerson, and Johnson [Op. Res. 1954] Introduced by Dantzig, Fulkerson, and Johnson [Op. Res. 1954] –49-city STSP –Iteratively solve relaxation –Integrality and subtour elimination constraints (SECs) removed –Add cutting planes to remove relaxed solutions –Added 23 SECs and 2 subjectively improvised cuts to remove non-integral solutions

56 IJCAI-05 TutorialBounding Techniques56 Generate cuts General procedure to remove non-integral solutions from binary IPs, as outlined by Hillier and Lieberman [2001] 1. Consider <= constraint with nonnegative coefficients 2. Find a group of variables such that (a)Violation if all equal 1 (while others equal 0) (b)Satisfied if any one is changed to 0 3. Let k equal number of variables in group Sum of variables in group <= k - 1

57 IJCAI-05 TutorialBounding Techniques57 Example 10x + 16y + 13z <= 29 x, y, z are binary variables Integrality is relaxed Constraint is violated if x, y, and z are all equal to 1 Constraint is satisfied if any one is equal to 0 New cut: x + y + z <= 2 Cut removes solutions from relaxed problem but not from original problem (e.g. x = 0.5, y = 0.8, z = 0.8)

58 IJCAI-05 TutorialBounding Techniques58 Gomory cuts In 1958, Gomory proposed an iterative search strategy for IPs [Bulletin of the Am. Math. Soc., 1958] In 1958, Gomory proposed an iterative search strategy for IPs [Bulletin of the Am. Math. Soc., 1958] –Solve with integrality relaxed –Apply cut to remove relaxed solution –Proved termination In 1966, Martin implemented first TSP code using cutting planes [Op. Res. 1966] In 1966, Martin implemented first TSP code using cutting planes [Op. Res. 1966] –Relaxed integrality and subtour elimination constraints (SECs) Iterative procedures, no branching Iterative procedures, no branching

59 IJCAI-05 TutorialBounding Techniques59 Branch-and-cut Branch-and-cut (BnC) is a BnB search with cutting planes added to relaxations at nodes Branch-and-cut (BnC) is a BnB search with cutting planes added to relaxations at nodes Hong implemented BnC code in 1972 for the TSP [PhD thesis, Johns Hopkins, 1972] Hong implemented BnC code in 1972 for the TSP [PhD thesis, Johns Hopkins, 1972] In 1987, Padberg and Rinaldi coined “branch-and-cut” and used to solve 532- city STSP [Op. Res. Letters 1987] In 1987, Padberg and Rinaldi coined “branch-and-cut” and used to solve 532- city STSP [Op. Res. Letters 1987]

60 IJCAI-05 TutorialBounding Techniques60 Concorde In 1994, first implementation of Concorde [Applegate et al. Lecture Notes in Comp. Sci. 2001] In 1994, first implementation of Concorde [Applegate et al. Lecture Notes in Comp. Sci. 2001] Branch-and-cut code for STSP Branch-and-cut code for STSP Relaxed integrality and SECs Relaxed integrality and SECs Also used custom cuts tailored for the TSP Also used custom cuts tailored for the TSP Apply cuts at a node until diminishing returns Apply cuts at a node until diminishing returns Branch by setting value of an arc Branch by setting value of an arc In 2004, Concorde was used to solve a 24,978- city STSP [Applegate et al. www.tsp.gatech.edu] In 2004, Concorde was used to solve a 24,978- city STSP [Applegate et al. www.tsp.gatech.edu]www.tsp.gatech.edu –14,827,429 cutting planes in addition to SECs

61 IJCAI-05 TutorialBounding Techniques61 Modifications to obtain bounds Many possibilities for obtaining bounds have been previously overlooked Many possibilities for obtaining bounds have been previously overlooked Examine every aspect of problem description Examine every aspect of problem description Modifications of IPs to produce bounds Modifications of IPs to produce bounds –Relaxing or tightening constraints –Modifying objective function –Adding or deleting decision variables –Modifications to dual problem

62 IJCAI-05 TutorialBounding Techniques62 Example Minimize Z = y – 4/5 x Subject to: x >= 0 y <= 3 y + 13/6 x <= 9 y – 5/13 x >= 1/14 y + 3/5 x >= 6/5 x,y integers

63 IJCAI-05 TutorialBounding Techniques63 Solution space x >= 0 y <= 3 y + 13/6 x <= 9 y – 5/13 x >= 1/14 y + 3/5 x >= 6/5 x,y integers

64 IJCAI-05 TutorialBounding Techniques64 Objective function Minimize Z = y – 4/5 x x = 0 y = 3 Z = 3

65 IJCAI-05 TutorialBounding Techniques65 Optimal solution Minimize Z = y – 4/5 x x = 2 y = 1 Z = -0.6

66 IJCAI-05 TutorialBounding Techniques66 Relaxing constraints Minimize Z = y – 4/5 x subject to: x >= 0 y <= 3 y + 13/6 x <= 9 y – 5/13 x >= 1/14 y + 3/5 x >= 6/5 x,y integers x = 3.5 y = 1.4 Z = -1.4 Lower bound

67 IJCAI-05 TutorialBounding Techniques67 Relaxing constraints Minimize Z = y – 4/5 x subject to: x >= 0 y <= 3 y + 13/6 x <= 9 y – 5/13 x >= 1/14 y + 3/5 x >= 6/5 x,y integers x = 3 y = 1 Z = -1.4 Lower bound

68 IJCAI-05 TutorialBounding Techniques68 Tightening constraints Minimize Z = y – 4/5 x subject to: x >= 0 y <= 3 y + 13/6 x <= 9 y – 5/13 x >= 1/14 y + 3/5 x >= 6/5 x,y integers y >= 2 x = 3 y = 2 Z = -0.4 Upper bound

69 IJCAI-05 TutorialBounding Techniques69 Tightening constraints Minimize Z = y – 4/5 x subject to: x >= 0 y <= 3 y + 13/6 x <= 9 y – 5/13 x >= 1/14 y + 3/5 x >= 6/5 x,y integers x = 2 y = 2 Z = 0.4 Upper bound

70 IJCAI-05 TutorialBounding Techniques70 Tightening constraints Common use: adding constraints that set values of variables (branching rules) Common use: adding constraints that set values of variables (branching rules) Reduce number of feasible solutions Reduce number of feasible solutions Example of “easier” problem: add constraints setting most variables to zero, and solve sparse problem Example of “easier” problem: add constraints setting most variables to zero, and solve sparse problem

71 IJCAI-05 TutorialBounding Techniques71 Relaxing optimality Minimize Z = y – 4/5 x subject to: x >= 0 y <= 3 y + 13/6 x <= 9 y – 5/13 x >= 1/14 y + 3/5 x >= 6/5 x,y integers x = 3 y = 2 Z = -0.4 Upper bound

72 IJCAI-05 TutorialBounding Techniques72 Modifying objective function coefficients Minimize Z = y – 4/5 x subject to: x >= 0 y <= 3 y + 13/6 x <= 9 y – 5/13 x >= 1/14 y + 3/5 x >= 6/5 x,y integers x = 1 y = 1 Z = 1 Upper bound

73 IJCAI-05 TutorialBounding Techniques73 Modifying objective function coefficients Minimize Z = y – 4/5 x subject to: x >= 0 y <= 3 y + 13/6 x <= 9 y – 5/13 x >= 1/14 y + 3/5 x >= 6/5 x,y integers x = 1 y = 1 Z = 0.2 Upper bound

74 IJCAI-05 TutorialBounding Techniques74 Modifying objective function coefficients Reducing range of objective function coefficients can yield “easier” problem for TSP Reducing range of objective function coefficients can yield “easier” problem for TSP Phase transitions – difficulty in solving problem changes dramatically when a parameter is increased beyond a distinct value Phase transitions – difficulty in solving problem changes dramatically when a parameter is increased beyond a distinct value For TSP, parameter is range of c ij values For TSP, parameter is range of c ij values

75 IJCAI-05 TutorialBounding Techniques75 Modifying objective function coefficients Zhang reduced range by eliminating least significant bits [CP-AI-OR-02] Zhang reduced range by eliminating least significant bits [CP-AI-OR-02] After finding solution, used original values to compute cost of tour, yielding an upper bound After finding solution, used original values to compute cost of tour, yielding an upper bound Could derive a lower bound by rounding values down and keeping these values when computing cost of tour found Could derive a lower bound by rounding values down and keeping these values when computing cost of tour found

76 IJCAI-05 TutorialBounding Techniques76 Modifying objective function coefficients Frieze’s polynomial-time STSP algorithm [SIAM Computing 1987] Frieze’s polynomial-time STSP algorithm [SIAM Computing 1987] –Finds exact solution with a probability that tends to 1 as the number of cities n tends to infinity –Random cost values drawn from zero to B (n) - 1, where B (n) = o (n / log log n)

77 IJCAI-05 TutorialBounding Techniques77 Modifying objective function coefficients Lagrangian relaxation Lagrangian relaxation –Adds penalties to objective function for violation of deleted constraints [Fisher, Management Science 1981] –Deletion of constraints yields lower bound –Addition of penalties tightens the bound, but it is still a lower bound –Can be used as a lower bound for pruning or as a heuristic to guide the search

78 IJCAI-05 TutorialBounding Techniques78 Adding decision variables Assuming zero is in allowable range for new variable Assuming zero is in allowable range for new variable Addition of variable creates a lower bound Addition of variable creates a lower bound Creates additional feasible solutions, doesn’t exclude any feasible solutions to original problem Creates additional feasible solutions, doesn’t exclude any feasible solutions to original problem Example 1: add edges in graph problems Example 1: add edges in graph problems –May create shortcuts

79 IJCAI-05 TutorialBounding Techniques79 Adding decision variables Example 2: tilted drilling machine problems, a class of ATSPs Example 2: tilted drilling machine problems, a class of ATSPs Convert ATSP to STSP with 2-node transformation Convert ATSP to STSP with 2-node transformation Solved 800 100-city instances using Concorde and ATSP solvers Solved 800 100-city instances using Concorde and ATSP solvers Number of decision variables more than quadrupled*, yet modified problems were generally easier to solve Number of decision variables more than quadrupled*, yet modified problems were generally easier to solve *number of variables without zero or infinite costs same as original problem

80 IJCAI-05 TutorialBounding Techniques80 Deleting decision variables Assuming zero is in allowable range for deleted variable Assuming zero is in allowable range for deleted variable Deletion of variable creates an upper bound Deletion of variable creates an upper bound Removes feasible solutions from the original problem Removes feasible solutions from the original problem Example for TSP: remove all arcs with edges greater than a threshold Example for TSP: remove all arcs with edges greater than a threshold –If a tour exists it is an upper bound Same result as adding a constraint setting the value of the variable to zero Same result as adding a constraint setting the value of the variable to zero

81 IJCAI-05 TutorialBounding Techniques81 Using dual problem Feasible solutions to dual IP are lower bounds to primal IP Feasible solutions to dual IP are lower bounds to primal IP Modifications to dual that yield lower bounds for primal Modifications to dual that yield lower bounds for primal –Tightening constraints –Relaxing optimality –Decreasing objective function coefficients –Changing objective function coefficients, and substituting original coefficients after dual is solved –Deleting decision variables (assuming zero is in allowable range)

82 IJCAI-05 TutorialBounding Techniques82 Summary of modifications Upper bound Lower bound Relax constraints -Primal Tighten constraints PrimalDual Relax optimality PrimalDual Increase c i Primal- Decrease c i - Primal & dual Change c i but use original values after solved PrimalDual Delete variables* PrimalDual Add variables* -Primal * Assuming zero is in allowable range of variables

83 IJCAI-05 TutorialBounding Techniques83 “Simplifying” modifications Decomposability Decomposability –Manhattan distance for 8-puzzle example –All subproblems can be solved independently Partial ordering Partial ordering –8-puzzle: delete ADJ requirement –Placing empty tile has higher order and is done last –Greedy algorithm works Create special structures Create special structures –OR development of algorithms for tackling special structures  Mutually exclusive alternatives constraints  Contingent decisions constraints  Set-covering constraints Trial-and-error Trial-and-error

84 IJCAI-05 Tutorialnew slide Set covering problems Given a set S = {1, 2, 3, …, m} and a class C of subsets of S where each subset has a cost associated with it Given a set S = {1, 2, 3, …, m} and a class C of subsets of S where each subset has a cost associated with it Cover all members of S at minimum cost using members of C Cover all members of S at minimum cost using members of C Let S = {1, 2, 3, 4, 5} and C = {{1, 2}, {1, 3, 5}, {2, 4, 5}, {3}, {1}, {4, 5}} with costs of 2, 1, 1, 2, 3, 1 respectively Let S = {1, 2, 3, 4, 5} and C = {{1, 2}, {1, 3, 5}, {2, 4, 5}, {3}, {1}, {4, 5}} with costs of 2, 1, 1, 2, 3, 1 respectively

85 IJCAI-05 Tutorialnew slide Set covering problems Let x i = 1 if the i th member of C is in the cover, otherwise x i = 0 Let x i = 1 if the i th member of C is in the cover, otherwise x i = 0 Minimize 2x 1 + x 2 + x 3 + 2x 4 + 3x 5 + x 6 Subject to: x 1 + x 2 + x 5 >= 1 x 1 + x 3 >= 1 + x 2 + x 4 >= 1 + x 2 + x 4 >= 1 + x 3 + x 6 >= 1 + x 3 + x 6 >= 1 + x 2 + x 3 + x 6 >= 1 + x 2 + x 3 + x 6 >= 1

86 IJCAI-05 Tutorialnew slide Set covering problems Set covering properties Set covering properties –Minimization with all constraints ‘>=‘ –All RHS coefficients are 1 –All other matrix coefficients are 0 or 1 Weighted set covering Weighted set covering –RHS >= 1 Generalized set covering Generalized set covering –RHS >= 1 –other matrix coefficients to be -1, 0, or 1

87 IJCAI-05 Tutorialnew slide Set covering problems Comparatively easy to solve Comparatively easy to solve Optimal solution must be a vertex solution of the corresponding LP Optimal solution must be a vertex solution of the corresponding LP However, this vertex solution is not necessarily the optimal solution of the LP However, this vertex solution is not necessarily the optimal solution of the LP

88 IJCAI-05 Tutorialnew slide Set packing problems Given a set S = {1, 2, 3, …, m} and a class C of subsets of S where each subset has a value associated with it Given a set S = {1, 2, 3, …, m} and a class C of subsets of S where each subset has a value associated with it Pack as many members of C as possible into S yielding maximum total value, without overlaps Pack as many members of C as possible into S yielding maximum total value, without overlaps

89 IJCAI-05 Tutorialnew slide Set packing problems Let x i = 1 if the i th member of C is in the pack, otherwise x i = 0 Let x i = 1 if the i th member of C is in the pack, otherwise x i = 0 Maximize 2x 1 + x 2 + x 3 + 2x 4 + 3x 5 + x 6 Subject to: x 1 + x 2 + x 5 <= 1 x 1 + x 3 <= 1 + x 2 + x 4 <= 1 + x 2 + x 4 <= 1 + x 3 + x 6 <= 1 + x 3 + x 6 <= 1 + x 2 + x 3 + x 6 <= 1 + x 2 + x 3 + x 6 <= 1

90 IJCAI-05 Tutorialnew slide Set packing problems Set packing properties Set packing properties –Maximization with all constraints ‘<=‘ –All RHS coefficients are 1 –All other matrix coefficients are 0 or 1 Weighted set packing Weighted set packing –RHS >= 1 Generalized set packing Generalized set packing –RHS >= 1 –other matrix coefficients to be -1, 0, or 1

91 IJCAI-05 Tutorialnew slide Set partitioning problems Given a set S = {1, 2, 3, …, m} and a class C of subsets of S where each subset has a value associated with it Given a set S = {1, 2, 3, …, m} and a class C of subsets of S where each subset has a value associated with it Pack as many members of C as possible into S without overlaps and cover all the members of S Pack as many members of C as possible into S without overlaps and cover all the members of S Maximization or minimization Maximization or minimization

92 IJCAI-05 Tutorialnew slide Set partitioning problems Let x i = 1 if the i th member of C is in the pack, otherwise x i = 0 Let x i = 1 if the i th member of C is in the pack, otherwise x i = 0 Max or min 2x 1 + x 2 + x 3 + 2x 4 + 3x 5 + x 6 Subject to: x 1 + x 2 + x 5 = 1 x 1 + x 3 = 1 + x 2 + x 4 = 1 + x 2 + x 4 = 1 + x 3 + x 6 = 1 + x 3 + x 6 = 1 + x 2 + x 3 + x 6 = 1 + x 2 + x 3 + x 6 = 1

93 IJCAI-05 Tutorialnew slide Set covering, packing, and partitioning problems Set partitioning problems are equivalent to set packing problems [Williams, Model Building in Mathematical Programming, 1985] Set partitioning problems are equivalent to set packing problems [Williams, Model Building in Mathematical Programming, 1985] The optimal solution for either problem is a vertex solution for its corresponding LP The optimal solution for either problem is a vertex solution for its corresponding LP These problems are generally even easier than set covering problems These problems are generally even easier than set covering problems –Corresponding LP is more constrained due to ‘=‘ constraints Problems can be modified to fully or partially resemble any of these problem types Problems can be modified to fully or partially resemble any of these problem types

94 IJCAI-05 Tutorialnew slide Total unimodularity

95 IJCAI-05 Tutorialnew slide Total unimodularity Convex hull of feasible integer points

96 IJCAI-05 Tutorialnew slide Total unimodularity Deriving the convex hull is generally much more difficult than solving the problem Deriving the convex hull is generally much more difficult than solving the problem Sometimes the IP model is the convex hull Sometimes the IP model is the convex hull –The problem is totally unimodular –Assignment problem –Recognize and solve as LP instead of IP Sometimes problems can be easily reformulated to yield the convex hull Sometimes problems can be easily reformulated to yield the convex hull Sometimes it is possible to reformulate to get closer to the convex hull Sometimes it is possible to reformulate to get closer to the convex hull

97 IJCAI-05 Tutorialnew slide Total unimodularity Let A = the matrix consisting of the coefficients of the decision variables in the constraints (not including the RHS) Let A = the matrix consisting of the coefficients of the decision variables in the constraints (not including the RHS) Reformulate problem so that it is a maximization problem, the values in A are all non-negative, and the constraints are equalities (add slack variables) Reformulate problem so that it is a maximization problem, the values in A are all non-negative, and the constraints are equalities (add slack variables) A is totally unimodular if every square submatrix of A has a determinant equal to -1, 0, or 1 [Garfinkel and Nemhauser, Integer Programming, 1972] A is totally unimodular if every square submatrix of A has a determinant equal to -1, 0, or 1 [Garfinkel and Nemhauser, Integer Programming, 1972] However, evaluating every determinant is usually too difficult However, evaluating every determinant is usually too difficult

98 IJCAI-05 Tutorialnew slide Total unimodularity A sufficient but not necessary property P A sufficient but not necessary property P Property P : –All values in A are -1, 0, or 1 –No more than 2 non-zero elements appear in each column –Rows can be partitioned into 2 subsets such that  If a column contains 2 non-zero values with same sign, those rows are in different subsets  If a column contains 2 non-zero values with different signs, those rows are in the same subset

99 IJCAI-05 Tutorialnew slide Total unimodularity Easy to identify when property P holds Easy to identify when property P holds Sometimes possible to reformulate (or modify) problem so that property P holds for the entire problem or for part of the problem Sometimes possible to reformulate (or modify) problem so that property P holds for the entire problem or for part of the problem

100 IJCAI-05 Tutorialnew slide Total unimodularity example Common constraint: Common constraint: If any of a set of boolean x i are equal to 1, then x = 1 Equivalent constraint: Equivalent constraint: x 1 + x 2 + … +x n - nx <= 0 Reformulation into n constraints: Reformulation into n constraints: x 1 - x <= 0... x n - x <= 0

101 IJCAI-05 Tutorialnew slide Total unimodularity example Dual of reformulation has property P Dual of reformulation has property P Example also demonstrates another way to tighten a problem Example also demonstrates another way to tighten a problem –IP is equivalent with or without reformulation –Sum of reformulated constraints equals the original constraint –Although IP is equivalent before and after reformulation, corresponding LPs are very different –Adding together a set of constraints in an LP generally enlarges the feasible region –Fractional solutions to the LP for the original problem are ruled out in the LP of the reformulated problem Modifications can be made to resemble total unimodularity or to replace a single constraint with a set of constraint. Modifications can be made to resemble total unimodularity or to replace a single constraint with a set of constraint.

102 IJCAI-05 TutorialBounding Techniques102 Limit crossing A 2-step procedure for systematically exploring the use of bounds A 2-step procedure for systematically exploring the use of bounds Has been implicitly used in a number of algorithms and search strategies Has been implicitly used in a number of algorithms and search strategies To our knowledge, hasn’t been generally formalized To our knowledge, hasn’t been generally formalized –Narrower formalization [Climer and Zhang, AAAI-02] Broaden focus beyond branch-and-bound Broaden focus beyond branch-and-bound

103 IJCAI-05 TutorialBounding Techniques103 Limit crossing 2 steps: 2 steps: (1) Find a simple upper or lower bound (2) Combine upper-bounding and lower- bounding modifications and solve If solution of the doubly-modified problem exceeds the simple upper bound, upper- bounding modification in step (2) is invalid If solution of the doubly-modified problem exceeds the simple upper bound, upper- bounding modification in step (2) is invalid If solution of doubly-modified problem is less than the simple lower bound, lower-bounding modification in step (2) is invalid If solution of doubly-modified problem is less than the simple lower bound, lower-bounding modification in step (2) is invalid

104 IJCAI-05 TutorialBounding Techniques104 Branch-and-bound search Incumbent solution

105 IJCAI-05 TutorialBounding Techniques105 Limit crossing Find a simple upper or lower bound that is tight Find a simple upper or lower bound that is tight Systematically apply modifications to produce doubly-modified problems Systematically apply modifications to produce doubly-modified problems –Either modification can be difficult to solve –Only need the combination of the two modifications to be relatively easy May produce novel search strategies May produce novel search strategies

106 IJCAI-05 TutorialBounding Techniques106 Limit crossing search strategies Cut-and-solve [Climer and Zhang, AIJ, to appear] Cut-and-solve [Climer and Zhang, AIJ, to appear] –An iterative search strategy –Useful for combinatorial optimization problems Backbone and fat identifier [Climer and Zhang, AAAI-02] Backbone and fat identifier [Climer and Zhang, AAAI-02] –Used to identify characteristic variables

107 IJCAI-05 TutorialBounding Techniques107 Cut-and-solve Iterative search strategy called cut-and-solve Iterative search strategy called cut-and-solve For each iteration: For each iteration: –Step 1: A chunk of the solution space is cut away and solved, providing incumbent solutions –Step 2: A relaxed solution is found for remaining solution space Iterate until relaxed solution is greater than or equal to incumbent Iterate until relaxed solution is greater than or equal to incumbent

108 IJCAI-05 TutorialBounding Techniques108 Example x >= 0 y <= 3 y + 13/6 x <= 9 y – 5/13 x >= 1/14 y + 3/5 x >= 6/5 x,y integers

109 IJCAI-05 TutorialBounding Techniques109 Optimal solution Minimize Z = y – 4/5 x x = 2 y = 1 Z = -0.6

110 IJCAI-05 TutorialBounding Techniques110 Iteration 1, first step Cut away a chunk of the solution space: y – 17/3 x >= -14 and solve sparse problem

111 IJCAI-05 TutorialBounding Techniques111 Iteration 1, first step Incumbent solution is -0.4 x = 3 y = 2 Z = -0.4

112 IJCAI-05 TutorialBounding Techniques112 Iteration 1, second step Add new constraint: y – 17/3 x <= -14 to cut off chunk of solution space Relax integrality and solve

113 IJCAI-05 TutorialBounding Techniques113 Iteration 1, second step x = 2.6 y = 1.0 Z = -1.1 Incumbent solution is -0.4, so need to run another iteration

114 IJCAI-05 TutorialBounding Techniques114 Iteration 2, first step Cut away a chunk of the solution space and solve sparse problem

115 IJCAI-05 TutorialBounding Techniques115 Iteration 2, first step This solution is less than incumbent, so incumbent becomes -0.6 x = 2 y = 1 Z = -0.6

116 IJCAI-05 TutorialBounding Techniques116 Iteration 2, second step Add constraint to cut off solved chunk Relax integrality and solve

117 IJCAI-05 TutorialBounding Techniques117 Iteration 2, second step Incumbent value: Z = -0.6 Solution is greater than incumbent, so incumbent must be optimal x = 1.0 y = 0.6 Z = -0.2

118 IJCAI-05 TutorialBounding Techniques118 Cut-and-solve properties Minimal memory requirements Minimal memory requirements –Keep new constraints and incumbent solution from one iteration to the next No subtrees in which to get lost No subtrees in which to get lost Piercing cuts should try to capture optimal solutions Piercing cuts should try to capture optimal solutions Different from conventional cutting planes Different from conventional cutting planes –Remove solutions from original problem

119 IJCAI-05 TutorialBounding Techniques119 Cut-and-solve Same as two steps of limit crossing Same as two steps of limit crossing –Small chunk is solved to provide simple upper bound –Doubly-modified problem  Added piercing cuts are upper-bounding modifications  Relaxation is lower-bounding modification Unusual upper-bounding modification Unusual upper-bounding modification –Results in search path instead of search tree

120 IJCAI-05 TutorialBounding Techniques120 Cut-and-solve A general technique for determining piercing cuts for binary IPs A general technique for determining piercing cuts for binary IPs –Relax integrality and solve LP –LP yields reduced costs  a lower bound on the increase of the solution if variable forced into LP solution –Let S = set of k variables with the smallest reduced costs –Find optimal solution for set S –Add piercing cut: sum of variables not in S >= 1 Custom piercing cuts tailored to particular problem may work better Custom piercing cuts tailored to particular problem may work better

121 IJCAI-05 TutorialBounding Techniques121 Cut-and-solve We used general technique for ATSP [AIJ, to appear] We used general technique for ATSP [AIJ, to appear] –Piercing cut added: sum of variables in S <= n – 1 –Same cut but less variables in constraint Very good results for difficult instances Very good results for difficult instances –Real-world instances with structural characteristics Results for easy problems not as impressive Results for easy problems not as impressive

122 IJCAI-05 Tutorialnew slide Real-world ATSPs Super - shortest common superstring Super - shortest common superstring Shop - no-wait flowshop Shop - no-wait flowshop Rtilt, stilt - tilted drilling machine Rtilt, stilt - tilted drilling machine Crane - stacker crane Crane - stacker crane Disk - computer disk read head Disk - computer disk read head Coin - pay phone collection Coin - pay phone collection

123 IJCAI-05 Tutorialnew slide ATSP results 100 city instances 100 city instances Varied degree of accuracy of the arc costs Varied degree of accuracy of the arc costs –Varied the number of digits used by generators Average computation time over 100 trials Average computation time over 100 trials Compared CDT, Concorde, and cut-and- solve Compared CDT, Concorde, and cut-and- solve

124 IJCAI-05 Tutorialnew slide ATSP results Super class not dependent on number of digits Super class not dependent on number of digits CDT averaged 0.073 seconds CDT averaged 0.073 seconds Concorde averaged 8.15 seconds Concorde averaged 8.15 seconds Cut-and-solve averaged 2.07 seconds Cut-and-solve averaged 2.07 seconds

125 IJCAI-05 Tutorialnew slide ATSP results shop class

126 IJCAI-05 Tutorialnew slide ATSP results rtilt class 100 cities 100 trials

127 IJCAI-05 Tutorialnew slide ATSP results stilt class

128 IJCAI-05 Tutorialnew slide ATSP results crane class

129 IJCAI-05 Tutorialnew slide ATSP results disk class

130 IJCAI-05 Tutorialnew slide ATSP results coin class

131 IJCAI-05 TutorialBounding Techniques131 Backbone and fat identifier Backbones are variables that appear in every optimal solution Backbones are variables that appear in every optimal solution Fat variables don’t appear in any optimal solution Fat variables don’t appear in any optimal solution Useful for Useful for –Enumerating all optimal solutions –Identifying critical components –Identifying wasted components

132 IJCAI-05 TutorialBounding Techniques132 Backbone and fat identifier Find a simple upper bound Find a simple upper bound Attempt to identify backbones: Attempt to identify backbones: –Force exclusion of each variable –If limits cross, must be a backbone Attempt to identify fat: Attempt to identify fat: –Force inclusion of each variable –If limits cross, must be fat Use problem-specific information to improve results Use problem-specific information to improve results Example for ATSP [Climer and Zhang, AAAI-02] Example for ATSP [Climer and Zhang, AAAI-02]

133 IJCAI-05 TutorialBounding Techniques133

134 IJCAI-05 TutorialBounding Techniques134 Upper bound = 61

135 IJCAI-05 TutorialBounding Techniques135 Upper bound = 61 Lower bound = 24

136 IJCAI-05 TutorialBounding Techniques136 Upper bound = 61 Lower bound = 24 Doubly-modified = 61

137 IJCAI-05 TutorialBounding Techniques137 Upper bound = 61 Lower bound = 24 Doubly-modified = 70

138 IJCAI-05 TutorialBounding Techniques138 Upper bound = 61

139 IJCAI-05 TutorialBounding Techniques139 Upper bound = 61 Lower bound = 24

140 IJCAI-05 TutorialBounding Techniques140 Upper bound = 61 Doubly-modified = 61

141 IJCAI-05 TutorialBounding Techniques141 Upper bound = 61 Doubly-modified = 75

142 IJCAI-05 TutorialBounding Techniques142 Upper bound = 61

143 IJCAI-05 TutorialBounding Techniques143 Upper bound = 61 Doubly-modified = 152

144 IJCAI-05 TutorialBounding Techniques144 Upper bound = 61

145 IJCAI-05 TutorialBounding Techniques145 Upper bound = 61 Doubly-modified = 115

146 IJCAI-05 TutorialBounding Techniques146 Upper bound = 61

147 IJCAI-05 TutorialBounding Techniques147 Upper bound = 61 Doubly-modified = infinity

148 IJCAI-05 TutorialBounding Techniques148

149 IJCAI-05 TutorialBounding Techniques149

150 IJCAI-05 TutorialBounding Techniques150 Results Used AP lower bound [Climer and Zhang, AAAI-02] Used AP lower bound [Climer and Zhang, AAAI-02] Random ATSP graphs with cost range equal to number of cities Random ATSP graphs with cost range equal to number of cities Found close to half of the backbone arcs and 99% of the fat arcs Found close to half of the backbone arcs and 99% of the fat arcs Time savings when finding all solutions: Time savings when finding all solutions: –20% for smaller instances –75% for larger instances Found very few backbones and fat for real-world problems as this lower bound is not tight Found very few backbones and fat for real-world problems as this lower bound is not tight –Implementing with tighter lower bound

151 IJCAI-05 TutorialBounding Techniques151 Moving beyond typical branch-and-bound Cut-and-solveBackbone & fat identifier

152 IJCAI-05 TutorialBounding Techniques152 Future directions Forecasting effective bounding modifications Forecasting effective bounding modifications –Automatically identifying “easy” doubly-modified problems –Tightness of modifications Systematic exploration of IPs Systematic exploration of IPs Other domains Other domains –Quadratic programs –Constraint programming –Explore modifications of every facet of problem definition

153 IJCAI-05 TutorialBounding Techniques153 Take-home message The use of bounds is a powerful technique for solving difficult problems The use of bounds is a powerful technique for solving difficult problems It has been shown that a systematic approach to finding lower bounds can produce useful heuristics that are not easily discovered It has been shown that a systematic approach to finding lower bounds can produce useful heuristics that are not easily discovered Systematically using the 2-step limit crossing procedure coupled with a clear vision of all potential problem modifications can yield novel search strategies that exploit bounds Systematically using the 2-step limit crossing procedure coupled with a clear vision of all potential problem modifications can yield novel search strategies that exploit bounds


Download ppt "Techniques for Computing and Using Bounds for Combinatorial Optimization Problems Sharlee Climer and Weixiong Zhang Department of Computer Science and."

Similar presentations


Ads by Google