Jean-Charles REGIN Michel RUEHER ILOG Sophia Antipolis Université de Nice – Sophia Antipolis A global constraint combining.

Slides:



Advertisements
Similar presentations
Min Cost Flow: Polynomial Algorithms. Overview Recap: Min Cost Flow, Residual Network Potential and Reduced Cost Polynomial Algorithms Approach Capacity.
Advertisements

Constraint Satisfaction Problems Russell and Norvig: Chapter
Maintaining Arc Consistency We have a constraint graph G of variables X 1,...X n, and constraint relations {X i  X j}, and each Xi has a value set V (X.
Global Constraints Toby Walsh National ICT Australia and University of New South Wales
This lecture topic (two lectures) Chapter 6.1 – 6.4, except 6.3.3
1 Finite Constraint Domains. 2 u Constraint satisfaction problems (CSP) u A backtracking solver u Node and arc consistency u Bounds consistency u Generalized.
ICS-271:Notes 5: 1 Lecture 5: Constraint Satisfaction Problems ICS 271 Fall 2008.
Wednesday, January 29, 2003CSCE Spring 2003 B.Y. Choueiry Constraint Consistency Chapter 3.
Introduction to Algorithms
Constraint Programming for Compiler Optimization March 2006.
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
15.082J & 6.855J & ESD.78J Shortest Paths 2: Bucket implementations of Dijkstra’s Algorithm R-Heaps.
Constraint Optimization Presentation by Nathan Stender Chapter 13 of Constraint Processing by Rina Dechter 3/25/20131Constraint Optimization.
Optimal Rectangle Packing: A Meta-CSP Approach Chris Reeson Advanced Constraint Processing Fall 2009 By Michael D. Moffitt and Martha E. Pollack, AAAI.
5-1 Chapter 5 Tree Searching Strategies. 5-2 Satisfiability problem Tree representation of 8 assignments. If there are n variables x 1, x 2, …,x n, then.
Network Optimization Models: Maximum Flow Problems
Chapter 10: Iterative Improvement The Maximum Flow Problem The Design and Analysis of Algorithms.
Chapter 7 Maximum Flows: Polynomial Algorithms
Chapter 7 Dynamic Programming 7.
1 Refining the Basic Constraint Propagation Algorithm Christian Bessière and Jean-Charles Régin Presented by Sricharan Modali.
An Approximation of Generalized Arc-Consistency for Temporal CSPs Lin Xu and Berthe Y. Choueiry Constraint Systems Laboratory Department of Computer Science.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
Math443/543 Mathematical Modeling and Optimization
Constraint Satisfaction Problems
Network Optimization Models: Maximum Flow Problems In this handout: The problem statement Solving by linear programming Augmenting path algorithm.
1 Consistency algorithms Chapter 3. Spring 2007 ICS 275A - Constraint Networks 2 Consistency methods Approximation of inference: Arc, path and i-consistecy.
Developing a Deterministic Patrolling Strategy for Security Agents Nicola Basilico, Nicola Gatti, Francesco Amigoni.
Branch and Bound Algorithm for Solving Integer Linear Programming
5-1 Chapter 5 Tree Searching Strategies. 5-2 Breadth-first search (BFS) 8-puzzle problem The breadth-first search uses a queue to hold all expanded nodes.
Advanced Constraint Processing, Fall 2009 An Efficient Consistency Algorithm for the Temporal Constraint Satisfaction Problem Berthe Y. Choueiry & Lin.
Penn ESE535 Spring DeHon 1 ESE535: Electronic Design Automation Day 8: February 13, 2008 Retiming.
CP Summer School Modelling for Constraint Programming Barbara Smith 1.Definitions, Viewpoints, Constraints 2.Implied Constraints, Optimization,
Review for E&CE Find the minimal cost spanning tree for the graph below (where Values on edges represent the costs). 3 Ans. 18.
CP Summer School Modelling for Constraint Programming Barbara Smith 2. Implied Constraints, Optimization, Dominance Rules.
Artificial Intelligence CS482, CS682, MW 1 – 2:15, SEM 201, MS 227 Prerequisites: 302, 365 Instructor: Sushil Louis,
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
CALTECH CS137 Winter DeHon CS137: Electronic Design Automation Day 7: February 3, 2002 Retiming.
CS 4407, Algorithms University College Cork, Gregory M. Provan Network Optimization Models: Maximum Flow Problems In this handout: The problem statement.
1 Branch and Bound Searching Strategies Updated: 12/27/2010.
Chapter 5 Constraint Satisfaction Problems
X y x-y · 4 -y-2x · 5 -3x+y · 6 x+y · 3 Given x, for what values of y is (x,y) feasible? Need: y · 3x+6, y · -x+3, y ¸ -2x-5, and y ¸ x-4 Consider the.
Arc Consistency CPSC 322 – CSP 3 Textbook § 4.5 February 2, 2011.
Lagrangean Relaxation
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
15.082J & 6.855J & ESD.78J September 30, 2010 The Label Correcting Algorithm.
Review for E&CE Find the minimal cost spanning tree for the graph below (where Values on edges represent the costs). 3 Ans. 18.
EMIS 8374 The Ford-Fulkerson Algorithm (aka the labeling algorithm) Updated 4 March 2008.
Approximation Algorithms Duality My T. UF.
Constraint Satisfaction Problems/Programming ZUI 2012/2013.
1 Maximum Flows CONTENTS Introduction to Maximum Flows (Section 6.1) Introduction to Minimum Cuts (Section 6.1) Applications of Maximum Flows (Section.
St. Edward’s University
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Consistency Methods for Temporal Reasoning
Constraint Propagation
Constraints and Search
Dijkstra’s Algorithm for the Shortest Path Problem
CSP Search Techniques Backtracking Forward checking
Constraint Satisfaction Problems
Constraints and Search
Chapter 5: General search strategies: Look-ahead
Branch and Bound Searching Strategies
Complexity of Ford-Fulkerson
Constraint satisfaction problems
Chapter 6 Network Flow Models.
NP-Completeness Reference: Computers and Intractability: A Guide to the Theory of NP-Completeness by Garey and Johnson, W.H. Freeman and Company, 1979.
EE5900 Advanced Embedded System For Smart Infrastructure
Directional consistency Chapter 4
Constraint Graph Binary CSPs
Constraint satisfaction problems
Consistency algorithms
Presentation transcript:

Jean-Charles REGIN Michel RUEHER ILOG Sophia Antipolis Université de Nice – Sophia Antipolis A global constraint combining a sum constraint and binary inequalities 2 August,

Global constraint IS Combination of: A sum constraint: y=x i and binary Inequalities: x i  x j + c ij Example: x 1 + x 2 = y & x 1  x August,

Motivations to improve “back” propagation when solving Optimization Problems in the CP Framework 2 August,

Optimization Problems Objective function : a sum z=x i CP Framework  solving a sequence of decision problems (B&B) where z < z* (each solution must be better than the previous one) Local consistency algorithms on each constraint poorly propagate (not very effective) Global constraint IS : to improve the back propagation when the bounds of z are modified 2 August,

Minimizing mean flow time F=1/n  j=1…n (C j -r j ) Minimizing tardiness D=1/n  j=1…n D j Where D j =max(C j -d j,0) C j, r j, d j : completion time, ready time, and duration of task T j D ifference constraints : precedence/distance constraints between the tasks 2 August, Applications in deterministic scheduling problems

D (x 1 ) =[0,6] D (x 2 ) =[1,7] D(y)=[1,13] C 1 : x 1 + x 2 =y C 2 : x 1  x This system is arc consistent : each value belongs to a solution of every constraint If the lower bound of y is set to 6 this system remains arc consistent  no pruning can be achieved 2 August, Example 1

D (x 1 ) =[0,6] D (x 2 ) =[1,7] D(y)=[6,13] C 1 : x 1 + x 2 =y C 2 : x 1  x However, when x 2 belongs to [1,3] C 1 cannot be satisfied Global constraint IS will delete [1,3] from the domain D (x 2 ) 2 August, Example 1 (continued)

Constraint S um : y=  j=1…n x j Inequalities I neq : { x j - x j  c ji (i,j  [1,n])} Domain constraints D om : { l j  x j  u j (i  [1,n])} Global constraint IS = {S um }  D om  I neq Algorithm (scheme): If a bound of x is modified  Filtering D om  I neq by interval consistency If a bound of some y is modified  Filtering S um by interval consistency Updating the bounds of every x j with respect to IS 2 August, Summary of our framework

 Filtering S um by interval consistency : 0(n)  Filtering I neq by interval consistency : 0(mn)  Filtering IS by interval consistency : O(n(m + nlogn)) 2 August, Summary of our framework Contribution

Sum constraint y=x i is interval consistent iff: (1)min(y)   j=1…n min(x j ) (2)max(y)   j=1…n max(x j ) (3) x i : min(x i )  min(y) -  j i max(x j ) (4) x i : max(x i )  max(y) -  j i min(x j ) 2 August,

Sum constraint (continued) Checking interval consistency of y=x i is in 0(n) (3)  x i : min(x i )  min(y) -  j i max(x j )  j i max(x j )=  j=1…n max(x j )- max(x i ) 2 August,

Binary inequalities Filtering by AC  complexity depends on the size of the domains Filtering by interval consistency can be achieved in O(mn)  simple temporal CSP (Dechter et al) 2 August,

Simple Temporal CSP Shortest path :  (s,x j )   (s,x i ) + c (x i, x j ) Distance Graph G=(N,E) N: source node s with D (s) ={0} + one node for each variable E:  x i  x j + c ji  arc (x j, x i ) with cost c ji  D (x) =[min x, max x ] s  x – min(x)  arc (x,s) with cost -min(x) x  s + max(x)  arc (s,x) with cost max(x) 2 August,

Example D (x i ) =[1,6] D (x j ) =[2,5] x j  x i - 3 s i j August, Distance graph (continued)

x i is interval consistant if D (x i ) =[-  (x i, s),  (s,x i ) ] Example D (x i ) =[1 5,6] D (x j ) =[2,5 3 ] x j  x i - 3 s i j August, Distance graph (continued) Computing shortest path : 0(nm) but after ONE computation we can use reduced costs  0(m+n log( n ))

IS constraint ( Sum & binary Inequalities ) (3) x i : min(x i )  min(y) -  ji max (x j ) is too weak to enforce interval consistency on IS Example: D (x 1 ) =[0,6] D (x 2 ) =[1,7] D(y)=[6,13] C 1 : x 1 + x 2 =y C 2 : x 1  x min(x 2 )  min(y) - max (x 1 ) = 0 although IS cannot be satisfied when x 1  3 2 August,

IS constraint ( Sum & binary Inequalities ) Interval consistent iff: min(y)   i=1…n min(x i ) max(y)   i=1…n max(x i ) (3b) x i : min(x i )  min(y) -  ji max x i  min(x i ) (x j ) (4b) x i : max(x i )  max(y) -  ji min x i  max(x i ) (x j ) max x i  min(x i ) (x j ) : maximum value of D (xj) which satisfies I neq when x i is set to min(x i ) 2 August,

IS constraint ( Sum & binary Inequalities ) (3b) x i : min(x i )  min(y) -  ji max x i  min(x i ) (x j ) Example: D (x 1 ) =[0,6] D (x 2 ) =[1,7] D(y)=[6,13] C 1 : x 1 + x 2 =y C 2 : x 1  x min(x 2 )  min(y) - max x 2  min(x 2 ) (x 1 ) = 4 2 August,

How to compute min(x i ) ? (3b) x i : min(x i )  min(y) -  ji max x i  min(x i ) (x j )  x i = min(y) -  j i max x i  x i (x j ) 2 August,

Computing x i max (x j )=min(  (s,x j ), x i +  ’ (x i,x j ) ) (1) xi  xi where  ’ (x i,x j ) ) is the shortest path from x i to x j in G-{s} and thus x i = min(y) -  j i min( (s,x j ), x i +  ’ (x i,x j ) ) (2) 2 August, max (x j ) depends only on the upper-bounds of the variables x k that belong to a shortest path from s to x j

Computing min(x i ) (continued) x i = min(y) -  j i min( (s,x j ), x i +  ’ (x i,x j ) ) (2) 2 August, algorithm (1) Computing  ’ (x i,x j ) ) for all j i and be  (s,x j ) -  (x i,x j ) =  j (2) L  sorted sorted list of  j S  { x j :  j  0} S is the set of the x j ’s for which min( (s,x j ), x i +  ’ (x i,x j ) ) =  (s,x j ) Loop x i = min(y) -  j i min( (s,x j ), x i +  ’ (x i,x j ) ) S  S  { x j :  j  x i } until of S does no more change

Computing min(x i ) (continued) 2 August,   (x i,x j ) can be computed on the graph of reduced costs of G-{s} in O(nlogn)  Identifying the shortest paths from x i to x j which go through s  iteration step : O(nlogn)  Filtering of IS by interval consistency : O(n(m + nlogn)) (no propagation step is required when min(x i ) is increased)

Extensions Works also for  i x i =y (3b) x i  1 (min(y) -  ji  j  (x i,x j ) ) |X| Works also if all the variables do not occur in the sum constraint 2 August,

Conclusion  An original combination and an efficient algorithm for a new global constraint  to improve propagation in optimization problems  Further work : implementation & experimentation 2 August,

Computing min(x i ) (continued) x i  1 (min(y) -  ji  (x i,x j ) ) |X| 2 August,   (x i,x j ) can be computed on the graph of reduced costs of G-{s}  Identifying the shortest paths from x i to x j which go through s  iteration step : O(nlogn)  Filtering of IS by interval consistency : O(n(m + nlogn)) (no propagation step is required when min(x i ) is increased)

Computing x i max (x j )=min(  (s,x j ), x i +  (x i,x j ) ) (1) xi  xi and thus max (x j )  x i +  (x i,x j ) (2) xi  xi and x i  1 (min(y) -  ji  (x i,x j ) ) (3) |X| 2 August, max (x j ) depends only on the upper-bounds of the variables x k that belong to a shortest path from s to x j

Computing min(x i ) (continued) max (x j )= x i +  (x i,x j ) (2) xi  min(xi) 2 August, Proof (scheme) Two cases : 1. x i belongs to a shortest path from s to x j when x i is set to the value of x i we are searching for  max xi  min(xi) (x j )=x i +  (x i,x j ) 2. x i does not belong to a shortest from s to x j (whatever value is assigned to x i ) x i =min(x i ) =-  (x i,s) max xi  min(xi) (x j )=  (s,x j ) = x i +  (x i,s) +  (s,x j ) and s belongs to a shortest path from s to x j

Computing min(x i ) (improvement) 3b) requires x i  = 1 (min(y) -  j i  (x i,x j ) ) |X| 2 August,  (x i,x j ) can be approximated by its lower bound max(x j )-max(x i ) Computation can be stopped earlier