(Finite domain) constraint logic programming Andy King With thanks to the Mark Wallace, Joachim Schimpf,

Slides:



Advertisements
Similar presentations
Examples for Discrete Constraint Programming
Advertisements

Constraint Satisfaction Patrick Prosser. An Example, Exam Timetabling Someone timetables the exams We have a number of courses to examine how many? Dept.
Constraint Satisfaction Problems Russell and Norvig: Parts of Chapter 5 Slides adapted from: robotics.stanford.edu/~latombe/cs121/2004/home.htm Prof: Dekang.
1 Constraint Satisfaction Problems A Quick Overview (based on AIMA book slides)
Introduction. IC-Parc2 ECLiPSe Components Constraint Logic Programming system, consisting of  A runtime core Data-driven computation, backtracking, garbage.
1 Finite Constraint Domains. 2 u Constraint satisfaction problems (CSP) u A backtracking solver u Node and arc consistency u Bounds consistency u Generalized.
ICS-271:Notes 5: 1 Lecture 5: Constraint Satisfaction Problems ICS 271 Fall 2008.
1 Chapter 8: Modelling with Finite Domain Constraints Where we examine how modelling and controlling search interact with finite domain constraints.
Iterative Deepening A* & Constraint Satisfaction Problems Lecture Module 6.
CS162 Week 2 Kyle Dewey. Overview Continuation of Scala Assignment 1 wrap-up Assignment 2a.
Constraint Satisfaction problems (CSP)
G53CLP Constraint Logic Programming Modeling CSPs – Case Study I Dr Rong Qu.
A Third Look At Prolog Chapter Twenty-TwoModern Programming Languages, 2nd ed.1.
Chapter 6 Linear Programming: The Simplex Method
CPSC 422, Lecture 21Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 21 Mar, 4, 2015 Slide credit: some slides adapted from Stuart.
/425 Declarative Methods - J. Eisner1 Constraint Programming.
Best-First Search: Agendas
CPSC 322, Lecture 11Slide 1 Constraint Satisfaction Problems (CSPs) Introduction Computer Science cpsc322, Lecture 11 (Textbook Chpt 4.0 – 4.2) January,
4 Feb 2004CS Constraint Satisfaction1 Constraint Satisfaction Problems Chapter 5 Section 1 – 3.
Constraint Logic Programming Ryan Kinworthy. Overview Introduction Logic Programming LP as a constraint programming language Constraint Logic Programming.
1 Optimisation Although Constraint Logic Programming is somehow focussed in constraint satisfaction (closer to a “logical” view), constraint optimisation.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
Ch 13 – Backtracking + Branch-and-Bound
Ken Bayer, Josh Snyder, and Berthe Y. Choueiry Constraint Systems Laboratory University of Nebraska-Lincoln A Constraint-Based Approach to Solving Minesweeper.
Backtracking.
7/14/20151 Constraint Programming 2001 Edition Jan Maluszynski and Ulf Nilsson TCSLAB, LiU {janma,
Global Constraints for Lexicographic Orderings Alan Frisch, Ian Miguel (University of York) Brahim Hnich, Toby Walsh (4C) Zeynep Kiziltan (Uppsala University)
Constraint Logic Programming (CLP) Luis Tari March 10, 2005.
Constraint Reasoning Florida Institute of Technology Computer Science.
Learning Objectives for Section 6.2
CP Summer School Modelling for Constraint Programming Barbara Smith 1.Definitions, Viewpoints, Constraints 2.Implied Constraints, Optimization,
Slide 1 Constraint Satisfaction Problems (CSPs) Introduction Jim Little UBC CS 322 – CSP 1 September 27, 2014 Textbook §
CONSTRAINT PROGRAMMING Computer Science Seminar April 9 th, 2004 Kerem Kacel.
Chapter 6 Linear Programming: The Simplex Method Section 2 The Simplex Method: Maximization with Problem Constraints of the Form ≤
Chapter 3 Sec 3.3 With Question/Answer Animations 1.
Constraint Satisfaction Problems (CSPs) CPSC 322 – CSP 1 Poole & Mackworth textbook: Sections § Lecturer: Alan Mackworth September 28, 2012.
Constraint Satisfaction Problems Chapter 6. Review Agent, Environment, State Agent as search problem Uninformed search strategies Informed (heuristic.
Chapter 5 Section 1 – 3 1.  Constraint Satisfaction Problems (CSP)  Backtracking search for CSPs  Local search for CSPs 2.
CP Summer School Modelling for Constraint Programming Barbara Smith 2. Implied Constraints, Optimization, Dominance Rules.
Constraint Satisfaction CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Chapter 5: Constraint Satisfaction ICS 171 Fall 2006.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Fall 2006 Jim Martin.
CSPs Tamara Berg CS Artificial Intelligence Many slides throughout the course adapted from Svetlana Lazebnik, Dan Klein, Stuart Russell, Andrew.
Chapter 5 Constraint Satisfaction Problems
Chapter 2) CSP solving-An overview Overview of CSP solving techniques: problem reduction, search and solution synthesis Analyses of the characteristics.
1 Chapter 3: Finite Constraint Domains Where we meet the simplest and yet most difficult constraints, and some clever and not so clever ways to solve them.
CPSC 422, Lecture 21Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 21 Oct, 30, 2015 Slide credit: some slides adapted from Stuart.
Problem Reduction So far we have considered search strategies for OR graph. In OR graph, several arcs indicate a variety of ways in which the original.
Proximity Inversion Functions on the Non-Negative Integers Presented By Brendan Lucier June 5, 2005 CMS Summer 2005 Meeting Automatic Sequences and Related.
CHAPTER 5 SECTION 1 – 3 4 Feb 2004 CS Constraint Satisfaction 1 Constraint Satisfaction Problems.
Put a different number in each circle (1 to 8) such that adjacent circles cannot take consecutive numbers.
G51IAI Introduction to Artificial Intelligence
Chapter 13 Backtracking Introduction The 3-coloring problem
1 Constraint Satisfaction Problems: Formulation, Arc Consistency & Propagation 1 Brian C. Williams October 13 th, 2004 Slides adapted from:
Chapter 5 Team Teaching AI (created by Dewi Liliana) PTIIK Constraint Satisfaction Problems.
1 Constraint Satisfaction Problems (CSP). Announcements Second Test Wednesday, April 27.
Constraint Satisfaction Problems
Constraint Satisfaction Problems (CSPs) Introduction
CSC Modeling with FD Constraints
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Constraint Satisfaction Problems vs. Finite State Problems
Constraint Satisfaction Problems (CSPs) Introduction
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Chapter 3: Finite Constraint Domains
Artificial Intelligence
Chapter 3: Finite Constraint Domains
Constraint Satisfaction Problems
Presentation transcript:

(Finite domain) constraint logic programming Andy King With thanks to the Mark Wallace, Joachim Schimpf, Warwick Harvey, Andrew Cheadle, Andrew Sadler for use of their ECLiPSe material. Slides originally adapted from notes of Micha Meier software/eclipse/win32/ software/eclipse/linux-i386/

Logical course structure zThis overview zConstraint satisfaction problems (CSPs) zBounds propagation, search and optimisation techniques zModeling problems with reified constraints

CLP applications zFleet assignment – the assignment of a set of aircraft of different types to a predefined schedule of flights zJob-shop scheduling – a set of jobs and set of machines are given, each job consisting on a partially ordered set of tasks zNurse scheduling – specify which days on and which days off given constraints on personnel policy, nurses’ qualifications and individual requests zInterpreting sloppy stick figures – recognizing drawings with missing model parts and noisy data zSee at PACT, CP, Constraints and the applications page zDo not be deceived: constraint solving is usually hidden beneath interface coded in Java.

Growth of constraint programming zResults – 64 submissions to ICLP in 2001; 66 submissions to the ICFP in 2001 compared with 135 submissions to CP in 2001 zSystems – AKL, Amulet and Garnet, B-Prolog, Bertrand, CHIP, CPLEX, Cassowary, Cooldraw, Thinglab, ECLiPSe, GNU-Prolog, IF/Prolog, ILOG Solver, Interval Solver for Microsoft Excel, Jsolver, Numerica, Oz, Prolog IV, RISC-CLP(Real), SICStus zLaboratories and startups – CCC, IC-Parc, IF/Computer, ILOG, PrologIA, Vine Solutions, etc

Paper and on-line resources zFor ECLiPSe see the “ECLiPSe Constraint Library Manual” in the on-line documentation zKrzysztof Apt, “Principles of Constraint Programming”, Cambridge, 1993 zKim Marriott and Stuckey, “Programming with Constraints”, MIT Press, 1998 zRoman Barták, “On-line Guide to Constraint Programming”, see

Commerce versus science “Were you to ask me which programming paradigm is likely to gain most in commercial significance over the next 5 years I’d have to pick Constrained Logic Programming (CLP), even though it’s perhaps currently one of the least known and understood” Dick Pountain BYTE, February 1995 “Constraint programming represents one of the closest approaches computer science has yet made to the Holy Grail of programming: the user states the problem, the computer solves it” Eugene C. Freuder CONSTRAINTS, April 1997

Constraint satisfaction problems Chapter 1: the declarative nature of constraint logic programming

Constraint satisfaction problems (CSPs) A CSP consists of: za finite set of variables X = {x 1, …, x n }  a domain D that is a mapping {x 1  S 1,…, x n  S n } where each S i is a finite set  a constraint C that is a finite set of primitive constraints C = {c 1, …, c m } where var(c i )  X The CSP is interpreted as the problem of deciding whether C  x 1  D(x 1 )  …  x n  D(x n ) is satisfiable – whether it possesses a solution

What is an example CSP? zThe set of variables is X = {WA, NT, Q, NS, SA, V, T}  The domain is D = {WA  {red}, NT  S, …, T  S} where S = {red, yellow, blue} zThe set of constraints is C = {WA  NT, WA  SA, NT  SA, NT  Q, SA  Q, SA  NS, SA  V, Q  NS, NS  V} Colour the regions of a map with a limited number of colours, subject to the condition that no two adjacent regions share the same colour. For instance, consider the CSP which encodes the problem of colouring Australia so that Western Australia is red:

Coding the colouring CSP in ECLiPSe :- use_module(library(fd)). colour(Regions) :- Regions = [NT, Q, NS, SA, V, _T], Regions :: 1..3, [WA] :: 1..1, WA #\= NT, WA #\= SA, NT #\= SA, NT #\= Q, SA #\= Q, SA #\= NS, SA #\= V, Q #\= NS, NS #\= V, labeling(Regions). [eclipse 2]: colour(R). R = [2, 1, 2, 3, 1, 1] More (0.00s cpu) ? ; R = [2, 1, 2, 3, 1, 2] More (0.00s cpu) ? ; R = [2, 1, 2, 3, 1, 3] More (0.00s cpu) ? ; R = [3, 1, 3, 2, 1, 1] More (0.00s cpu) ? ; R = [3, 1, 3, 2, 1, 2] More (0.00s cpu) ? ; R = [3, 1, 3, 2, 1, 3] Yes (0.00s cpu)

What is another example of a CSP? Crypto-arithmetic problems – puzzles in which digits are replaced with distinct letters of the alphabet or other symbols. Consider the classic SEND, MORE, MONEY puzzle that is due to Dudeney [Strand Magazine, July, 1924] zThe set of variables is X = {S, E, N, D, M, O, R, Y, C 1, C 2, C 3 }  The domain is D = {S  {0,…,9}, …, R  {0,…,9}, C 1  {0,1}, C 2  {0,1}, C 3  {0,1}} zThe set of constraints is C = C P  C A where yC P = {S  E, S  N, S  D, S  M, S  O, S  R, E  N, E  D, E  M, E  O, E  R, N  D, N  M, N  O, N  R, D  M, D  O, D  R, M  O, M  R, O  R} yC A = {D + E = Y + 10C 1, N + R + C 1 = E + 10C 2, E + O + C 2 = N + 10C 3, S + M + C 3 = O + 10M} SEND + MORE MONEY

Coding the crypto- arithmetic CSP in ECliPSe :- use_module(library(fd)). crypto(Digits) :- Digits = [S,E,N,D,M,O,R,Y], Digits :: 0..9, Carrys = [C1, C2, C3], Carrys :: 0..1, alldifferent(Digits), D + E #= Y + 10 * C1, N + R + C1 #= E + 10 * C2, E + O + C2 #= N + 10 * C3, S + M + C3 #= O + 10 * M, labeling(Digits). [eclipse 2]: crypto(D). D = [2, 8, 1, 7, 0, 3, 6, 5] More (0.00s cpu) ? ; D = [2, 8, 1, 9, 0, 3, 6, 7] More (0.00s cpu) ? ; D = [3, 7, 1, 2, 0, 4, 6, 9] More (0.00s cpu) ? ; D = [3, 7, 1, 9, 0, 4, 5, 6] More (0.00s cpu) ? ; …

What is the n-queens problem? zPlace n queens on an n by n chessboard so that none of them can take each other zSolutions and non-solutions for, say, n = 6 are zThere are n 2 !/(n!(n 2 -n)!) possible ways of placing n queens on an n by n board, that is, for n = 6            

Expressing 6-queens as a CSP zThe set of variables is X = {X 1, …, X 6 } where X i is the column number for the queen in row i zThe assignment X 1 = 2, X 2 = 4 …, X 6 = 5 represents the safe configuration: zThis representation assumes that exactly one queen occurs in each row: yif two queens occurred in the same row then the configuration is a non-solution (take horizontally) yif zero queens occurred in one row, then at least two queens must occur in another row (take horizontally)      

Expressing 6-queens as a CSP (cont’) zThe domain is D = {X 1  {1,…,6}, …, X 6  {1,…,6}} zThe set of constraints is C = C P  C D where yC P = {X 1  X 2, X 1  X 3, X 1  X 4, X 1  X 5, X 1  X 6, X 2  X 3, X 2  X 4, X 2  X 5, X 2  X 6, …, X 4  X 5, X 5  X 6 } yThis ensures that queens cannot take vertically yC D = {1  abs(X 1 – X 2 ), 2  abs(X 1 – X 3 ), 3  abs(X 1 – X 4 ), 4  abs(X 1 – X 5 ), 5  abs(X 1 – X 6 ), 1  abs(X 2 – X 3 ), 2  abs(X 2 – X 4 ), 3  abs(X 2 – X 5 ), 4  abs(X 2 – X 6 ), …, 1  abs(X 5 – X 6 )} yThis ensures that queens cannot take diagonally

Taking diagonally revisited zSuppose the X 1 = 2 zConsider those X 2 which are unsafe relative to X 1 : zIn either case 1 = abs(X 1 – X 2 ), hence require 1  abs(X 1 – X 2 )       zSuppose the X 5 = 3 zConsider those X 3 which are unsafe relative to X 5 : zIn either case 2 = abs(X 5 – X 3 ), hence require 2  abs(X 5 – X 3 )

Coding the n-queens CSP in ECliPSe :- use_module(library(fd)). nqueens(N, Soln):- length(Soln, N), Soln :: 1..N, safe(Soln), alldifferent(Soln), labeling(Soln). safe([]). safe([CN | CNs]) :- no_attack(CNs, CN, 1),safe(CNs). no_attack([], _, _). no_attack([CN|CNs], First_CN, Diff) :- % Diff #\= abs(First_CN - CN), Diff #\= First_CN - CN, Diff #\= CN - First_CN, Next_Diff is Diff + 1, no_attack(CNs, First_CN, Next_Diff).

Running the n-queens program [eclipse 26]: nqueens(6, S). S = [2, 4, 6, 1, 3, 5] More (0.00s cpu) ? ; S = [3, 6, 2, 5, 1, 4] More (0.00s cpu) ? ; S = [4, 1, 5, 2, 6, 3] More (0.00s cpu) ? ; S = [5, 3, 1, 6, 4, 2] More (0.00s cpu) ? ; No (0.00s cpu)                        

Bounds propagation, search and optimisation Chapter 2: how constraint solving is realised

With and without labelling :- use_module(library(fd)). bounds(X, Y, Z):- [X] :: 1..5, [Y] :: 1..2, [Z] :: 3..5, X #= Y + Z. %labeling([X, Y, Z]). [eclipse 8]: bounds(X, Y, Z). X = X{[4, 5]} Y = Y{[1, 2]} Z = Z{[3, 4]} Yes (0.00s cpu) :- use_module(library(fd)). bounds(X, Y, Z):- [X] :: 1..5, [Y] :: 1..2, [Z] :: 3..5, X #= Y + Z, labeling([X, Y, Z]). [eclipse 8]: bounds(X, Y, Z). X = 4, Y = 1, Z = 3 Yes (0.00s cpu) ? ; X = 5, Y = 1, Z = 4 Yes (0.00s cpu) ? ; X = 5, Y = 2, Z = 3 Yes (0.00s cpu)

Unravelling (understanding) bounds propagation zInitially 1  X  5, 1  Y  2 and 3  Z  5 zConsider X = Y + Z yThus 4=min(Y)+min(Z)  X  max(Y)+max(Z)=7 yTighten by 4  X, hence 4  X  5, 1  Y  2 and 3  Z  5 zConsider Y = X - Z yThus -1=min(X)-max(Z)  Y  max(X)-min(Z)=2 yCannot tighten Y zConsider Z = X - Y yThus 2=min(X)-max(Y)  Z  max(X)-min(Y)=4 yTighten by Z  4, hence 4  X  5, 1  Y  2 and 3  Z  4

Unravelling (unwinding) labelling [eclipse 8]: [X] :: 1..5, [Y] :: 1..2, [Z] :: 3..5, labeling([X, Y, Z]). X = 1, Y = 1, Z = 3 Yes (0.00s cpu) ? ; X = 1, Y = 1, Z = 4 Yes (0.00s cpu) ? ; X = 1, Y = 1, Z = 5 Yes (0.00s cpu) ? ; X = 1, Y = 2, Z = 3 Yes (0.00s cpu) ? ; X = 1, Y = 2, Z = 4 Yes (0.00s cpu) ? ; X = 1, Y = 2, Z = 5 Yes (0.00s cpu) ? ; bounds(X, Y, Z):- [X] :: 1..5, [Y] :: 1..2, [Z] :: 3..5, labeling([X, Y, Z]). [eclipse 9]: bounds(X, Y, Z). X = 1, Y = 1, Z = 3 Yes (0.00s cpu) ? ; X = 1, Y = 1, Z = 4 Yes (0.00s cpu) ? ; X = 1, Y = 1, Z = 5 Yes (0.00s cpu) ? ;

Search without bounds propagation (is inefficient) zWithout bounds propagation, maximum of 5×2×3 = 30 cases need to be checked for consistency with X #= Y + Z zBounds propagation infers 4  X  5, 1  Y  2 and 3  Z  4, hence maximum of 2×2×2 = 8 cases need to be checked for consistency bounds(X, Y, Z):- [X] :: 1..5, [Y] :: 1..2, [Z] :: 3..5, X #= Y + Z, labeling([X, Y, Z]).

Bounds propagation without search (  ) z Initially -4  X  4 and -4  Y  4 z Consider Y = 2(X-1), so X = (Y/2)+1 yThus -1= min(Y)/2+1  X  max(Y)/2+1=3 yTighten -1  X  3, hence -1  X  3 and -4  Y  4 z Consider Y = X yThus -1=min(X)  Y  max(X)=3 yTighten -1  Y  3, hence -1  X  3 and -1  Y  3 z Consider Y = 2(X-1), so X = (Y/2)+1 yThus 1/2= min(Y)/2+1  X  max(Y)/2+1=5/2 yTighten 0  X  2, hence 0  X  2 and -1  Y  3 z Consider Y = X yThus 0=min(X)  Y  max(X)=2 yTighten -1  Y  3, hence 0  X  2 and 0  Y  2 cross(X, Y) :- [X, Y] :: -4..4, Y #= 2*(X - 1), Y #= X. [eclipse 8]: cross(X, Y). X = 2 Y = 2 Yes (0.00s cpu)

Bounds propagation without search (  ) z The story so far 0  X  2 and 0  Y  2 z Consider Y = 2(X-1), so X = (Y/2)+1 yThus 1= min(Y)/2+1  X  max(Y)/2+1=2 yTighten 1  X  2, hence 1  X  2 and 0  Y  2 z Consider Y = X yThus 1=min(X)  Y  max(X)=2 yTighten 1  Y, hence 1  X  2 and 1  Y  2 z Consider Y = 2(X-1), so X = (Y/2)+1 yThus 3/2= min(Y)/2+1  X  max(Y)/2+1=2 yTighten 2  X, hence 2  X  2 and 1  Y  2 z Consider Y = X yThus 2=min(X)  Y  max(X)=2 yTighten 2  Y, hence 2  X  2 and 2  Y  2 cross(X, Y) :- [X, Y] :: -4..4, Y #= 2*(X - 1), Y #= X. [eclipse 8]: cross(X, Y). X = 2 Y = 2 Yes (0.00s cpu)

Bounds propagation without search (  ) z Initially 0  X  1 and 0  Y  1 z Consider X = 1 – Y, thus yThus 0=1-max(Y)  X  1-min(Y)=1 yCannot tighten X z Consider X = 1 – Y, thus Y = 1 - X yThus 0=1-max(Y)  X  1-min(Y)=1 yCannot tighten Y z Consider Y = X, thus yThus 0=min(X)  Y  max(X)=1 yCannot tighten Y z Consider Y = X, thus X = Y yThus 0=min(Y)  X  max(Y)=1 yCannot tighten X non_bit(X, Y):- [X, Y] :: 0..1, X #= 1 – Y, X #= Y. [eclipse 8]: non_bit(X, Y). X = X{[0, 1]} Y = Y{[0, 1]} Yes (0.00s cpu)

Bounds propagation versus search z Bounds propagation is incomplete; it does not always perform optimal interval pruning yIt may not have the intelligence to infer that no solutions exist (in non_bit there are 4 cases to consider) yIt may not have the intelligence to infer that only 3 solutions exist (in bounds there are 8 cases to consider) z When intelligence fails, resort to brute force z Follow bounds propagation with labelling to systematically enumerate the space to either: yFind a solution (bounds example) yDetect that no solutions exist (non_bit example)

Brute force versus divide- and-conquer zConsider a binary CSP where X = {A, …, H}, D = {A  S, …, H  S} and S = {0,1,2} zNow label E (see the next slide). zE fixed so no propagation occurs from G to E. Once E propagates to G, no more propagation can occur over the E-G constraint so it is as if it is not there. zThus one CSP over {A,B,C,D} and another CSP over {F,G,H,J}. zThe total number of configurations is 3( ) = 3( ) = 486 « = 3 9 zThus labelling can decompose a CSP into independent sub-CSPs that are cheaper to solve than the whole A C D E F G H J

A B C D 0 F G H J A B C D 2 F G H J A B C D 1 F G H J Brute force versus divide- and-conquer

What is optimisation? zOptimisation is the problem of satisfying a CSP so as to minimise or maximise a solution with respect to a cost function. zA classic optimisation problem is the smuggler’s knapsack problem: A smuggler has a knapsack of limited capacity, say 19 units. He can smuggle in bottles of whiskey of size 4 units, bottles of scent of size 3 units and boxes of cigarettes of size 2 units. The profits from a bottle of whiskey, scent and a box of cigarettes are 15 pounds, 10 pounds and 7 pounds respectively. The smuggler will only make a trip if he can make 30 pounds or more, so what does he take?

What is optimisation? zSolve the following CSP: yThe set of variables is X = {W, S, C, P} for number of bottles of whisky, bottles of scent, boxes of cigarettes and the profits yThe domain is D = {W  {0,..,9}, S  {0,…,9}, C  {0,…,9}, P  {0,…,10000}} yThe set of constraints is C = {4W + 3S + 2C  19, P = 15W + 10S + 7C, 30  P}. zSo as to maximise the value assigned to P zA more realistic profit limit is = = 88

“Di-cho-tomic” search (for minimising a cost) zDichotomic search is essentially bisection search built on top of a binary decision procedure for a CSP  Consider minimising a cost function represented by a variable x  X zNote that the maximum number of iterations is log 2 (|D(x)|) zNote that top is not assigned to mid within the while loop function dichotomic(CSP  X, D, C , x  X) begin bot := min(D(x)) top := max(D(x)) if  X, D, C  unsatisfiable then error while (bot < top) mid :=  (bot + top) / 2  if  X, D, C  {bot  x  mid}  satisfiable top := D(x) else bot :=  (bot + top) / 2  + 1 return bot end

Example minimisation zSolve the following CSP: yThe set of variables is X = {x, y} yThe domain is D = {x  {-10,…,80}, y  {-3,…,14}} yThe set of constraints is C = {x = (y-4) 2 }. zSo as to minimise the value assigned to x bottop mid :=  (bot + top) / 2  X,D,C  {bot  x  mid}  satisfiable? yes with {x  9, y  7} -109no 094 yes with {x  1, y  5} 010 yes with {x  0, y  4} 00

What about maximisation? zThe knapsack CSP revisited: yThe set of variables is X = {W, S, C, P, Q} for number of bottles of whisky, bottles of scent, boxes of cigarettes and the profits yThe domain is D = {W  {0,..,9}, S  {0,…,9}, C  {0,…,9}, P  {0,…,88}, Q  {-88,…,0}} yThe set of constraints is C = {4W + 3S + 2C  19, P = 15W + 10S + 7C, 30  P, P = -Q}. zSo as to minimise the value assigned to Q zThe act of minimising Q maximises P

Maximisation in ECLiPSe :- use_module(library(fd)). :- use_module(library(branch_and_bound)). main(Profit, Goods) :- Goods = [Whisky, Scent, Ciggys], Goods :: 0..9, [Profit] :: 0..88, [NegProfit] :: , 4*Whisky + 3*Scent + 2*Ciggys #<= 19, Profit #= 15*Whisky + 10*Scent + 7*Ciggys, 30 #<= Profit, NegProfit #= -Profit, bb_min(labeling(Goods), NegProfit, bb_options with [strategy:dichotomic]).

Dichotomic optimisation in ECLiPSe [eclipse 28]: main(Profit, Goods). Found a solution with cost –35 Found a solution with cost –63 Found no solution with cost – –75.5 Found a solution with cost –70 Found no solution with cost – –72.75 Found no solution with cost – – Found no solution with cost – – Profit = 70 Goods = [4, 1, 0] Yes (0.01s cpu)

Modeling and reification Chapter 3: how to coerce a problem into a CSP (CLP)

How to organise your day :- use_module(library(fd)). main(Begins) :- Begins = [Beg_Work, Beg_Mail, Beg_Shop, Beg_Bank], Ends = [End_Work, End_Mail, End_Shop, End_Bank], Begins :: 9..17, Ends :: 9..17, End_Work #= Beg_Work + 4, End_Mail #= Beg_Mail + 1, End_Shop #= Beg_Shop + 2, End_Bank #= Beg_Bank + 1, End_Bank #=< Beg_Shop, End_Mail #=< Beg_Work, 11 #=< Beg_Work, one_thing_at_a_time(Begins, Ends), labeling(Begins). zCan only perform one task at any given moment in time zIn disjunctive scheduling, certain jobs, say tasks 1 and 2, cannot simultaneously use a resource

Modelling disjunctive scheduling with ECLiPSe one_thing_at_a_time([], []). one_thing_at_a_time([Begin | Begins], [End | Ends]) :- one_thing_at_a_time(Begins, Ends, Begin, End), one_thing_at_a_time(Begins, Ends). one_thing_at_a_time([], [], _, _). one_thing_at_a_time([Begin1 | Begins], [End1 | Ends], Begin2, End2) :- non_overlap(Begin1, End1, Begin2, End2), one_thing_at_a_time(Begins, Ends, Begin2, End2). non_overlap(Begin1, End1, Begin2, End2) :- (Begin1 #>= End2) #\/ (Begin2 #>= End1).

How to organise your day [eclipse 10]: main(Begins). Begins = [13,12,10,9] More (0.00s cpu) ? ; Begins = [13,10,11,9] ? ; More (0.00s cpu) ? ; Begins = [13,9,11,10] ? ; More (0.00s cpu) ? ; Begins = [11,10,15,9] ? ; More (0.00s cpu) ? ; Begins = [11,9,15,10] More (0.00s cpu) BSSMWWWW BMSSWWWW MBSSWWWW BMWWWWSS MBWWWWSS

Reified constraints [eclipse 1]: [X, Y, B] :: 0..1, X # B, X #= 0, Y #= 1. B #= 1 is posted because X < Y is entailed (it is implied from now on) [eclipse 2]: [X, Y, B] :: 0..1, X # B, X #= 1, Y #= 0. B #= 0 is posted because X < Y is disentailed (it can never be implied from now on) [eclipse 3]: [X, Y, B] :: 0..1, X # B, B #= 1. X #< Y is posted to the store which, in turn, ensures X = 0, Y = 1. [eclipse 4]: [X, Y, B] :: 0..1, X #= B, B #= 0. The negation of X #= Y, which, in turn, ensures X = 1, Y = 0. [eclipse 5]: [X, Y, B] :: 0..1, X # B, X #= 0. Neither B #= 0 nor B #= 1 is posted because X < Y is entailed nor disentailed. zInstead of merely adding a constraint to the store, like X #< Y say, it is often useful to attach a flag, say B, to the constraint to test and set its truth or falsity

Reification for implementing disjunction non_overlap(Begin1, End1, Begin2, End2) :- [B1, B2] :: 0..1, Begin1 #>= End2 # B1, Begin2 #>= End1 # B2, B1 + B2 #> 1. ‘XgreaterthanYimpliesXlessthanZ’(X, Y, Z) :- [B1, B2] :: 0..1, X #> Y # B1, X # B2, B1 #<= B2.

Reification for counting :- use_module(library(fd)). atmost(Ts, C) :- atmost_aux(Ts, 0, C). atmost_aux([], Acc, C) :- Acc #=< C. atmost_aux([T | Ts], Acc, C) :- [B] :: 0..1, Acc1 #= Acc + B, 0 # B, atmost_aux(Ts, Acc1, C). main(Ts, N) :- Ts = [T, _], Ts :: , atmost(Ts, N), T #= 1. [eclipse 24]: main(Ts, 2). Ts = [1, _211{[-1..1]}] Yes (0.00s cpu) [eclipse 25]: main(Ts, 1). Ts = [1, _211{[-1..0]}] Yes (0.00s cpu) [eclipse 26]: main(Ts, 0). No (0.00s cpu) zConsider the problem of writing a predicate atmost(Ts, C) which ensures that at most C elements of the list Ts are positive

The tennis tournament problem zThe problem is to schedule the matches of a tennis tournament so as to minimise the total length of the tournament zEach of n competitors plays each of the other n - 1 competitors giving n(n - 1)/2 matches in all zEach match takes an equal amount of time and at most c matches can be contested simultaneously since there are c courts zNo competitor can play two or more others at the same time zNo competitor has back-to-back matches, that is, each competitor has at least one match's worth of rest between each match

How can the tennis tournament be modelled? zThe problem can be modelled by upper triangular matrix which represents “who plays who and when” zA matrix for the n = 5 and c = 2 is zFor n = 6 (our instance) the problem reduces to labelling a list Ts = [T12, T13, T14, T15, T16, T23, T24, T25, T26, …, T56] where Ts is constrained by Ts :: 1..Max_T zA predicate one_game_apart([T12, T13, T14, T15, T16]) will ensure that each of these time slots is separated by at least one game

How can we write one_game_apart? one_game_apart([]). one_game_apart([H | T]) :- one_game_apart(T, H), one_game_apart(T). one_game_apart([], _). one_game_apart([H | T], D) :- %abs(H - D) #> 1, (H - D #> 1) #\/ (D - H #> 1) one_game_apart(T, D).

Schedule at most c matches simultaneously zWe need a predicate atmost(Ts, Time, C), say, which constrains Ts so that no more than C matches are played in a given time slot Time my_atmost(Ts, Time, C) :- atmost_aux(Ts, 0, Time, C). atmost_aux([], Acc, _, C) :- Acc #=< C. atmost_aux([T | Ts], Acc, Time, C) :- [B] :: 0..1, Acc1 #= Acc + B, Time #= T # B, atmost_aux(Ts, Acc1, Time, C).

Schedule at most c matches simultaneously zWe need to call my_atmost multiply like my_atmost(Ts, 1, C), my_atmost(Ts, 2, C), …, my_atmost(Ts, HighestT, C) to cover all the time slots 1, 2, …, HighestT courts(0, _, _). courts(Time, Ts, C) :- Time > 0, my_atmost(Ts, Time, C), Time1 is Time - 1, courts(Time1, Ts, C). bound_time([], _). bound_time([T | Ts], BoundT) :- T #=< BoundT, bound_time(Ts, BoundT). zTo perform minimisation, we need to find the maximum time slot occurring in Ts zIt is sufficient to find an upper bound on the time slots in Ts

How does it all fit together? main(Ts, C, BoundT) :- HighestT = 29, % 15 games on one court Ts = [T12, T13, T14, T15, T16, T23, T24, T25, T26, T34, T35, T36, T45, T46, T56], Ts :: 1..HighestT, [BoundT] :: 1..HighestT, one_game_apart([T12, T13, T14, T15, T16]), one_game_apart([T12, T23, T24, T25, T26]), one_game_apart([T13, T23, T34, T35, T36]), one_game_apart([T14, T24, T34, T45, T46]), one_game_apart([T15, T25, T35, T45, T56]), one_game_apart([T16, T26, T36, T46, T56]), courts(HighestT, Ts, C), bound_time(Ts, BoundT), append(Ts, [BoundT], AllTs), bb_min(labeling(AllTs), BoundT, bb_options with [strategy:dichotomic]).

Games schedules for 2 and 3 courts (4 is not better) [eclipse 28]: main(Ts, 2, BoundT). Found a solution with cost 13 Found no solution with cost Found no solution with cost Found a solution with cost 11 Ts = [1, 3, 5, 7, 10, 6, 9, 11, 3, 11, 9, 1, 2, 7, 5] BoundT = 11 More (9.43s cpu) ? [eclipse 28]: main(Ts, 3, BoundT). Found a solution with cost 13 Found no solution with cost Found a solution with cost 9 Found no solution with cost Ts = [1, 3, 5, 7, 9, 5, 7, 9, 3, 9, 1, 7, 3, 1, 5] BoundT = 9 More (0.15s cpu) ?