1 Refining the Basic Constraint Propagation Algorithm Christian Bessière and Jean-Charles Régin Presented by Sricharan Modali.

Slides:



Advertisements
Similar presentations
Design of the fast-pick area Based on Bartholdi & Hackman, Chpt. 7.
Advertisements

Global Constraints Toby Walsh National ICT Australia and University of New South Wales
1 Constraint Satisfaction Problems A Quick Overview (based on AIMA book slides)
ICS-271:Notes 5: 1 Lecture 5: Constraint Satisfaction Problems ICS 271 Fall 2008.
Wednesday, January 29, 2003CSCE Spring 2003 B.Y. Choueiry Constraint Consistency Chapter 3.
Chapter 5 Shortest Paths: Label-Correcting Algorithms
Outline. Theorem For the two processor network, Bit C(Leader) = Bit C(MaxF) = 2[log 2 ((M + 2)/3.5)] and Bit C t (Leader) = Bit C t (MaxF) = 2[log 2 ((M.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
1 Minimum-energy broadcasting in multi-hop wireless networks using a single broadcast tree Department of Computer Science and Information Engineering National.
An Approximation of Generalized Arc-Consistency for Temporal CSPs Lin Xu and Berthe Y. Choueiry Constraint Systems Laboratory Department of Computer Science.
1 Optimisation Although Constraint Logic Programming is somehow focussed in constraint satisfaction (closer to a “logical” view), constraint optimisation.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
©2007 Tarik Hadzic1 Lecture 11: Consistency Techniques 1. Arc Consistency 2. Directional Consistency 3. Generalized Arc Consistency Efficient AI Programming.
Tirgul 8 Universal Hashing Remarks on Programming Exercise 1 Solution to question 2 in theoretical homework 2.
Hierarchical Constraint Satisfaction in Spatial Database Dimitris Papadias, Panos Kalnis And Nikos Mamoulis.
Jean-Charles REGIN Michel RUEHER ILOG Sophia Antipolis Université de Nice – Sophia Antipolis A global constraint combining.
CSE 830: Design and Theory of Algorithms Dr. Eric Torng.
Evolutionary Computational Intelligence Lecture 9: Noisy Fitness Ferrante Neri University of Jyväskylä.
Preference Analysis Joachim Giesen and Eva Schuberth May 24, 2006.
Constraint Systems Laboratory March 26, 2007Reeson–Undergraduate Thesis1 Using Constraint Processing to Model, Solve, and Support Interactive Solving of.
CSC 2300 Data Structures & Algorithms March 20, 2007 Chapter 7. Sorting.
26 April 2013Lecture 5: Constraint Propagation and Consistency Enforcement1 Constraint Propagation and Consistency Enforcement Jorge Cruz DI/FCT/UNL April.
MAC and Combined Heuristics: Two Reasons to Forsake FC (and CBJ?) on Hard Problems Christian Bessière and Jean-Charles Régin Presented by Suddhindra Shukla.
Constraint Satisfaction Problems
1.1 Chapter 1: Introduction What is the course all about? Problems, instances and algorithms Running time v.s. computational complexity General description.
Slide 1 CSPs: Arc Consistency & Domain Splitting Jim Little UBC CS 322 – Search 7 October 1, 2014 Textbook §
Constraint Satisfaction Problem:  Used to model constrained combinatorial problems  Important real-world applications: hardware & software verification,
Unit III : Introduction To Data Structures and Analysis Of Algorithm 10/8/ Objective : 1.To understand primitive storage structures and types 2.To.
Constraint Satisfaction Problems (CSPs) CPSC 322 – CSP 1 Poole & Mackworth textbook: Sections § Lecturer: Alan Mackworth September 28, 2012.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
CS 61B Data Structures and Programming Methodology July 28, 2008 David Sun.
The Fast Optimal Voltage Partitioning Algorithm For Peak Power Density Minimization Jia Wang, Shiyan Hu Department of Electrical and Computer Engineering.
Chapter 11 Heap. Overview ● The heap is a special type of binary tree. ● It may be used either as a priority queue or as a tool for sorting.
+ Simulation Design. + Types event-advance and unit-time advance. Both these designs are event-based but utilize different ways of advancing the time.
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
CP Summer School Modelling for Constraint Programming Barbara Smith 2. Implied Constraints, Optimization, Dominance Rules.
Jinsong Guo Jilin University, China Background  Filtering techniques are used to remove some local inconsistencies in the search algorithms solving.
Constraint Satisfaction CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Problem Solving with Constraints Lookahead Schemas 1 Foundations of Constraint Processing CSCE496/896, Fall
1 Prune-and-Search Method 2012/10/30. A simple example: Binary search sorted sequence : (search 9) step 1  step 2  step 3  Binary search.
Chapter 5: Constraint Satisfaction ICS 171 Fall 2006.
1 The Theory of NP-Completeness 2 Cook ’ s Theorem (1971) Prof. Cook Toronto U. Receiving Turing Award (1982) Discussing difficult problems: worst case.
Schreiber, Yevgeny. Value-Ordering Heuristics: Search Performance vs. Solution Diversity. In: D. Cohen (Ed.) CP 2010, LNCS 6308, pp Springer-
Review Test1. Robotics & Future Technology Future of Intelligent Systems / Ray Kurzweil futurist Ray Kurzweil / A Long Bet A Long Bet / Robot Soccer.
Chapter 2) CSP solving-An overview Overview of CSP solving techniques: problem reduction, search and solution synthesis Analyses of the characteristics.
An Introduction to Artificial Intelligence Lecture 5: Constraint Satisfaction Problems Ramin Halavati In which we see how treating.
Arc consistency AC5, AC2001, MAC. AC5 A generic arc-consistency algorithm and its specializations AIJ 57 (2-3) October 1992 P. Van Hentenryck, Y. Deville,
Arc Consistency CPSC 322 – CSP 3 Textbook § 4.5 February 2, 2011.
Constraint Propagation influenced by Dr. Rina Dechter, “Constraint Processing”
Chapter 13 Backtracking Introduction The 3-coloring problem
03/02/20061 Evaluating Top-k Queries Over Web-Accessible Databases Amelie Marian Nicolas Bruno Luis Gravano Presented By: Archana and Muhammed.
Arc Consistency and Domain Splitting in CSPs CPSC 322 – CSP 3 Textbook Poole and Mackworth: § 4.5 and 4.6 Lecturer: Alan Mackworth October 3, 2012.
Lecture 5: Constraint Satisfaction Problems
Evolutionary Computing Chapter 13. / 24 Chapter 13: Constraint Handling Motivation and the trouble What is a constrained problem? Evolutionary constraint.
Constraint Programming for the Diameter Constrained Minimum Spanning Tree Problem Thiago F. Noronha Celso C. Ribeiro Andréa C. Santos.
1 Maximum Flows CONTENTS Introduction to Maximum Flows (Section 6.1) Introduction to Minimum Cuts (Section 6.1) Applications of Maximum Flows (Section.
Dense-Region Based Compact Data Cube
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
May 17th – Comparison Sorts
Multiway Search Trees Data may not fit into main memory
Empirical Comparison of Preprocessing and Lookahead Techniques for Binary Constraint Satisfaction Problems Zheying Jane Yang & Berthe Y. Choueiry Constraint.
Objective of This Course
Constraints and Search
8/04/2009 Many thanks to David Sun for some of the included slides!
Chapter 11 Limitations of Algorithm Power
CS 8520: Artificial Intelligence
Constraint Satisfaction Problems
Consistency algorithms
Presentation transcript:

1 Refining the Basic Constraint Propagation Algorithm Christian Bessière and Jean-Charles Régin Presented by Sricharan Modali

2 Outline AC3 Two refinements AC2000 AC2001 Experiments Analytical comparison of AC2001 & AC6 Conclusion

3 Introduction Importance of constraint propagation Propagation scheme of most of existing constraint solving engines Constraint oriented, or Variable oriented AC3 is a generic algorithm AC4, AC6 & AC7: value oriented propagation

4 Importance of AC3 When you know constraint semantics, use special propagation algorithms (e.g., all-diff, functional) When nothing is known about constraints, use a generic AC algorithm (e.g., AC1, 2, 3, 4, 6 or 7) AC3 does not require maintaining a specific data structures during search, in contrast to AC4, AC6 & AC7  Authors focus on AC3: generic and light weight

5 Contribution Modify AC3 into AC2000 & AC2001 More efficient with heavy propagation Light weight data structures Variable-oriented Dominate AC3 (# CC & CPU time)

6 AC2000, like AC3, is free of any data structure to be maintained during search (not really: authors use  (X i ) per variable) Regarding human cost of implementation AC2000 needs 5 more lines than AC3 AC2000 vs. AC3

7 AC2001 vs. AC3 AC2001 needs an extra data structure, an integer for each value-constraint pair: Last(X i,v i,X j ) Achieves optimal worst-case time complexity Human cost (implementation): AC2001 needs management of additional data structure (  (X i ), Last(X i,v i,X j ))

8 Constraint network P = (X, D, C) X is a set of n variables {X 1, …, X n } D is a set of domains {D(X 1 ), …, D(X n )} C is a set of e binary constraints between pairs of variables. Constraint check: verifying whether or not a given pair of values (v i,v j ) is allowed by C ij

9 Arc consistent value v i is an arc-consistent value on C ij : v i  D(X i )   v j  D(X j ) | (v i,v j )  C ij v j is called support for (X i,v i ) on C ij XiXi XjXj vivi vjvj

10 Viable value Viable value: v i  D(X i ) is viable  it has support in all neighboring D(X j ) Arc consistent CSP: if all the values in all the domains are viable.

11 AC3 A variable-oriented propagation scheme Difference with [Mackworth 77] Instead of handling a queue for the constraints to be propagated, it has a queue of the variables whose domain has been modified. This AC3 terminates whenever any domain is empty

12

13 A (bad) example for AC3

14 AC3 overdoes it Revise3(X j,X i ) removes v j from D(X j ) AC puts X j in Q Propagate3 calls Revise3(X i, X j ) for every constraint C ij involving X j Revise3(X i, X j ) will look for a support for every value in D(X i ) even when v j was not a support!

15 Enhancement in AC2000 Instead of blindly looking for a support for a value v i  D(X i ) each time D(X j ) is modified, it is done only when needed

16 AC2000 In addition to Q,  (X j ) is used  (X j ) contains the values removed from D(X j ) since the last propagation of X j When calling Revise2000(X i,X j,t) a check is made to see if v i has a support in  (X j )

17 Example

18 How AC2000 operates The larger  (X j ), the closer it gets in size to D(X j ) the more expensive the process is the more likely for v i to have a support in  (X j ) Hence lazymode is used only when |  (X j )| is sufficiently smaller than |D(X j )| Use of lazymode is controlled with Ratio |  (X j ) |/ |D(X j )| < Ratio, use lazymode |  (X j ) |/ |D(X j )|  Ratio, use  lazymode

19

20 Analysis of AC2000 Assumption: AC3 is correct Prove: Lazymode of AC2000 does not lead to arc-inconsistent values in the domain The only way the search for support for a value v i in D(X i ) is skipped is when v i is not supported by values in  (X j )  (X j ) contains all values last deleted from D(X j )  v i has exactly the same set of supports as before on C ij Looking again for a support for v i is useless as it remains consistent with C ij

21 Space complexity of AC2000 It is bounded by the sizes of Q and  Q is O(n),  is O(nd) d is the size of the largest domain Overall complexity O(nd)

22 Time Complexity of AC2000 The main change is in Revise2000, where both  (X j ) and D(X j ) are examined instead of only D(X j ) This leads to a worst case where d 2 checks are performed in Revise2000 Hence the overall time complexity is O(ed 3 ) since Revise2000 can be called d times per constraint.

23 AC2000 too overdoes it.. In AC2000 we have to look again for a support for v i on C ij If we can remember the support found for v i in D(X j ) the last time C ij is revised Next time we need to check whether or not this last support belongs to  (X j ).

24 AC2001 saves more.. A new data structure Last(X i,v i,X j ) is used to store the value that supports v i The function Revise2001 always runs in lazymode, except during the initialization phase. Further, when supports are checked in a given ordering “< d ” (i.e., sorted) we know that there isn’t any support for v i before Last(X i,v i,X j ) in D(X j ).

25 Example

26

27 Space complexity of AC2001 Is bounded above by the size of Q,  and Last Q is in O(n)  is in O(nd) But Last is in O(ed) Since each value v i has a Last pointer for each constraint involving X i. This gives the overall complexity of O(ed)

28 Time Complexity of AC2001 As in AC3 & AC2000, Revise2001 can be called d times per constraint. But at each call to Revise2001(X i,X j,t) for each value v i  D(X i ) There will be a test on the Last(X i,v i,X j ) And a search for support on D(X j ) greater than Last(X i,v i,X j ) The overall time complexity is then bounded above by d(d+d)2e, which is in O(ed 2 ) O(ed 2 ) is optimal AC2001 is the first optimal arc-consistency algorithm proposed in the literature that is free of any lists of supported values.

29 Experiments To see if AC2000 and AC2001 are effective vs. AC3, compare #CC & CPU Context: pre-processing & search (MAC) The goal is not to compete with AC6/AC7 An improvement (even small) w.r.t AC3 is significant

30 AC as a preprocessing The chance of having some propagations are very small on real instances Hence only one real – world instance is considered Other instances are randomly generated to fall in the phase transition region Ratio of 0.2 is taken (no justification given)

31 Parameters N number of variables D size of the domain C number of constraints P 1 density of constraints 2C/N.(N-1) T number of forbidden tuples P 2 tightness of the forbidden tuples T/D 2

32 Results Low density (p 1 =0.045) Instance 1: under-constrained (p 2 =0.5) Instance 2: over-constrained (p 2 =0.94) High tightness (p 2 =0.918, 0.875) Instance 3: sparse (p 1 =0.045) Instance 4: dense (p 1 =1.0)

33 Observation

34 Maintaining Arc consistency during search MAC-3, MAC-2000, MAC-2001 Experiments carried over all the instances contained in FullRLFAP archive for which more than 2 secs is necessary to find a solution or to prove that none exists Ratio is again 0.2 (no justification given)

35 Results

36 Observations There is a slight gain of MAC2000 over MAC3 Except for SCEN#11 On SCEN#11 it is seen that MAC2000 outperforms MAC3 for ratio 0.1 MAC2001 outperforms MAC3 with 9 times less CC and 2 times less cpu time

37 Restrictions Comparison is between algorithms with simple data structures Note that to solve SCEN#11 MAC-6 (MAC + AC6):14.69 sec MAC3 needs sec MAC2000 needs sec MAC2001 needs sec

38 AC2001 vs. AC6 Time complexity and space complexity of AC2001 is equal to that of AC6 What are the differences between AC6 and AC2001? Property1: #CC same! Property2: Difference is in the effort of maintaining specific data structures Authors give condition who wins when

39 Conclusion Two refinements to AC3: AC2000 & AC2001 AC2000 improves slightly over AC3, w/o maintenance of any new data structure AC2001 needs an additional data structure Last AC2001 achieves optimal worst-case time complexity

40 Thanks