Presentation is loading. Please wait.

Presentation is loading. Please wait.

Constraint Satisfaction Problems. Contents Representations Representations Solving with Tree Search and Heuristics Constraint Propagation Tree Clustering.

Similar presentations


Presentation on theme: "Constraint Satisfaction Problems. Contents Representations Representations Solving with Tree Search and Heuristics Constraint Propagation Tree Clustering."— Presentation transcript:

1 Constraint Satisfaction Problems

2 Contents Representations Representations Solving with Tree Search and Heuristics Constraint Propagation Tree Clustering

3 Posing a CSP A set of variables V 1, …, V n A domain over each variable D 1,…,D n A set of constraint relations C 1,…,C m between variables which indicate permitted combinations Goal is to find an assignment to each variable such than none of the constraints are violated

4 Constraint Graphs Nodes = variables Edges = constraints Example: map coloring A B C D A BC D

5 N-ary Constraint Graphs Example: – Variables: X=[1,2] Y=[3,4] Z=[5,6] – Constraints: X + Y = Z (Roman Barták, 1998 ) X YZ Hyper graph X YZ Primal constraint graph

6 Making a Binary CSP Can convert n-ary constraint C into a unary constraint on new variable V c – D c = cartesian product of vars in C Can convert n-ary CSP into a binary CSP – Create var V c for each constraint C (as above) – Domain D c = cartesian product – tuples that violate C – Add binary equivalence constraints between new variables V c, V c’ :C,C’ share var X  V c,V c’ must agree on X

7 Making a Unary Constraint Variables: X=[1,2] Y=[3,4] Z=[5,6] Constraints: X + Y = Z X YZ (Roman Barták, 1998 ) XYZ [(1,3,5),(1,3,6), (1,4,5),(1,4,6), (2,3,5),(2,3,6) (2,4,5),(2,4,6)] XYZ=

8 Making a Unary Constraint Variables: X=[1,2] Y=[3,4] Z=[5,6] Constraints: X + Y = Z X YZ (Roman Barták, 1998 ) XYZ XYZ=[(1,4,5), (2,3,5), (2,4,6)]

9 Making a Binary CSP Variables: X=[1,2] Y=[3,4] Z=[5,6] W=[1,3] Constraints: X + Y = Z, W { "@context": "http://schema.org", "@type": "ImageObject", "contentUrl": "http://slideplayer.com/3188115/11/images/slide_8.jpg", "name": "Making a Binary CSP Variables: X=[1,2] Y=[3,4] Z=[5,6] W=[1,3] Constraints: X + Y = Z, W

10 Making a Binary CSP Variables: X=[1,2] Y=[3,4] Z=[5,6] W=[1,3] Constraints: X + Y = Z, W { "@context": "http://schema.org", "@type": "ImageObject", "contentUrl": "http://slideplayer.com/3188115/11/images/slide_9.jpg", "name": "Making a Binary CSP Variables: X=[1,2] Y=[3,4] Z=[5,6] W=[1,3] Constraints: X + Y = Z, W

11 Contents Representations Solving with Tree Search and Heuristics Solving with Tree Search and Heuristics Constraint Propagation Tree Clustering

12 Generate and Test Generate each possible assignment to the variables and test if constraints are satisfied – Exponential possibilities: O(d n ) – Simple but extremely wasteful!

13 DFS and Backtracking Depth first search – Levels represent variables – Branches off nodes represent a possible instantiations of variables Test against constraints after every variable instantiation and backtrack if violation – Incrementally attempts to extend partial solution – Whole subtrees eliminated at once

14 Example red green blue red red green V1V1 V2V2 V3V3 (*,*,*)

15 Example red green blue red red green V1V1 V2V2 V3V3 (*,*,*) (r,*,*)

16 Example red green blue red red green V1V1 V2V2 V3V3 (*,*,*) (r,*,*) (r,r,*)

17 Example red green blue red red green V1V1 V2V2 V3V3 (*,*,*) (r,*,*)(g,*,*) (r,r,*)(g,r,*)

18 Example red green blue red red green V1V1 V2V2 V3V3 (*,*,*) (r,*,*)(g,*,*) (r,r,*)(g,r,*) (g,r,r)(g,r,g)

19 Example red green blue red red green V1V1 V2V2 V3V3 (*,*,*) (r,*,*)(g,*,*) (r,r,*)(g,r,*) (g,r,r) (b,*,*) (b,r,*) (b,r,r)(g,r,g)(b,r,g)

20 Forward Checking Backtracking is still wasteful – A lot of time is spent searching in areas where no solution remains – Ex. setting V 4 to value X 1 eliminates all possible values for V 8 under the given constraints – Can cause thrashing Forward checking removes restricted values from the domains of all uninstantiated variables – If a domain becomes empty backtracking is done immediately

21 Heuristics The search can usually be sped up by searching intelligently: – Most-constrained variable: Expand subtree of variables that have the fewest possible values within their domain first – Most-constraining variable: Expand subtree of variables which most restrict others first – Least-constraining value: Choose values that allow the most options for the remaining variables first

22 Contents Representations Solving with Tree Search and Heuristics Constraint Propagation Constraint Propagation Tree Clustering

23 Constraint Propagation A preprocessing step to shrink the CSP – Constraints are used to gradually narrow down the possible values from the domains of the variables A singleton may result – If the domains of each variable contain a single value we do not need to search

24 Arc Consistency Arc (V i, V j ) in a constraint graph is arc consistent if for every value of V i there is some value that is permitted for V j Algorithm: Complexity O(ed 3 ) do foreach edge (i,j) delete values from D i that cause Arc(V i,V j ) to fail while deletions

25 Example green red green blue green blue V1V1 V2V2 V3V3 Consider edge (1,3)

26 Example green red green blue green blue V1V1 V2V2 V3V3 Consider edge (3,1)

27 Example green red green blue green blue V1V1 V2V2 V3V3 Consider edge (2,1)

28 Example green red green blue green blue V1V1 V2V2 V3V3 Consider edge (2,3)

29 Example green red green blue green blue V1V1 V2V2 V3V3 Consistent and a singleton!

30 Levels of Consistency Algorithms we have seen before are combinations of tree search and arc consistency: Generate and Test Backtracking Forward Checking Partial Lookahead Full Lookahead Really Full Lookahead (Nadel, 1988) TS BT = TS + AC 1/5 FC = TS + AC 1/4 PL = FC + AC 1/3 FL = FC + AC 1/2 RFL = FC + AC

31 Backtracking function BT(i,var) for(var[i]=D i ) CONSISTENT = true for(j=1:i-1) CONISITENT = check(i,var[i],j,var[j]) end if CONSISTENT if i==n disp(var) else BT(i+1,var) end Given: – check(i,X i, j, X j ) : true if Vi = X i and V j = X j is permitted by constraints – revise(i,j) : true if D i is empty after making Arc(V i,V j ) = true function BT(i) EMPTY_DOMAIN = check_backward(i) if ~EMPTY_DOMAIN for(var[i]=D i ) D i = var[i] if i==n disp(var) else BT(i+1) end function check_backward(i) for(j=1:i-1) if revise(i,j) return true end return false

32 Forward Checking function FC(i) EMPTY_DOMAIN = check_forward(i) if ~EMPTY_DOMAIN for(var[i]=D i ) D i = var[i] if i==n disp(var) else FC(i+1) end function check_forward(i) if i>1 for(j=i:n) if revise(j,i-1) return true end return false Similar to backtracking except more arc- consistency

33 Levels of Consistency Generate and Test Backtracking Forward Checking Partial Lookahead Full Lookahead Really Full Lookahead (Nadel, 1988) TS BT = TS + AC 1/5 FC = TS + AC 1/4 PL = FC + AC 1/3 FL = FC + AC 1/2 RFL = FC + AC

34 A Stronger Degree of Consistency A graph is K-consistent if we can choose values for any K-1 variables that satisfy all the constraints, then for any Kth variable be able to assign it a value that satisfies the constraints A graph is strongly K-consistent if J-consistent for all J < K – Node consistency is equivalent to strong 1-consistency – Arc consistency is equivalent to strong 2-consistency

35 Towards Backtrack Free Search A graph that has strong n-consistency requires no search – Acquiring strong n-consistency is exponential in the number of variables (Cooper, 1989) For a general graph that is strongly k- consistent (where k < n) backtracking cannot be avoided

36 Example Arc consistent, yet a search will backtrack! red green green blue V1V1 V2V2 V3V3 (*,*,*) (r,*,*) (r,r,*) …

37 Constraint Graph Width The nodes of a constraint graph can be ordered V1V1 V2V2 V3V3 V1V1 V2V2 V3V3 V1V1 V3V3 V2V2 V2V2 V1V1 V3V3 V2V2 V3V3 V1V1 V3V3 V1V1 V2V2 V3V3 V2V2 V1V1 The width of a node in an ordered graph is equal to the number of incoming arcs from higher up nodes The width of an ordered graph is the max width of its vertices The width of a constraint graph is the min width of each of its orderings  1

38 Backtrack Free Search Theorem: If a constraint graph is strongly K- consistent, and K is greater than its width, then there exists a search order that is backtrack free – K>2 consistency algorithms add arcs requiring even greater consistency – If a graph has width 1 we can use node and arc consistency to get strong 2-consistency without adding arcs – All tree structured constraint graphs have width 1 (Freuder 1988)

39 Contents Representations Solving with Tree Search and Heuristics Constraint Propagation Tree Clustering Tree Clustering

40 Tree Clustering Motivation Tree structured constraint graphs can be solved without backtracking – We would like to turn non-tree graphs into trees by grouping variables – The grouped variables themselves become smaller CSP’s – Solving a CSP is exponential in the worst case so reducing the number of variables we consider at once is also important If we want the CSP for many queries it is worth investing more time in restructuring it (Dechter, 1988)

41 Redundancy Constraints in the dual graph are equalities Variables: A, B, C, D, E, F Constraints: (ABC), (AEF), (CDE), (ACE) ABCAEF ACECDE A CE AC E C AE ABCAEF ACECDE CE ACAE Join graph/tree

42 Tree Clustering If the dual graph cannot be reduced to a join tree we can still make it acyclic: – Condition for acyclicity: A CSP is acyclic iff its primal graph is chordal and conformal Given a primal graph its dual can be made acyclic: – Triangulate graph to make it chordal – The maximal cliques are constraints/nodes in the new dual graph (Beeri, 1983)Beeri, 1983

43 Triangulation Use maximum cardinality search (m-ordering) to order the nodesm-ordering Add an edge between any two nonadjacent nodes that are connected by nodes higher in the ordering (Tarjan, 1984)

44 The Algorithm Build the primal graph for the CSP – O(n 2 ) Triangulate – O(n 2 ) Use maximal cliques as new nodes in dual graph – O(n) Remove any redundancies in the new graph – O(n)

45 Example Variables: A, B, C, D, E Constraints: (A,C), (A,D), (B,D), (C,E), (D,E) AD AC BD DE CE DD A E D C AD AC BD DE CE DD A E C Still cyclic!

46 Example Variables: A, B, C, D, E Constraints: (A,C), (A,D), (B,D), (C,E), (D,E) E CD AB E C D A B Order: E, D, C, A, B E CD AB

47 Example Variables: A, B, C, D, E Constraints: (A,C), (A,D), (B,D), (C,E), (D,E) E CD AB ACD BD CDE DD CD ACD BD CDE D CD Acyclic!

48 Solving the CSP Solve each node of the tree as a separate small CSP – This can be done in parallel – The solutions to each node constitute the domain of that node in the tree – O(d m ) Use arc consistency to reduce the domains of each node Solve the entire CSP without backtracking

49 Appendix

50 Example CSP’s N-queens Map coloring Cryptoarithmatic Wireless network base station placement Object recognition from image features

51 Heuristic Repair Start with a random instantiation of variables, choose a variable and reassign it so that fewest constraints are violated Repeat this some number of times, if constraints still violated, restart with a new random instantiation Similar to GSAT

52 Graham’s Algorithm Given a dual constraint graph – If V i is a variable that appears in exactly one node then remove V i – If the variables in N i are a subset of variables in another node N j then remove N i Repeat until neither applies Graph is acyclic if the result is the empty set Easy to verify (Graham, 1979)

53 Graham’s Algorithm Trees collapse to empty set – Edges in dual graph are constraints on common vars – Nodes in a tree share vars only with their parents – Step 1 removes any unique vars from the children – After step 1 the children are removed by step 2 since they are now subsets of the parent     Step-1  Step-2 [ ] Step-1

54 Graham’s Algorithm Cycles don’t collapse – Step 1 will fail since cycles are created among nodes with vars that are shared among multiple nodes – Step 2 must fail since if any node was a subset of another it would have to share all of its variables with at least one other node   

55 Graham’s Algorithm  Primal Graph is Chordal Assume Graham’s algorithm succeeds – Step 1 must have removed every node of the primal graph – Assume there was a chordless cycle Let z be a node on the cycle that is first eliminated Let x, y be nodes on the cycle adjacent to z For step 1 to apply z must have belonged to only one constraint (which are cliques/hyperedges) With z gone the constraint is left with x and y Thus x and y are connected  contradiction!

56 Why m-ordering? We want an ordering that will not add any edges to chordal graphs Property P (for zero fill-in): – If u < v < w – If (u,w) is an edge – If (v,w) is not an edge – Then exists vertex x v < x (v, x) is an edge (w, x) is not an edge w uv x

57 Proof of Property P Assumptions – Given chordal graph G – Given an ordering  with Property P Define property Q: – Let V 0, V 1, …, V k be an unchorded path for which  (V k ) is maximum – V k > … > V i+1 > V i < … < V 2 < V 1 < V k < V 0 – Not possible! w v u … … … - order: u < v < w - if no edge (v,w) then v, u, w satisfies property Q

58 M-ordering Satisfies P Suppose u < v < w w uv x


Download ppt "Constraint Satisfaction Problems. Contents Representations Representations Solving with Tree Search and Heuristics Constraint Propagation Tree Clustering."

Similar presentations


Ads by Google