Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Logic of Partially Satisfied Constraints Nic Wilson Cork Constraint Computation Centre Computer Science, UCC.

Similar presentations


Presentation on theme: "A Logic of Partially Satisfied Constraints Nic Wilson Cork Constraint Computation Centre Computer Science, UCC."— Presentation transcript:

1 A Logic of Partially Satisfied Constraints Nic Wilson Cork Constraint Computation Centre Computer Science, UCC

2 Soft Constraints Soft Constraints formalisms tend to involve some form of degrees: degrees to which tuples satisfy a constraint (or set of constraints). These degrees can represent, for example, (a) costs, of not satisfying constraints in an over-constrained problem (b) extent of satisfaction of a vague constraint – constraints can be partially satisfied by tuples (c) degree of uncertainty, given uncertain constraint – we’re unsure about the identity of the constraint

3 Set of variables V. Each variable v has domain v. For U  V, U is set of possible assignments to U, i.e.,  v  U v A complete tuple x is an element of V. A constraint c has an associated set of variables V c  V. c is defined to be a subset of V c (so a set of partial tuples). Associated with constraint c is the constraint c  V = {x  V : x  Vc  c} on variables V: the set of complete tuples which project to an element of c. c may be considered as a compact representation of c  V. For example, let V = {V 1, V 2, V 3, V 4, V 5 }, V 1 = {a 1, b 1 }, V 2 = {a 2, b 2 } etc. Let U = {V 1, V 2 }. Then U = { (a 1, a 2 ), (a 1, b 2 ), (b 1, a 2 ), (b 1, b 2 ) }. Define constraint c by: V c = {V 1, V 2 }, c = { (a 1, a 2 ), (b 1, b 2 ) }. x = (a 1, a 2, b 3, a 4, b 5 ) is a complete tuple. x  c  V since x  Vc = (a 1, a 2 ) is in c. Finite CSPs as a Logic

4 The constraints are for a particular purpose. There is a set M of complete tuples that are adequate for this purpose. M is unknown: constraints give us information about it. Each constraint c is viewed as a restriction on complete tuples: it says that if a tuple is not in c, then it is inadequate for the purpose (so constraints are taken to give negative information). c tells us that M  c  V : any complete tuple outside c  V is inadequate. Define a model M to be a subset of V. Each M is a possible candidate for M. M |= c (M is a model of c) iff M  c  V Let K be a set of constraints, and d be a constraint. We define K |= d (K entails d) iff every model of (every element of) K is a model of d. In other words, if K tells us that every tuple outside d is inadequate for our purpose. Semantics

5 Combination of constraints. Constraints c and d. c  d is the constraint c  U  d  U on variables U = V c  V d. So y  U is in c  d iff y  Vc  c and y  Vd  d. c  d is essentially the intersection, as c  U is essentially the same constraint as c. This operation is commutative and associative. Projection of constraints For constraint c, and U  V c, c  U, the projection of c to U, is defined to be {y  U : y  c}. Identity (trivial) constraint Let 1 V be the constraint V. (Every tuple is allowed.) This gives us no information, and is satisfied by all models.

6 Axiom: 1 V Inference Rules: From c and d deduce c  d When T  V c : From c deduce c  T When V c = V d and c  d: From c deduce d The proof theory is sound and complete: K |= d if and only if d can be derived from K  {1 V } using the above rules. Proof Theory

7 Derived inference rules Arc consistency can be considered as derived inference rules: combination of projections to a single variable.

8 Derived Inference Rule: Deletion of a variable V 1 : Combine all constraints involving that variable and project to U - {V 1 }: (  {c  K : V 1  V c })  U – {V1} where U is the set of variables involved in the combination. (  K)  Vd can be computed by repeated deletion of variables; the deletion inference rule gives a sound and complete proof procedure (Dechter and Pearl’s adaptive consistency algorithm; special case of Shenoy’s fusion algorithm, Dechter’s bucket elimination).

9 A constraint c, as defined above, can also be viewed as a function from V c to {0,1}: assigning 1’s to partial tuples in the constraint. Similarly, models can be viewed as functions from V to {0,1}. Then we have: M |= c iff M  c  V, i.e., iff for all tuples x  V, M(x)  c  V (x), i.e., M(x)  c(x  Vc ). c  d is the constraint on V c  V d given by (c  d) (y) = c(y  Vc )  d(y  Vd ) where the last  is logical AND (i.e., min). c  U, the projection of c to U, c  U (u) =  {c(y): y  U = u} where  is logical OR (i.e., max). 1 U is the constraint on variables U which is everywhere equal to 1. The only difference in the proof theory is that the subset inference rule is replaced by: When V c = V d and c  d: From c deduce d since: c  d, i.e., for all y, c(y)  d(y), iff every tuple in c is a tuple in d.

10 Suppose we now want to allow degrees of satisfaction of constraints. We choose a partially (or totally) ordered set (A, , 0, 1) to represent these degrees, where A contains a unique maximal element 1 and a unique minimal element 0. A (soft) constraint c is defined to be a function from V c to A, with a value of 1 meaning that the tuple completely satisfies the constraint, and a value of 0 meaning that the tuple doesn’t satisfy the constraint at all (tuple is known to be completely inadequate for our purpose). Define models to be functions from V to A. A model is intended to represent the ‘true’ degree to which a tuple is adequate for our purpose. Each constraint is viewed as restricting the set of models: it gives upper bounds on these ‘true’ degrees. M |= c iff M  c  V, i.e., iff for all tuples x  V, M(x)  c  V (x), i.e., M(x)  c(x  Vc ). As before, for set of (soft) constraints K, and (soft) constraint d, we say K |= d iff every model M of K is also model of d. Partially Satisfied Constraints

11 When A is a distributive lattice An example of a distributive lattice is a subset lattice: Let B be a set. Let A be a set of subsets of B, which is closed under intersection and union: i.e., if ,   A then ,   B, and   ,     A. In fact any finite distributive lattice is isomorphic to such a subset lattice.    is the unique greatest lower bound of  and .    is the unique least upper bound of  and . This enables us to define combination and projection, with the same form as before. Let c: V c  A and d: V d  A be two soft constraints. Their combination c  d is the constraint on V c  V d. given by (c  d) (y) = c(y  Vc )  d(y  Vd ) For constraint c, and U  V c, c  U, the projection of c to U, is defined by c  U (u) =  {c(y): y  U = u}. The degree of the projection is the least upper bound (meet) of c(y) over all y which project to u.

12 Axiom: 1 V Inference Rules: From c and d deduce c  d When U  V c : From c deduce c  U When V c = V d and c  d: From c deduce d Again we have a form of arc consistency as sound (derived) inference rules; and also the deletion inference rule leads again to a sound and complete inference procedure which is efficient if there’s a nice hypertree cover of the hypergraph {V c : c  K}. Sound and Complete Proof Theory

13 Idempotent c-semi-rings are distributive lattices; the combination operation is the same as described above, and the summation is the projection defined above. The deduction is the same. So we have a new semantics for deduction in idempotent semi-ring CSPs. Relationship with idempotent semi-ring CSPs

14 Any partial order can be embedded in a subset lattice (so that the ordering is maintained). In particular we can choose A  A, and for  A define X  to be {  A :    }. If A is chosen sufficiently large we have    iff X   X , so that ({X  :  A},  ) is a representation of the partial order (A,  ). ( We also need the following equivalence, which holds for the above representation: For all  A, ( X    {X  :  B}  X    {X  ’ :  B} ) is equivalent to  {X  :  B}   {X  ’ :  B}. ) We can thus use the above proof procedures on this embedding to get sound and complete deduction procedures for the general case. The General Partially Ordered Case

15 Models in the above semantics are functions from V to partially ordered set A: they assign partially ordered degrees to complete tuples. It can also be natural to define models as functions from V to some partially ordered set A that extends A. Otherwise we use the same definitions to define the entailment relation. The intuition is that the partially ordered values mentioned in the constraints are from some unknown partially ordered set. This leads to a much weaker entailment (set of constraints K will often have many less consequences d). A Different Semantics for Partially Satisfied Constraints

16 Summary A logical approach was taken to partially satisfied constraints, where tuples can be assigned values from an arbitrary partially ordered set. The logic has a simple semantics, and sound and complete proof theory. This gives a new semantics for idempotent semi- ring CSP deduction.

17

18

19

20

21

22 For example, if we choose A = A so that for   A, we define X  = {  A :    }. Then    implies X   X  : this follows by transitivity of  : if  X  then    so    so   X  And X   X  implies    : this follows by reflexivity of  :    so  X   X  therefore    However, we can usually find much smaller sets A than A leading to a more efficient representation of the partial order.


Download ppt "A Logic of Partially Satisfied Constraints Nic Wilson Cork Constraint Computation Centre Computer Science, UCC."

Similar presentations


Ads by Google