Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assumption-Based Truth Maintenance Systems Meir Kalech.

Similar presentations


Presentation on theme: "Assumption-Based Truth Maintenance Systems Meir Kalech."— Presentation transcript:

1 Assumption-Based Truth Maintenance Systems Meir Kalech

2 Outline  Last lecture: 1. Consistency-based diagnosis 2. GDE – general diagnosis engine 3. Conflict generation using ATMS 4. Candidate generation  Today’s lecture: 1. What is TMS 2. TMS architecture 3. Justification-based TMS 4. Assumption-based TMS

3 What is TMS? A Truth Maintenance System (TMS) is a Problem Solver module responsible for: Enforcing logical relations among beliefs. Generating explanations for conclusions. Finding solutions to search problems Supporting default reasoning. Identifying causes for failure and recover from inconsistencies.

4 1. Enforcement of logical relations  AI problem -> search.  Search utilizes assumptions.  Assumptions change.  Changing assumptions -> updating consequences of beliefs.  TMS: mechanism to maintain and update relations among beliefs.

5 1. Enforcement of logical relations Example: If (cs-501) and (math-218) then (cs-570). If (cs-570) and (CIT) then (TMS). If (TMS) then (AI-experience). The following are relations among beliefs: (AI-experience) if (TMS). (TMS) if (cs-570), (CIT). (cs-570) if (cs-501), (math-218)  Beliefs are propositional variables  TMS is a mechanism for processing large collections of logical relations on propositional variables.

6 2. Generation of explanations  Solving problems is what Problem Solvers do. However, often solutions are not enough.  The PS is expected to provide an explanation  TMS uses cached inferences for that aim.  TMS is efficient: Generating cached inferences once is more beneficial than running inference rules that have generated these inferences more than once.

7 2. Generation of explanations Example: Q: Shall I have an AI experience after completing the CIT program? A: Yes, because of the TMS course. Q: What do I need to take a TMS course? A: CS-570 and CIT.  There are different types of TMSs that provide different ways of explaining conclusions (JTMS vs ATMS).  In this example, explaining conclusions in terms of their immediate predecessors works much better.

8 3. Finding solutions to search problems B A D C E  Color the nodes: red (1), green (2) yellow (3).  Adjacent nodes are of different colors.  The set of constraints describe this problem: A1 or A2 or A3 not (A1 and B1) not (A3 and C3) not (D2 and E2) B1 or B2 or B3 not (A2 and B2) not (B1 and D1) not (D3 and E3) C1 or C2 or C2 not (A3 and B3) not (B2 and D2) not (C1 and E1) D1 or D2 or D3 not (A1 and C1) not (B3 and D3) not (C2 and E2) E1 or E2 or E2 not (A2 and C2) not (D1 and E1) not (C3 and E3)

9 To find a solution we can use search: 3. Finding solutions to search problems A is red A is greenA is yellow B is green B is yellow C is red C is yellow D is red D is green E is green E is yellow

10 4. Default reasoning and TMS  PS must make conclusions based on incomplete information.  “Closed-World Assumption” (CWA)  X is true unless there is an evidence to the contrary.  CWA helps us limit the underlying search space.  The reasoning scheme that supports CWA is called “default (or non-monotonic) reasoning”.

11 4. Default reasoning and TMS  Example: Consider the following knowledge base Bird(tom) and ¬ Abnormal(tom)  Can_fly(tom) Penguin(tom)  Abnormal(tom) Ostrich(tom)  Abnormal(tom) Bird(tom) Under the CWA, we assume ¬ Abnormal(tom) and therefore we can derive: Can_fly(tom)  Non-monotonic TMS supports this type of reasoning.

12 5. Identifying causes for failures and recovering from inconsistencies  Inconsistencies among beliefs in the KB are always possible: wrong data (example: “Outside temperature is 320 degrees.”) Impossible constraints (example: Big-house and Cheap-house and Nice-house).  TMS maintains help identify the reason for an inconsistency  “dependency-directed backtracking” allows the TMS to recover.

13 TMS applications  Constraint Satisfaction Problems (CSP) Set of variables Domain over each variable Constraints between variables’ domain Goal: find “solution”: assignments to the variables that satisfy the constraints  Scenario and Planning Problems Find a path of state transitions lead from initial to final states. (games, strategies). TMS – identifies of applicable rules.

14 CSP example Allocation problem Two hosts:{h1,h2}(variables) Three tasks: {t1,t2,t3}(domain) Two constraints:  t1 before t2 on the same host  t1 could not be run on the same host of t3

15 CSP example t1-h1t2-h1t3-h1t1-h2t2-h2t3-h3 t1-h1 t2-h1 t1-h1 t3-h1 t1-h2 t2-h2 t1-h2 t3-h2 t1-h1 t2-h1 t3-h2 t1-h1 t2-h1 t3-h1 t1-h2 t2-h2 t3-h1 t1-h2 t2-h2 t3-h2 … … 48 nodes6 are solutions

16 Outline  Last lecture: 1. Consistency-based diagnosis 2. GDE – general diagnosis engine 3. Conflict generation using ATMS 4. Candidate generation  Today’s lecture: 1. What is TMS 2. TMS architecture 3. Justification-based TMS 4. Assumption-based TMS

17 Problem Solver Architecture Problem Solver TMS Justifications, assumptions Beliefs contradictions The TMS / PS relationship is the following:

18 How the TMS and the PS communicate?  The PS works with: assertions (facts, beliefs, conclusions, hypotheses) inference rules procedures  Each one of these is assigned a TMS node.  Example: N1: (rule (student ?x) (assert (and (underpaid ?x) (overworked ?x)))) N2: (student Bob)  Given N1 and N2, the PS can infer N3: (and (underpaid Bob) (overworked Bob))  PS threats nodes as logical formulas,  While TMS treats nodes as propositional variables.

19 TMS nodes  Different types of TMS support types of nodes: Premise nodes. These are always true. Contradiction nodes. These are always false. Assumption nodes. PS believes no matter whether or not they are supported by the existing evidence.  Node has a label associated with it. The contents and the structure of the label depends on the type of TMS.  Other properties are node type (premise, assumption, etc.), node support (justifications, antecedents), node consequences, etc.

20 TMS justifications  If N3, is created by the PS, it reports to the TMS together with the fact that it follows from N1, N2. justification: (N3  N2 N1)  Here N3 is called the consequent, N1 and N2 are the antecedents of the justification.  Justifications record relations among beliefs or explaining consequents and identifying causes for inconsistencies.  The general format of justifications is the following: ( )

21 Propositional specification of a TMS  TMS nodes are propositional variables   TMS justifications are propositional formulas N1 & N2 & … & Ni  Nj  Here N1, N2, …, Ni, Nj are positive literals, therefore this implication is a Horn formula. TMS can be viewed as a set of Horn formulas

22 PS / TMS interaction Responsibilities of the PS: 1. Adds assertions and justifications. 2. Makes premises and assumptions. 3. Retracts assumptions. 4. Provides advise on handling contradictions Responsibilities of the TMS: 1.Cashes beliefs and consequences and maintains labels. 2.Detects contradictions. 3.Performs belief revision. 4.Generates explanations.

23 Outline  Last lecture: 1. Consistency-based diagnosis 2. GDE – general diagnosis engine 3. Conflict generation using ATMS 4. Candidate generation  Today’s lecture: 1. What is TMS 2. TMS architecture 3. Justification-based TMS 4. Assumption-based TMS

24 Justification-based TMS Justifications are used for: Belief update purpose, when belief state of a node changes. Handle contradiction: 1. Justification is added to the dependency- directed backtracking system 2. Then search through the dependency network for the assumptions of the contradiction 3. Contradiction is removed.

25 Justification-based TMS  A justification contains inlist and outlist for a justified node to be believed: inlist – a set of nodes that must be in outlist – a set of nodes that must be out Syntax: {(inlist),(outlist)}  Premises hold universally: empty in and out  Only one context includes the set of assumptions currently believed.

26 Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night PS concludes “nice weather” from A and C

27 Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} New node in the JTMS

28 Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} PS concludes “swim” from E and G

29 Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()}

30 Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()}

31 Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()} I: Contradiction{(C),()} Dependency-directed backtracking system

32 Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()} I: Contradiction{(C),()} X: Handle{(),()} //premise D: Raining{(X),()} Context: {(A,D,E), (B,C,F,G,H,I)}

33 Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()} I: Contradiction{(C),()} X: Handle{(),()} //premise D: Raining{(X),()} J: Read {(D,E),()} K: Contradiction{(J),()}//becomes tired Dependency-directed backtracking system

34 Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()} I: Contradiction{(C),()} X: Handle{(),()} //premise D: Raining{(X),()} J: Read {(D,E),()} K: Contradiction{(J),()}//becomes tired F: Night{(X),()} Context: {(A,D,F), (B,C,E,G,H,I,J,K)}

35 Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()} I: Contradiction{(C),()} X: Handle{(),()} //premise D: Raining{(X),()} J: Read {(D,E),()} K: Contradiction{(J),()}//becomes tired F: Night{(X),()} L: Sleep{(F),()}

36 Outline  Last lecture: 1. Consistency-based diagnosis 2. GDE – general diagnosis engine 3. Conflict generation using ATMS 4. Candidate generation  Today’s lecture: 1. What is TMS 2. TMS architecture 3. Justification-based TMS 4. Assumption-based TMS

37 Assumption-based TMS: Motivation  Problem solvers need to explore multiple contexts at the same time, instead of a single one (the JTMS case) Alternate diagnoses of a broken system Different design choices Competing theories to explain a set of data  Problem solvers need to compare contexts switching from one context to another. In JTMS, this can be done by enabling and retracting assumptions. In ATMS, alternative contexts are explicitly stored.

38 The idea behind ATMS  The assumptions underlying conclusions are important in problem-solving Solutions can be described as sets of assumptions States of the world can be represented by sets of assumptions  Identify sets of assumptions called here environments Organize problem solver around manipulating environments Facilitates reasoning with multiple hypotheses

39 Assumptions and Justifications  ATMS keeps and manipulates sets of assumptions rather than sets of beliefs  Three types of nodes: Premise nodes. These are always true, but they are of no special interest for ATMS. Assumption nodes. Once made, assumptions are never retracted. Contradictions. These are defined by means of assumptions that originate them. Such sets of assumptions are called nogoods.  ATMS justifications are Horn formulas of the form: Jk: I1, I2, …, In  Ck, where I1, I2, …, In are the antecedents, and Ck is the consequent of justification Jk.

40 Basic ATMS terminology  ATMS answers queries about whether a node holds in a given set of beliefs.  Definition. A set of assumptions upon which a given node depends is called an environment. Example: {A,B,C}  Definition. A label is a set of environments. Example: {{A,B,C}, …,{D,F}} That is, the label is the assumptions upon which the node ultimately depends – major difference from JTMS label, where labels are simple, :IN or :OUT.  Definition. An ATMS-node, Nk is a triplet

41 Basic ATMS terminology Definition. A node n holds in a given environment E, iff it can be derived from E given the set of justifications J: E,J ⊢ n An environment is inconsistent if false is derived: E,J ⊢ ⊥ Definition. Let E be a (consistent) environment, and N be a set of nodes derived from E. Then, E  N is called the context of E.  Definition. A characterizing environment is a minimal consistent environment from which a context can be derived. Each context is completely specified by its characterizing environment.

42 ATMS efficiency  ATMS is provided by a set of assumptions and justifications.  The task of ATMS is efficiently determines the contexts. Incrementally updating only the changed contexts. Data structure for context-consistency checking and node inclusion very fast.

43 Relations between environments Because environments are monotonic, set inclusion between environments implies logical subsumption of consequences. Example: E1 = {C} E2 = {C, D} E3 = {D, E} E1 subsumes E2 E2 is subsumed by E1 E1 neither subsumes or is subsumed by E3

44 How ATMS answers queries How ATMS answers queries about whether a node holds in a given environment?  Easiest way: associate with each node all of the environments  Better way: we can record only those environments which satisfy the following four properties: 1.Soundness: a node holds in any of the environments associated with it. 2.Consistency: no environment is a nogood. 3.Completeness: every consistent environment is either associated with it or with a superset of it. 4.Minimality: no environment is a subset of any other.

45 Example, dependency network: Is H believed? Yes, because its label is non-empty. Is H believed under {B, C, D, Z, X}? Yes, because {B, C, D}  {B, C, D, Z, X} Is H believed under {C, D}? No. ATMS labels    D C  F G  {{B, C}} {{C, D}} {{A},{B,C,D}}

46  Certain nodes can be declared as contradictions:  Every environment which allows a contradiction is inconsistent.  Inconsistent environments are called nogoods.  Example: Contradictions F G  {B,C} {A,B,C}

47 Special labels in ATMS Case 1: Label = { } (empty label) This means that there is no known consistent environment in which the node is believed, i.e. either there is no path from assumptions to it, or all environments for it are inconsistent. Case 2: Label = {{}} (empty environment) This means that the node is believed in every consistent environment, i.e. the node is either a premise or can be derived strictly from premises.

48 Label propagation R  CDG  L  

49 Label propagation: enable A R  CDG  L    

50 Label propagation: enable B R C D G  L     

51 Label propagation: enable C RDG  L      C {{C   C}}  C}, {B,C}}

52 Label propagation: enable D RG  L      C {{C   C},{D}}  C}}  C}, {B,C}} D

53 Example: datum justifications environments For example see Franz Wotawa’s slides page 7-8

54

55 Properties of ATMS

56 Environment Lattice

57 Comments to lattice  If an environment is nogood, then all of its superset environments are nogood as well. All nogoods are the result of the nogood {A, B, E}.  The ATMS associates every datum with its contexts. If a datum is in a context, then it is in every superset as well (the inconsistent supersets are ignored).

58 Comments to lattice  The circled nodes indicate all the contexts of  The square nodes indicate all the contexts of  If PS infers y=0 from x+y=1 and x=1:  Then the context for y=0 is the intersection of the contexts of the above:

59 Comments to lattice  One sound and complete label for the consequent is the set whose elements are the union of all possible combinations of picking one environment from each antecedent node label. Thus one sound and complete label is:  The environment {A, B, C, D} is removed because it is subsumed by {A, B, C}.  The environment {A, B, D, E} is not included because it is a superset of the inconsistent {A, B, E}.

60 ATMS algorithms  Logical specification of the ATMS: ATMS does propositional reasoning over nodes. ATMS justifications are Horn clauses. Contradictions are characterized by nogoods.  Every ATMS operation which changes a node label can be viewed as adding a justification, i.e. this is the only operation we have to be concerned here is label update as a result of adding a justification.

61 ATMS algorithms  Step 1: Compute a tentative new (locally correct) label for the affected node as follows Given J ik the label of the i’th node of k’th justification for consequent node n, a complete label for node n: L new =  k {x | x =  i x i, where x i  J ik }  Step 2: All nogoods and subsumed environments are removed from L new to achieve global correctness.

62 Propagating label changes  To update node Ni, compute its new label as described  If the label has not changed  DONE.  Else  If Ni is a contradiction node do Mark all environments in its label as nogoods. For every node in the network, check its label for environments marked as nogoods and remove from every node label.  Else  recursively update all Ni’s consequences (other nodes having justifications which mention Ni).

63 Example Assumptions: #:Proposition: Label: Justifications: A:Temperature >= 25 {{A}} {(A)} B: Temperature < 25 {{B}} {(B)} C: Not Raining {{C}} {(C)} D: Day {{D}} {(D)} Derived facts: E: Nice weather F: Swim G: Read H: Sleep Rules: 1.A and C  E 2.E and D  F 3.D and out(C)  G 4.Out(D)  H 5.A and B  ⊥

64 Example – empty environment Assumptions: #:Proposition: Label: Justifications: A:Temperature >= 25 {{A}} {(A)} B: Temperature < 25 {{B}} {(B)} C: Not Raining {{C}} {(C)} D: Day {{D}} {(D)} H: Sleep {{Out(D)}} {(Out(D))} The problem solver applies the breadth first search strategy, to provide the assumptions: The empty environment is provided first, then A, B…, {A,B},{A,C}…,{A,B,C}..etc. The rules are fired by the PS as justifications to the assumptions. The empty environment caused the PS to provide out(D) as justification to H

65 Example – {D} Assumptions: #:Proposition: Label: Justifications: A:Temperature >= 25 {{A}} {(A)} B: Temperature < 25 {{B}} {(B)} C: Not Raining {{C}} {(C)} D: Day {{D}} {(D)} H: Sleep {{Out(D)}} {(Out(D))} G:Read{{D,Out(C)}}{{D,Out(C)}} Environments {A},{B} and {C} have not changed

66 Example – {A,B} Assumptions: #:Proposition: Label: Justifications: A:Temperature >= 25 {{A}} {(A)} B: Temperature < 25 {{B}} {(B)} C: Not Raining {{C}} {(C)} D: Day {{D}} {(D)} H: Sleep {{Out(D)}} {(Out(D))} G:Read{{D,Out(C)}}{(D,Out(C))} ⊥ {{A,B}}{(A,B)} PS will fire the fifth rule (A and B  ⊥ ) ATMS will add this environment to the nogood DB

67 Example – {A,C} Assumptions: #:Proposition: Label: Justifications: A:Temperature >= 25 {{A}} {(A)} B: Temperature < 25 {{B}} {(B)} C: Not Raining {{C}} {(C)} D: Day {{D}} {(D)} H: Sleep {{Out(D)}} {(Out(D))} G:Read{{D,Out(C)}}{(D,Out(C))} E: Nice weather{{A,C}}{(A,C)} PS will fire the first rule (A and C  E) {A,D} and {B,D} can fire the third rule, but it has already been fired

68 Example – {A,C,D} Assumptions: #:Proposition: Label: Justifications: A:Temperature >= 25 {{A}} {(A)} B: Temperature < 25 {{B}} {(B)} C: Not Raining {{C}} {(C)} D: Day {{D}} {(D)} H: Sleep {{Out(D)}} {(Out(D))} G:Read{{D,Out(C)}}{(D,Out(C))} E: Nice weather{{A,C}}{(A,C)} F: Swim{{A,C,D}}{(E,D)} PS will fire the second rule (E and D  F) {A,B,C},{A,B,D} and {A,B,C,D} are superset of {A,B} and so are not fired If: Label(E)={{A,C},{X,Y}} then Label(F)={{A,C,D},{X,Y,D}}

69 Example – environment lattice When sleep (H) holds? Square –where H holds Circle – where G holds Rhombus - where F holds

70 Back to Diagnosis… M1M1 M2M2 M3M3 * * * A1A1 A2A2 + + G=12,{{A2,M2,M3}} B=2,{{}} E=3,{{}} C=2,{{}} D=3,{{}} A=3,{{}} z=6,{{M3}} y=6,{{M2}} x=6,{{M1}} F=12,{{A1,M1,M2}} F=10,{{}} NOGOODS: {A1,M1,M2} x=4,{{A1,M2}} y=4,{{A1,M1}} G=10,{{A1,M1,M3,A2}} G=12,{{}} {A1,M1,M3,A2} z=6,{{M3},{A2,M2}} y=6,{{M2},{A2,M3}} z=8,{{A1,A2,M1}} x=4,{{A1,M2},{A1,A2,M3}}

71 Bibliography 1. Kenneth D. Forbus and Johan de Kleer, Building Problem Solvers, The MIT Press, Johan de Kleer, An assumption-based truth maintenance system, Artificial Intelligence 28, , Johan de Kleer, Problem Solving with the ATMS, Artificial Intelligence 28, , Johan de Kleer, Extending the ATMS, Artificial Intelligence 28, , Mladen Stanojevic and Sanja Vranes and Dusan Velasevic, Using Truth Maintenance Systems: A Tutorial, IEEE Expert: Intelligent Systems and Their Applications 9(6), 45-56, 1994.


Download ppt "Assumption-Based Truth Maintenance Systems Meir Kalech."

Similar presentations


Ads by Google