Assumption-Based Truth Maintenance Systems Meir Kalech.

Slides:



Advertisements
Similar presentations
Modelling with expert systems. Expert systems Modelling with expert systems Coaching modelling with expert systems Advantages and limitations of modelling.
Advertisements

Computer Science CPSC 322 Lecture 25 Top Down Proof Procedure (Ch 5.2.2)
Constraint Satisfaction Problems Russell and Norvig: Chapter
ARCHITECTURES FOR ARTIFICIAL INTELLIGENCE SYSTEMS
Introduction to Truth Maintenance Systems A Truth Maintenance System (TMS) is a PS module responsible for: 1.Enforcing logical relations among beliefs.
Partial Order Reduction: Main Idea
Chapter 16: Multiagent Systems Service-Oriented Computing: Semantics, Processes, Agents – Munindar P. Singh and Michael N. Huhns, Wiley, 2005.
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
Dana Nau: Lecture slides for Automated Planning Licensed under the Creative Commons Attribution-NonCommercial-ShareAlike License:
MBD and CSP Meir Kalech Partially based on slides of Jia You and Brian Williams.
Justification-based TMSs (JTMS) JTMS utilizes 3 types of nodes, where each node is associated with an assertion: 1.Premises. Their justifications (provided.
Truth Maintenance Systems. Outline What is a TMS? Basic TMS model Justification-based TMS.
Artificial Intelligence Constraint satisfaction problems Fall 2008 professor: Luigi Ceccaroni.
UIUC CS 497: Section EA Lecture #2 Reasoning in Artificial Intelligence Professor: Eyal Amir Spring Semester 2004.
Propositional and First Order Reasoning. Terminology Propositional variable: boolean variable (p) Literal: propositional variable or its negation p 
Default Reasoning the problem: in FOL, universally-quantified rules cannot have exceptions –  x bird(x)  can_fly(x) –bird(tweety) –bird(opus)  can_fly(opus)
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
1 DCP 1172 Introduction to Artificial Intelligence Chang-Sheng Chen Topics Covered: Introduction to Nonmonotonic Logic.
Agents That Reason Logically Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2004.
Logic CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Logic.
CPSC 422, Lecture 21Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 21 Mar, 4, 2015 Slide credit: some slides adapted from Stuart.
CS 484 – Artificial Intelligence1 Announcements Choose Research Topic by today Project 1 is due Thursday, October 11 Midterm is Thursday, October 18 Book.
Inferences The Reasoning Power of Expert Systems.
Abstract Answer Set Solver. Todolist Print the rules of Fig 1.
Intelligent systems Lecture 6 Rules, Semantic nets.
Best-First Search: Agendas
Knowledge Representation and Reasoning University "Politehnica" of Bucharest Department of Computer Science Fall 2010 Adina Magda Florea
Expert System Human expert level performance Limited application area Large component of task specific knowledge Knowledge based system Task specific knowledge.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.5 [P]: Propositions and Inference Sections.
Constraint Logic Programming Ryan Kinworthy. Overview Introduction Logic Programming LP as a constraint programming language Constraint Logic Programming.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
1 Planning. R. Dearden 2007/8 Exam Format  4 questions You must do all questions There is choice within some of the questions  Learning Outcomes: 1.Explain.
Artificial Intelligence Chapter 17 Knowledge-Based Systems Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
EE1J2 – Discrete Maths Lecture 5 Analysis of arguments (continued) More example proofs Formalisation of arguments in natural language Proof by contradiction.
Knowledge Representation and Reasoning University "Politehnica" of Bucharest Department of Computer Science Fall 2010 Adina Magda Florea
Notes for Chapter 12 Logic Programming The AI War Basic Concepts of Logic Programming Prolog Review questions.
Inference is a process of building a proof of a sentence, or put it differently inference is an implementation of the entailment relation between sentences.
Introduction to Proofs
CPS 170: Artificial Intelligence Propositional Logic Instructor: Vincent Conitzer.
1 TMS and ATMS Philippe Dague and Yuhong YAN NRC-IIT
Understanding PML Paulo Pinheiro da Silva. PML PML is a provenance language (a language used to encode provenance knowledge) that has been proudly derived.
Constraint Satisfaction Problems (CSPs) CPSC 322 – CSP 1 Poole & Mackworth textbook: Sections § Lecturer: Alan Mackworth September 28, 2012.
Pattern-directed inference systems
Advanced Topics in Propositional Logic Chapter 17 Language, Proof and Logic.
Slide 1 Propositional Definite Clause Logic: Syntax, Semantics and Bottom-up Proofs Jim Little UBC CS 322 – CSP October 20, 2014.
Actions Planning and Defeasible Reasoning Guillermo R. Simari Alejandro J. García Marcela Capobianco Dept. of Computer Science and Engineering U NIVERSIDAD.
A Logic of Partially Satisfied Constraints Nic Wilson Cork Constraint Computation Centre Computer Science, UCC.
Problem Reduction So far we have considered search strategies for OR graph. In OR graph, several arcs indicate a variety of ways in which the original.
Maximum Density Still Life Symmetries and Lazy Clause Generation Geoffrey Chu, Maria Garcia de la Banda, Chris Mears, Peter J. Stuckey.
11 Artificial Intelligence CS 165A Thursday, October 25, 2007  Knowledge and reasoning (Ch 7) Propositional logic 1.
Logic: Proof procedures, soundness and correctness CPSC 322 – Logic 2 Textbook §5.2 March 7, 2011.
Artificial Intelligence
Forward and Backward Chaining
Expert System Seyed Hashem Davarpanah University of Science and Culture.
Assumption-based Truth Maintenance Systems: Motivation n Problem solvers need to explore multiple contexts at the same time, instead of a single one (the.
Knowledge Repn. & Reasoning Lecture #9: Propositional Logic UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2005.
Dana Nau: Lecture slides for Automated Planning Licensed under the Creative Commons Attribution-NonCommercial-ShareAlike License:
Knowledge Representation and Reasoning University "Politehnica" of Bucharest Department of Computer Science Fall 2011 Adina Magda Florea Master of Science.
Explaining and Controlling Reasoning Dr Nicholas Gibbins 32/3077.
Section 1.7. Section Summary Mathematical Proofs Forms of Theorems Direct Proofs Indirect Proofs Proof of the Contrapositive Proof by Contradiction.
Logic-based TMS Main idea: Represent negation explicitly to permit the IE to express any logical constraint among nodes. Recall: JTMS and ATMS do not allow.
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Artificial Intelligence Chapter 17 Knowledge-Based Systems
Contradiction-tolerant TMS
Artificial Intelligence Chapter 17 Knowledge-Based Systems
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Back to “Serious” Topics…
Habib Ullah qamar Mscs(se)
Presentation transcript:

Assumption-Based Truth Maintenance Systems Meir Kalech

Outline  Last lecture: 1. Consistency-based diagnosis 2. GDE – general diagnosis engine 3. Conflict generation using ATMS 4. Candidate generation  Today’s lecture: 1. What is TMS 2. TMS architecture 3. Justification-based TMS 4. Assumption-based TMS

What is TMS? A Truth Maintenance System (TMS) is a Problem Solver module responsible for: Enforcing logical relations among beliefs. Generating explanations for conclusions. Finding solutions to search problems Supporting default reasoning. Identifying causes for failure and recover from inconsistencies.

1. Enforcement of logical relations  AI problem -> search.  Search utilizes assumptions.  Assumptions change.  Changing assumptions -> updating consequences of beliefs.  TMS: mechanism to maintain and update relations among beliefs.

1. Enforcement of logical relations Example: If (cs-501) and (math-218) then (cs-570). If (cs-570) and (CIT) then (TMS). If (TMS) then (AI-experience). The following are relations among beliefs: (AI-experience) if (TMS). (TMS) if (cs-570), (CIT). (cs-570) if (cs-501), (math-218)  Beliefs are propositional variables  TMS is a mechanism for processing large collections of logical relations on propositional variables.

2. Generation of explanations  Solving problems is what Problem Solvers do. However, often solutions are not enough.  The PS is expected to provide an explanation  TMS uses cached inferences for that aim.  TMS is efficient: Generating cached inferences once is more beneficial than running inference rules that have generated these inferences more than once.

2. Generation of explanations Example: Q: Shall I have an AI experience after completing the CIT program? A: Yes, because of the TMS course. Q: What do I need to take a TMS course? A: CS-570 and CIT.  There are different types of TMSs that provide different ways of explaining conclusions (JTMS vs ATMS).  In this example, explaining conclusions in terms of their immediate predecessors works much better.

3. Finding solutions to search problems B A D C E  Color the nodes: red (1), green (2) yellow (3).  Adjacent nodes are of different colors.  The set of constraints describe this problem: A1 or A2 or A3 not (A1 and B1) not (A3 and C3) not (D2 and E2) B1 or B2 or B3 not (A2 and B2) not (B1 and D1) not (D3 and E3) C1 or C2 or C2 not (A3 and B3) not (B2 and D2) not (C1 and E1) D1 or D2 or D3 not (A1 and C1) not (B3 and D3) not (C2 and E2) E1 or E2 or E2 not (A2 and C2) not (D1 and E1) not (C3 and E3)

To find a solution we can use search: 3. Finding solutions to search problems A is red A is greenA is yellow B is green B is yellow C is red C is yellow D is red D is green E is green E is yellow

4. Default reasoning and TMS  PS must make conclusions based on incomplete information.  “Closed-World Assumption” (CWA)  X is true unless there is an evidence to the contrary.  CWA helps us limit the underlying search space.  The reasoning scheme that supports CWA is called “default (or non-monotonic) reasoning”.

4. Default reasoning and TMS  Example: Consider the following knowledge base Bird(tom) and ¬ Abnormal(tom)  Can_fly(tom) Penguin(tom)  Abnormal(tom) Ostrich(tom)  Abnormal(tom) Bird(tom) Under the CWA, we assume ¬ Abnormal(tom) and therefore we can derive: Can_fly(tom)  Non-monotonic TMS supports this type of reasoning.

5. Identifying causes for failures and recovering from inconsistencies  Inconsistencies among beliefs in the KB are always possible: wrong data (example: “Outside temperature is 320 degrees.”) Impossible constraints (example: Big-house and Cheap-house and Nice-house).  TMS maintains help identify the reason for an inconsistency  “dependency-directed backtracking” allows the TMS to recover.

TMS applications  Constraint Satisfaction Problems (CSP) Set of variables Domain over each variable Constraints between variables’ domain Goal: find “solution”: assignments to the variables that satisfy the constraints  Scenario and Planning Problems Find a path of state transitions lead from initial to final states. (games, strategies). TMS – identifies of applicable rules.

CSP example Allocation problem Two hosts:{h1,h2}(variables) Three tasks: {t1,t2,t3}(domain) Two constraints:  t1 before t2 on the same host  t1 could not be run on the same host of t3

CSP example t1-h1t2-h1t3-h1t1-h2t2-h2t3-h3 t1-h1 t2-h1 t1-h1 t3-h1 t1-h2 t2-h2 t1-h2 t3-h2 t1-h1 t2-h1 t3-h2 t1-h1 t2-h1 t3-h1 t1-h2 t2-h2 t3-h1 t1-h2 t2-h2 t3-h2 … … 48 nodes6 are solutions

Outline  Last lecture: 1. Consistency-based diagnosis 2. GDE – general diagnosis engine 3. Conflict generation using ATMS 4. Candidate generation  Today’s lecture: 1. What is TMS 2. TMS architecture 3. Justification-based TMS 4. Assumption-based TMS

Problem Solver Architecture Problem Solver TMS Justifications, assumptions Beliefs contradictions The TMS / PS relationship is the following:

How the TMS and the PS communicate?  The PS works with: assertions (facts, beliefs, conclusions, hypotheses) inference rules procedures  Each one of these is assigned a TMS node.  Example: N1: (rule (student ?x) (assert (and (underpaid ?x) (overworked ?x)))) N2: (student Bob)  Given N1 and N2, the PS can infer N3: (and (underpaid Bob) (overworked Bob))  PS threats nodes as logical formulas,  While TMS treats nodes as propositional variables.

TMS nodes  Different types of TMS support types of nodes: Premise nodes. These are always true. Contradiction nodes. These are always false. Assumption nodes. PS believes no matter whether or not they are supported by the existing evidence.  Node has a label associated with it. The contents and the structure of the label depends on the type of TMS.  Other properties are node type (premise, assumption, etc.), node support (justifications, antecedents), node consequences, etc.

TMS justifications  If N3, is created by the PS, it reports to the TMS together with the fact that it follows from N1, N2. justification: (N3  N2 N1)  Here N3 is called the consequent, N1 and N2 are the antecedents of the justification.  Justifications record relations among beliefs or explaining consequents and identifying causes for inconsistencies.  The general format of justifications is the following: ( )

Propositional specification of a TMS  TMS nodes are propositional variables   TMS justifications are propositional formulas N1 & N2 & … & Ni  Nj  Here N1, N2, …, Ni, Nj are positive literals, therefore this implication is a Horn formula. TMS can be viewed as a set of Horn formulas

PS / TMS interaction Responsibilities of the PS: 1. Adds assertions and justifications. 2. Makes premises and assumptions. 3. Retracts assumptions. 4. Provides advise on handling contradictions Responsibilities of the TMS: 1.Cashes beliefs and consequences and maintains labels. 2.Detects contradictions. 3.Performs belief revision. 4.Generates explanations.

Outline  Last lecture: 1. Consistency-based diagnosis 2. GDE – general diagnosis engine 3. Conflict generation using ATMS 4. Candidate generation  Today’s lecture: 1. What is TMS 2. TMS architecture 3. Justification-based TMS 4. Assumption-based TMS

Justification-based TMS Justifications are used for: Belief update purpose, when belief state of a node changes. Handle contradiction: 1. Justification is added to the dependency- directed backtracking system 2. Then search through the dependency network for the assumptions of the contradiction 3. Contradiction is removed.

Justification-based TMS  A justification contains inlist and outlist for a justified node to be believed: inlist – a set of nodes that must be in outlist – a set of nodes that must be out Syntax: {(inlist),(outlist)}  Premises hold universally: empty in and out  Only one context includes the set of assumptions currently believed.

Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night PS concludes “nice weather” from A and C

Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} New node in the JTMS

Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} PS concludes “swim” from E and G

Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()}

Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()}

Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()} I: Contradiction{(C),()} Dependency-directed backtracking system

Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()} I: Contradiction{(C),()} X: Handle{(),()} //premise D: Raining{(X),()} Context: {(A,D,E), (B,C,F,G,H,I)}

Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()} I: Contradiction{(C),()} X: Handle{(),()} //premise D: Raining{(X),()} J: Read {(D,E),()} K: Contradiction{(J),()}//becomes tired Dependency-directed backtracking system

Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()} I: Contradiction{(C),()} X: Handle{(),()} //premise D: Raining{(X),()} J: Read {(D,E),()} K: Contradiction{(J),()}//becomes tired F: Night{(X),()} Context: {(A,D,F), (B,C,E,G,H,I,J,K)}

Justification-based TMS – Example Propositions:Justifications: A: Temperature>=25{(),(B)} B: Temperature< 25 C: Not raining{(),(D)} D: Raining E: Day{(),(F)} F: Night G: Nice weather{(A,C),()} H: Swim{(E,G),()} I: Contradiction{(C),()} X: Handle{(),()} //premise D: Raining{(X),()} J: Read {(D,E),()} K: Contradiction{(J),()}//becomes tired F: Night{(X),()} L: Sleep{(F),()}

Outline  Last lecture: 1. Consistency-based diagnosis 2. GDE – general diagnosis engine 3. Conflict generation using ATMS 4. Candidate generation  Today’s lecture: 1. What is TMS 2. TMS architecture 3. Justification-based TMS 4. Assumption-based TMS

Assumption-based TMS: Motivation  Problem solvers need to explore multiple contexts at the same time, instead of a single one (the JTMS case) Alternate diagnoses of a broken system Different design choices Competing theories to explain a set of data  Problem solvers need to compare contexts switching from one context to another. In JTMS, this can be done by enabling and retracting assumptions. In ATMS, alternative contexts are explicitly stored.

The idea behind ATMS  The assumptions underlying conclusions are important in problem-solving Solutions can be described as sets of assumptions States of the world can be represented by sets of assumptions  Identify sets of assumptions called here environments Organize problem solver around manipulating environments Facilitates reasoning with multiple hypotheses

Assumptions and Justifications  ATMS keeps and manipulates sets of assumptions rather than sets of beliefs  Three types of nodes: Premise nodes. These are always true, but they are of no special interest for ATMS. Assumption nodes. Once made, assumptions are never retracted. Contradictions. These are defined by means of assumptions that originate them. Such sets of assumptions are called nogoods.  ATMS justifications are Horn formulas of the form: Jk: I1, I2, …, In  Ck, where I1, I2, …, In are the antecedents, and Ck is the consequent of justification Jk.

Basic ATMS terminology  ATMS answers queries about whether a node holds in a given set of beliefs.  Definition. A set of assumptions upon which a given node depends is called an environment. Example: {A,B,C}  Definition. A label is a set of environments. Example: {{A,B,C}, …,{D,F}} That is, the label is the assumptions upon which the node ultimately depends – major difference from JTMS label, where labels are simple, :IN or :OUT.  Definition. An ATMS-node, Nk is a triplet

Basic ATMS terminology Definition. A node n holds in a given environment E, iff it can be derived from E given the set of justifications J: E,J ⊢ n An environment is inconsistent if false is derived: E,J ⊢ ⊥ Definition. Let E be a (consistent) environment, and N be a set of nodes derived from E. Then, E  N is called the context of E.  Definition. A characterizing environment is a minimal consistent environment from which a context can be derived. Each context is completely specified by its characterizing environment.

ATMS efficiency  ATMS is provided by a set of assumptions and justifications.  The task of ATMS is efficiently determines the contexts. Incrementally updating only the changed contexts. Data structure for context-consistency checking and node inclusion very fast.

Relations between environments Because environments are monotonic, set inclusion between environments implies logical subsumption of consequences. Example: E1 = {C} E2 = {C, D} E3 = {D, E} E1 subsumes E2 E2 is subsumed by E1 E1 neither subsumes or is subsumed by E3

How ATMS answers queries How ATMS answers queries about whether a node holds in a given environment?  Easiest way: associate with each node all of the environments  Better way: we can record only those environments which satisfy the following four properties: 1.Soundness: a node holds in any of the environments associated with it. 2.Consistency: no environment is a nogood. 3.Completeness: every consistent environment is either associated with it or with a superset of it. 4.Minimality: no environment is a subset of any other.

Example, dependency network: Is H believed? Yes, because its label is non-empty. Is H believed under {B, C, D, Z, X}? Yes, because {B, C, D}  {B, C, D, Z, X} Is H believed under {C, D}? No. ATMS labels    D C  F G  {{B, C}} {{C, D}} {{A},{B,C,D}}

 Certain nodes can be declared as contradictions:  Every environment which allows a contradiction is inconsistent.  Inconsistent environments are called nogoods.  Example: Contradictions F G  {B,C} {A,B,C}

Special labels in ATMS Case 1: Label = { } (empty label) This means that there is no known consistent environment in which the node is believed, i.e. either there is no path from assumptions to it, or all environments for it are inconsistent. Case 2: Label = {{}} (empty environment) This means that the node is believed in every consistent environment, i.e. the node is either a premise or can be derived strictly from premises.

Label propagation R  CDG  L  

Label propagation: enable A R  CDG  L    

Label propagation: enable B R C D G  L     

Label propagation: enable C RDG  L      C {{C   C}}  C}, {B,C}}

Label propagation: enable D RG  L      C {{C   C},{D}}  C}}  C}, {B,C}} D

Example: datum justifications environments For example see Franz Wotawa’s slides page 7-8

Properties of ATMS

Environment Lattice

Comments to lattice  If an environment is nogood, then all of its superset environments are nogood as well. All nogoods are the result of the nogood {A, B, E}.  The ATMS associates every datum with its contexts. If a datum is in a context, then it is in every superset as well (the inconsistent supersets are ignored).

Comments to lattice  The circled nodes indicate all the contexts of  The square nodes indicate all the contexts of  If PS infers y=0 from x+y=1 and x=1:  Then the context for y=0 is the intersection of the contexts of the above:

Comments to lattice  One sound and complete label for the consequent is the set whose elements are the union of all possible combinations of picking one environment from each antecedent node label. Thus one sound and complete label is:  The environment {A, B, C, D} is removed because it is subsumed by {A, B, C}.  The environment {A, B, D, E} is not included because it is a superset of the inconsistent {A, B, E}.

ATMS algorithms  Logical specification of the ATMS: ATMS does propositional reasoning over nodes. ATMS justifications are Horn clauses. Contradictions are characterized by nogoods.  Every ATMS operation which changes a node label can be viewed as adding a justification, i.e. this is the only operation we have to be concerned here is label update as a result of adding a justification.

ATMS algorithms  Step 1: Compute a tentative new (locally correct) label for the affected node as follows Given J ik the label of the i’th node of k’th justification for consequent node n, a complete label for node n: L new =  k {x | x =  i x i, where x i  J ik }  Step 2: All nogoods and subsumed environments are removed from L new to achieve global correctness.

Propagating label changes  To update node Ni, compute its new label as described  If the label has not changed  DONE.  Else  If Ni is a contradiction node do Mark all environments in its label as nogoods. For every node in the network, check its label for environments marked as nogoods and remove from every node label.  Else  recursively update all Ni’s consequences (other nodes having justifications which mention Ni).

Example Assumptions: #:Proposition: Label: Justifications: A:Temperature >= 25 {{A}} {(A)} B: Temperature < 25 {{B}} {(B)} C: Not Raining {{C}} {(C)} D: Day {{D}} {(D)} Derived facts: E: Nice weather F: Swim G: Read H: Sleep Rules: 1.A and C  E 2.E and D  F 3.D and out(C)  G 4.Out(D)  H 5.A and B  ⊥

Example – empty environment Assumptions: #:Proposition: Label: Justifications: A:Temperature >= 25 {{A}} {(A)} B: Temperature < 25 {{B}} {(B)} C: Not Raining {{C}} {(C)} D: Day {{D}} {(D)} H: Sleep {{Out(D)}} {(Out(D))} The problem solver applies the breadth first search strategy, to provide the assumptions: The empty environment is provided first, then A, B…, {A,B},{A,C}…,{A,B,C}..etc. The rules are fired by the PS as justifications to the assumptions. The empty environment caused the PS to provide out(D) as justification to H

Example – {D} Assumptions: #:Proposition: Label: Justifications: A:Temperature >= 25 {{A}} {(A)} B: Temperature < 25 {{B}} {(B)} C: Not Raining {{C}} {(C)} D: Day {{D}} {(D)} H: Sleep {{Out(D)}} {(Out(D))} G:Read{{D,Out(C)}}{{D,Out(C)}} Environments {A},{B} and {C} have not changed

Example – {A,B} Assumptions: #:Proposition: Label: Justifications: A:Temperature >= 25 {{A}} {(A)} B: Temperature < 25 {{B}} {(B)} C: Not Raining {{C}} {(C)} D: Day {{D}} {(D)} H: Sleep {{Out(D)}} {(Out(D))} G:Read{{D,Out(C)}}{(D,Out(C))} ⊥ {{A,B}}{(A,B)} PS will fire the fifth rule (A and B  ⊥ ) ATMS will add this environment to the nogood DB

Example – {A,C} Assumptions: #:Proposition: Label: Justifications: A:Temperature >= 25 {{A}} {(A)} B: Temperature < 25 {{B}} {(B)} C: Not Raining {{C}} {(C)} D: Day {{D}} {(D)} H: Sleep {{Out(D)}} {(Out(D))} G:Read{{D,Out(C)}}{(D,Out(C))} E: Nice weather{{A,C}}{(A,C)} PS will fire the first rule (A and C  E) {A,D} and {B,D} can fire the third rule, but it has already been fired

Example – {A,C,D} Assumptions: #:Proposition: Label: Justifications: A:Temperature >= 25 {{A}} {(A)} B: Temperature < 25 {{B}} {(B)} C: Not Raining {{C}} {(C)} D: Day {{D}} {(D)} H: Sleep {{Out(D)}} {(Out(D))} G:Read{{D,Out(C)}}{(D,Out(C))} E: Nice weather{{A,C}}{(A,C)} F: Swim{{A,C,D}}{(E,D)} PS will fire the second rule (E and D  F) {A,B,C},{A,B,D} and {A,B,C,D} are superset of {A,B} and so are not fired If: Label(E)={{A,C},{X,Y}} then Label(F)={{A,C,D},{X,Y,D}}

Example – environment lattice When sleep (H) holds? Square –where H holds Circle – where G holds Rhombus - where F holds

Back to Diagnosis… M1M1 M2M2 M3M3 * * * A1A1 A2A2 + + G=12,{{A2,M2,M3}} B=2,{{}} E=3,{{}} C=2,{{}} D=3,{{}} A=3,{{}} z=6,{{M3}} y=6,{{M2}} x=6,{{M1}} F=12,{{A1,M1,M2}} F=10,{{}} NOGOODS: {A1,M1,M2} x=4,{{A1,M2}} y=4,{{A1,M1}} G=10,{{A1,M1,M3,A2}} G=12,{{}} {A1,M1,M3,A2} z=6,{{M3},{A2,M2}} y=6,{{M2},{A2,M3}} z=8,{{A1,A2,M1}} x=4,{{A1,M2},{A1,A2,M3}}

Bibliography 1. Kenneth D. Forbus and Johan de Kleer, Building Problem Solvers, The MIT Press, Johan de Kleer, An assumption-based truth maintenance system, Artificial Intelligence 28, , Johan de Kleer, Problem Solving with the ATMS, Artificial Intelligence 28, , Johan de Kleer, Extending the ATMS, Artificial Intelligence 28, , Mladen Stanojevic and Sanja Vranes and Dusan Velasevic, Using Truth Maintenance Systems: A Tutorial, IEEE Expert: Intelligent Systems and Their Applications 9(6), 45-56, 1994.