Assumption-based Truth Maintenance Systems: Motivation n Problem solvers need to explore multiple contexts at the same time, instead of a single one (the.

Slides:



Advertisements
Similar presentations
Completeness and Expressiveness
Advertisements

Heuristic Search techniques
Computer Science CPSC 322 Lecture 25 Top Down Proof Procedure (Ch 5.2.2)
Introduction to Truth Maintenance Systems A Truth Maintenance System (TMS) is a PS module responsible for: 1.Enforcing logical relations among beliefs.
Partial Order Reduction: Main Idea
Dana Nau: Lecture slides for Automated Planning Licensed under the Creative Commons Attribution-NonCommercial-ShareAlike License:
Justification-based TMSs (JTMS) JTMS utilizes 3 types of nodes, where each node is associated with an assertion: 1.Premises. Their justifications (provided.
Truth Maintenance Systems. Outline What is a TMS? Basic TMS model Justification-based TMS.
ECE Synthesis & Verification - L271 ECE 697B (667) Spring 2006 Synthesis and Verification of Digital Systems Model Checking basics.
Propositional and First Order Reasoning. Terminology Propositional variable: boolean variable (p) Literal: propositional variable or its negation p 
Assumption-Based Truth Maintenance Systems Meir Kalech.
Default Reasoning the problem: in FOL, universally-quantified rules cannot have exceptions –  x bird(x)  can_fly(x) –bird(tweety) –bird(opus)  can_fly(opus)
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
Logic.
Abstract Answer Set Solver. Todolist Print the rules of Fig 1.
Parallel Scheduling of Complex DAGs under Uncertainty Grzegorz Malewicz.
Best-First Search: Agendas
Knowledge Representation and Reasoning University "Politehnica" of Bucharest Department of Computer Science Fall 2010 Adina Magda Florea
1 Introduction to Computability Theory Lecture15: Reductions Prof. Amos Israeli.
1 Introduction to Computability Theory Lecture12: Reductions Prof. Amos Israeli.
Introduction to Computability Theory
Constraint Logic Programming Ryan Kinworthy. Overview Introduction Logic Programming LP as a constraint programming language Constraint Logic Programming.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
Firewall Policy Queries Author: Alex X. Liu, Mohamed G. Gouda Publisher: IEEE Transaction on Parallel and Distributed Systems 2009 Presenter: Chen-Yu Chang.
1 Planning. R. Dearden 2007/8 Exam Format  4 questions You must do all questions There is choice within some of the questions  Learning Outcomes: 1.Explain.
Logical Agents Chapter 7. Why Do We Need Logic? Problem-solving agents were very inflexible: hard code every possible state. Search is almost always exponential.
Logical Agents Chapter 7. Why Do We Need Logic? Problem-solving agents were very inflexible: hard code every possible state. Search is almost always exponential.
Methods of Proof Chapter 7, second half.
Knoweldge Representation & Reasoning
Artificial Intelligence Chapter 17 Knowledge-Based Systems Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Knowledge Representation and Reasoning University "Politehnica" of Bucharest Department of Computer Science Fall 2010 Adina Magda Florea
Database Systems Normal Forms. Decomposition Suppose we have a relation R[U] with a schema U={A 1,…,A n } – A decomposition of U is a set of schemas.
Notes for Chapter 12 Logic Programming The AI War Basic Concepts of Logic Programming Prolog Review questions.
1 TMS and ATMS Philippe Dague and Yuhong YAN NRC-IIT
Copyright © Cengage Learning. All rights reserved. CHAPTER 4 ELEMENTARY NUMBER THEORY AND METHODS OF PROOF ELEMENTARY NUMBER THEORY AND METHODS OF PROOF.
Problem Definition Chapter 7. Chapter Objectives Learn: –The 8 steps of experienced problem solvers –How to collect and analyze information and data.
Constraint Satisfaction Problems (CSPs) CPSC 322 – CSP 1 Poole & Mackworth textbook: Sections § Lecturer: Alan Mackworth September 28, 2012.
Knowledge Representation Use of logic. Artificial agents need Knowledge and reasoning power Can combine GK with current percepts Build up KB incrementally.
Pattern-directed inference systems
Advanced Topics in Propositional Logic Chapter 17 Language, Proof and Logic.
Logical Agents Logic Propositional Logic Summary
Slide 1 Propositional Definite Clause Logic: Syntax, Semantics and Bottom-up Proofs Jim Little UBC CS 322 – CSP October 20, 2014.
CP Summer School Modelling for Constraint Programming Barbara Smith 2. Implied Constraints, Optimization, Dominance Rules.
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
Solving Problems by searching Well defined problems A probem is well defined if it is easy to automatically asses the validity (utility) of any proposed.
Ch. 13 Ch. 131 jcmt CSE 3302 Programming Languages CSE3302 Programming Languages (notes?) Dr. Carter Tiernan.
KR A Principled Framework for Modular Web Rule Bases and its Semantics Anastasia Analyti Institute of Computer Science, FORTH-ICS, Greece Grigoris.
Computer Science CPSC 322 Lecture 22 Logical Consequences, Proof Procedures (Ch 5.2.2)
Problem Reduction So far we have considered search strategies for OR graph. In OR graph, several arcs indicate a variety of ways in which the original.
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
International Conference on Fuzzy Systems and Knowledge Discovery, p.p ,July 2011.
1 Reasoning with Infinite stable models Piero A. Bonatti presented by Axel Polleres (IJCAI 2001,
Logic: Proof procedures, soundness and correctness CPSC 322 – Logic 2 Textbook §5.2 March 7, 2011.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
Quality Assurance in the Presence of Variability Kim Lauenroth, Andreas Metzger, Klaus Pohl Institute for Computer Science and Business Information Systems.
LDK R Logics for Data and Knowledge Representation Propositional Logic Originally by Alessandro Agostini and Fausto Giunchiglia Modified by Fausto Giunchiglia,
Forward and Backward Chaining
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
Dana Nau: Lecture slides for Automated Planning Licensed under the Creative Commons Attribution-NonCommercial-ShareAlike License:
Knowledge Representation and Reasoning University "Politehnica" of Bucharest Department of Computer Science Fall 2011 Adina Magda Florea Master of Science.
Explaining and Controlling Reasoning Dr Nicholas Gibbins 32/3077.
Logic-based TMS Main idea: Represent negation explicitly to permit the IE to express any logical constraint among nodes. Recall: JTMS and ATMS do not allow.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Logics for Data and Knowledge Representation
Graphplan/ SATPlan Chapter
Graphplan/ SATPlan Chapter
Presentation transcript:

Assumption-based Truth Maintenance Systems: Motivation n Problem solvers need to explore multiple contexts at the same time, instead of a single one (the JTMS case) –Alternate diagnoses of a broken system –Different design choices –Competing theories to explain a set of data n Problem solvers often need to compare contexts rapidly switching from one context to another. In JTMS, this can be done by enabling and retracting assumptions. In ATMS, re-labeling is avoided because alternative contexts are explicitly stored. n ATMS contexts are monotonic.

The idea behind ATMS n The assumptions underlying conclusions are important in problem-solving –Solutions can be concisely described as sets of assumptions –States of the world can be represented by sets of assumptions –Theories can be represented by sets of assumptions n Identify sets of assumptions called here environments –Organize problem solver around manipulating environments –Facilitates reasoning with multiple hypotheses

ATMS keeps and manipulates sets of assumptions rather than sets of beliefs that directly support a given belief. It works with three types of nodes: 1.Premise nodes. These are always true, but they are of no special interest for ATMS. 2.Assumption nodes. These extend the incomplete description of the domain in different possible ways. Once made, assumptions are never retracted. 3.Contradictions. These are defined by means of assumptions that originate them. Such sets of assumptions are called nogoods. ATMS justifications are Horn formulas of the form: J k : I 1, I 2, …, I n  C k, where I 1, I 2, …, I n are the antecedents, and C k is the consequent of justification J k.

Basic ATMS terminology Similar to JTMS, the primary task of the ATMS is to answer queries about whether a node holds in a given set of beliefs. The following definitions are setting up the ATMS-related terminology to address this task. Definition. A set of assumptions upon which a given node depends is called an environment. Example: {A,B,C} Definition. A label is a set of environments. Example: {{A,B,C}, …,{D,F}} That is, the label is the assumptions upon which the node ultimately depends – major difference from JTMS label, where labels are simple, :IN or :OUT. Definition. An ATMS-node, N k is a triplet

Basic ATMS terminology (cont.) Definition. A node, N k, holds in a given environment, E, iff it is propositionally derivable from E given the set of justifications, J. Definition. Let E be a (consistent) environment, and N be a set of nodes propositionally derivable from E. Then, E  N is called the context of E. Definition. A characterizing environment for the context is a set of assumptions where every node belonging to that context holds. Each context is completely specified by its characterizing environment. This is why, the question about whether a node holds in a given set of beliefs (or context) is reduced to defining whether the node holds in a given environment.

Relations between environments Because environments are monotonic, set inclusion between environments implies logical subsumption of consequences. Example: E1 = {C} E2 = {C, D} E3 = {D, E} E1 subsumes E2 E2 is subsumed by E1 E1 neither subsumes or is subsumed by E3

How ATMS answers queries about whether a node holds in a given environment?  The easiest way: associate with each node all of the environments where this nodes holds. However, if a node holds universally (that is, in all possible environments, the number of which can be as much as 2^n, where n is the number of assumptions) this may result in a huge data structure associated with a node.  The better way: given the fact that ATMS is a monotonic system, we can record only those environments which satisfy the following four properties : 1.Soundness, i.e. a node holds in any of the environments associated with it. 2.Consistency, i.e. no environment is a nogood. 3.Completeness, i.e. every consistent environment where a node holds is either associated with it, or is a superset of some environment associated with it. 4.Minimality, i.e. no environment is a proper subset of any other.

ATMS labels : more complex than JTMS labels Examples: Consider the following dependency network: n Is H believed? Yes, because its label is non-empty. n Is H believed under {B, C, D, Z, X}? Yes, because {B, C, D}  {B, C, D, Z, X} n Is H believed under {C, D}? No.    D C  F G  {{B, C}} {{C, D}} {{A},{B,C,D}}

Contradictions Like JTMS, in ATMS certain nodes can be declared as contradictions, which here suggests that every environment which would allow a contradiction to be believed is inconsistent. Inconsistent environments are called nogoods. Example: F G  {B,C} {A,B,C}

There are two special labels in ATMS: Case 1: Label = { } (empty label) This means that there is no known consistent environment in which the node is believed, i.e. either there is no path from assumptions to it, or all environments for it are inconsistent. Case 2: Label = {{}} (empty environment) This means that the node is believed in every consistent environment, i.e. the node is either a premise or can be derived strictly from premises.

Label propagation R  CDG  L  

Label propagation: enable A R  CDG  L    

Label propagation: enable B R C D G  L     

Label propagation: enable C RDG  L      C {{C   C}}  C}, {B,C}}

Label propagation: enable D RG  L      C {{C   C},{D}}  C}}  C}, {B,C}} D

ATMS algorithms Logical specification of the ATMS:  ATMS does propositional reasoning over nodes.  ATMS justifications are Horn clauses.  Contradictions are characterized by nogoods. Every ATMS operation which changes a node label can be viewed as adding a justification, i.e. this is the only operation we have to be concerned here is label update as a result of adding a justification. This operation is carried out in two step: Step 1: Compute a tentative new (locally correct) label for the affected node as follows L new =  k {x | x =  i x i, where x i  L ik Step 2: All nogoods and subsumed environments are removed from L new to achieve global correctness.

Propagating label changes 1.To update node N i compute its new label as described. 2.If the label has not changed DONE. 3.If N i is a contradiction node do 1.Mark all environments in its label as nogoods. 2.For every node in the network, check its label for environments newly marked as nogoods and remove them from every node label. 4.If N i is not a contradiction node, then recursively update all of its consequences. This algorithm is guaranteed to terminate with correct labels for all of the nodes in the network. However, it is inefficient, because it constantly re-computes nodes’ labels.

Constructing solutions ATMS can answer a variety of questions: 1. Does a node hold in a given context? 2. Is a context originated by a given set of assumptions consistent? 3. Why does a node hold in a given context? 4. Which assumptions underlie a given node? What exactly is expected by the ATMS (the solution) depends on the task the system is intended to solve. This defines a major difference between JTMS and ATMS, namely: –In single-context JTMS, a solution is a contradiction-free state of the TMS which contains beliefs of particular types. Search procedures push the TMS into appropriate states, and the answers are read out of the database. –In multiple-context ATMS, Solution = Environment that supports beliefs of particular types. We must ensure that such environments are created somehow.

Constructing solutions by using goal nodes If a problem-solving task can be solved by using choice sets (i.e. the solution is generated by picking up one choice from each choice set), then we can read off the solution from the label of one goal node. The algorithm for generating the solution becomes: 1.Construct a node (or nodes) whose label constitutes valid solutions. 2.Solve a problem by i.Building the appropriate dependency structure (including nogoods) ii.Enabling assumptions iii.Reading off the label of the goal node. 3.Optimizations i.Interleave assumption-making with building the dependency network ii.Use multiple goal nodes to decompose search space See textbook, example p.437

Applications of ATMS In class, we shall discuss the following two applications of the ATMS:  Model-based diagnosis. Diagnosis in general is intended to identify the cause (or causes) of a system failure, which is signaled by one or more failure symptoms. The traditional approach to diagnosis is the rule-based approach, where rules describe the relationship between symptoms and the underlying causes. The major problem with this easy-to-implement approach is that the actual cause for the failure may not be explicitly represented into the KB, in which case the system will either fail to reach a conclusion or will reach a wrong conclusion instead. Model-based diagnosis provides an alternative, much more comprehensive approach to building diagnostic systems. (Notes to be distributed in class.)  Constraint satisfaction. Given a finite sets of variables, possible values of those variables and a set of constraints on those variables, the constraint satisfaction problem (CSP) consists in defining substitutions for variables from corresponding sets of values, so that all of the constraints are satisfied. The traditional solution to this problem relies on generate-and-test search and chronological backtracking. Because of the complexity of this type of search, this solution is good only for small problems. The ATMS approach to the CSP allow the PS to keep track of already evaluated constraints, thus improving the efficiency of the search. (Notes to be distributed in class.)