Concurrent Reasoning with Inference Graphs Daniel R. Schlegel and Stuart C. Shapiro Department of Computer Science and Engineering University at Buffalo,

Slides:



Advertisements
Similar presentations
Artificial Intelligence
Advertisements

CS4026 Formal Models of Computation Part II The Logic Model Lecture 1 – Programming in Logic.
Computer Science CPSC 322 Lecture 25 Top Down Proof Procedure (Ch 5.2.2)
Inference Rules Universal Instantiation Existential Generalization
Automatic Verification Book: Chapter 6. What is verification? Traditionally, verification means proof of correctness automatic: model checking deductive:
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
Justification-based TMSs (JTMS) JTMS utilizes 3 types of nodes, where each node is associated with an assertion: 1.Premises. Their justifications (provided.
Truth Maintenance Systems. Outline What is a TMS? Basic TMS model Justification-based TMS.
1 Logic Logic in general is a subfield of philosophy and its development is credited to ancient Greeks. Symbolic or mathematical logic is used in AI. In.
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
Inferences The Reasoning Power of Expert Systems.
Best-First Search: Agendas
Knowledge Representation Methods
Rule Based Systems Michael J. Watts
Chapter 12: Expert Systems Design Examples
System Architecture Intelligently controlling image processing systems.
Proof methods Proof methods divide into (roughly) two kinds: –Application of inference rules Legitimate (sound) generation of new sentences from old Proof.
CS 536 Spring Global Optimizations Lecture 23.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 9 Jim Martin.
Termination Detection Presented by: Yonni Edelist.
Knowledge in intelligent systems So far, we’ve used relatively specialized, naïve agents. How can we build agents that incorporate knowledge and a memory?
Semantics of a Propositional Network Stuart C. Shapiro Department of Computer Science & Engineering Center for MultiSource Information Fusion.
1 Chapter 9 Rules and Expert Systems. 2 Chapter 9 Contents (1) l Rules for Knowledge Representation l Rule Based Production Systems l Forward Chaining.
Rules and Expert Systems
© C. Kemke1Reasoning - Introduction COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.
Knoweldge Representation & Reasoning
Prof. Fateman CS 164 Lecture 221 Global Optimization Lecture 22.
Artificial Intelligence CSC 361
Prof. Bodik CS 164 Lecture 16, Fall Global Optimization Lecture 16.
Katanosh Morovat.   This concept is a formal approach for identifying the rules that encapsulate the structure, constraint, and control of the operation.
Chapter 14: Artificial Intelligence Invitation to Computer Science, C++ Version, Third Edition.
Minimal Knowledge and Negation as Failure Ming Fang 7/24/2009.
Department of Computer Science A Static Program Analyzer to increase software reuse Ramakrishnan Venkitaraman and Gopal Gupta.
Production Systems A production system is –a set of rules (if-then or condition-action statements) –working memory the current state of the problem solving,
Inference Graphs: A Roadmap Daniel R. Schlegel and Stuart C. Department of Computer Science and Engineering L A – Logic of Arbitrary.
Computer Science Department Data Structure & Algorithms Lecture 8 Recursion.
1 Agenda Modeling problems in Propositional Logic SAT basics Decision heuristics Non-chronological Backtracking Learning with Conflict Clauses SAT and.
NATURAL LANGUAGE UNDERSTANDING FOR SOFT INFORMATION FUSION Stuart C. Shapiro and Daniel R. Schlegel Department of Computer Science and Engineering Center.
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Concurrent Aggregates (CA) Andrew A. Chien and William J. Dally Presented by: John Lynn and Ryan Wu.
Concurrent Inference Graphs Daniel R. Schlegel Department of Computer Science and Engineering Problem Summary Inference graphs 2 in their current form.
16 August Verilog++ Assertion Extension Requirements Proposal.
An Object-Oriented Approach to Programming Logic and Design Fourth Edition Chapter 6 Using Methods.
Static Program Analyses of DSP Software Systems Ramakrishnan Venkitaraman and Gopal Gupta.
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
LECTURE LECTURE Propositional Logic Syntax 1 Source: MIT OpenCourseWare.
Automated Reasoning Early AI explored how to automated several reasoning tasks – these were solved by what we might call weak problem solving methods as.
Automated Reasoning Early AI explored how to automate several reasoning tasks – these were solved by what we might call weak problem solving methods as.
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
Intro to Planning Or, how to represent the planning problem in logic.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
ARTIFICIAL INTELLIGENCE Lecture 2 Propositional Calculus.
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Forward and Backward Chaining
1 Chapter 11 Global Properties (Distributed Termination)
Concurrent Reasoning with Inference Graphs Daniel R. Schlegel Stuart C. Shapiro Department of Computer Science and Engineering Problem Summary Rise of.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
Programming Language Concepts
Chapter 7. Propositional and Predicate Logic
EA C461 – Artificial Intelligence Logical Agent
Arab Open University 2nd Semester, M301 Unit 5
Artificial Intelligence (CS 370D)
Applications of Propositional Logic
What to do when you don’t know anything know nothing
In peer-to-peer networks such as gnutella, each host must search out other hosts. When a host finds another host, these hosts become neighbors. Often a.
BEST FIRST SEARCH -OR Graph -A* Search -Agenda Search CSE 402
Computer Security: Art and Science, 2nd Edition
Chapter 7. Propositional and Predicate Logic
Summary of the Rules A>B -A B -(A>B) A -B --A A AvB A B
Presentation transcript:

Concurrent Reasoning with Inference Graphs Daniel R. Schlegel and Stuart C. Shapiro Department of Computer Science and Engineering University at Buffalo, The State University of New York Buffalo, New York, D. R. Schlegel and S. C. Shapiro1

Problem Statement Rise of multi-core computers Lack of concurrent natural deduction systems Application to natural language understanding for terrorist plot detection. D. R. Schlegel2 A Motivation

What are Inference Graphs? Graphs for natural deduction – Four types of inference: Forward Backward Bi-directional Focused – Retain derived formulas for later re-use. – Propagate disbelief. – Built upon Propositional Graphs. Take advantage of multi-core computers – Concurrency and scheduling – Near-linear speedup. D. R. Schlegel3

Propositional Graphs Directed acyclic graph Every well-formed expression is a node – Individual constants – Functional terms – Atomic formulas – Non-atomic formulas (“rules”) Each node has an identifier, either – Symbol, or – wfti[!] No two nodes with same identifier. D. R. Schlegel and S. C. Shapiro4

Propositional Graphs D. R. Schlegel and S. C. Shapiro5 If a, and b are true, then c is true. a c and-ant cq wft1! b and-ant

Inference Graphs Extend Propositional Graphs Add channels for information flow (messages): – i-channels report truth of an antecedent to a rule node. – u-channels report truth of a consequent from a rule node. Channels contain valves. – Hold messages back, or allow them through. D. R. Schlegel and S. C. Shapiro6 i-channel u-channel a c and-ant cq wft1! b and-ant

Messages Five kinds – I-INFER – “I’ve been inferred” – U-INFER – “You’ve been inferred” – BACKWARD-INFER – “Open valves so I might be inferred” – CANCEL-INFER – “Stop trying to infer me (close valves)” – UNASSERT – “I’m no longer believed” D. R. Schlegel and S. C. Shapiro7

Priorities Messages have priorities. – UNASSERT is top priority – CANCEL-INFER is next – I-INFER/U-INFER are higher priority closer to a result – BACKWARD-INFER is lowest D. R. Schlegel and S. C. Shapiro8

Rule Node Inference 1.Message arrives at node. D. R. Schlegel and S. C. Shapiro9 a! c Assume we have a KB with a ^ b -> c, and b. Then a is asserted with forward inference. A message is sent from a to wft1 and-ant cq a : true wft1! i-infer b! and-ant

Rule Node Inference 2. Message is translated to Rule Use Information D. R. Schlegel and S. C. Shapiro10 1 Positive Antecedent, a 0 Negative Antecedents 2 Total Antecedents a : true Rule Use Information is stored in rule nodes to be combined later with others that arrive. a! c and-ant cq wft1! b! and-ant

Rule Node Inference 3. Combine RUIs with any existing ones D. R. Schlegel and S. C. Shapiro11 1 Positive Antecedent, b 0 Negative Antecedents 2 Total Antecedents Combine the RUI for a with the one which already exists in wft1 for b. 1 Positive Antecedent, a 0 Negative Antecedents 2 Total Antecedents + 2 Positive Antecedents, a,b 0 Negative Antecedents 2 Total Antecedents = a! c and-ant cq wft1! b! and-ant

Rule Node Inference 4. Determine if the rule can fire. D. R. Schlegel and S. C. Shapiro12 We have two positive antecedents, and we need two. The rule can fire. 2 Positive Antecedents, a,b 0 Negative Antecedents 2 Total Antecedents a! c and-ant cq wft1! b! and-ant

Rule Node Inference 5. Send out new messages. D. R. Schlegel13 c will receive the message, and assert itself. c : true u-infer a! c and-ant cq wft1! b! and-ant

Cycles Graphs may contain cycles. No rule node will infer on the same message more than once. – RUIs with no new information are ignored. Already open valves can’t be opened again. D. R. Schlegel and S. C. Shapiro14 a b ant cq wft2! wft1! ant cq

Concurrency and Scheduling Inference Segment: the area between two valves. When messages reach a valve: – A task is created with the same priority as the message. Task: application of the segment’s function to the message. – Task is added to a prioritized queue. Tasks have minimal shared state, easing concurrency. D. R. Schlegel and S. C. Shapiro15

Concurrency and Scheduling A task only operates within a single segment. 1.tasks for relaying newly derived information using segments “later” in the derivation are executed before “earlier” ones, and 2.once a node is known to be true or false, all tasks attempting to derive it are canceled, as long as their results are not needed elsewhere. D. R. Schlegel and S. C. Shapiro16

Example D. R. Schlegel and S. C. Shapiro17 cq Backchain on cq. Assume every node requires a single one of its incoming nodes to be true for it to be true (simplified for easy viewing). Two processors will be used.

Example D. R. Schlegel and S. C. Shapiro18 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro19 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro20 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro21 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro22 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro23 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro24 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro25 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro26 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro27 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro28 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro29 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro30 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro31 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro32 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro33 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro34 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Example D. R. Schlegel and S. C. Shapiro35 Backward Inference (Open valve) Inferring Inferred Cancelled cq

Evaluation Concurrency: – Near linear performance improvement with the number of processors – Performance robust to graph depth and branching factor changes. Scheduling Heuristics: – Backward-inference with or-entailment shows 10x improvement over LIFO queues, and 20-40x over FIFO queues. D. R. Schlegel and S. C. Shapiro36

Acknowledgements This work has been supported by a Multidisciplinary University Research Initiative (MURI) grant (Number W911NF ) for Unified Research on Network-based Hard/Soft Information Fusion, issued by the US Army Research Office (ARO) under the program management of Dr. John Lavery. D. R. Schlegel37