1 Multi-Entity Bayesian Networks Without Multi-Tears Bayesian Networks Seminar Jan 3-4, 2007.

Slides:



Advertisements
Similar presentations
Slide 1 of 18 Uncertainty Representation and Reasoning with MEBN/PR-OWL Kathryn Blackmond Laskey Paulo C. G. da Costa The Volgenau School of Information.
Advertisements

Limitations of the relational model 1. 2 Overview application areas for which the relational model is inadequate - reasons drawbacks of relational DBMSs.
First Order Bayesian Network
Big Ideas in Cmput366. Search Blind Search State space representation Iterative deepening Heuristic Search A*, f(n)=g(n)+h(n), admissible heuristics Local.
Semantics Static semantics Dynamic semantics attribute grammars
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
The Logic of Intelligence Pei Wang Department of Computer and Information Sciences Temple University.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Knowledge Representation
Multisource Fusion for Opportunistic Detection and Probabilistic Assessment of Homeland Terrorist Threats Kathryn Blackmond Laskey & Tod S. Levitt presented.
Introduction of Probabilistic Reasoning and Bayesian Networks
ISBN Chapter 3 Describing Syntax and Semantics.
Terrorism Risk Management Authors of the Paper: David C. Daniels Linwood D.Hudson Kathryn B. Laskey Suzanne M. Mahoney Bryan S. Ware Edward J. Wright Book:
Modeling Human Reasoning About Meta-Information Presented By: Scott Langevin Jingsong Wang.
Plan Recognition with Multi- Entity Bayesian Networks Kathryn Blackmond Laskey Department of Systems Engineering and Operations Research George Mason University.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Artificial Intelligence Knowledge-based Agents Russell and Norvig, Ch. 6, 7.
PR-OWL: A Framework for Probabilistic Ontologies by Paulo C. G. COSTA, Kathryn B. LASKEY George Mason University presented by Thomas Packer 1PR-OWL.
1 © 1998 HRL Laboratories, LLC. All Rights Reserved Construction of Bayesian Networks for Diagnostics K. Wojtek Przytula: HRL Laboratories & Don Thompson:
Probabilistic Reasoning Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14 (14.1, 14.2, 14.3, 14.4) Capturing uncertain knowledge Probabilistic.
Relational Data Mining in Finance Haonan Zhang CFWin /04/2003.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
1 Department of Computer Science and Engineering, University of South Carolina Issues for Discussion and Work Jan 2007  Choose meeting time.
Completeness February 27, 2006 Geog 458: Map Sources and Errors.
Describing Syntax and Semantics
WELCOME TO THE WORLD OF FUZZY SYSTEMS. DEFINITION Fuzzy logic is a superset of conventional (Boolean) logic that has been extended to handle the concept.
FERA 2001 Slide 1 November 6, 2001 Making Sense of Data from Complex Assessments Robert J. Mislevy University of Maryland Linda S. Steinberg & Russell.
1 Inference Algorithm for Similarity Networks Dan Geiger & David Heckerman Presentation by Jingsong Wang USC CSE BN Reading Club Contact:
Scaling and Attitude Measurement in Travel and Hospitality Research Research Methodologies CHAPTER 11.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Probabilistic Reasoning
The Bayesian Web Adding Reasoning with Uncertainty to the Semantic Web
Chapter 7 Requirement Modeling : Flow, Behaviour, Patterns And WebApps.
Copyright © Cengage Learning. All rights reserved. 8 Tests of Hypotheses Based on a Single Sample.
Chapter 1: Introduction to Statistics
Introduction to ASMs Dumitru Roman Digital Enterprise Research Institute
1 Department of Electrical and Computer Engineering University of Virginia Software Quality & Safety Assessment Using Bayesian Belief Networks Joanne Bechta.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Understanding PML Paulo Pinheiro da Silva. PML PML is a provenance language (a language used to encode provenance knowledge) that has been proudly derived.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Pattern-directed inference systems
ISBN Chapter 3 Describing Semantics -Attribute Grammars -Dynamic Semantics.
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
Logic. What is logic? Logic (from the Ancient Greek: λογική, logike) is the use and study of valid reasoning. The study of logic features most prominently.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
Semantics In Text: Chapter 3.
Generic Tasks by Ihab M. Amer Graduate Student Computer Science Dept. AUC, Cairo, Egypt.
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
CHAPTER 5 Probability Theory (continued) Introduction to Bayesian Networks.
Decision Trees Binary output – easily extendible to multiple output classes. Takes a set of attributes for a given situation or object and outputs a yes/no.
Statistics What is the probability that 7 heads will be observed in 10 tosses of a fair coin? This is a ________ problem. Have probabilities on a fundamental.
DeepDive Model Dongfang Xu Ph.D student, School of Information, University of Arizona Dec 13, 2015.
Formal Specification: a Roadmap Axel van Lamsweerde published on ICSE (International Conference on Software Engineering) Jing Ai 10/28/2003.
Probability and Time. Overview  Modelling Evolving Worlds with Dynamic Baysian Networks  Simplifying Assumptions Stationary Processes, Markov Assumption.
Artificial Intelligence: Research and Collaborative Possibilities a presentation by: Dr. Ernest L. McDuffie, Assistant Professor Department of Computer.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Introduction on Graphic Models
CSC3315 (Spring 2009)1 CSC 3315 Languages & Compilers Hamid Harroud School of Science and Engineering, Akhawayn University
C HAPTER 3 Describing Syntax and Semantics. D YNAMIC S EMANTICS Describing syntax is relatively simple There is no single widely acceptable notation or.
Conditional Independence As with absolute independence, the equivalent forms of X and Y being conditionally independent given Z can also be used: P(X|Y,
Probabilistic Reasoning Inference and Relational Bayesian Networks.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Artificial Intelligence Logical Agents Chapter 7.
Semantics In Text: Chapter 3.
Causal Models Lecture 12.
Generalized Diagnostics with the Non-Axiomatic Reasoning System (NARS)
Presentation transcript:

1 Multi-Entity Bayesian Networks Without Multi-Tears Bayesian Networks Seminar Jan 3-4, 2007

2 Limitations of Bayesian networks Not expressive enough for many real world applications. fixed number of attributes varying numbers of related entities of different types.

3 Systems based on first-order logic (FOL) the ability to represent entities of different types interacting with each other in varied ways. “has enough expressive power to define all of mathematics.. and the semantics of every version of logic, including itself” lack a theoretically principled, widely accepted, logically coherent methodology for reasoning under uncertainty.

4 Multi-entity Bayesian networks (MEBN) a knowledge representation formalism that combines the expressive power of first-order logic with a sound and logically consistent treatment of uncertainty. not a computer language or an application.

5 Multi-entity Bayesian networks (MEBN) (cont..) A formal system that instantiates first-order Bayesian logic. MEBN provides syntax, a set of model construction and inference processes, and semantics that together provide a means of defining probability distributions over unbounded and possibly infinite numbers of interrelated hypotheses. As such, MEBN provides a logical foundation for the many emerging languages that extend the expressiveness of Bayesian networks.

6 An example: Star Trek illustrate the limitations of standard BNs for situations that demand a more powerful representation formalism.

7 Decision Support Systems in the 24th Century

8 The Basic Starship Bayesian Network

9

10 Problems little use in a real life starship environment

11 The BN for Four Starships

12 Limitations of Bayesian networks BNs lack the expressive power to represent entity types (e.g., starships) that can be instantiated as many times as required for the situation at hand.

13 The Premise the likelihood ratio for a high MDR is 7/5 = 1.4 in favor of a starship in cloak mode. Although this favors a cloaked starship in the vicinity, the evidence is not overwhelming.

14 Repetition Repetition is a powerful way to boost the discriminatory power of weak signals. an example: from airport terminal radars, a single pulse reflected from an aircraft usually arrives back to the radar receiver very weakened, making it hard to set apart from background noise. However, a steady sequence of reflected radar pulses is easily distinguishable from background noise.

15 Repetition (cont..) Following the same logic, it is reasonable to assume that an abnormal background disturbance will show random fluctuation, whereas a disturbance caused by a starship in cloak mode would show a characteristic temporal pattern. Thus, when there is a cloaked starship nearby, the MDR state at any time depends on its previous state.

16 The BN for One Starship with Recursion

17 Temporal Recursion DBNs PDBN a more general recursion capability is needed, as well as a parsimonious syntax for expressing recursive relationships.

18 Using MEBN Logic a more “realistic” sci-fi scenario. different alien species: Friends, Cardassians, Romulans, and Klingons while addressing encounters with other possible races using the general labelUnknown. consider each starship’s type, offensive power, the ability of inflict harm to the Enterprise given its range, and numerous other features pertinent to the model’s purpose.

19 Using MEBN Logic (cont..) MEBN logic represents the world as comprised of entities that have attributes and are related to other entities. Random Variables.

20 Using MEBN Logic (cont..) Knowledge about attributes and relationships is expressed as a collection of MFrags organized into MTheories. MEBN Fragments (MFrags). represents a conditional probability distribution for instances of its resident RVs given their parents in the fragment graph and the context nodes. MEBN Theories (MTheories). a set of MFrags that collectively satisfies consistency constraints ensuring the existence of a unique joint probability distribution over instances of the RVs represented in each of the MFrags within the set.

21 The DangerToSelf MFrag

22 Using MEBN Logic (cont..) Nodes: Contex nodes. Input nodes. Resdent nodes. Arguments. Unique identifier Exclamation point.

23 Using MEBN Logic (cont..) Instances of the RV. HarmPotential(!ST1, !T1), HarmPotential(!ST2, !T1) Home MFrag. Local distribution Boolean context nodes. True, False, or Absurd. Relevant only for deciding whether to use a resident random variable’s local distribution or its default distribution.

24 Using MEBN Logic (cont..) No probability values are shown for the states of the nodes. a node in an MFrag represents a generic class of random variables. Identify all the instances none pseudo code.

25 Using MEBN Logic (cont..) Local distributions in standard BNs are typically represented by static tables, which limits each node to a fixed number of parents. An instance of a node in an MTheory might have any number of parents. MEBN implementations(i.e. languages based on MEBN logic) must provide an expressive language for defining local distributions.

26 An Instance of the DangerToSelf MFrag

27 An Instance of the DangerToSelf MFrag (cont..) the belief for state Unacceptable is.975 ( *3) and the beliefs for states High, Medium, and Low are.02 ((1-.975)*.8),.005 ((1-.975)*.2), and zero respectively.

28 Using MEBN Logic (cont..) more complex knowledge patterns could be accommodated as needed to suit the requirements of the application. MEBN logic has built-in logical MFrags that provide the ability to express anything that can be expressed in first- order logic.

29 Recursive MFrags One of the main limitations of BNs is their lack of support for recursion. MEBN provides theoretically grounded support for very general recursive definitions of local distributions.

30 Recursive MFrags (cont..) MEBN logic allows influences between instances of the same random variable. The recursion is grounded by specifying an initial distribution at time !T0 that does not depend on a previous magnetic disturbance.

31 Recursive MFrags (cont..) How recursive definitions can be applied to construct a situation-specific Bayesian Network (SSBN) to answer a query. Example: !Z0, !T3, !ST0 and !ST1 ZoneMD(!Z0,!T3)

32 The Zone MFrag

33 SSBN Constructed from Zone MFrag

34 Recursive MFrags (cont..) Steps: begin by creating an instance of the home MFrag of the query node ZoneMD(!Z0,!T3). build any CPTs we can already build. recursively creating instances of the home MFrags until we have added all the nodes.

35 Building MEBN models with MTheories MFrags provide a flexible means to represent knowledge about specific subjects within the domain of discourse. but the true gain in expressive power is revealed when we aggregate these “knowledge patterns” to form a coherent model of the domain of discourse that can be instantiated to reason about specific situations and refined through learning. just collecting a set MFrags that represent specific parts of a domain is not enough to ensure a coherent representation of that domain. a set of MFrags with cyclic influences

36 Building MEBN models with MTheories (cont..) In order to build a coherent model we have to make sure that our set of MFrags collectively satisfies consistency constraints ensuring the existence of a unique joint probability distribution over instances of the random variables mentioned in the MFrags. Such a coherent collection of MFrags is called an MTheory. An MTheory represents a joint probability distribution for an unbounded, possibly infinite number of instances of its random variables.

37 Building MEBN models with MTheories (cont..) A generative MTheory summarizes statistical regularities that characterize a domain. These regularities are captured and encoded in a knowledge base using some combination of expert judgment and learning from observation. To apply a generative MTheory to reason about particular scenarios, we need to provide the system with specific information about the individual entity instances involved in the scenario. Bayesian inference answer specific questions of interest refine the MTheory

38 Building MEBN models with MTheories (cont..) Findings the basic mechanism for incorporating observations into MTheories. a finding is represented as a special 2-node MFrag containing a node from the generative MTheory and a node declaring one of its states to have a given value.

39 Building MEBN models with MTheories (cont..) Inserting a finding into an MTheory corresponds to asserting a new axiom in a first-order theory. In other words, MEBN logic is inherently open, having the ability to incorporate new axioms as evidence and update the probabilities of all random variables in a logically consistent way.

40 Building MEBN models with MTheories (cont..) A valid MTheory Each random variable must have a unique home MFrag. It must ensure that all recursive definitions terminate in finitely many steps and contain no circular influences.

41 The Star Trek Generative MTheory

42 Building MEBN models with MTheories (cont..) It is important to understand the power and flexibility that MEBN logic gives to knowledge base designers by allowing multiple, equivalent ways of portraying the same knowledge.

43 Equivalent MFrag Representations of Knowledge

44 Inference in MEBN Logic BN Assessing the impact of new evidence involves conditioning on the values of evidence nodes and applying a belief propagation algorithm. MEBN have an initial generative MTheory, a Finding set and Target set. construct SSBN. Creating instances of Finding and Target random variables. standard BN inference is applied. Inspecting the posterior probabilities of the target nodes.

45 SSBN for the Star Trek MTheory with Four Starships within Enterprise’s Range