Introduction of Probabilistic Reasoning and Bayesian Networks

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

Bayesian network for gene regulatory network construction
A Tutorial on Learning with Bayesian Networks
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Dynamic Bayesian Networks (DBNs)
BAYESIAN NETWORKS CHAPTER#4 Book: Modeling and Reasoning with Bayesian Networks Author : Adnan Darwiche Publisher: CambridgeUniversity Press 2009.
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
1 Reasoning Under Uncertainty Over Time CS 486/686: Introduction to Artificial Intelligence Fall 2013.
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
Toothache  toothache catch  catch catch  catch cavity  cavity Joint PDF.
M.I. Jaime Alfonso Reyes ´Cortés.  The basic task for any probabilistic inference system is to compute the posterior probability distribution for a set.
Probabilistic reasoning over time So far, we’ve mostly dealt with episodic environments –One exception: games with multiple moves In particular, the Bayesian.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
An Introduction to Bayesian Networks for Multi-Agent Systems By Vijay Sargunar.M.M.
Part 2 of 3: Bayesian Network and Dynamic Bayesian Network.
Goal: Reconstruct Cellular Networks Biocarta. Conditions Genes.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
1 Department of Computer Science and Engineering, University of South Carolina Issues for Discussion and Work Jan 2007  Choose meeting time.
Today Logistic Regression Decision Trees Redux Graphical Models
Bayesian Networks Alan Ritter.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Learning In Bayesian Networks. Learning Problem Set of random variables X = {W, X, Y, Z, …} Training set D = { x 1, x 2, …, x N }  Each observation specifies.
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
A Brief Introduction to Graphical Models
Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
第十讲 概率图模型导论 Chapter 10 Introduction to Probabilistic Graphical Models
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Introduction to Bayesian Networks
UIUC CS 498: Section EA Lecture #21 Reasoning in Artificial Intelligence Professor: Eyal Amir Fall Semester 2011 (Some slides from Kevin Murphy (UBC))
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Learning the Structure of Related Tasks Presented by Lihan He Machine Learning Reading Group Duke University 02/03/2006 A. Niculescu-Mizil, R. Caruana.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Computing & Information Sciences Kansas State University Data Sciences Summer Institute Multimodal Information Access and Synthesis Learning and Reasoning.
Presented by Jian-Shiun Tzeng 5/7/2009 Conditional Random Fields: An Introduction Hanna M. Wallach University of Pennsylvania CIS Technical Report MS-CIS
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Learning In Bayesian Networks. General Learning Problem Set of random variables X = {X 1, X 2, X 3, X 4, …} Training set D = { X (1), X (2), …, X (N)
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Probability and Time. Overview  Modelling Evolving Worlds with Dynamic Baysian Networks  Simplifying Assumptions Stationary Processes, Markov Assumption.
Pattern Recognition and Machine Learning
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Introduction on Graphic Models
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Dynamic Bayesian Network Fuzzy Systems Lifelog management.
CHAPTER 3: BAYESIAN DECISION THEORY. Making Decision Under Uncertainty Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Belief Networks Kostas Kontogiannis E&CE 457. Belief Networks A belief network is a graph in which the following holds: –A set of random variables makes.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
An Algorithm to Learn the Structure of a Bayesian Network Çiğdem Gündüz Olcay Taner Yıldız Ethem Alpaydın Computer Engineering Taner Bilgiç Industrial.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Stochasticity and Probability. A new approach to insight Pose question and think of the answer needed to answer it. Ask: How do the data arise? What is.
CS 188: Artificial Intelligence Spring 2007
CS 2750: Machine Learning Directed Graphical Models
Qian Liu CSE spring University of Pennsylvania
Course: Autonomous Machine Learning
Probabilistic Reasoning over Time
Propagation Algorithm in Bayesian Networks
CSCI 5822 Probabilistic Models of Human and Machine Learning
CS 188: Artificial Intelligence Fall 2007
Class #19 – Tuesday, November 3
Class #16 – Tuesday, October 26
Presentation transcript:

Introduction of Probabilistic Reasoning and Bayesian Networks Hongtao Du Group Presentation

Outline Uncertain Reasoning Probabilistic Reasoning Bayesian Network (BN) Dynamic Bayesian Network (DBN)

Reasoning The activity of guessing the state of the domain from prior knowledge and observations. Causal reasoning Diagnostic reasoning Combinations of these two

Uncertain Reasoning (Guessing) Some aspects of the domain are often unobservable and must be estimated indirectly through other observations. The relationships among domain events are often uncertain, particularly the relationship between the observables and non-observables.

The observations themselves may be unreliable. Even though observable, very often we do not have sufficient resource to observe all relevant events. Even though events relations are certain, very often it is impractical to analyze all of them

Probabilistic Reasoning Methodology founded on the Bayesian probability theory. Events and objects in the real world are represented by random variables. Probabilistic models: Bayesian reasoning Evidence theory Robust statistics Recursive operators

Graphical Model A tool that visually illustrate conditional independence among variables in a given problem. Consisting of nodes (Random variables or States) and edges (Connecting two nodes, directed or undirected). The lack of edge represents conditional independence between variables.

Chain, Path, Cycle, Directed Acyclic Graph (DAG), Parents and Children

Bayesian Network (BN) Probabilistic network, belief network, causal network. A specific type of graphical model that is represented as a Directed Acyclic Graph. X Z Y U V A B

BN consists of Assumptions variables (nodes) V={1, 2, …, k} A set of dependencies (edges) D A set of probability distribution functions (pdf) of each variable P Assumptions P(X)=1 if and only if X is certain If X and Y are mutually exclusive, then P(X v Y) = P(X) + P(Y) Joint probability P(X, Y)= P(X|Y) P(Y)

X represents hypothesis Y represents evidence P(Y|X) is likelihood P(X|Y) is the posterior probability If X, Y are conditionally independent P(X|Z, Y) = P(X|Z)

Given some certain evidence, BN operates by propagating beliefs throughout the network. P(Z, Y, U, V) = P(Z) * P(Y|Z) * P(U|Y) * P(V|U) where is the parents of node Explaining away If a node is observed, its parents become dependent. Two causes (parents) compete to explain the observed data (child).

Tasks in Bayesian Network Inference Learning

Inference Inference is the task of computing the probability of each state of a node in a BN when other variables are known. Method: dividing set of BN nodes into non-overlapping subsets of conditional independent nodes.

Example Given Y is the observed variable. Goal: find the conditional pdf over Case 1:

Case 2:

Learning Goal: completing the missing beliefs in the network. Adjusting the parameters of the Bayesian network so that the pdfs defined by the network sufficiently describes statistical behavior of the observed data.

M: a BN model : Parameter of probability of distribution : Observed data Goal: Estimating to maximize the posterior probability

Assume is highly peaked around maximum likelihood estimates

Dynamic Bayesian Network (DBN) Bayesian network with time-series to represent temporal dependencies. Dynamically changing or evolving over time. Directed graphical model of stochastic processes. Especially aiming at time series modeling. Satisfying the Markovian condition: The state of a system at time t depends only on its immediate past state at time t-1.

Representation Time slice t1 t2 tk The transition matrix that represent these time dependencies is called Conditional Probability Table (CPT).

Description T: time boundary we are investigating : observable variables : hidden-state variables : state transition pdfs, specifying time dependencies between states. : observation pdfs, specifying dependencies of observation nodes regarding to other nodes at time slice t. : initial state distribution.

Tasks in DBN Inference Decoding Learning Pruning

Inference Estimating the pdf of unknown states through given observations and initial probability distributions. Goal: finding : a finite set of T consecutive observations : the set of corresponding hidden variables

Decoding Finding the best-fitting probability values for the hidden states that have generated the known observations. Goal: determine the sequence of hidden states with highest probabilities.

Learning Given a number of observations, estimating parameters of DBN that best fit the observed data. Goal: Maximizing the joint probability distribution. : the model parameter vector

Pruning An important but difficult task in DBN. Distinguishing which nodes are important for inference, and removing the unimportant nodes. Actions: Deleting states from a particular node Removing the connection between nodes Removing a node from the network

Time slice t : designated world nodes, a subset of the nodes, representing the part we want to inspect. , If state of is known, , then are no longer relevant to the overall goal of the inference. Thus, (1) delete all nodes (2) incorporate knowledge that

Future work Probabilistic reasoning in multiagent systems. Different DBNs and applications. Discussion of DBN problems.