Dynamic Bayesian Networks

Slides:



Advertisements
Similar presentations
Probabilistic Tracking and Recognition of Non-rigid Hand Motion
Advertisements

State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
Dynamic Bayesian Networks (DBNs)
CSCI 121 Special Topics: Bayesian Networks Lecture #5: Dynamic Bayes Nets.
Reasoning Under Uncertainty: Bayesian networks intro Jim Little Uncertainty 4 November 7, 2014 Textbook §6.3, 6.3.1, 6.5, 6.5.1,
Hidden Markov Models Reading: Russell and Norvig, Chapter 15, Sections
Introduction of Probabilistic Reasoning and Bayesian Networks
Chapter 15 Probabilistic Reasoning over Time. Chapter 15, Sections 1-5 Outline Time and uncertainty Inference: ltering, prediction, smoothing Hidden Markov.
1 Reasoning Under Uncertainty Over Time CS 486/686: Introduction to Artificial Intelligence Fall 2013.
Albert Gatt Corpora and Statistical Methods Lecture 8.
What is the temporal feature in video sequences?
Probabilistic reasoning over time So far, we’ve mostly dealt with episodic environments –One exception: games with multiple moves In particular, the Bayesian.
10/28 Temporal Probabilistic Models. Temporal (Sequential) Process A temporal process is the evolution of system state over time Often the system state.
Part 3 of 3: Beliefs in Probabilistic Robotics. References and Sources of Figures Part 1: Stuart Russell and Peter Norvig, Artificial Intelligence, 2.
Temporal Processes Eran Segal Weizmann Institute.
Part 2 of 3: Bayesian Network and Dynamic Bayesian Network.
CPSC 322, Lecture 31Slide 1 Probability and Time: Markov Models Computer Science cpsc322, Lecture 31 (Textbook Chpt 6.5) March, 25, 2009.
11/14  Continuation of Time & Change in Probabilistic Reasoning Project 4 progress? Grade Anxiety? Make-up Class  On Monday?  On Wednesday?
Dynamic Bayesian Networks CSE 473. © Daniel S. Weld Topics Agency Problem Spaces Search Knowledge Representation Reinforcement Learning InferencePlanningLearning.
Reasoning Under Uncertainty: Bayesian networks intro CPSC 322 – Uncertainty 4 Textbook §6.3 – March 23, 2011.
Temporal Models Template Models Representation Probabilistic Graphical
Efficient Solution Algorithms for Factored MDPs by Carlos Guestrin, Daphne Koller, Ronald Parr, Shobha Venkataraman Presented by Arkady Epshteyn.
Learning and Inferring Transportation Routines By: Lin Liao, Dieter Fox and Henry Kautz Best Paper award AAAI’04.
Summary of This Course Huanhuan Chen. Outline  Basics about Signal & Systems  Bayesian inference  PCVM  Hidden Markov Model  Kalman filter  Extended.
UIUC CS 498: Section EA Lecture #21 Reasoning in Artificial Intelligence Professor: Eyal Amir Fall Semester 2011 (Some slides from Kevin Murphy (UBC))
Dynamic Bayesian Networks and Particle Filtering COMPSCI 276 (chapter 15, Russel and Norvig) 2007.
Hidden Markov Models & POS Tagging Corpora and Statistical Methods Lecture 9.
Hilbert Space Embeddings of Conditional Distributions -- With Applications to Dynamical Systems Le Song Carnegie Mellon University Joint work with Jonathan.
Tractable Inference for Complex Stochastic Processes X. Boyen & D. Koller Presented by Shiau Hong Lim Partially based on slides by Boyen & Koller at UAI.
Probabilistic reasoning over time Ch. 15, 17. Probabilistic reasoning over time So far, we’ve mostly dealt with episodic environments –Exceptions: games.
1 Chapter 15 Probabilistic Reasoning over Time. 2 Outline Time and UncertaintyTime and Uncertainty Inference: Filtering, Prediction, SmoothingInference:
Motivation and Overview
Probability and Time. Overview  Modelling Evolving Worlds with Dynamic Baysian Networks  Simplifying Assumptions Stationary Processes, Markov Assumption.
Daphne Koller Markov Networks General Gibbs Distribution Probabilistic Graphical Models Representation.
Daphne Koller Wrapup BNs vs MNs Probabilistic Graphical Models Representation.
Uncertain Observation Times Shaunak Chatterjee & Stuart Russell Computer Science Division University of California, Berkeley.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Network Arnaud Doucet Nando de Freitas Kevin Murphy Stuart Russell.
Daphne Koller Bayesian Networks Semantics & Factorization Probabilistic Graphical Models Representation.
Daphne Koller Overview Conditional Probability Queries Probabilistic Graphical Models Inference.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks Arnaud Doucet, Nando de Freitas, Kevin Murphy and Stuart Russell CS497EA presentation.
1 Relational Factor Graphs Lin Liao Joint work with Dieter Fox.
Daphne Koller Template Models Plate Models Probabilistic Graphical Models Representation.
Daphne Koller Bayesian Networks Semantics & Factorization Probabilistic Graphical Models Representation.
Reasoning Patterns Bayesian Networks Representation Probabilistic
Daphne Koller Introduction Motivation and Overview Probabilistic Graphical Models.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
CS 541: Artificial Intelligence Lecture VIII: Temporal Probability Models.
Daphne Koller Independencies Bayesian Networks Probabilistic Graphical Models Representation.
Daphne Koller Bayesian Networks Naïve Bayes Probabilistic Graphical Models Representation.
Temporal Models Template Models Representation Probabilistic Graphical
Probabilistic Reasoning Over Time
Probabilistic Reasoning over Time
Probabilistic Reasoning over Time
Probability and Time: Markov Models
Probability and Time: Markov Models
General Gibbs Distribution
Independence in Markov Networks
Markov Networks Independencies Representation Probabilistic Graphical
General Gibbs Distribution
Bayesian Networks Independencies Representation Probabilistic
Probability and Time: Markov Models
Independence in Markov Networks
General Gibbs Distribution
A Direct Measure for the Efficacy of Bayesian Network Structures Learned from Data Gary F. Holness.
Probability and Time: Markov Models
Factorization & Independence
Factorization & Independence
Markov Networks Independencies Representation Probabilistic Graphical
Independence in Markov Networks
Markov Networks Independencies Representation Probabilistic Graphical
Presentation transcript:

Dynamic Bayesian Networks Representation Probabilistic Graphical Models Template Models Dynamic Bayesian Networks

Template Transition Model Weather Weather’ Velocity Velocity’ Location Location’ Failure Failure’ Obs’ Time slice t Time slice t+1

Initial State Distribution Weather0 Velocity0 Location0 Failure0 Obs0 Time slice 0

Ground Bayesian Network Weather0 Weather1 Weather2 Velocity0 Velocity1 Velocity2 Location0 Location1 Location2 Failure0 Failure1 Failure2 Obs0 Obs1 Obs2 Time slice 0 Time slice 1 Time slice 2

Tim Huang, Dieter Koller, Jitendra Malik, Gary Ogasawara, Bobby Rao, Stuart Russell, J. Weber

Dynamic Bayesian Network A transition model over X1,…,Xn is specified via A dynamic Bayesian network is specified via

Ground Network

Hidden Markov Models s1 s2 s3 s4 S S’ S1 O1 S0 S2 O2 S3 O3 O’ 0.5 0.7 0.3 0.4 0.6 0.1 0.9

Hidden Markov Models s1 s2 s3 s4 S S’ S1 O1 S0 S2 O2 S3 O3 O’ 0.5 0.7 0.3 0.4 0.6 0.1 0.9

Consider a smoke detection tracking application, where we have 3 rooms connected in a row. Each room has a true smoke level (X) and a smoke level (Y) measured by a smoke detector situated in the middle of the room. Which of the following is the best DBN structure for this problem? X’1 Y’1 X1 X’2 Y’2 X2 X’3 Y’3 X3 X’1 Y’1 X1 X’2 Y’2 X2 X’3 Y’3 X3 X’1 Y’1 X1 X’2 Y’2 X2 X’3 Y’3 X3 X’1 Y’1 X1 X’2 Y’2 X2 X’3 Y’3 X3

Consider an application, where we have 3 sources, each making sound (X) simultaneously (e.g., a TV, a blender, and a ringing cellphone). We want to separate the observed aggregate signal (Y) into its unobserved constituent signals. Which of the following is the best DBN structure for this problem? X’1 X1 X’2 X2 X’3 Y X3 X1 X’1 X2 X’2 X3 X’3 Y X’1 Y’1 X1 X’2 Y’2 X2 X’3 Y’3 X3 X’1 Y’1 X1 X’2 Y’2 X2 X’3 Y’3 X3

Consider a situation where we have two people in a conversation, each wearing a sensitive lapel microphone. We want to separate each person’s speech (X) using the signal (Y) obtained from their microphone. Which of the following is the best DBN structure for this problem? X1 X’1 X1 X’1 Y’1 Y’1 X2 X’2 X2 X’2 Y’2 Y’2 X1 X’1 X1 X’1 Y’1 X2 X’2 Y’1 X2 X’2 Y’2 Y’2

Summary DBNS are a compact representation for encoding structured distributions over arbitrarily long temporal trajectories They make assumptions that may require appropriate model (re)design: Markov assumption Time invariance