CPSC 422, Lecture 17Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 17 Oct, 19, 2015 Slide Sources D. Koller, Stanford CS - Probabilistic.

Slides:



Advertisements
Similar presentations
Decision Theory: Sequential Decisions Computer Science cpsc322, Lecture 34 (Textbook Chpt 9.3) Nov, 28, 2012.
Advertisements

CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
Department of Computer Science Undergraduate Events More
Exact Inference in Bayes Nets
Dynamic Bayesian Networks (DBNs)
Lirong Xia Approximate inference: Particle filter Tue, April 1, 2014.
Undirected Probabilistic Graphical Models (Markov Nets) (Slides from Sam Roweis)
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
CPSC 322, Lecture 37Slide 1 Finish Markov Decision Processes Last Class Computer Science cpsc322, Lecture 37 (Textbook Chpt 9.5) April, 8, 2009.
CPSC 322, Lecture 30Slide 1 Reasoning Under Uncertainty: Variable elimination Computer Science cpsc322, Lecture 30 (Textbook Chpt 6.4) March, 23, 2009.
CS 188: Artificial Intelligence Fall 2009 Lecture 20: Particle Filtering 11/5/2009 Dan Klein – UC Berkeley TexPoint fonts used in EMF. Read the TexPoint.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
CPSC 322, Lecture 31Slide 1 Probability and Time: Markov Models Computer Science cpsc322, Lecture 31 (Textbook Chpt 6.5) March, 25, 2009.
CPSC 322, Lecture 32Slide 1 Probability and Time: Hidden Markov Models (HMMs) Computer Science cpsc322, Lecture 32 (Textbook Chpt 6.5) March, 27, 2009.
CPSC 422, Lecture 18Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18 Feb, 25, 2015 Slide Sources Raymond J. Mooney University of.
CPSC 422, Lecture 14Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 14 Feb, 4, 2015 Slide credit: some slides adapted from Stuart.
CPSC 422, Lecture 19Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 19 Feb, 27, 2015 Slide Sources Raymond J. Mooney University of.
CPSC 322, Lecture 35Slide 1 Value of Information and Control Computer Science cpsc322, Lecture 35 (Textbook Chpt 9.4) April, 14, 2010.
QUIZ!!  T/F: The forward algorithm is really variable elimination, over time. TRUE  T/F: Particle Filtering is really sampling, over time. TRUE  T/F:
CPSC 322, Lecture 22Slide 1 Logic: Domain Modeling /Proofs + Top-Down Proofs Computer Science cpsc322, Lecture 22 (Textbook Chpt 5.2) Oct, 26, 2010.
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
CPSC 322, Lecture 32Slide 1 Probability and Time: Hidden Markov Models (HMMs) Computer Science cpsc322, Lecture 32 (Textbook Chpt 6.5.2) Nov, 25, 2013.
CPSC 422, Lecture 15Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 15 Oct, 14, 2015.
CPSC 422, Lecture 11Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 2, 2015.
CPSC 322, Lecture 33Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 33 Nov, 30, 2015 Slide source: from David Page (MIT) (which were.
QUIZ!!  In HMMs...  T/F:... the emissions are hidden. FALSE  T/F:... observations are independent given no evidence. FALSE  T/F:... each variable X.
CPSC 422, Lecture 19Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 19 Oct, 23, 2015 Slide Sources Raymond J. Mooney University of.
Probabilistic Graphical Models seminar 15/16 ( ) Haim Kaplan Tel Aviv University.
CPSC 422, Lecture 33Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 34 Dec, 2, 2015 Slide source: from David Page (MIT) (which were.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Probability and Time. Overview  Modelling Evolving Worlds with Dynamic Baysian Networks  Simplifying Assumptions Stationary Processes, Markov Assumption.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 10
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 16
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 7
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 19
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 19
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 14
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 17
Markov Networks.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 20
Probability and Time: Markov Models
UBC Department of Computer Science Undergraduate Events More
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 16
Probability and Time: Markov Models
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 16
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 2
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Decision Theory: Sequential Decisions
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 17
Probability and Time: Markov Models
Pairwise Markov Networks
Please Also Complete Teaching Evaluations
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 14
Probability and Time: Markov Models
Readings: K&F: 5.1, 5.2, 5.3, 5.4, 5.5, 5.6, 5.7 Markov networks, Factor graphs, and an unified view Start approximate inference If we are lucky… Graphical.
Pairwise Markov Networks
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18
Markov Networks.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 7
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 7
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 14
Presentation transcript:

CPSC 422, Lecture 17Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 17 Oct, 19, 2015 Slide Sources D. Koller, Stanford CS - Probabilistic Graphical Models D. Page, Whitehead Institute, MIT Several Figures from “Probabilistic Graphical Models: Principles and Techniques” D. Koller, N. Friedman 2009

Simple but Powerful Approach: Particle Filtering Idea from Exact Filtering: should be able to compute P(X t+1 | e 1:t+1 ) from P( X t | e 1:t ) ‏ “.. One slice from the previous slice…” Idea from Likelihood Weighting Samples should be weighted by the probability of evidence given parents New Idea: run multiple samples simultaneously through the network 2CPSC 422, Lecture 11

Particle Filtering Run all N samples together through the network, one slice at a time STEP 0: Generate a population on N initial-state samples by sampling from initial state distribution P(X 0 ) N = 10

Particle Filtering STEP 1: Propagate each sample for x t forward by sampling the next state value x t+1 based on P(X t+1 |X t ) RtRt P(R t+1 =t) tftf

Particle Filtering STEP 2: Weight each sample by the likelihood it assigns to the evidence E.g. assume we observe not umbrella at t+1 RtRt P(u t )P(┐u t ) tftf

Particle Filtering STEP 3: Create a new population from the population at X t+1, i.e. resample the population so that the probability that each sample is selected is proportional to its weight  Start the Particle Filtering cycle again from the new sample

In practice, approximation error of particle filtering remains bounded overtime Is PF Efficient? It is also possible to prove that the approximation maintains bounded error with high probability (with specific assumption: probs in transition and sensor models >0 and <1)

422 big picture: Where are we? Query Planning DeterministicStochastic Value Iteration Approx. Inference Full Resolution SAT Logics Belief Nets Markov Decision Processes and Partially Observable MDP Markov Chains and HMMs First Order Logics Ontologies Temporal rep. Applications of AI Approx. : Gibbs Undirected Graphical Models Markov Networks Conditional Random Fields Reinforcement Learning Representation Reasoning Technique Prob CFG Prob Relational Models Markov Logics Hybrid: Det +Sto Forward, Viterbi…. Approx. : Particle Filtering CPSC 322, Lecture 34Slide 8

CPSC 422, Lecture 179 Lecture Overview Probabilistic Graphical models Intro Example Markov Networks Representation (vs. Belief Networks) Inference in Markov Networks (Exact and Approx.) Applications of Markov Networks

Probabilistic Graphical Models CPSC 422, Lecture 17Slide 10 From “Probabilistic Graphical Models: Principles and Techniques” D. Koller, N. Friedman 2009

Misconception Example Four students (Alice, Bill, Debbie, Charles) get together in pairs, to work on a homework But only in the following pairs: AB AD DC BC Professor misspoke and might have generated misconception A student might have figured it out later and told study partner CPSC 422, Lecture 17Slide 11

Example: In/Depencencies Are A and C independent because they never spoke? No, because A might have figure it out and told B who then told C But if we know the values of B and D…. And if we know the values of A and C CPSC 422, Lecture 17Slide 12

Which of these two Bnets captures the two independencies of our example? CPSC 422, Lecture 17Slide 13 a.b. c. Both d. None

Parameterization of Markov Networks CPSC 422, Lecture 17Slide 14 Factors define the local interactions (like CPTs in Bnets) What about the global model? What do you do with Bnets? X X

How do we combine local models? As in BNets by multiplying them! CPSC 422, Lecture 17Slide 15

Multiplying Factors (same seen in 322 for VarElim) CPSC 422, Lecture 17Slide 16

Factors do not represent marginal probs. ! CPSC 422, Lecture 1717 a 0 b a 0 b a 1 b a 1 b Marginal P(A,B) Computed from the joint

Step Back…. From structure to factors/potentials In a Bnet the joint is factorized…. CPSC 422, Lecture 17Slide 18 In a Markov Network you have one factor for each maximal clique

Directed vs. Undirected Independencies CPSC 422, Lecture 17Slide 19 Factorization

General definitions Two nodes in a Markov network are independent if and only if every path between them is cut off by evidence So the markov blanket of a node is…? eg for C eg for A C CPSC 422, Lecture 1720

Markov Networks Applications (1): Computer Vision Called Markov Random Fields Stereo Reconstruction Image Segmentation Object recognition CPSC 422, Lecture 1721 Typically pairwise MRF Each vars correspond to a pixel (or superpixel ) Edges (factors) correspond to interactions between adjacent pixels in the image E.g., in segmentation: from generically penalize discontinuities, to road under car

Image segmentation CPSC 422, Lecture 1722

Markov Networks Applications (2): Sequence Labeling in NLP and BioInformatics Conditional random fields (next class Wed) CPSC 422, Lecture 1723

CPSC 422, Lecture 17Slide 24 Learning Goals for today’s class  You can: Justify the need for undirected graphical model (Markov Networks) Interpret local models (factors/potentials) and combine them to express the joint Define independencies and Markov blanket for Markov Networks Perform Exact and Approx. Inference in Markov Networks Describe a few applications of Markov Networks

CPSC 422, Lecture 17Slide 25 Keep Working on assignment-2 ! Go to Office Hours x Learning Goals (look at the end of the slides for each lecture – will post complete list) Revise all the clicker questions and practice exercises Will post more practice material today Midterm, Mon, Oct 26, we will start at 9am sharp How to prepare….

How to acquire factors? CPSC 422, Lecture 1726

CPSC 422, Lecture 1727