1 MPE and Partial Inversion in Lifted Probabilistic Variable Elimination Rodrigo de Salvo Braz University of Illinois at Urbana-Champaign with Eyal Amir.

Slides:



Advertisements
Similar presentations
1 MPE and Partial Inversion in Lifted Probabilistic Variable Elimination Rodrigo de Salvo Braz University of Illinois at Urbana-Champaign with Eyal Amir.
Advertisements

Online Multi-camera Tracking with a Switiching State-Space Model Wojciech Zajdel, A. Taylan Cemgil, and Ben KrÄose ICPR 2004.
First-Order Logic.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
Graphical Models BRML Chapter 4 1. the zoo of graphical models Markov networks Belief networks Chain graphs (Belief and Markov ) Factor graphs =>they.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Lifted First-Order Probabilistic Inference Rodrigo de Salvo Braz, Eyal Amir and Dan Roth This research is supported by ARDA’s AQUAINT Program, by NSF grant.
Plan Recognition with Multi- Entity Bayesian Networks Kathryn Blackmond Laskey Department of Systems Engineering and Operations Research George Mason University.
Formal Logic Proof Methods Direct Proof / Natural Deduction Conditional Proof (Implication Introduction) Reductio ad Absurdum Resolution Refutation.
Lecture 19 Exam: Tuesday June14 4-6pm Overview. Disclaimer The following is a only study guide. You need to know all the material treated in class.
1Causality & MDL Causal Models as Minimal Descriptions of Multivariate Systems Jan Lemeire June 15 th 2006.
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
Inference in First-Order Logic
CSE (c) S. Tanimoto, 2008 Bayes Nets 1 Probabilistic Reasoning With Bayes’ Rule Outline: Motivation Generalizing Modus Ponens Bayes’ Rule Applying.
Lecture 19 Exam: Tuesday June15 4-6pm Overview. General Remarks Expect more questions than before that test your knowledge of the material. (rather then.
The Bayesian Web Adding Reasoning with Uncertainty to the Semantic Web
Notes for Chapter 12 Logic Programming The AI War Basic Concepts of Logic Programming Prolog Review questions.
Relational Probability Models Brian Milch MIT 9.66 November 27, 2007.
1 MPE and Partial Inversion in Lifted Probabilistic Variable Elimination Rodrigo de Salvo Braz University of Illinois at Urbana-Champaign with Eyal Amir.
Christopher Re and Dan Suciu University of Washington Efficient Evaluation of HAVING Queries on a Probabilistic Database.
Set Theory Dr. Ahmed Elmoasry. Contents Ch I: Experiments, Models, and Probabilities. Ch II: Discrete Random Variables Ch III: Discrete Random Variables.
Made by: Maor Levy, Temple University  Inference in Bayes Nets ◦ What is the probability of getting a strong letter? ◦ We want to compute the.
Unification Algorithm Input: a finite set Σ of simple expressions Output: a mgu for Σ (if Σ is unifiable) 1. Set k = 0 and  0 = . 2. If Σ  k is a singleton,
Lifted First-Order Probabilistic Inference Rodrigo de Salvo Braz SRI International joint work with Eyal Amir and Dan Roth.
Markov Logic Networks Pedro Domingos Dept. Computer Science & Eng. University of Washington (Joint work with Matt Richardson)
1 Predicate (Relational) Logic 1. Introduction The propositional logic is not powerful enough to express certain types of relationship between propositions.
Inference for 2 Proportions Mean and Standard Deviation.
1 Lifted First-Order Probabilistic Inference Rodrigo de Salvo Braz University of Illinois at Urbana-Champaign with Eyal Amir and Dan Roth.
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Thursday, November 29, 2001.
LAC group, 16/06/2011. So far...  Directed graphical models  Bayesian Networks Useful because both the structure and the parameters provide a natural.
Dr. Muhammed Al-Mulhem ICS An Introduction to Logical Programming.
First-Order Logic and Inductive Logic Programming.
Propositional Logic Predicate Logic
Comparing Student Ability to Reason with Multiple Variables for Graphed and Non-Graphed Information. Rebecca Rosenblatt Student comments from experiment.
4.3 Relations in Categorical Data.  Use categorical data to calculate marginal and conditional proportions  Understand Simpson’s Paradox in context.
First-Order Probabilistic Inference Rodrigo de Salvo Braz.
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
1-1 An Introduction to Logical Programming Sept
Computer vision: models, learning and inference Chapter 2 Introduction to probability.
1-1 Patterns and Expressions Goal: Identify and Describe Patterns Honors Algebra II.
Today Graphical Models Representing conditional dependence graphically
First-Order Probabilistic Inference Rodrigo de Salvo Braz SRI International.
Daphne Koller Overview Maximum a posteriori (MAP) Probabilistic Graphical Models Inference.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
Venn Diagrams.
Daphne Koller Variable Elimination Variable Elimination Algorithm Probabilistic Graphical Models Inference.
Happy Mittal Advisor : Parag Singla IIT Delhi Lifted Inference Rules With Constraints.
6.1 The Idea of Probability
Introduction to Logic for Artificial Intelligence Lecture 2
Topics Covered since 1st midterm…
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 30
Quizzz Rihanna’s car engine does not start (E).
اختر أي شخصية واجعلها تطير!
Lifted First-Order Probabilistic Inference [de Salvo Braz, Amir, and Roth, 2005] Daniel Lowd 5/11/2005.
Anytime Lifted Belief Propagation
Pattern Recognition and Image Analysis
CS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2007
1.4 Rewriting Equations.
Automatic Inference in PyBLOG
Clique Tree Algorithm: Computation
Compact Propositional Encoding of First Order Theories
First Order Probabilistic Inference, by David Poole (2003)
Read R&N Ch Next lecture: Read R&N
Organizational Patterns
presented by Anton Bezuglov, Hrishikesh Goradia CSCE 582 Fall02
Ensuring Security in Critical Infrastructures with Probabilistic Relational Modeling Part 2 Summer School “Hot Topics in Cyber Security” Rome Ralf Möller.
Observation Information we get from our senses alone.
basic probability and bayes' rule
Presentation transcript:

1 MPE and Partial Inversion in Lifted Probabilistic Variable Elimination Rodrigo de Salvo Braz University of Illinois at Urbana-Champaign with Eyal Amir and Dan Roth

Page 2 Repetitive patterns in graphical models sick(mary,measles) hospital(mary) epidemic(measles)epidemic(flu) sick(mary,flu) … … sick(bob,measles) hospital(bob) sick(bob,flu) …… … …… ……

Page 3 Repetitive patterns in graphical models sick(mary,measles) hospital(mary) epidemic(measles)epidemic(flu) sick(mary,flu) … … sick(bob,measles) hospital(bob) sick(bob,flu) …… … …… ……   sick(mary,measles), epidemic(measles))

Page 4 Repetitive patterns in graphical models sick(mary,measles) hospital(mary) epidemic(measles)epidemic(flu) sick(mary,flu) … … sick(bob,measles) hospital(bob) sick(bob,flu) …… … …… ……

Page 5 Lots of Redundancy! sick(mary,measles) hospital(mary) epidemic(measles)epidemic(flu) sick(mary,flu) … … sick(bob,measles) hospital(bob) sick(bob,flu) …… … …… ……

Page 6 Representing structure sick(mary,measles) epidemic(measles)epidemic(flu) sick(mary,flu) … … sick(bob,measles)sick(bob,flu) …… …… sick(P,D) epidemic(D) Poole (2003) named these parfactors, for “parameterized factors”

Page 7 Parfactor sick(Person,Disease) epidemic(Disease) 8 Person, Disease   sick(Person,Disease), epidemic(Disease))

Page 8 Parfactor sick(Person,Disease) epidemic(Disease) 8 Person, Disease   sick(Person,Disease), epidemic(Disease)), Person  mary, Disease  flu

Page 9 Lifted Probabilistic Inference Goal: to perform inference at the first-order level, without resorting to grounding. First-Order Variable Elimination (FOVE): a generalization of Variable Elimination in propositional graphical models. Eliminates classes of random variables at once.

Page 10 Inference - Inversion Elimination (IE) P(hospital(mary) | sick(mary, measles)) = ? hospital(mary) sick(mary, D) epidemic(D)

Page 11 Inference - Inversion Elimination (IE) P(hospital(mary) | sick(mary, measles)) = ? hospital(mary) sick(mary, D) epidemic(D) = Unification

Page 12 Inference - Inversion Elimination (IE) P(hospital(mary) | sick(mary, measles)) = ? sick(mary,measles) hospital(mary) sick(mary, D) D  measles epidemic(measles)epidemic(D) D  measles

Page 13 Inference - Inversion Elimination (IE) P(hospital(mary) | sick(mary, measles)) = ? sick(mary,measles) hospital(mary) sick(mary, D) D  measles epidemic(measles)epidemic(D) D  measles =

Page 14 Inference - Inversion Elimination (IE) P(hospital(mary) | sick(mary, measles)) = ? sick(mary,measles) hospital(mary) sick(mary, D) D  measles epidemic(measles)epidemic(D) D  measles

Page 15 Inference - Inversion Elimination (IE) P(hospital(mary) | sick(mary, measles)) = ? sick(mary,measles) hospital(mary) sick(mary, D) D  measles epidemic(D) D  measles

Page 16 Inference - Inversion Elimination (IE) hospital(mary) sick(mary, D) D  measles epidemic(D) D  measles P(hospital(mary) | sick(mary, measles)) = ?

Page 17 Inference - Inversion Elimination (IE) P(hospital(mary) | sick(mary, measles)) = ? hospital(mary) sick(mary, D) D  measles

Page 18 Inference - Inversion Elimination (IE) P(hospital(mary) | sick(mary, measles)) = ? hospital(mary)

Page 19 Inversion Elimination Joint       (A   ) Example  X   (p(X))  X,Y   (p(X),q(X,Y)) Marginalization by eliminating class q(X,Y):  q(X,Y)  X   (p(X))  X,Y   (p(X),q(X,Y))  X   (p(X))  q(X,Y)  X,Y   (p(X),q(X,Y))

Page 20 Inversion Elimination  q(X,Y)  X,Y   (p(X),q(X,Y)) =  X,Y  q(X,Y)   (p(X),q(X,Y)) =  X,Y   (p(X)) =  X    Y  (p(X)) =  X   (p(X)) * depends on certain conditions *

Page 21 Inversion Elimination - Conditions - I Eliminated atom must contain all logical variables in parfactors involved. sick(P,D) epidemic(D)

Page 22 Inversion Elimination - Conditions - I Eliminated atom must contain all logical variables in parfactors involved. sick(P,D) epidemic(D) Ok, contains both P and D

Page 23 Inversion Elimination - Conditions - I Eliminated atom must contain all logical variables in parfactors involved. sick(P,D) epidemic(D) Not Ok, missing P sick(P,D)

Page 24 Inversion Elimination - Conditions - I Eliminated atom must contain all logical variables in parfactors involved. q(Y,Z) p(X,Y) No atom can be eliminated

Page 25 Inversion Elimination - Conditions - I … sick(mary, flu) epidemic(flu) sick(mary, rubella) epidemic(rubella) … sick(mary, D) epidemic(D) D  measles Eliminated atom must contain all logical variables - guarantees that subproblems are disjoint.

Page 26 Inversion Elimination - Conditions - II epidemic(measles) epidemic(flu) epidemic(D2) epidemic(D1) epidemic(rubella) … Inversion Elimination Not Ok D1  D2 Requires eliminated RVs to occur in separate instances of parfactor

Page 27  e(D)  D1  D2  (e(D 1 ),e(D 2 )) =  e(D)  (0,0) #(0,0) in e(D),D1  D2  (0,1) #(0,1) in e(D),D1  D2  (1,0) #(1,0) in e(D),D1  D2  (1,1) #(1,1) in e(D),D1  D2 =  e(D)  v  (v) #v in e(D),D1  D2 Counting Elimination - A Combinatorial Approach =  ( )  v  (v) #v in e(D),D1  D2 (from i) |e(D)| i i=0

Page 28 No shared logical variables between atoms, so counting can be done independently  (epidemic(D 1, Region), epidemic(D 2, Region)) Counting Elimination - A Combinatorial Approach

Page 29 Uncovered by Inversion and Counting Eliminating epidemic from   epidemic(Disease1,Region), epidemic(Disease2,Region), donations) No logical variable in all atoms, so no Inversion Elimination Shared logical variables, so no Counting Elimination

Page 30 Partial Inversion  e(D,R)  D1  D2,R   e(D1,R), e(D2,R), d )  e(D,R)  D1  D2,R   e(D1,R), e(D2,R), d )  R  e(D,r)  D1  D2   e(D1,r), e(D2,r), d )  R  ’  d ) =  ’  d ) |R| =  ’’  d ) Inversion elimination is the case where all logical variables are inverted and subproblem is propositional.

Page 31 Partial Inversion, graphically epidemic(D2,r 1 ) epidemic(D1,r 1 ) D1  D2 donations epidemic(D2,R) epidemic(D1,R) D1  D2 donations epidemic(D2,r 10 ) epidemic(D1,r 10 ) D1  D2 … … Each instance a counting elimination problem

Page 32 Partial inversion conditions Conditioned subsets must be disjoint   friends(P1, P2), friends(P2,P1), smoke(P1), smoke(P2) ) Doesn’t work because subproblems share instances of friends.

Page 33 Second contribution: Lifted MPE In propositional case, MPE done by factors containing MPE of eliminated variables. AB C D

Page 34 MPE AB D BD  000.3C= C= C= C=1 In propositional case, MPE done by factors containing MPE of eliminated variables.

Page 35 MPE AB B  00.5C=1,D=0 11.4C=1,D=1 In propositional case, MPE done by factors containing MPE of eliminated variables.

Page 36 MPE A A  MPE(B,C,D) 00.9B=0,C=1,D=0 10.7B=1,C=1,D=1 In propositional case, MPE done by factors containing MPE of eliminated variables.

Page 37 MPE  0.9A=0,B=1,C=1,D=1 In propositional case, MPE done by factors containing MPE of eliminated variables.

Page 38 MPE Same idea in First-order case But factors are quantified and so are assignments: p(X)q(X,Y)  MPE r(X,Y) = r(X,Y) = r(X,Y) = r(X,Y) = 1 8 X, Y   p(X), q(X,Y))

Page 39 MPE After Inversion Elimination of q(X,Y): p(X)q(X,Y)  MPE r(X,Y) = r(X,Y) = r(X,Y) = r(X,Y) = 1 8 X, Y   p(X), q(X,Y)) p(X) ’’ MPE Y q(X,Y) = 1, r(X,Y) = Y q(X,Y) = 0, r(X,Y) = 1 8 X  ’  p(X)) Lifted assignments

Page 40 MPE After Inversion Elimination of p(X): 8 X  ’  p(X))  ’’ MPE X 8 Y p(X) = 0, q(X,Y) = 1, r(X,Y) = 0  ’’  ) p(X) ’’ MPE Y q(X,Y) = 1, r(X,Y) = Y q(X,Y) = 0, r(X,Y) = 1

Page 41 MPE After Counting Elimination of e: e(D1)e(D2)  MPE r(D1,D2) = r(D1,D2) = r(D1,D2) = r(D1,D2) = 1 8 D1, D2   e(D1), e(D2)) ’’ MPE (D1=0,D2=0) e(D1)=0, e(D2) = 1, r(D1,D2) = (D1=0,D2=1) e(D1)=1, e(D2) = 1, r(D1,D2) = (D1=1,D2=0) e(D1)=1, e(D2) = 0, r(D1,D2) = (D1=1,D2=1) e(D1)=0, e(D2) = 0, r(D1,D2) = 1 ’)’)

Page 42 Conclusions Partial Inversion: More general algorithm, subsumes Inversion elimination Lifted MPE same idea as in propositional VE, but with Lifted assignments: describe sets of basic assignments Universally quantified comes from Inversion Existentially quantified comes from Counting elimination Ultimate goal: To perform lifted probabilistic inference in way similar to logic inference: without grounding and at a higher level.

Page 43