Presentation is loading. Please wait.

Presentation is loading. Please wait.

Learning Models of Relational Stochastic Processes Sumit Sanghai.

Similar presentations


Presentation on theme: "Learning Models of Relational Stochastic Processes Sumit Sanghai."— Presentation transcript:

1 Learning Models of Relational Stochastic Processes Sumit Sanghai

2 Motivation Features of real-world domains –Multiple classes, objects, relations P2 P1 P3 A2 A1 V1

3 Motivation Features of real-world domains –Multiple classes, objects, relations –Uncertainty P2 P1 P3 A2 A1 V1 ? ??

4 Motivation Features of real-world domains –Multiple classes, objects, relations –Uncertainty –Changes with time P2 P1 P3 A2 A1 V1 P4 A3 P5 P6

5 Relational Stochastic Processes Features –Multiple classes, objects, relations –Uncertainty –Change over time Examples: Social networks, molecular biology, user activity modeling, web, plan recognition, … Growth inherent or due to explicit actions Most large datasets are gathered over time –Explore dependencies over time –Predict future

6 Manufacturing Process

7 Paint(A, blue)

8 Manufacturing Process Paint(A, blue)Bolt(B, C)

9 Manufacturing Process Paint(A, blue)Bolt(B, C) A.col tt + 1 A.colP red blue red blue 0.1 0.9 0.95 redblue0.05

10 Why are they different? Modeling object, relationships modification, creation and deletion Modeling actions (preconditions/effects), activities, plans Can’t just throw ``time’’ into the mix Summarizing object information Learning can be made easier by concentrating on temporal dependencies Sophisticated inference techniques like particle filtering may be applicable

11 Outline Background: Dynamic Bayes Nets Dynamic Probabilistic Relational Models Inference in DPRMs Learning with Dynamic Markov Logic Nets Future Work

12 Dynamic Bayesian Networks DBNs model change in uncertain variables over time Each time slice consists of state/observation variables Bayesian network models dependency of current on previous time slice(s) At each node a conditional model (CPT, logistic regression, etc.) YtYt Y t+1 tt+1 X1tX1t X2tX2t X 2 t+1 X 1 t+1

13 Inference and learning in DBNs Inference –All techniques from BNs are used –Special techniques like Particle Filtering, Boyen-Koller, Factored Frontier, etc. can be used for state monitoring Learning –Problem exactly similar to BNs –Structural EM used in case of missing data Needs a fast inference algorithm

14 Particle Filtering in DBNs Task: State monitoring Particle Filter –Samples represent state distribution at time t –Generate samples for t+1 based on model –Reweight according to observations –Resample Particles stay in most probable regions Performs poorly in hi-dimensional spaces

15 Incorporating time in First Order Probabilistic Models Simple approach: Time is one of the arguments in first order logic –Year(p100, 1996), Hot (SVM, 2004) But time is special –World is growing in the direction of time Hot (SVM, 2005) dependent on Hot (SVM, 2004) –Hard to discover rules that help in state monitoring, future prediction, etc. –Blowup by incorporating time explicitly –Special inference algorithms no longer applicable

16 Dynamic Probabilistic Relational Models DPRM is a PRM replicated over time slices –DBN is a Bayes Net replicated over time slices In a DPRM attributes for each class dependent on attributes of same/related class –Related class from current/previous time slice –Previous relation “Unrolled” DPRM = DBN

17 DPRMs: Example t PLATE 1 Color : Red #Holes : 4 Bolted-To : BRACKET 7 Color : Blue Size : Large t+1 Action:Bolt PLATE 1 Color : Red #Holes : 4 Bolted-To : BRACKET 7 Color : Blue Size : Large

18 Inference in DPRMs Relational uncertainty  huge state space –E.g. 100 parts  10,000 possible attachments Particle filter likely to perform poorly –Rao-Blackwellization ?? Assumptions (relaxed afterwards) –Uncertain reference slots do not appear in slot chains or as parents –Single-valued uncertain reference slots

19 Rao-Blackwellization in DPRMs Sample propositional attributes –Smaller space and less error –Constitute the particle For each uncertain reference slot and particle state –Maintain a multinomial distribution over the set of objects in the target class –Conditioned on values of propositional variables

20 RBPF: A Particle Propositional attributes Bolted-To-1 Bolted-To-2 Pl 1 Pl 2 Pl 3 Pl 4 Pl 5 Color Size Wt Pl 1 Pl 2 Pl 3 Pl 4 Pl 5 0.1 0.1 0.2 0.1 0.5 Bracket 1 Red Large 2lbs 0.3 0.2 0.2 0.1 0.2 Pl 6 Pl 7 Pl 8 Pl 9 Pl 10 0.25 0.3 0.1 0.25 0.1 Reference slots Pl 1 Pl 2 Pl 3 Pl 4 Pl 5 0.1 0.1 0.2 0.1 0.5 Bracket 2 Blue Small 1lb 0.4 0.1 0.1 0.3 0.1 Pl 6 Pl 7 Pl 8 Pl 9 Pl 10 0.1 0.1 0.2 0.1 0.5 ………

21 Experimental Setup Assembly Domain (AIPS98) –Objects : Plates, Brackets, Bolts –Attributes : Color, Size, Weight, Hole type, etc. –Relations : Bolted-To, Welded-To –Propositional Actions: Paint, Polish, etc. –Relational Actions: Weld, Bolt –Observations Fault model –Faults cause uncertainty –Actions and observations uncertain –Governed by global fault probability (f p ) Task: State Monitoring

22 RBPF vs PF

23 Problems with DPRMs Relationships modeled using slots –Slots and slot chains hard to represent and understand Modeling ternary relationships becomes hard Small subset of first-order logic (conjunctive expressions) used to specify dependencies Independence between objects participating in multi-valued slots Unstructured conditional model

24 Relational Dynamic Bayes Nets Replace slots and attributes with predicates (like in MLNs) Each predicate has parents which are other predicates The conditional model is a first-order probability tree The predicate graph is acyclic A copy of the model at each time slice

25 Inference: Relaxing the assumptions RBPF is infeasible when assumptions relaxed Observation: Similar objects behave similarly Sample all predicates –Small number of samples, but large relational predicate space –Smoothing : Likelihood of a small number of points can tell relative likelihood of others Given a particle smooth each relational predicate towards similar states

26 Simple Smoothing Pl 1 Pl 2 Pl 3 Pl 4 Pl 5 0.1 0.1 0.2 0.1 0.5 0.9 0.4 0.9 0.9 0.9 Propositional attributes Color Size Wt Red Large 2lbs Bolted-To (Bracket_1, X) Pl 1 Pl 2 Pl 3 Pl 4 Pl 5 0.1 0.1 0.2 0.1 0.5 1 0 1 1 1 Particle Filtering : A particle after smoothing

27 Simple Smoothing Problems Simple smoothing : probability of an object pair depends upon values of all other object pairs of the relation –E.g. P( Bolt(Br_1,Pl_1) ) depends on Bolt(Br_i, Pl_j) for all i and j. Solution : Make an object pair depend more upon similar pairs –Similarity given by properties of the objects

28 Abstraction Lattice Smoothing Abstraction represents a set of similar object pairs. –Bolt(Br1, Pl1) –Bolt(red, large) –Bolt(*,*) Abstraction Lattice: a hierarchy of abstractions –Each abstraction has a coefficient

29 Abstraction Lattice : an example

30 Abstraction Lattice Smoothing P(Bolt(B1, P1))= w 1 P pf (Bolt(B1, P1) + w 2 P pf (Bolt(red, large)) + w 3 P pf (Bolt(*,*)) Joint distributions are estimated using relational kernel density estimation –Kernel K(x, x i ) gives distance between the state and the particle –Distance measured using abstractions

31 Abstraction Smoothing vs PF

32 Learning with DMLNs Task: Can MLN learning be directly applied to learn time-based models? Domains – Predicting author, topic distribution in High- Energy Theoretical Physics papers from KDDCup 2003 –Learning action models of manufacturing assembly processes

33 Learning with DMLNs DMLNs = MLNs + Time predicates –R(x,y) -> R(x,y,t), Succ(11, 10), Gt(10,5) Now directly apply MLN structure learning algorithm (Stanley and Pedro) To make it work –Use templates to model Markovian assumption –Restrict number of predicates per clause –Add background knowledge

34 Physics dataset

35 Manufacturing Assembly

36 Current and Future Work Current Work –Programming by Demonstration using Dynamic First Order Probabilistic Models Future Work –Learning object creation models –Learning in presence of missing data –Modeling hierarchies (very useful for fast inference) –Applying abstraction smoothing to ``static’’ relational models


Download ppt "Learning Models of Relational Stochastic Processes Sumit Sanghai."

Similar presentations


Ads by Google