Download presentation

Presentation is loading. Please wait.

Published byKaylee McGinnis Modified over 4 years ago

1
Probabilistic Resolution

2
Logical reasoning Absolute implications office meeting office talk office pick_book But what if my rules are not absolute?

3
Migrating to Probabilities: Graphical Models noisy_office meeting talk pick_book Actually, the original model does not justify the last row

4
Migrating to Probabilities: Graphical Models noisy_officemeeting talk pick_book

5
Variable Elimination (VE) noisy_officemeeting talk pick_book

6
Variable Elimination (VE) noisy_office talk pick_book meeting (noisy_office, pick_book, talk, meeting) (meeting) meeting

7
Variable Elimination (VE) noisy_office talk pick_book

8
Variable Elimination (VE) noisy_office pick_book

9
Variable Elimination (VE) noisy_office

10
Graphical Models generalize Logic officemeeting talk pick_book

11
VE generalizes Resolution Resolution A or B B or C A or C A B C AC Variable Elimination There is still an important difference, though.

12
Story so far Logic uses absolute rules; Probabilistic models can deal with noise, and generalize logic; But...

13
Logical reasoning ends early office meeting office talk office pick_book... Given evidence meeting, we are done after considering first rule alone.

14
Ending early in deterministic graphical model Variable Elimination uses all nodes to calculate P(office | meeting) officemeeting talk pick_book

15
Ending early in deterministic graphical model But if meeting is observed, we dont need to look beyond it office talk pick_book

16
Ending early in deterministic graphical model We can use smarter algorithms to end early here as well office talk pick_book

17
Ending early in non-deterministic graphical models Calculating P(noisy_office | meeting) noisy_officemeeting talk pick_book

18
Ending early in non-deterministic graphical models P(noisy_office | meeting) depends on all nodes noisy_office talk pick_book

19
Ending early in non-deterministic graphical models noisy_office talk pick_book But we already know P(noisy_office | meeting) [0.99, 0.9992] Can we take advantage of this?

20
Goal A graphical model inference algorithm that derives a bound on solution so far; Ends as soon as bound is good enough; An anytime algorithm.

21
Probabilistic Resolution Resolution A or B B or C A or C A B C AC Variable Elimination Variable Elimination generalizes Resolution, but neither provides intermediate results nor ends early. Probabilistic Resolution = VE + ending early

22
Story so far Logic uses absolute rules; Probabilistic models can deal with noise, and generalize logic; Logic ends as soon as possible, graphical models do not; They can if we are willing to use bounds; But how to calculate bounds?

23
But how to get bounds? QN2N2 N1N1 N4N4 N3N3...

24
But how to get bounds? QN2N2 N1N1 N4N4 N3N3...

25
But how to get bounds? QN2N2 N1N1 N4N4 N3N3

26
QN

27
QN 1 2 P(Q) N 1 (Q,N) 2 (N) P(Q) N 1 (Q,N) P 2 (N) P(Q) f ( P 2 (N) )

28
But how to get bounds? QN P(Q) f ( P 2 (N) ) 0101 f P(Q)P 2 (N)

29
But how to get bounds? QN P(Q)P 2 (N) (0,0,1) (1,0,0) (0,1,0) (0,0,1) (1,0,0) (0,1,0) f P(Q) f ( P 2 (N) )

30
But how to get bounds? QN P(Q)P 2 (N) (0,0,1) (1,0,0) (0,1,0) (0,0,1) (1,0,0) (0,1,0) f P(Q) f ( P 2 (N) ) bound Infinite number of points! Justify inner shape to be equal to outter one

31
But how to get bounds? QN P(Q)P 2 (N) (0,0,1) (1,0,0) (0,1,0) (0,0,1) (1,0,0) (0,1,0) f P(Q) f ( P 2 (N) ) Vertices are enough

32
But how to get bounds? QN P(Q) (0,0,1) (1,0,0) (0,1,0) (0,0,1) (1,0,0) (0,1,0) f P(Q) f ( P 2 (N) ) P 2 (N) No necessary correspondence

33
But how to get bounds? QN (0,0,1) (1,0,0) (0,1,0) f P(Q) f ( P 2 (N) ) P 2 (N) 01 P(Q) Correspondence would be impossible in this case Make slide with opposite: segment to triangle

34
But how to get bounds? QN 0101 f P(Q) P(Q) f ( P 2 (N) ) P 2 (N)

35
Example I QN [0,1][0.36, 0.67] P(Q) f ( P 2 (N) ) P(Q) N (Q,N) P 2 (N) P(Q) (Q,0)P 2 (N=0) + (Q,1)P 2 (N=1) For P 2 (N=0) = 1: P(Q) (Q,0) 1 + (Q,1) 0 P(Q) (Q,0) P(Q=1) = (1,0) / ( (0,0) + (1,0)) P(Q=1) = 0.4 / (0.7 + 0.4) = 0.36 For P 2 (N=1) = 1: P(Q) (Q,0) 0 + (Q,1) 1 P(Q) (Q,1) P(Q=1) = (1,1) / ( (0,1) + (1,1)) P(Q=1) = 0.6 / (0.3 + 0.6) = 0.67

36
P 2 (N) Example II QN [0,1][0.5] P(Q) (0,0,1) (1,0,0) (0,1,0) (0,0,1) (0,1,0) f (1,0,0) 0101 f P(Q) P 2 (N)

37
Example III QN [0,1] P 2 (N)P(Q) (0,0,1) (1,0,0) (0,1,0) (0,0,1) (0,1,0) f (1,0,0) 0101 f P(Q) P 2 (N)

38
Example IV noisy_officemeeting talk pick_book

39
Example IV noisy_office talk pick_book

40
Example IV noisy_office talk pick_book 0.4

41
Example IV noisy_office pick_book

42
Example IV noisy_office pick_book 1

43
Example IV noisy_office

44
Algorithm Same as Variable Elimination, but update bounds every time a neighbor is eliminated; Bounds always improve at each neighbor elimination; Trade-off between granularity of bound updates (explain granularity) and ordering efficiency.

45
Complexity Issues Calculating bound is exponential on the size of neighborhood component, so complexity is exponential on largest neighborhood component during execution; This can be larger than tree-width; But finding tree-width is hard anyway.

46
Preliminary Tests

47
Conclusions Making Probabilistic Inference more like Logic Inference; Getting an anytime algorithm in the process; Preparing ground for First-order Probabilistic Resolution.

Similar presentations

Presentation is loading. Please wait....

OK

I.1 ii.2 iii.3 iv.4 1+1=. i.1 ii.2 iii.3 iv.4 1+1=

I.1 ii.2 iii.3 iv.4 1+1=. i.1 ii.2 iii.3 iv.4 1+1=

© 2018 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google