Download presentation

Presentation is loading. Please wait.

Published byGian Seman Modified about 1 year ago

1
Discrete Optimization in Computer Vision Nikos Komodakis Ecole des Ponts ParisTech, LIGM Traitement de l’information et vision artiﬁcielle

2
Message passing algorithms for energy minimization

3
Message-passing algorithms Central concept: messages These methods work by propagating messages across the MRF graph Widely used algorithms in many areas

4
Message-passing algorithms But how do messages relate to optimizing the energy? Let’s look at a simple example first: we will examine the case where the MRF graph is a chain

5
Message-passing on chains MRF graph

6
Message-passing on chains Corresponding lattice or trellis

7
Message-passing on chains Global minimum in linear time Optimization proceeds in two passes: Forward pass (dynamic programming) Backward pass

8
Message-passing on chains (example on board) (algebraic derivation of messages)

9
s qpr Message-passing on chains

10
s qpr Forward pass (dynamic programming)

11
s qpr

12
s qpr

13
s qpr

14
s qpr

15
s qpr Min-marginal for node s and label j:

16
s qpr Backward pass xsxs xrxr xqxq xpxp

17
Message-passing on chains How can I compute min-marginals for any node in the chain? How to compute min-marginals for all nodes efficiently? What is the running time of message-passing on chains?

18
Message-passing on trees We can apply the same idea to tree- structured graphs Slight generalization from chains Resulting algorithm called: belief propagation (also called under many other names: e.g., max-product, min-sum etc.) (for chains, it is also often called the Viterbi algorithm)

19
Belief propagation (BP)

20
Dynamic programming: global minimum in linear time BP: Inward pass (dynamic programming) Outward pass Gives min-marginals qpr BP on a tree [Pearl’88] root leaf

21
qpr Inward pass (dynamic programming)

22
qpr

23
qpr

24
qpr

25
qpr

26
qpr

27
qpr

28
qpr Outward pass

29
qpr BP on a tree: min-marginals Min-marginal for node q and label j:

30
Belief propagation: message-passing on trees

31
min-marginals = ???min-marginals = sum of all messages + unary potential

32
What is the running time of message- passing for trees?

33
Message-passing on chains Essentially, message passing on chains is dynamic programming Dynamic programming means reuse of computations

34
Generalizing belief propagation Key property: min(a+b,a+c) = a+min(b,c) BP can be generalized to any operators satisfying the above property E.g., instead of (min,+), we could have: (max,*) Resulting algorithm called max-product. What does it compute? (+,*) Resulting algorithm called sum-product. What does it compute?

35
Belief propagation as a distributive algorithm BP works distributively (as a result, it can be parallelized) Essentially BP is a decentralized algorithm Global results through local exchange of information Simple example to illustrate this: counting soldiers

36
Counting soldiers in a line Can you think of a distributive algorithm for the commander to count its soldiers? (From David MacKay’s book “Information Theory, Inference, and Learning”)

37
Counting soldiers in a line

38
Counting soldiers in a tree Can we do the same for this case?

39
Counting soldiers in a tree

40
Counting soldiers Simple example to illustrate BP Same idea can be used in cases which are seemingly more complex: counting paths through a point in a grid probability of passing through a node in the grid In general, we have used the same idea for minimizing MRFs (a much more general problem)

41
Graphs with loops How about counting these soldiers? Hmmm…overcounting?

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google