Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Trainable Graph Combination Scheme for Belief Propagation Kai Ju Liu New York University.

Similar presentations


Presentation on theme: "A Trainable Graph Combination Scheme for Belief Propagation Kai Ju Liu New York University."— Presentation transcript:

1 A Trainable Graph Combination Scheme for Belief Propagation Kai Ju Liu New York University

2 Images

3 Pairwise Markov Random Field 123 4 5 Basic structure: vertices, edges

4 Pairwise Markov Random Field Basic structure: vertices, edges Vertex i has set of possible states X i and observed value y i Compatibility between states and observed values, Compatibility between neighboring vertices i and j,

5 Pairwise MRF: Probabilities Joint probability: Marginal probability: –Advantage: allows average over ambiguous states –Disadvantage: complexity exponential in number of vertices

6 Belief Propagation 123 4 5

7 Beliefs replace probabilities: Messages propagate information:

8 Belief Propagation Example 13 4 5

9 BP: Questions When can we calculate beliefs exactly? When do beliefs equal probabilities? When is belief propagation efficient? Answer: Singly-Connected Graphs (SCG’s) Graphs without loops Messages terminate at leaf vertices Beliefs equal probabilities Complexity in previous example reduced from 13S 5 to 24S 2

10 BP on Loopy Graphs Messages do not terminate Energy approximation schemes [Freeman et al.] –Standard belief propagation –Generalized belief propagation Standard belief propagation –Approximates Gibbs free energy of system by Bethe free energy –Iterates, requiring convergence criteria 12 43

11 BP on Loopy Graphs Tree-based reparameterization [Wainwright] –Reparameterizes distributions on singly-connected graphs –Convergence improved compared to standard belief propagation –Permits calculation of bounds on approximation errors

12 BP-TwoGraphs Eliminates iteration Utilizes advantages of SCG’s

13 BP-TwoGraphs Calculate beliefs on each set of SCG’s: – Select set of beliefs with minimum entropy – Consider loopy graph with n vertices Select two sets of SCG’s that approximate the graph –

14 BP-TwoGraphs on Images Rectangular grid of pixel vertices H i : horizontal graphs G i : vertical graphs horizontal graph vertical graphoriginal graph

15 Image Segmentation add noise segment

16 Image Segmentation Results

17 Image Segmentation Revisited add noise ground truth max-flow ground truth

18 Image Segmentation: Horizontal Graph Analysis

19 Image Segmentation: Vertical Graph Analysis

20 BP-TwoLines Rectangular grid of pixel vertices H i : horizontal lines G i : vertical lines horizontal line vertical lineoriginal graph

21 Image Segmentation Results II

22 Image Segmentation Results III

23 Natural Image Segmentation

24 Boundary-Based Image Segmentation: Window Vertices Square 2-by-2 window of pixels Each pixel has two states –foreground –background

25 Boundary-Based Image Segmentation: Overlap

26 Boundary-Based Image Segmentation: Graph

27 Real Image Segmentation: Training

28 Real Image Segmentation: Results

29 Real Image Segmentation: Gorilla Results

30 Conclusion BP-TwoGraphs –Accurate and efficient –Extensive use of beliefs –Trainable parameters Future work –Multiple states –Stereo –Image fusion


Download ppt "A Trainable Graph Combination Scheme for Belief Propagation Kai Ju Liu New York University."

Similar presentations


Ads by Google