Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bayesian Belief Propagation

Similar presentations


Presentation on theme: "Bayesian Belief Propagation"— Presentation transcript:

1 Bayesian Belief Propagation
Reading Group

2 Overview Problem Background
Bayesian Modelling Markov Random Fields Examine use of Bayesian Belief Propagation (BBP) in three low level vision applications. Contour Motion Estimation Dense Depth Estimation Unwrapping Phase Images Convergence Issues Conclusions

3 Problem Background A problem of probabilistic inference
Estimate unknown variables given observed data. For low level vision: Estimate unknown scene properties (e.g. depth) from image properties (e.g. Intensity gradients)

4 Bayesian models in low level vision
A statistical description of an estimation problem. Given data d, we want to estimate unknown parameters u Two components Prior Model p(u) – Captures know information about unknown data and is independent of observed data. Distribution of probable solutions. Sensor Model p(d|u) – Describes relationship between sensed measurements d and unknown hidden data u. Combine using Bayes’ Rule to give the posterior

5 Markov Random Fields Image Data Nodes (d)
Pairwise Markov Random Field: Model commonly used to represent images Hidden Scene Nodes (u) Sensor model Prior model ui Neighborhood Ni

6 Contour Motion Estimation
Yair Weiss

7 Contour Motion Estimation
Estimate the motion of contour using only local information. Less computationally intensive method than optical flow. Application example: object tracking. Difficult due to the aperture problem.

8 Contour Motion Estimation
Actual Ideal Aperture Problem

9 Contour Motion Estimation
ui ui+1 ui-1 Brightness Constant Constraint Equation i i+1 i-1 i-2 i+2 Prior Model: ui+1 = ui + n where n ~ N(0,sp) where Ii = I(xi,yi,t) di di-1 di+1 di-2 di+2 ui ui-1 ui+1 ui-2 ui+2

10 1D Belief Propagation di di-1 di+1 di-2 di+2 ui ui-1 ui+1 ui-2 ui+2
Iterate until message values converge ui ui-1 ui+1 ui-2 ui+2

11 Results Contour motion estimation [Weiss]
Faster and more accurate solutions over pre-existing methods such as relaxation. Results after iteration n are optimal given all data within distance of n nodes. Due to the nature of the problem, all velocity components should and do converge to the same value. Interesting to try algorithm on problems where this is not the case Multiple motions within the same contour Rotating contours (requires a new prior model) Only one dimensional problems tackled but extensions to 2D are discussed. Also use of algorithm to solve Direction Of Figure (DOF) problem using convexity (not discussed)

12 Dense Depth Estimation
Richard Szeliski

13 Depth Estimation Assume smooth variation in disparity i Depth Zi
Disparity ui = 1 / Zi Define prior using Gibbs Distribution: Ep(u) is an energy functional:

14 Depth Estimation Where H is a measurement matrix and Disparity:
related to correlation metric di i Image T=0 Image T=1 Image T=t Image T=t+1 Where H is a measurement matrix and Image T=t+2 Image t=t+3 Es(u) is an energy functional:

15 Depth Estimation Posterior: E(u) is the overall energy: where and
Energy function E(u) minimized when u=A-1b Matrix A-1 is large and expensive to compute

16 Gauss-Seidel Relaxation
Minimize energy locally for each node ui keeping all other nodes fixed. Leads to update rule: This is also the estimated mean of the marginal probability distribution p(ui|d) given by Gibbs Sampling. For the 1-D example given by Weiss:

17 Results Dense depth estimation [Szeliski]
Dense (per pixel) depth estimation from a sequence of images with known camera motion. Adapted Kalman Filter: estimates of depth from time t-1 are used to improve estimates at time t. Uses multi-resolution technique (image pyramid) to improve convergence times. Uses Gibbs Sampling to sample the posterior. Stochastic Gauss-Seidel relaxation Not guaranteed to converge. Problem can be reformulated to use message passing. Does not account for loops in the network, only recently has belief propagation in networks with loops been fully understood [Yedidia et al]

18 Unwrapping Phase Images
Brendan Frey et al

19 Unwrapping Phase Images
Wrapped phase images are produced by devices such as MRI and radar. Unwrapping involves finding shift values between each point. Unwrapping is simple in one dimension One path through data Use local gradient to estimate shift. For 2D images, the problem is more difficult (NP-hard) Many paths through the data Shifts along all paths must be consistent

20 Zero-Curl Constraint Data Point a(x,y) b(x,y) b(x+1,y) a(x,y+1) (x,y)
Shift node Constraint Node

21 Sensor Data Estimating relative shift (variables a and b) values [-1,0 or 1] between each data point. Use local image gradient as sensor input Sensor nodes: Hidden shift nodes: Gaussian sensor model: Estimate from wrapped image

22 Belief Propagation 1.0 -1 1 Belief a(x,y) Data Point m5 m5 Shift node
1 Belief a(x,y) Data Point m5 m5 Shift node m4 m1 m2 m3 Constraint Node

23 Results Unwrapping phase images [Frey et al.]
Initialize message to uniform distribution and iterate to convergence. Estimates a solution to an NP-Hard problem in O(n) time in the number of the nodes. Reduction in reconstruction error over relaxation methods. Does not account for loops in the network, messages could cycle leading to incorrect belief estimates. Not guaranteed to converge.

24 Convergence Convergence only guaranteed when network is a tree structure and all data is available. In networks with loops, messages can cycle resulting in incorrect belief estimates. Multi-resolution methods such as image pyramids can be used to speed up convergence times (and improve results).

25 Conclusion BBP used to infer marginal posterior distribution of hidden information from observable data. Efficient message passing system is linear in the number of nodes as opposed to exponential. Propagate local information globally to achieve more reliable estimates. Useful for low level vision applications Contour Motion Estimation [Weiss] Dense Depth Estimation [Szeliski] Unwrapping Phase Images [Frey et al] Improved results over standard relaxation algorithms. Can be used in conjunction with multi-resolution framework to improve convergence times. Need to account for loops to prevent cycling of messages [Yedidia et al].


Download ppt "Bayesian Belief Propagation"

Similar presentations


Ads by Google