Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bayesian Belief Propagation Reading Group. Overview Problem Background Bayesian Modelling Bayesian Modelling Markov Random Fields Markov Random Fields.

Similar presentations


Presentation on theme: "Bayesian Belief Propagation Reading Group. Overview Problem Background Bayesian Modelling Bayesian Modelling Markov Random Fields Markov Random Fields."— Presentation transcript:

1 Bayesian Belief Propagation Reading Group

2 Overview Problem Background Bayesian Modelling Bayesian Modelling Markov Random Fields Markov Random Fields Examine use of Bayesian Belief Propagation (BBP) in three low level vision applications. Examine use of Bayesian Belief Propagation (BBP) in three low level vision applications. Contour Motion Estimation Contour Motion Estimation Dense Depth Estimation Dense Depth Estimation Unwrapping Phase Images Unwrapping Phase Images Convergence Issues Conclusions

3 Problem Background A problem of probabilistic inference Estimate unknown variables given observed data. For low level vision: Estimate unknown scene properties (e.g. depth) from image properties (e.g. Intensity gradients) Estimate unknown scene properties (e.g. depth) from image properties (e.g. Intensity gradients)

4 Bayesian models in low level vision A statistical description of an estimation problem. Given data d, we want to estimate unknown parameters u Given data d, we want to estimate unknown parameters u Two components Prior Model p(u) – Captures know information about unknown data and is independent of observed data. Distribution of probable solutions. Prior Model p(u) – Captures know information about unknown data and is independent of observed data. Distribution of probable solutions. Sensor Model p(d|u) – Describes relationship between sensed measurements d and unknown hidden data u. Sensor Model p(d|u) – Describes relationship between sensed measurements d and unknown hidden data u. Combine using Bayes Rule to give the posterior

5 Markov Random Fields Image Data Nodes ( d) Hidden Scene Nodes ( u) Sensor model Prior model uiui Neighborhood N i Pairwise Markov Random Field: Model commonly used to represent images

6 Contour Motion Estimation Yair Weiss

7 Contour Motion Estimation Estimate the motion of contour using only local information. Less computationally intensive method than optical flow. Application example: object tracking. Difficult due to the aperture problem.

8 Contour Motion Estimation Aperture Problem Ideal Actual

9 Prior Model: u i+1 = u i + where N (0, p ) Contour Motion Estimation i i+1 i-1 i-2 i+2 didi d i-1 d i+1 d i-2 d i+2 uiui u i+1 u i-1 Brightness Constant Constraint Equation uiui u i-1 u i+1 u i-2 u i+2 where I i = I(x i,y i,t)

10 1D Belief Propagation uiui u i-1 u i+1 u i-2 u i+2 didi d i-1 d i+1 d i-2 d i+2 Iterate until message values converge

11 Results Contour motion estimation [Weiss] Faster and more accurate solutions over pre-existing methods such as relaxation. Results after iteration n are optimal given all data within distance of n nodes. Due to the nature of the problem, all velocity components should and do converge to the same value. Interesting to try algorithm on problems where this is not the case Multiple motions within the same contour Rotating contours (requires a new prior model) Only one dimensional problems tackled but extensions to 2D are discussed. Also use of algorithm to solve Direction Of Figure (DOF) problem using convexity (not discussed)

12 Dense Depth Estimation Richard Szeliski

13 Depth Estimation i Depth Z i Disparity u i = 1 / Z i Assume smooth variation in disparity Define prior using Gibbs Distribution: E p (u) is an energy functional:

14 Depth Estimation Image T=0 Image T=1 Image T=t Image T=t+1 Image T=t+2 Image t=t+3 didi Disparity: related to correlation metric i Where H is a measurement matrix and E s (u) is an energy functional:

15 Depth Estimation E(u) is the overall energy: where and Energy function E(u) minimized when u=A -1 b Posterior : Matrix A -1 is large and expensive to compute

16 Gauss-Seidel Relaxation Minimize energy locally for each node u i keeping all other nodes fixed. Leads to update rule: Leads to update rule: This is also the estimated mean of the marginal probability distribution p(u i |d) given by Gibbs Sampling. This is also the estimated mean of the marginal probability distribution p(u i |d) given by Gibbs Sampling. For the 1-D example given by Weiss: For the 1-D example given by Weiss:

17 Results Dense depth estimation [Szeliski] Dense (per pixel) depth estimation from a sequence of images with known camera motion. Adapted Kalman Filter: estimates of depth from time t-1 are used to improve estimates at time t. Uses multi-resolution technique (image pyramid) to improve convergence times. Uses Gibbs Sampling to sample the posterior. Stochastic Gauss-Seidel relaxation Stochastic Gauss-Seidel relaxation Not guaranteed to converge. Problem can be reformulated to use message passing. Does not account for loops in the network, only recently has belief propagation in networks with loops been fully understood [Yedidia et al]

18 Unwrapping Phase Images Brendan Frey et al

19 Unwrapping Phase Images Wrapped phase images are produced by devices such as MRI and radar. Unwrapping involves finding shift values between each point. Unwrapping is simple in one dimension One path through data One path through data Use local gradient to estimate shift. Use local gradient to estimate shift. For 2D images, the problem is more difficult (NP-hard) Many paths through the data Many paths through the data Shifts along all paths must be consistent Shifts along all paths must be consistent

20 Zero-Curl Constraint a(x,y) b(x,y)b(x+1,y) a(x,y+1) (x,y)(x+1,y) (x+1,y+1) (x,y+1) Data Point Shift node Constraint Node

21 Sensor Data Estimating relative shift (variables a and b) values [-1,0 or 1] between each data point. Use local image gradient as sensor input Use local image gradient as sensor input Sensor nodes: Hidden shift nodes: Gaussian sensor model: Estimate from wrapped image

22 Belief Propagation m4m4 m1m1 m2m2 m3m3 m5m5 m5m Belief a(x,y) Data Point Shift node Constraint Node

23 Results Unwrapping phase images [Frey et al.] Initialize message to uniform distribution and iterate to convergence. Estimates a solution to an NP-Hard problem in O(n) time in the number of the nodes. Reduction in reconstruction error over relaxation methods. Does not account for loops in the network, messages could cycle leading to incorrect belief estimates. Not guaranteed to converge.

24 Convergence only guaranteed when network is a tree structure and all data is available. In networks with loops, messages can cycle resulting in incorrect belief estimates. Multi-resolution methods such as image pyramids can be used to speed up convergence times (and improve results). Convergence

25 Conclusion BBP used to infer marginal posterior distribution of hidden information from observable data. Efficient message passing system is linear in the number of nodes as opposed to exponential. Propagate local information globally to achieve more reliable estimates. Useful for low level vision applications Contour Motion Estimation [Weiss] Contour Motion Estimation [Weiss] Dense Depth Estimation [Szeliski] Dense Depth Estimation [Szeliski] Unwrapping Phase Images [Frey et al] Unwrapping Phase Images [Frey et al] Improved results over standard relaxation algorithms. Can be used in conjunction with multi-resolution framework to improve convergence times. Need to account for loops to prevent cycling of messages [Yedidia et al].


Download ppt "Bayesian Belief Propagation Reading Group. Overview Problem Background Bayesian Modelling Bayesian Modelling Markov Random Fields Markov Random Fields."

Similar presentations


Ads by Google