Presentation is loading. Please wait.

Presentation is loading. Please wait.

Probabilistic image processing and Bayesian network

Similar presentations


Presentation on theme: "Probabilistic image processing and Bayesian network"— Presentation transcript:

1 Probabilistic image processing and Bayesian network
Kazuyuki Tanaka Graduate School of Information Sciences, Tohoku University References K. Tanaka: Statistical-mechanical approach to image processing (Topical Review), J. Phys. A, vol.35, pp.R81-R150 (2002). K. Tanaka, H. Shouno, M. Okada and D. M. Titterington: Accuracy of the Bethe approximation for hyperparameter estimation in probabilistic image processing, J. Phys. A, vol.37, pp (2004). 8 November, 2005 CISJ2005

2 Bayesian Network and Belief Propagation
Bayes Formula Probabilistic Model Probabilistic Information Processing Belief Propagation J. Pearl: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference (Morgan Kaufmann, 1988). C. Berrou and A. Glavieux: Near optimum error correcting coding and decoding: Turbo-codes, IEEE Trans. Comm., 44 (1996). 8 November, 2005 CISJ2005

3 Contents Introduction Belief Propagation
Bayesian Image Analysis and Gaussian Graphical Model Concluding Remarks 8 November, 2005 CISJ2005

4 It is very hard to calculate exactly except some special cases.
Belief Propagation How should we treat the calculation of the summation over 2N configuration? It is very hard to calculate exactly except some special cases. Formulation for approximate algorithm Accuracy of the approximate algorithm 8 November, 2005 CISJ2005

5 Tractable Model Probabilistic models with no loop are tractable.
Factorizable Probabilistic models with loop are not tractable. Not Factorizable 8 November, 2005 CISJ2005

6 Probabilistic model on a graph with no loop
1 2 3 4 5 6 Marginal probability of the node 2 8 November, 2005 CISJ2005

7 Probabilistic model on a graph with no loop
1 2 3 4 5 6 Marginal probability can be expressed in terms of the product of messages from all the neighbouring nodes of node 2. Message from the node 1 to the node 2 can be expressed in terms of the product of message from all the neighbouring nodes of the node 1 except one from the node 2. 8 November, 2005 CISJ2005

8 Probabilistic Model on a Graph with Loops
Marginal Probability 8 November, 2005 CISJ2005

9 Belief Propagation Message Update Rule 1 4 2 5 3 1 4 5 3 2 6 8 7
In the Bethe approximation, the marginal probabilities are assumed to be the following form in terms of the messages from the neighboring pixels to the pixel. These marginal probabilities satisfy the reducibility conditions at each pixels and each nearest-neighbor pair of pixels. The messages are determined so as to satisfy the reducibility conditions. 8 November, 2005 CISJ2005

10 Message Passing Rule of Belief Propagation
1 3 4 2 5 The reducibility conditions can be rewritten as the following fixed point equations. This fixed point equations is corresponding to the extremum condition of the Bethe free energy. And the fixed point equations can be numerically solved by using the natural iteration. The algorithm is corresponding to the loopy belief propagation. Fixed Point Equations for Massage 8 November, 2005 CISJ2005

11 Fixed Point Equation and Iterative Method
8 November, 2005 CISJ2005

12 Contents Introduction Belief Propagation
Bayesian Image Analysis and Gaussian Graphical Model Concluding Remarks 8 November, 2005 CISJ2005

13 Bayesian Image Analysis
Noise Transmission Original Image Degraded Image 8 November, 2005 CISJ2005

14 Bayesian Image Analysis
Degradation Process Additive White Gaussian Noise Transmission Original Image Degraded Image 8 November, 2005 CISJ2005

15 Bayesian Image Analysis
A Priori Probability Generate Standard Images Similar? 8 November, 2005 CISJ2005

16 Bayesian Image Analysis
A Posteriori Probability Gaussian Graphical Model 8 November, 2005 CISJ2005

17 Bayesian Image Analysis
A Priori Probability Degraded Image Degraded Image Original Image Pixels A Posteriori Probability 8 November, 2005 CISJ2005

18 Hyperparameter Determination by Maximization of Marginal Likelihood
In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Marginalization Degraded Image Original Image Marginal Likelihood 8 November, 2005 CISJ2005

19 Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm
Q-Function In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Incomplete Data Equivalent 8 November, 2005 CISJ2005

20 Iterate the following EM-steps until convergence:
Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm Marginal Likelihood Q-Function In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Iterate the following EM-steps until convergence: EM Algorithm A. P. Dempster, N. M. Laird and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” J. Roy. Stat. Soc. B, 39 (1977). 8 November, 2005 CISJ2005

21 One-Dimensional Signal
127 255 100 200 Original Signal Degraded Signal Estimated Signal EM Algorithm 8 November, 2005 CISJ2005

22 Image Restoration by Gaussian Graphical Model
EM Algorithm with Belief Propagation Original Image Degraded Image MSE: 1512 MSE: 1529 8 November, 2005 CISJ2005

23 Exact Results of Gaussian Graphical Model
Multi-dimensional Gauss integral formula 8 November, 2005 CISJ2005

24 Comparison of Belief Propagation with Exact Results in Gaussian Graphical Model
MSE Belief Propagation 327 36.302 Exact 315 37.919 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE Belief Propagation 260 33.998 Exact 236 34.975 8 November, 2005 CISJ2005

25 Image Restoration by Gaussian Graphical Model
Original Image Degraded Image Belief Propagation Exact MSE: 1512 MSE: 325 MSE:315 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. Lowpass Filter Wiener Filter Median Filter MSE: 411 MSE: 545 MSE: 447 8 November, 2005 CISJ2005

26 Image Restoration by Gaussian Graphical Model
Original Image Degraded Image Belief Propagation Exact MSE: 1529 MSE: 260 MSE236 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. Lowpass Filter Wiener Filter Median Filter MSE: 224 MSE: 372 MSE: 244 8 November, 2005 CISJ2005

27 Extension of Belief Propagation
Generalized Belief Propagation J. S. Yedidia, W. T. Freeman and Y. Weiss: Constructing free-energy approximations and generalized belief propagation algorithms, IEEE Transactions on Information Theory, 51 (2005). Generalized belief propagation is equivalent to the cluster variation method in statistical mechanics R. Kikuchi: A theory of cooperative phenomena, Phys. Rev., 81 (1951). T. Morita: Cluster variation method of cooperative phenomena and its generalization I, J. Phys. Soc. Jpn, 12 (1957). 8 November, 2005 CISJ2005

28 Image Restoration by Gaussian Graphical Model
MSE Belief Propagation 327 36.302 Generalized Belief Propagation 315 37.909 Exact 37.919 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE Belief Propagation 260 33.998 Generalized Belief Propagation 236 34.971 Exact 34.975 8 November, 2005 CISJ2005

29 Image Restoration by Gaussian Graphical Model and Conventional Filters
MSE Belief Propagation 327 Lowpass Filter (3x3) 388 (5x5) 413 Generalized Belief Propagation 315 Median Filter 486 445 Exact Wiener Filter 864 548 GBP Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. (3x3) Lowpass (5x5) Median (5x5) Wiener 8 November, 2005 CISJ2005

30 Image Restoration by Gaussian Graphical Model and Conventional Filters
MSE Belief Propagation 260 Lowpass Filter (3x3) 241 (5x5) 224 Generalized Belief Propagation 236 Median Filter 331 244 Exact Wiener Filter 703 372 GBP Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. (5x5) Lowpass (5x5) Median (5x5) Wiener 8 November, 2005 CISJ2005

31 Bayesian Image Analysis and Gaussian Graphical Model
Contents Introduction Belief Propagation Bayesian Image Analysis and Gaussian Graphical Model Concluding Remarks 8 November, 2005 CISJ2005

32 Summary Formulation of belief propagation
Accuracy of belief propagation in Bayesian image analysis by means of Gaussian graphical model (Comparison between the belief propagation and exact calculation) 8 November, 2005 CISJ2005


Download ppt "Probabilistic image processing and Bayesian network"

Similar presentations


Ads by Google