Presentation is loading. Please wait.

Presentation is loading. Please wait.

Graduate School of Information Sciences, Tohoku University, Japan

Similar presentations


Presentation on theme: "Graduate School of Information Sciences, Tohoku University, Japan"— Presentation transcript:

1 Graduate School of Information Sciences, Tohoku University, Japan
Generalized Belief Propagation for Gaussian Graphical Model in probabilistic image processing Kazuyuki Tanaka Graduate School of Information Sciences, Tohoku University, Japan Reference K. Tanaka, H. Shouno, M. Okada and D. M. Titterington: Accuracy of the Bethe Approximation for Hyperparameter Estimation in Probabilistic Image Processing, J. Phys. A: Math & Gen., 37, 8675 (2004). 6 September, 2005 SPDSA2005 (Roma)

2 Contents Introduction Loopy Belief Propagation
Generalized Belief Propagation Probabilistic Image Processing Concluding Remarks 6 September, 2005 SPDSA2005 (Roma)

3 Bayesian Image Analysis
Noise Transmission Original Image Degraded Image Graphical Model with Loops =Spin System on Square Lattice Bayesian Image Analysis + Belief Propagation → Probabilistic Image Processing 6 September, 2005 SPDSA2005 (Roma)

4 Belief Propagation Belief Propagation Generalized Belief Propagation
Probabilistic model with no loop Belief Propagation = Transfer Matrix (Lauritzen, Pearl) Probabilistic model with some loops Approximation→Loopy Belief Propagation Generalized Belief Propagation (Yedidia, Freeman, Weiss) Loopy Belief Propagation (LBP) = Bethe Approximation Generalized Belief Propagation (GBP) = Cluster Variation Method How is the accuracy of LBP and GBP? 6 September, 2005 SPDSA2005 (Roma)

5 Gaussian Graphical Model
6 September, 2005 SPDSA2005 (Roma)

6 Probabilistic Image Processing by Gaussian Graphical Model and Generalized Belief Propagation
How can we construct a probabilistic image processing algorithm by using Loopy Belief Propagation and Generalized Belief Propagation? How is the accuracy of Loopy Belief Propagation and Generalized Belief Propagation? In order to clarify both questions, we assume the Gaussian graphical model as a posterior probabilistic model 6 September, 2005 SPDSA2005 (Roma)

7 Contents Introduction Loopy Belief Propagation
Generalized Belief Propagation Probabilistic Image Processing Concluding Remarks 6 September, 2005 SPDSA2005 (Roma)

8 Kullback-Leibler Divergence of Gaussian Graphical Model
Entropy Term 6 September, 2005 SPDSA2005 (Roma)

9 Loopy Belief Propagation
Trial Function Tractable Form 6 September, 2005 SPDSA2005 (Roma)

10 Loopy Belief Propagation
Trial Function Marginal Distribution of GGM is also GGM 6 September, 2005 SPDSA2005 (Roma)

11 Loopy Belief Propagation
Bethe Free Energy in GGM 6 September, 2005 SPDSA2005 (Roma)

12 Loopy Belief Propagation
m is exact 6 September, 2005 SPDSA2005 (Roma)

13 Iteration Procedure Fixed Point Equation Iteration 6 September, 2005
SPDSA2005 (Roma)

14 Loopy Belief Propagation and TAP Free Energy
Mean Field Free Energy 6 September, 2005 SPDSA2005 (Roma)

15 Contents Introduction Loopy Belief Propagation
Generalized Belief Propagation Probabilistic Image Processing Concluding Remarks 6 September, 2005 SPDSA2005 (Roma)

16 Generalized Belief Propagation
Cluster: Set of nodes Every subcluster of the element of B does not belong to B. Example: System consisting of 4 nodes 1 2 3 4 1 2 3 4 6 September, 2005 SPDSA2005 (Roma)

17 Selection of B in LBP and GBP
(Bethe Approx.) 1 2 4 5 3 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 4 5 3 6 7 8 9 GBP (Square Approx. in CVM) 6 September, 2005 SPDSA2005 (Roma)

18 Selection of B and C in Loopy Belief Propagation
LBP (Bethe Approx.) The set of Basic Clusters The Set of Basic Clusters and Their Subclusters 6 September, 2005 SPDSA2005 (Roma)

19 Selection of B and C in Generalized Belief Propagation
GBP (Square Approximation in CVM) The set of Basic Clusters The Set of Basic Clusters and Their Subclusters 6 September, 2005 SPDSA2005 (Roma)

20 Generalized Belief Propagation
Trial Function Marginal Distribution of GGM is also GGM 6 September, 2005 SPDSA2005 (Roma)

21 Generalized Belief Propagation
m is exact 6 September, 2005 SPDSA2005 (Roma)

22 Contents Introduction Loopy Belief Propagation
Generalized Belief Propagation Probabilistic Image Processing Concluding Remarks 6 September, 2005 SPDSA2005 (Roma)

23 Bayesian Image Analysis
Noise Transmission Original Image Degraded Image 6 September, 2005 SPDSA2005 (Roma)

24 Bayesian Image Analysis
Degradation Process Additive White Gaussian Noise Transmission Original Image Degraded Image 6 September, 2005 SPDSA2005 (Roma)

25 Bayesian Image Analysis
A Priori Probability Generate Standard Images Similar? 6 September, 2005 SPDSA2005 (Roma)

26 Bayesian Image Analysis
Original Image f Degraded Image g A Posteriori Probability Gaussian Graphical Model 6 September, 2005 SPDSA2005 (Roma)

27 Bayesian Image Analysis
A Priori Probability Degraded Image Degraded Image Original Image Pixels A Posteriori Probability 6 September, 2005 SPDSA2005 (Roma)

28 Hyperparameter Determination by Maximization of Marginal Likelihood
In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Marginalization Degraded Image Original Image Marginal Likelihood 6 September, 2005 SPDSA2005 (Roma)

29 Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm
Q-Function In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Incomplete Data Equivalent 6 September, 2005 SPDSA2005 (Roma)

30 Iterate the following EM-steps until convergence:
Maximization of Marginal Likelihood by EM (Expectation Maximization) Algorithm Marginal Likelihood Q-Function In the image restoration, we usually have to estimate the hyperparameters alpha and p. In statistics, the maximum likelihood estimation is often employed. In the standpoint of maximum likelihood estimation, the hyperparameters are determined so as to maximize the marginal likelihood defined by marginalize the joint probability for the original image and degraded image with respect to the original image. The marginal likelihood is expressed in terms of the partition functions of the a priori probabilistic model and the a posteriori probabilistic model. We can calculate these partition functions approximately by using the Bethe approximation. Iterate the following EM-steps until convergence: EM Algorithm A. P. Dempster, N. M. Laird and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” J. Roy. Stat. Soc. B, 39 (1977). 6 September, 2005 SPDSA2005 (Roma)

31 Image Restoration The original image is generated from the prior probability. (Hyperparameters: Maximization of Marginal Likelihood) Degraded Image Original Image Loopy Belief Propagation Mean-Field Method Exact Result 6 September, 2005 SPDSA2005 (Roma)

32 Numerical Experiments of Logarithm of Marginal Likelihood
The original image is generated from the prior probability. (Hyperparameters: Maximization of Marginal Likelihood) Original Image Degraded Image MFA -5.0 -5.0 MFA LPB -5.5 Exact LPB Exact -6.0 -5.5 10 20 30 40 50 60 0.0010 0.0020 Mean-Field Method Loopy Belief Propagation Exact Result 6 September, 2005 SPDSA2005 (Roma)

33 Numerical Experiments of Logarithm of Marginal Likelihood
EM Algorithm with Belief Propagation Original Image MF Exact LBP 0.001 50 100 LPB MFA 0.002 Degraded Image 6 September, 2005 SPDSA2005 (Roma)

34 Image Restoration by Gaussian Graphical Model
EM Algorithm with Belief Propagation Original Image Degraded Image MSE: 1512 MSE: 1529 6 September, 2005 SPDSA2005 (Roma)

35 Image Restoration by Gaussian Graphical Model
Original Image Degraded Image Mean Field Method MSE: 1512 MSE:611 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. LBP TAP GBP Exact Solution MSE:327 MSE:320 MSE: 315 MSE:315 6 September, 2005 SPDSA2005 (Roma)

36 Image Restoration by Gaussian Graphical Model
Original Image Degraded Image Mean Field Method Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE: 1529 MSE: 565 LBP TAP GBP Exact Solution MSE:260 MSE:248 MSE:236 MSE:236 6 September, 2005 SPDSA2005 (Roma)

37 Image Restoration by Gaussian Graphical Model
MSE MF 611 26.918 LBP 327 36.302 TAP 320 37.170 GBP 315 37.909 Exact 37.919 Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. MSE MF 565 26.353 LBP 260 33.998 TAP 248 34.475 GBP 236 34.971 Exact 34.975 6 September, 2005 SPDSA2005 (Roma)

38 Image Restoration by Gaussian Graphical Model and Conventional Filters
MSE MF 611 Lowpass Filter (3x3) 388 LBP 327 (5x5) 413 TAP 320 Median Filter 486 GBP 315 445 Exact Wiener Filter 864 548 GBP Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. (3x3) Lowpass (5x5) Median (5x5) Wiener 6 September, 2005 SPDSA2005 (Roma)

39 Image Restoration by Gaussian Graphical Model and Conventional Filters
MSE MF 565 Lowpass Filter (3x3) 241 LBP 260 (5x5) 224 TAP 248 Median Filter 331 GBP 236 244 Exact Wiener Filter 703 372 GBP Finally, we show only the results for the gray-level image restoration. For each numerical experiments, the loopy belief propagation ca give us better results than the ones by conventional filters. (5x5) Lowpass (5x5) Median (5x5) Wiener 6 September, 2005 SPDSA2005 (Roma)

40 Contents Introduction Loopy Belief Propagation
Generalized Belief Propagation Probabilistic Image Processing Concluding Remarks 6 September, 2005 SPDSA2005 (Roma)

41 Summary Generalized Belief Propagation for Gaussian Graphical Model
Accuracy of Generalized Belief Propagation Derivation of TAP Free Energy for Gaussian Graphical Model by Perturbation Expansion of Bethe Approximation 6 September, 2005 SPDSA2005 (Roma)

42 Future Problem Hyperparameter Estimation by TAP Free Energy is better than by Loopy Belief Propagation. Effectiveness of Higher Order Terms of TAP Free Energy for Hyperparameter Estimation by means of Marginal Likelihood in Bayesian Image Analysis. TAP Free Energy Mean Field Free Energy 6 September, 2005 SPDSA2005 (Roma)


Download ppt "Graduate School of Information Sciences, Tohoku University, Japan"

Similar presentations


Ads by Google