Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 October, 2007 ALT&DS2007 (Sendai, Japan ) 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate School of Information.

Similar presentations


Presentation on theme: "1 October, 2007 ALT&DS2007 (Sendai, Japan ) 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate School of Information."— Presentation transcript:

1 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate School of Information Sciences, Tohoku University, Sendai, Japan kazu@smapip.is.tohoku.ac.jphttp://www.smapip.is.tohoku.ac.jp/~kazu/

2 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 2 Contents 1.Introduction 2.Probabilistic Image Processing 3.Gaussian Graphical Model 4.Belief Propagation 5.Other Application 6.Concluding Remarks

3 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 3 Contents 1.Introduction 2.Probabilistic Image Processing 3.Gaussian Graphical Model 4.Belief Propagation 5.Other Application 6.Concluding Remarks

4 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 4 Markov Random Fields for Image Processing S. Geman and D. Geman (1986): IEEE Transactions on PAMI Image Processing for Markov Random Fields (MRF) (Simulated Annealing, Line Fields) J. Zhang (1992): IEEE Transactions on Signal Processing Image Processing in EM algorithm for Markov Random Fields (MRF) (Mean Field Methods) Markov Random Fields are One of Probabilistic Methods for Image processing.

5 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 5 Markov Random Fields for Image Processing In Markov Random Fields, we have to consider not only the states with high probabilities but also ones with low probabilities. In Markov Random Fields, we have to estimate not only the image but also hyperparameters in the probabilistic model.  We have to perform the calculations of statistical quantities repeatedly. Hyperparameter Estimation Statistical Quantities Estimation of Image We need a deterministic algorithm for calculating statistical quantities.  Belief Propagation

6 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 6 Belief Propagation Belief Propagation has been proposed in order to achieve probabilistic inference systems (Pearl, 1988). It has been suggested that Belief Propagation has a closed relationship to Mean Field Methods in the statistical mechanics (Kabashima and Saad 1998). Generalized Belief Propagation has been proposed based on Advanced Mean Field Methods (Yedidia, Freeman and Weiss, 2000). Interpretation of Generalized Belief Propagation also has been presented in terms of Information Geometry (Ikeda, T. Tanaka and Amari, 2004).

7 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 7 Probabilistic Model and Belief Propagation x3x3 x1x1 x2x2 x4x4 x5x5 Tree x3x3 x1x1 x2x2 Cycle Function consisting of a product of functions with two variables can be assigned to a graph representation. Examples Belief Propagation can give us an exact result for the calculations of statistical quantities of probabilistic models with tree graph representations. Generally, Belief Propagation cannot give us an exact result for the calculations of statistical quantities of probabilistic models with cycle graph representations.

8 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 8 Application of Belief Propagation Turbo and LDPC codes in Error Correcting Codes (Berrou and Glavieux: IEEE Trans. Comm., 1996; Kabashima and Saad: J. Phys. A, 2004, Topical Review). CDMA Multiuser Detection in Mobile Phone Communication (Kabashima: J. Phys. A, 2003). Satisfability (SAT) Problems in Computation Theory (Mezard, Parisi, Zecchina: Science, 2002). Image Processing (Tanaka: J. Phys. A, 2002, Topical Review; Willsky: Proceedings of IEEE, 2002). Probabilistic Inference in AI (Kappen and Wiegerinck, NIPS, 2002). Applications of belief propagation to many problems which are formulated as probabilistic models with cycle graph representations have caused to many successful results.

9 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 9 Purpose of My Talk Review of formulation of probabilistic model for image processing by means of conventional statistical schemes. Review of probabilistic image processing by using Gaussian graphical model (Gaussian Markov Random Fields) as the most basic example. Review of how to construct a belief propagation algorithm for image processing.

10 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 10 Contents 1.Introduction 2.Probabilistic Image Processing 3.Gaussian Graphical Model 4.Belief Propagation 5.Other Application 6.Concluding Remarks

11 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 11 Image Representation in Computer Vision Digital image is defined on the set of points arranged on a square lattice. The elements of such a digital array are called pixels. We have to treat more than 100,000 pixels even in the digital cameras and the mobile phones.

12 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 12 Image Representation in Computer Vision At each point, the intensity of light is represented as an integer number or a real number in the digital image data. A monochrome digital image is then expressed as a two- dimensional light intensity function and the value is proportional to the brightness of the image at the pixel.

13 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 13 Noise Reduction by Conventional Filters 192 202 190 202 219 120 100 218 110 192 202 190 202 173 120 100 218 110 It is expected that probabilistic algorithms for image processing can be constructed from such aspects in the conventional signal processing. Markov Random FieldsProbabilistic Image Processing Algorithm Smoothing Filters The function of a linear filter is to take the sum of the product of the mask coefficients and the intensities of the pixels.

14 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 14 Bayes Formula and Bayesian Network Posterior Probability Bayes Rule Prior Probability Event A is given as the observed data. Event B corresponds to the original information to estimate. Thus the Bayes formula can be applied to the estimation of the original information from the given data. A B Bayesian Network Data-Generating Process

15 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 15 Image Restoration by Probabilistic Model Original Image Degraded Image Transmission Noise Assumption 1: The degraded image is randomly generated from the original image by according to the degradation process. Assumption 2: The original image is randomly generated by according to the prior probability. Bayes Formula

16 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 16 Image Restoration by Probabilistic Model Degraded Image i f i : Light Intensity of Pixel i in Original Image Position Vector of Pixel i g i : Light Intensity of Pixel i in Degraded Image i Original Image The original images and degraded images are represented by f = {f i } and g = {g i }, respectively.

17 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 17 Probabilistic Modeling of Image Restoration Random Fields fifi gigi fifi gigi or Assumption 1: A given degraded image is obtained from the original image by changing the state of each pixel to another state by the same probability, independently of the other pixels.

18 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 18 Probabilistic Modeling of Image Restoration Random Fields Assumption 2: The original image is generated according to a prior probability. Prior Probability consists of a product of functions defined on the neighbouring pixels. ij Product over All the Nearest Neighbour Pairs of Pixels

19 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 19 Prior Probability for Binary Image = = > pp i j Probability of Neigbouring Pixel ij It is important how we should assume the function  (f i,f j ) in the prior probability. We assume that every nearest-neighbour pair of pixels take the same state of each other in the prior probability.

20 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 20 Prior Probability for Binary Image Prior probability prefers to the configuration with the least number of red lines. Which state should the center pixel be taken when the states of neighbouring pixels are fixed to the white states? ? > = = > pp i j Probability of Nearest Neigbour Pair of Pixels

21 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 21 Prior Probability for Binary Image Which state should the center pixel be taken when the states of neighbouring pixels are fixed as this figure? ?-??-? = = > pp > > = Prior probability prefers to the configuration with the least number of red lines.

22 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 22 What happens for the case of large umber of pixels? p lnp Disordered State Critical Point (Large fluctuation) small plarge p Covariance between the nearest neghbour pairs of pixels Sampling by Marko chain Monte Carlo Ordered State Patterns with both ordered states and disordered states are often generated near the critical point.

23 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 23 Pattern near Critical Point of Prior Probability ln p similar small plarge p Covariance between the nearest neghbour pairs of pixels We regard that patterns generated near the critical point are similar to the local patterns in real world images.

24 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 24 Bayesian Image Analysis Original Image Degraded Image Prior Probability Posterior Probability Degradation Process Image processing is reduced to calculations of averages, variances and co-variances in the posterior probability. B : Set of all the nearest neighbour pairs of pixels Ω : Set of All the pixels

25 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 25 Estimation of Original Image We have some choices to estimate the restored image from posterior probability. In each choice, the computational time is generally exponential order of the number of pixels. Thresholded Posterior Mean (TPM) estimation Maximum posterior marginal (MPM) estimation Maximum A Posteriori (MAP) estimation (1) (2) (3)

26 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 26 Statistical Estimation of Hyperparameters Marginalized with respect to F Original Image Marginal Likelihood Degraded Image Hyperparameters  are determined  so as to maximize the marginal likelihood Pr{G=g| ,  } with respect to , 

27 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 27 Maximization of Marginal Likelihood by EM Algorithm Marginal Likelihood E-step and M-Step are iterated until convergence: EM (Expectation Maximization) Algorithm Q-Function

28 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 28 Contents 1.Introduction 2.Probabilistic Image Processing 3.Gaussian Graphical Model 4.Belief Propagation 5.Other Application 6.Concluding Remarks

29 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 29 Bayesian Image Analysis by Gaussian Graphical Model Patterns are generated by MCMC. Markov Chain Monte Carlo Method Prior Probability B:Set of all the nearest-neghbour pairs of pixels  :Set of all the pixels

30 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 30 Bayesian Image Analysis by Gaussian Graphical Model Histogram of Gaussian Random Numbers Degraded image is obtained by adding a white Gaussian noise to the original image. Degradation Process is assumed to be the additive white Gaussian noise.  :Set of all the pixels

31 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 31 Bayesian Image Analysis by Gaussian Graphical Model Multi-Dimensional Gaussian Integral Formula Posterior Probability Average of the posterior probability can be calculated by using the multi- dimensional Gauss integral Formula NxN matrix B:Set of all the nearest-neghbour pairs of pixels  :Set of all the pixels

32 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 32 Bayesian Image Analysis by Gaussian Graphical Model Iteration Procedure of EM algorithm in Gaussian Graphical Model EM

33 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 33 Image Restoration by Markov Random Field Model and Conventional Filters MSE Statistical Method315 Lowpass Filter (3x3)388 (5x5)413 Median Filter (3x3)486 (5x5)445 (3x3) Lowpass (5x5) Median MRF Original Image Degraded Image RestoredImage  :Set of all the pixels

34 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 34 Contents 1.Introduction 2.Probabilistic Image Processing 3.Gaussian Graphical Model 4.Belief Propagation 5.Other Application 6.Concluding Remarks

35 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 35 Graphical Representation for Tractable Models Tractable Model A B C D Intractable Model A B C Tree Graph Cycle Graph It is possible to calculate each summation independently. It is hard to calculate each summation independently.

36 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 36 Belief Propagation for Tree Graphical Model 3 12 5 4 3 12 5 4 After taking the summations over red nodes 3,4 and 5, the function of nodes 1 and 2 can be expressed in terms of some messages.

37 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 37 Belief Propagation for Tree Graphical Model 3 12 5 4 By taking the summation over all the nodes except node 1, message from node 2 to node 1 can be expressed in terms of all the messages incoming to node 2 except the own message. 3 1 2 5 4 2 1 Summation over all the nodes except 1

38 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 38 Loopy Belief Propagation for Graphical Model in Image Processing Graphical model for image processing is represented in terms of the square lattice. Square lattice includes a lot of cycles. Belief propagation are applied to the calculation of statistical quantities as an approximate algorithm. 124 5 3 124 5 Every graph consisting of a pixel and its four neighbouring pixels can be regarded as a tree graph. Loopy Belief Propagation 3 1 2 5 4 2 1 3

39 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 39 Loopy Belief Propagation for Graphical Model in Image Processing 4 2 3 1 5 Message Passing Rule in Loopy Belief Propagation Averages, variances and covariances of the graphical model are expressed in terms of messages. 3 1 2 5 4 2 1

40 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 40 Loopy Belief Propagation for Graphical Model in Image Processing We have four kinds of message passing rules for each pixel. Each massage passing rule includes 3 incoming messages and 1 outgoing message Visualizations of Passing Messages

41 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 41 EM algorithm by means of Belief Propagation Input Output Loopy BP EM Update Rule of Loopy BP 3 1 2 5 4 2 1 EM Algorithm for Hyperparameter Estimation

42 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 42 Probabilistic Image Processing by EM Algorithm and Loopy BP for Gaussian Graphical Model Loopy Belief Propagation Exact MSE:327 MSE:315

43 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 43 Contents 1.Introduction 2.Probabilistic Image Processing 3.Gaussian Graphical Model 4.Belief Propagation 5.Other Application 6.Concluding Remarks

44 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 44 Digital Images Inpainting based on MRF Input Output Markov Random Fields M. Yasuda, J. Ohkubo and K. Tanaka: Proceedings of CIMCA&IAWTIC2005.

45 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 45 Contents 1.Introduction 2.Probabilistic Image Processing 3.Gaussian Graphical Model 4.Belief Propagation 5.Other Application 6.Concluding Remarks

46 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 46 Summary Formulation of probabilistic model for image processing by means of conventional statistical schemes has been summarized. Probabilistic image processing by using Gaussian graphical model has been shown as the most basic example. It has been explained how to construct a belief propagation algorithm for image processing.

47 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 47 Statistical Mechanics Informatics for Probabilistic Image Processing S. Geman and D. Geman (1986): IEEE Transactions on PAMI Image Processing for Markov Random Fields (MRF) (Simulated Annealing, Line Fields) J. Zhang (1992): IEEE Transactions on Signal Processing Image Processing in EM algorithm for Markov Random Fields (MRF) (Mean Field Methods) K. Tanaka and T. Morita (1995): Physics Letters A Cluster Variation Method for MRF in Image Processing Original ideas of some techniques, Simulated Annealing, Mean Field Methods and Belief Propagation, is often based on the statistical mechanics. Mathematical structure of Belief Propagation is equivalent to Bethe Approximation and Cluster Variation Method (Kikuchi Method) which are ones of advanced mean field methods in the statistical mechanics.

48 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 48 Statistical Mechanical Informatics for Probabilistic Information Processing It has been suggested that statistical performance estimations for probabilistic information processing are closed to the spin glass theory. The computational techniques of spin glass theory has been applied to many problems in computer sciences. Error Correcting Codes (Y. Kabashima and D. Saad: J. Phys. A, 2004, Topical Review). CDMA Multiuser Detection in Mobile Phone Communication (T. Tanaka: IEEE Information Theory, 2002). SAT Problems (Mezard, Parisi, Zecchina: Science, 2002). Image Processing (K. Tanaka: J. Phys. A, 2002, Topical Review).

49 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 49 SMAPIP Project MEXT Grant-in Aid for Scientific Research on Priority Areas Period: 2002 –2005 Head Investigator: Kazuyuki Tanaka Period: 2002 –2005 Head Investigator: Kazuyuki Tanaka Webpage URL: http://www.smapip.eei.metro-u.ac.jp./ Member: K. Tanaka, Y. Kabashima, H. Nishimori, T. Tanaka, M. Okada, O. Watanabe, N. Murata,......

50 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 50 DEX-SMI Project http://dex-smi.sp.dis.titech.ac.jp/DEX-SMI/http://dex-smi.sp.dis.titech.ac.jp/DEX-SMI/ 情報統計力学 GO DEX-SMI GO MEXT Grant-in Aid for Scientific Research on Priority Areas Period: 2006 –2009 Head Investigator: Yoshiyuki Kabashima Period: 2006 –2009 Head Investigator: Yoshiyuki Kabashima Deepening and Expansion of Statistical Mechanical Informatics

51 1 October, 2007 ALT&DS2007 (Sendai, Japan ) 51 References K. Tanaka: Statistical-Mechanical Approach to Image Processing (Topical Review), J. Phys. A, 35 (2002). A. S. Willsky: Multiresolution Markov Models for Signal and Image Processing, Proceedings of IEEE, 90 (2002).


Download ppt "1 October, 2007 ALT&DS2007 (Sendai, Japan ) 1 Introduction to Probabilistic Image Processing and Bayesian Networks Kazuyuki Tanaka Graduate School of Information."

Similar presentations


Ads by Google