1 Bayesian Restoration Using a New Nonstationary Edge-Preserving Image Prior Giannis K. Chantas, Nikolaos P. Galatsanos, and Aristidis C. Likas IEEE Transactions.

Slides:



Advertisements
Similar presentations
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Advertisements

Factorial Mixture of Gaussians and the Marginal Independence Model Ricardo Silva Joint work-in-progress with Zoubin Ghahramani.
Exploiting Sparse Markov and Covariance Structure in Multiresolution Models Presenter: Zhe Chen ECE / CMR Tennessee Technological University October 22,
Patch-based Image Deconvolution via Joint Modeling of Sparse Priors Chao Jia and Brian L. Evans The University of Texas at Austin 12 Sep
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
1 Fast Primal-Dual Strategies for MRF Optimization (Fast PD) Robot Perception Lab Taha Hamedani Aug 2014.
Chih-Hsing Lin, Jia-Shiuan Tsai, and Ching-Te Chiu
The Factor Graph Approach to Model-Based Signal Processing Hans-Andrea Loeliger.
Maximum likelihood separation of spatially autocorrelated images using a Markov model Shahram Hosseini 1, Rima Guidara 1, Yannick Deville 1 and Christian.
Markov random field Institute of Electronics, NCTU
A Bayesian Approach to Joint Feature Selection and Classifier Design Balaji Krishnapuram, Alexander J. Hartemink, Lawrence Carin, Fellow, IEEE, and Mario.
Pattern Recognition and Machine Learning
Image Denoising via Learned Dictionaries and Sparse Representations
1 On the Statistical Analysis of Dirty Pictures Julian Besag.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
Super-Resolution Reconstruction of Images -
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
Rician Noise Removal in Diffusion Tensor MRI
Noise Estimation from a Single Image Ce Liu William T. FreemanRichard Szeliski Sing Bing Kang.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
24 November, 2011National Tsin Hua University, Taiwan1 Mathematical Structures of Belief Propagation Algorithms in Probabilistic Information Processing.
ELE 488 F06 ELE 488 Fall 2006 Image Processing and Transmission ( ) Wiener Filtering Derivation Comments Re-sampling and Re-sizing 1D  2D 10/5/06.
© by Yu Hen Hu 1 ECE533 Digital Image Processing Image Restoration.
Yasmina Schoueri, Milena Scaccia, and Ioannis Rekleitis School of Computer Science, McGill University.
RECPAD - 14ª Conferência Portuguesa de Reconhecimento de Padrões, Aveiro, 23 de Outubro de 2009 The data exhibit a severe type of signal-dependent noise,
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
Segmental Hidden Markov Models with Random Effects for Waveform Modeling Author: Seyoung Kim & Padhraic Smyth Presentor: Lu Ren.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Markov Random Fields Probabilistic Models for Images
School of Electrical & Computer Engineering Image Denoising Using Steerable Pyramids Alex Cunningham Ben Clarke Dy narath Eang ECE November 2008.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
14 October, 2010LRI Seminar 2010 (Univ. Paris-Sud)1 Statistical performance analysis by loopy belief propagation in probabilistic image processing Kazuyuki.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Statistics in the Image Domain for Mobile Robot Environment Modeling L. Abril Torres-Méndez and Gregory Dudek Centre for Intelligent Machines School of.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
An Introduction to Kalman Filtering by Arthur Pece
Paper Reading Dalong Du Nov.27, Papers Leon Gu and Takeo Kanade. A Generative Shape Regularization Model for Robust Face Alignment. ECCV08. Yan.
Additional Topics in Prediction Methodology. Introduction Predictive distribution for random variable Y 0 is meant to capture all the information about.
Lecture 2: Statistical learning primer for biologists
Abstract  Arterial Spin Labeling (ASL) is a noninvasive method for quantifying Cerebral Blood Flow (CBF).  The most common approach is to alternate between.
6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok.
Efficient Belief Propagation for Image Restoration Qi Zhao Mar.22,2006.
Markov Random Fields in Vision
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
Edge Preserving Spatially Varying Mixtures for Image Segmentation Giorgos Sfikas, Christophoros Nikou, Nikolaos Galatsanos (CVPR 2008) Presented by Lihan.
MOTION Model. Road Map Motion Model Non Parametric Motion Field : Algorithms 1.Optical flow field estimation. 2.Block based motion estimation. 3.Pel –recursive.
RECONSTRUCTION OF MULTI- SPECTRAL IMAGES USING MAP Gaurav.
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Biointelligence Laboratory, Seoul National University
Detection theory 1. Definition of the problematic
Probability Theory and Parameter Estimation I
Graduate School of Information Sciences, Tohoku University
Today.
Fundamentals of estimation and detection in signals and images
Ch3: Model Building through Regression
Markov Random Fields with Efficient Approximations
Lecture 09: Gaussian Processes
Dynamical Statistical Shape Priors for Level Set Based Tracking
Image Analysis Image Restoration.
Graduate School of Information Sciences, Tohoku University
Markov Random Fields for Edge Classification
Graduate School of Information Sciences, Tohoku University
More Parameter Learning, Multinomial and Continuous Variables
Image and Video Processing
Lecture 10: Gaussian Processes
Advanced deconvolution techniques and medical radiography
Presentation transcript:

1 Bayesian Restoration Using a New Nonstationary Edge-Preserving Image Prior Giannis K. Chantas, Nikolaos P. Galatsanos, and Aristidis C. Likas IEEE Transactions on Image Processing, Vol. 15, No. 10, October 2006

2 Outline Review of Markov random field (MRF) for signal restoration problem Bayesian restoration using a new non-stationary edge-preserving image prior

3 MAP formulation for signal restoration problem Noisy signal dRestored signal f

4 MAP formulation for signal restoration problem The problem of the signal restoration could be modeled as the MAP estimation problem, that is, (Prior model) (Observation model)

5 MAP formulation for signal restoration problem Assume the observation is the true signal plus the independent Gaussian noise, that is Assume the unknown data f is MRF, the prior model is:

6 MAP formulation for signal restoration problem Substitute above information into the MAP estimator, we could get: Observation model (Similarity measure) Prior model (Reconstruction constrain, Regularization)

7 MAP formulation for signal restoration problem From the potential function point of view: The edge region is blurred due to the improper design of the prior model

8 MRF with pixel process and line process (Geman and Geman, 1984) Lattice of pixel site: S P Labeling value: f i p (real value) Lattice of line site: S E Labeling value: f ii’ E (only 0 or 1) Compound MRF Prior model with indicator (Line process)

9 MRF with nonstationary image prior (G.K. Chantas, N.P. Galatsanos and A.C. Likas, 2006) From image modeling point of view, the binary nature (0 or 1) of the line process (Previous prior model) is insufficient to capture the image variations Edge pattern 1 Edge pattern 2 Edge pattern 2 is more sharper than edge pattern 1

10 MRF with nonstationary image prior (G.K. Chantas, N.P. Galatsanos and A.C. Likas, 2006) A linear imaging model is assumed in this paper, that is:

11 MRF with nonstationary image prior (G.K. Chantas, N.P. Galatsanos and A.C. Likas, 2006) For the image prior model, they assume the first order difference of the image f in four direction, 0 o, 90 o, 45 o, 135 o, respectively, are given by f(i-1,j-1)f(i-1,j)f(i-1,j+1) f(i,j-1)f(i,j)f(i,j+1) f(i+1,j-1)f(i+1,j)f(i+1,j+1) A 3x3 image patch; f(i,j): Intensity at location (i,j)

12 MRF with nonstationary image prior (G.K. Chantas, N.P. Galatsanos and A.C. Likas, 2006) The previous equation can be also written in matrix vector form for the entire image, that is

13 MRF with nonstationary image prior (G.K. Chantas, N.P. Galatsanos and A.C. Likas, 2006) For convenience, author introduces the following notation

14 MRF with nonstationary image prior (G.K. Chantas, N.P. Galatsanos and A.C. Likas, 2006) Assume the residual ε i k in each direction and at each pixel location are independent. Then, the joint density for the residuals is Gaussian and is given as:

15 MRF with nonstationary image prior (G.K. Chantas, N.P. Galatsanos and A.C. Likas, 2006) We could get the pdf of image f by using the fact that: Then we have: Over-parameterization occurs of the proposed model !

16 MRF with nonstationary image prior (G.K. Chantas, N.P. Galatsanos and A.C. Likas, 2006) To overcome the over-parameterization problem, the author views a i k as a random variable instead of parameter and introduces Gamma hyper-prior for it Where l k and m k are parameters of the hyper-prior

17 MRF with nonstationary image prior (G.K. Chantas, N.P. Galatsanos and A.C. Likas, 2006) More on Gamma hyper-prior a i k Stationary prior Non-stationary prior Pdf:

18 MRF with nonstationary image prior (G.K. Chantas, N.P. Galatsanos and A.C. Likas, 2006) MAP estimation – Maximize p(.) is equivalent to minimize J MAP

19 MRF with nonstationary image prior (G.K. Chantas, N.P. Galatsanos and A.C. Likas, 2006) Bayesian algorithm : We are interested in true value of f instead of a i k – Marginalize a i k for solution finding, that is The image is estimated by finding the mode of above pdf

20 MRF with nonstationary image prior (G.K. Chantas, N.P. Galatsanos and A.C. Likas, 2006) Definition for improvement signal to noise ration (ISNR)

21 Original imageMAP non-stationary, ISNR:5.63 dB, l=2.2 Wiener filter, ISNR:3.2dB Bayesian non-stationary, ISNR:5.22 dB, l=2.2 CLS, ISNR:4.65dBDegraded image

22