Presentation is loading. Please wait.

Presentation is loading. Please wait.

Advanced Image Processing Image Relaxation – Restoration and Feature Extraction 02/02/10.

Similar presentations


Presentation on theme: "Advanced Image Processing Image Relaxation – Restoration and Feature Extraction 02/02/10."— Presentation transcript:

1 Advanced Image Processing Image Relaxation – Restoration and Feature Extraction 02/02/10

2 Homework discussion On edge detectors –Double edge Does Sobel or Prewitt present double edge? Does Prewitt behave the same in all directions? –Closed contour by Marr-Hildreth? –Does Sobel provide better noise-suppression characteristics than Prewitt? –Is zero-crossing more accurate than gradient? –Which one is gradient-based? Which one uses zero-crossing (or 2nd derivative)? –Does large sigma increase neighborhood pixels’ weight? Misc –Provide parameter selection info –Observations should be based on your results –Using latex –Reference

3 Derivatives

4 http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL_COPIES/OWENS/LECT6/node2.html

5 Outline Image restoration as a relaxation method –MAP (maximum a-posteriori probability) –MFA (mean field annealing) –VCD (variable conductance diffusion or anisotropic diffusion) Image restoration as a blind source separation problem –Lidan’s CVPR paper Non-iterative image restoration

6 Relaxation method Relaxation –A multistep algorithm with the property that The output of a single step is of the same form as the input, so that it can be applied iteratively It converges to a bounded result The operation on any element be dependent only on its neighbors Restoration as a relaxation algorithm

7 Image restoration The degradation model An ill-posed inverse problem –Well-posed vs. ill-posed –Ill-conditioning Condition number Regularization theory –Maximum A-Posteriori probability (MAP) –Mean Field Annealing (MAF)

8 Degradation model H+ f (x, y) g (x, y)  (x, y)

9 Regularization theory Generally speaking, any regularization method tries to analyze a related well-posed problem whose solution approximates the original ill- posed problem. The well-posedness is achieved by implementing one or more of the following basic ideas –restriction of the data; –change of the space and/or topologies; –modification of the operator itself; –the concept of regularization operators; and –well-posed stochastic extensions of ill-posed problems.

10 Image restoration – An ill- posed problem Degradation model H is ill-conditioned which makes image restoration problem an ill-posed problem –Solution is not stable

11 Ill-conditioning

12 Example Noise-free Sinusoidal noise Noise-free Exact H Exact H not exact H

13 MAP Bayes’ rule The noise term –The noise probability distribution The prior term –MRF and Gibbs distribution –Different models of smoothness for modeling prior energy Piece-wise constant (flat surface) Piece-wise planar surface Quadratic surface Gradient descent

14 Solution formulation For g = Hf + , the regularization method constructs the solution as u(f, g) describes how the real image data is related to the degraded data. In other words, this term models the characteristic of the imaging system.  v(f) is the regularization term with the regularization operator v operating on the original image f, and the regularization parameter  used to tune up the weight of the regularization term. By adding the regularization term, the original ill-posed problem turns into a well-posed one, that is, the insertion of the regularization operator puts some constraints on what f might be, which makes the solution more stable.

15 MAP (maximum a-posteriori probability) Formulate solution from statistical point of view: MAP approach tries to find an estimate of image f that maximizes the a-posteriori probability p(f|g) as According to Bayes' rule, –P(f) is the a-priori probability of the unknown image f. We call it the prior model –P(g) is the probability of g which is a constant when g is given –p(g|f) is the conditional probability density function (pdf) of g. We call it the sensor model, which is a description of the noisy or stochastic processes that relate the original unknown image f to the measured image g.

16 MAP - Derivation Bayes interpretation of regularization theory Noise term Prior term

17 Noise Term Assume Gaussian noise of zero mean,  the standard deviation

18 Prior Model The a-priori probability of an image by a Gibbs distribution is defined as –U(f) is the energy function –T is the temperature of the model –Z is a normalization constant

19 Prior Model (cont’) U(f), the prior energy function, is usually formulated based on the smoothness property of the original image. Therefore, U(f) should measure the extent to which the smoothness is violated Difference between neighborhood pixels punishment

20 Prior Model (cont’)  is the parameter that adjusts how smooth the image goes The k-th derivative models the difference between neighbor pixels. It can also be approximated by convolution with the right kernel

21 Prior Model – Kernel r Laplacian kernel

22 The Objective Function Use gradient descent to solve f

23 MFA Compared to SA An optimization scheme Annealing combined with gradient descent Avoids local minima In the prior term,  as  +T


Download ppt "Advanced Image Processing Image Relaxation – Restoration and Feature Extraction 02/02/10."

Similar presentations


Ads by Google