Download presentation

Presentation is loading. Please wait.

Published byBernard Wilder Modified over 4 years ago

1
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 1/25 Signal- und Bildverarbeitung, 323.014 Image Analysis and Processing Arjan Kuijper 23.11.2006 Johann Radon Institute for Computational and Applied Mathematics (RICAM) Austrian Academy of Sciences Altenbergerstraße 56 A-4040 Linz, Austria arjan.kuijper@oeaw.ac.at

2
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 2/25 Last week The diffusion can be made locally adaptive to image structure. Three mathematical approaches are discussed: 1.PDE-based nonlinear diffusion, where the luminance function evolves as the divergence of some flow. 2.Evolution of the isophotes as an example of curve-evolution; 3.Variational methods, minimizing an energy functional defined on the image. The nonlinear PDE's involve local image derivatives, and cannot be solved analytically. Adaptive smoothing requires geometric reasoning to define the influence on the diffusivity coefficient. The simplest equation is the equation proposed by Perona & Malik, where the variable conduction is a function of the local edge strength. Strong gradient magnitudes prevent the blurring locally, the effect is edge preserving smoothing. The Perona & Malik equation leads to deblurring (enhancing edges) for edges larger than the turnover point k, and blurs smaller edges.

3
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 3/25 Today Total Variation –Rudin – Osher – Fatemi (ROF) Model –Denoising –Edge preserving –Energy minimizing –Bounded variation Taken from:

4
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 4/25 Let the observed intensity function u 0 (x, y) denote the pixel values of a noisy image for x, y W. Let u(x, y) denote the desired clean image, so with n additive white (0, )noise. The constrained minimization problem is: Min(H|I 1,I 2 ) = Min (H - l 1 I 1 - l 2 I 2 )

5
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 5/25 We arrive at the Euler-Lagrange equations 0= d H - l 1 d I 1 - l 2 d I 2 Integrating the expression over W gives l 1 = 0. So the average intensity is kept.

6
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 6/25 The solution procedure uses a parabolic equation with time as an evolution parameter, or equivalently, the gradient descent method. This means that we solve

7
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 7/25 We must compute l (t). We merely multiply the equation in W by (u - u0) and integrate by parts over W. We then have

8
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 8/25 The numerical method in two spatial dimensions is as follows:

9
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 9/25 The numerical approximation is

10
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 10/25 and l n is defined discreetly via A step size restriction is imposed for stability:

11
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 11/25 Results

12
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 12/25

13
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 13/25 Original Noisy Wiener TV

14
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 14/25 Do it slightly sloppy: demo L t = r ¢ ¡ k r L k ¡ 1 r L ¢ + ¸ ( L ¡ L 0 ) ± L = C ¢ D ¡ i ; j ¡ k D + i ; j L k ¡ 1 D + i ; j L ¢ + ¸ ( L ¡ L 0 )

15
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 15/25 Write Then I(u) is a constant of motion: I(u 0 )=I(u t ): Then the pde converges to a minimum on the manifold given by the constraints: u t = ¡ ( ± H ¡ ¸± I ) @ t I ( u ) = < ± I ; u t > = 0 @ t H ( u ) = ¡ u 2 t · 0 ¸ = < ± H ; ± I > < ± I ; ± I > < A ; B > = R A ¢ B d

16
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 16/25 Some extensions A blurred noisy image: u 0 = (Au)(x,y) + n(x,y) where A is a compact operator on L2. Multiplicative noise & blurring u 0 = [(Au)(x,y) ] n(x,y) u 0 = (Au)(x,y) + [u(x,y) n(x,y)] Smarter functional: H = R Á ( j r L j ) d

17
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 17/25 Recent Developments in Total Variation Image Restoration T. Chan, S. Esedoglu, F. Park, A. Yip Handbook of Mathematical Models in Computer Vision Since their introduction in a classic paper by Rudin, Osher and Fatemi, total variation minimizing models have become one of the most popular and successful methodology for image restoration. More recently, there has been a resurgence of interest and exciting new developments, some extending the applicabilities to inpainting, blind deconvolution and vector-valued images, while others offer improvements in better preservation of contrast, geometry and textures, in ameliorating the stair casing effect, and in exploiting the multi-scale nature of the models. In addition, new computational methods have been proposed with improved computational speed and robustness.ameliorating

18
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 18/25 Properties Total variation based image restoration models were first introduced by Rudin, Osher, and Fatemi (ROF) in their pioneering work on edge preserving image denoising. It is one of the earliest and best known examples of PDE based edge preserving denoising. It was designed with the explicit goal of preserving sharp discontinuities (edges) in images while removing noise and other unwanted fine scale detail. Being convex, the ROF model is one of the simplest variational models having this most desirable property. The revolutionary aspect of this model is its regularization term that allows for discontinuities but at the same time disfavors oscillations.

19
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 19/25 Properties The constraint of the optimization forces the minimization to take place over images that are consistent with this known noise level. The objective functional itself is called the total variation (TV) of the function u(x); for smooth images it is equivalent to the L1 norm of the derivative, and hence is some measure of the amount of oscillation found in the function u(x).

20
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 20/25 Remark The step from to is not trivial!not trivial

21
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 21/25 BV The space of functions with bounded variation (BV) is an ideal choice for minimizers to the ROF model since BV provides regularity of solutions but also allows sharp discontinuities (edges). Many other spaces like the Sobolev space W 1,1 do not allow edges. Before defining the space BV, we formally state the definition of TV as: where and is a bounded open set. We can now define the space BV as Thus, BV functions amount to L1 functions with bounded TV semi-norm.

22
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 22/25 TV & Contours Why does this work? Ignoring the constraints we get The TV norm of f can be obtained by integrating along all contours of f = c for all values of c. Thus, one can view TV as controlling both the size of the jumps in an image and the geometry of the isophotes (level sets). @f @ t = f vv f w = ¡ · !

23
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 23/25 Caveats While using TV-norm as regularization can reduce oscillations and regularize the geometry of level sets without penalizing discontinuities, it possesses some properties which may be undesirable under some circumstances. –Loss of contrast: The total variation of a function, defined on a bounded domain, is decreased if we re-scale it around its mean value in such a way that the difference between the maximum and minimum value (contrast) is reduced. –Loss of geometry: In addition to loss of contrast, the TV of a function may be decreased by reducing the length of each level set. –Staircasing: This refers to the phenomenon that the denoised image may look blocky (piecewise constant). –Loss of Texture: Although highly effective for denoising, the TV norm cannot preserve delicate small scale features like texture.

24
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 24/25 Summary Total variation minimizing models have become one of the most popular and successful methodology for image restoration. ROF is one of the earliest and best known examples of PDE based edge preserving denoising. It was designed with the explicit goal of preserving sharp discontinuities (edges) in images while removing noise and other unwanted fine scale detail. However, it has some drawbacks as shown in the previous slides

25
Johann Radon Institute for Computational and Applied Mathematics: www.ricam.oeaw.ac.at 25/25 Next week Non-linear diffusion: Mean curvature motion –Curve evolution –Denoising –Edge preserving –Implementation –Isophote vs. image implementation

Similar presentations

OK

Various Regularization Methods in Computer Vision Min-Gyu Park Computer Vision Lab. School of Information and Communications GIST.

Various Regularization Methods in Computer Vision Min-Gyu Park Computer Vision Lab. School of Information and Communications GIST.

© 2018 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google