Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Total variation minimization Numerical Analysis, Error Estimation, and Extensions Martin Burger Johannes Kepler University Linz SFB Numerical-Symbolic-Geometric.

Similar presentations


Presentation on theme: "1 Total variation minimization Numerical Analysis, Error Estimation, and Extensions Martin Burger Johannes Kepler University Linz SFB Numerical-Symbolic-Geometric."— Presentation transcript:

1 1 Total variation minimization Numerical Analysis, Error Estimation, and Extensions Martin Burger Johannes Kepler University Linz SFB Numerical-Symbolic-Geometric Scientific Computing Radon Institute for Computational & Applied Mathematics Westfälische Wilhelms Universität Münster

2 Total variation minimization Obergurgl, September 20062 Stan Osher, Jinjun Xu, Guy Gilboa (UCLA) Lin He (Linz / UCLA) Klaus Frick, Otmar Scherzer (Innsbruck) Carola Schönlieb (Vienna) Don Goldfarb, Wotao Yin (Columbia) Collaborations

3 Total variation minimization Obergurgl, September 20063 Total variation methods are popular in imaging (and inverse problems), since - they keep sharp edges - eliminate oscillations (noise) - create new nice mathematics Many related approaches appeared in the last years, e.g. ℓ 1 penalization / sparsity techniques Introduction

4 Total variation minimization Obergurgl, September 20064 Total variation and related methods have some shortcomings - difficult to analyze and to obtain error estimates - systematic errors (clean images not reconstructed perfectly) - computational challenges - some extensions to other imaging tasks are not well understood (e.g. inpainting) Introduction

5 Total variation minimization Obergurgl, September 20065 Starting point of the analysis is the ROF model for denoising Rudin-Osher Fatemi 89/92, Acar-Vogel 93, Chambolle-Lions 96, Vogel 95/96, Scherzer-Dobson 96, Chavent-Kunisch 98, Meyer 01,… ROF Model

6 Total variation minimization Obergurgl, September 20066 ROF Model Reconstruction (code by Jinjun Xu) cleannoisy ROF

7 Total variation minimization Obergurgl, September 20067 First question for error estimation: estimate difference of u (minimizer of ROF) and f in terms of Estimate in the L 2 norm is standard, but does not yield information about edges Estimate in the BV-norm too ambitious: even arbitrarily small difference in edge location can yield BV-norm of order one ! Error Estimation

8 Total variation minimization Obergurgl, September 20068 We need a better error measure, stronger than L2, weaker than BV Possible choice: Bregman distance Bregman 67 Real distance for a strictly convex differentiable functional – not symmetric Symmetric version Error Estimation

9 Total variation minimization Obergurgl, September 20069 Total variation is neither symmetric nor differentiable Define generalized Bregman distance for each subgradient Symmetric version Kiwiel 97, Chen-Teboulle 97 Error Estimation

10 Total variation minimization Obergurgl, September 200610 Since TV seminorm is homogeneous of degree one, we have Bregman distance becomes Error Estimation

11 Total variation minimization Obergurgl, September 200611 Bregman distance for TV is not a strict distance, can be zero for In particular d TV is zero for contrast change Resmerita-Scherzer 06 Bregman distance is still not negative (TV convex) Bregman distance can provide information about edges Error Estimation

12 Total variation minimization Obergurgl, September 200612 Let v be piecewise constant with white background and color values on regions Then we obtain subgradients of the form with signed distance function and Error Estimation

13 Total variation minimization Obergurgl, September 200613 Bregman distances given by In the limit we obtain for being piecewise continuous Error Estimation

14 Total variation minimization Obergurgl, September 200614 For estimate in terms of we need smoothness condition on data Optimality condition for ROF Error Estimation

15 Total variation minimization Obergurgl, September 200615 Subtract q Estimate for Bregman distance, mb-Osher 04 Error Estimation

16 Total variation minimization Obergurgl, September 200616 In practice we have to deal with noisy data f (perturbation of some exact data g) Estimate for Bregman distance Error Estimation

17 Total variation minimization Obergurgl, September 200617 Optimal choice of the penalization parameter i.e. of the order of the noise variance Error Estimation

18 Total variation minimization Obergurgl, September 200618 Direct extension to deconvolution / linear inverse problems under standard source condition mb-Osher 04 Extension: stronger estimates under stronger conditions, Resmerita 05 Nonlinear inverse problems, Resmerita-Scherzer 06 Error Estimation

19 Total variation minimization Obergurgl, September 200619 Natural choice: primal discretization with piecewise constant functions on grid Problem 1: Numerical analysis (characterization of discrete subgradients) Problem 2: Discrete problems are the same for any anisotropic version of the total variation Discretization

20 Total variation minimization Obergurgl, September 200620 In multiple dimensions, nonconvergence of the primal discretization for the isotropic TV (p=2) can be shown Convergence of anisotropic TV (p=1) on rectangular aligned grids Fitzpatrick-Keeling 1997 Discretization

21 Total variation minimization Obergurgl, September 200621 Alternative: perform primal-dual discretization for optimality system (variational inequality) with convex set Primal-Dual Discretization

22 Total variation minimization Obergurgl, September 200622 Discretization Discretized convex set with appropriate elements (piecewise linear in 1D, Raviart- Thomas in multi-D) Primal-Dual Discretization

23 Total variation minimization Obergurgl, September 200623 In 1 D primal, primal-dual, and dual discretization are equivalent Error estimate for Bregman distance by analogous techniques Note that only the natural condition is needed to show Primal / Primal-Dual Discretization

24 Total variation minimization Obergurgl, September 200624 In multi-D similar estimates, additional work since projection of subgradient is not discrete subgradient. Primal-dual discretization equivalent to discretized dual minimization (Chambolle 03, Kunisch-Hintermüller 04). Can be used for existence of discrete solution, stability of p mb 06/07 ? Primal / Primal-Dual Discretization

25 Total variation minimization Obergurgl, September 200625 For most imaging applications Cartesian grids are used. Primal dual discretization can be reinterpreted as a finite difference scheme in this setup. Value of image intensity corresponds to color in a pixel of width h around the grid point. Raviart-Thomas elements on Cartesian grids particularly easy. First component piecewise linear in x, pw constant in y,z, etc. Leads to simple finite difference scheme with staggered grid Cartesian Grids

26 Total variation minimization Obergurgl, September 200626 ROF minimization has a systematic error, total variation of the reconstruction is smaller than total variation of clean image. Image features left in residual f-u g, clean f, noisy u, ROFf-u Extension I: Iterative Refinement & ISS

27 Total variation minimization Obergurgl, September 200627 Idea: add the residual („noise“) back to the image to pronounce the features decreased to much. Then do ROF again. Iterative procedure Osher-mb-Goldfarb-Xu-Yin 04 Extension I: Iterative Refinement & ISS

28 Total variation minimization Obergurgl, September 200628 Improves reconstructions significantly Extension I: Iterative Refinement & ISS

29 Total variation minimization Obergurgl, September 200629 Extension I: Iterative Refinement & ISS

30 Total variation minimization Obergurgl, September 200630 Simple observation from optimality condition Consequently, iterative refinement equivalent to Bregman iteration Extension I: Iterative Refinement & ISS

31 Total variation minimization Obergurgl, September 200631 Choice of parameter less important, can be kept small (oversmoothing). Regularizing effect comes from appropriate stopping. Quantitative stopping rules available, or „stop when you are happy“ – S.O. Limit to zero can be studied. Yields gradient flow for the dual variable („inverse scale space“) mb-Gilboa-Osher-Xu 06, mb-Frick-Osher-Scherzer 06 Extension I: Iterative Refinement & ISS

32 Total variation minimization Obergurgl, September 200632 Non-quadratic fidelity is possible, some caution needed for L 1 fidelity He-mb-Osher 05, mb-Frick-Osher-Scherzer 06 Error estimation in Bregman distance mb-Resmerita 06, in prep Further details see talk of Klaus Frick Extension I: Iterative Refinement & ISS

33 Total variation minimization Obergurgl, September 200633 Extension I: Inverse Scale Space Movie by M. Bachmayr, Master Thesis 06

34 Total variation minimization Obergurgl, September 200634 Application to other regularization techniques, e.g. wavelet thresholding is straightforward Starting from soft shrinkage, iterated refinement yields firm shrinkage, inverse scale space becomes hard shrinkage Osher-Xu 06 Bregman distance natural sparsity measure, source condition just requires sparse signal, number of nonzero components is smoothness measure in error estimates Extension I: Iterative Refinement & ISS

35 Total variation minimization Obergurgl, September 200635 Total variation, inverse scale space, and shrinkage techniques can be combined nicely See talk by Lin He Extension I: Iterative Refinement & ISS

36 Total variation minimization Obergurgl, September 200636 Total variation will prefer isotropic structures (circles, spheres) or special anisotropies In many applications one wants sharp corners in different directions. Adaptive anisotropy is needed Can be incorporated in ROF and ISS. See talk by Benjamin Berkels Extension II: Anisotropy

37 Total variation minimization Obergurgl, September 200637 Difficult to construct total variation techniques for inpainting Original extensions of ROF failed to obtain natural connectivity (see book by Chan, Shen 05) Inpainting region, image f (noisy) given on Try to minimize Extension III: Inpainting

38 Total variation minimization Obergurgl, September 200638 Optimality condition will have the form with A being a linear operator defining the norm In particular p = 0 in D ! Extension III: Inpainting

39 Total variation minimization Obergurgl, September 200639 Different iterated approach (motivated by Cahn-Hilliard inpainting, Bertozzi et al 05 ) Minimize in each step First term for damping, second for fidelity (fit to f where given, and to old iterate in the inpainting region), third term for smoothing Extension III: Inpainting

40 Total variation minimization Obergurgl, September 200640 Continuous flow for damping parameter to zero Fourth order flow for H -1 norm Stationary solution (existence ?) satisfies Extension III: Inpainting

41 Total variation minimization Obergurgl, September 200641 Result: Penguins Extension III: Inpainting

42 Total variation minimization Obergurgl, September 200642 Original motivation: Osher-Marquinha 01 used preconditioned gradient flow for ROF Stationary state assumed to be ROF minimizer Computational observation: not always true ! Trivial observation: for initial value u(0) = 0 the flow remains zero for all time ! Extension IV: Manifolds

43 Total variation minimization Obergurgl, September 200643 Embarrassing observation: flow always created by transport from initial value Important observation: Stationary state minimizes ROF on the manifold Extension IV: Manifolds

44 Total variation minimization Obergurgl, September 200644 Surprising observation: for f being the indicator function of a convex set, the flow is equivalent to the gradient flow of the L 1 version of ROF No loss of contrast ! More detailed analysis for general images needed Possible extension to ROF minimization on other manifolds by metric gradient flows Extension IV: Manifolds

45 Total variation minimization Obergurgl, September 200645 Download and Contact Papers and Talks: www.indmath.uni-linz.ac.at/people/burger from October: wwwmath1.uni-muenster.de/num e-mail: martin.burger@jku.at


Download ppt "1 Total variation minimization Numerical Analysis, Error Estimation, and Extensions Martin Burger Johannes Kepler University Linz SFB Numerical-Symbolic-Geometric."

Similar presentations


Ads by Google