Presentation on theme: "Edge Preserving Image Restoration using L1 norm"— Presentation transcript:
1Edge Preserving Image Restoration using L1 norm Vivek AgarwalThe University of Tennessee, Knoxville
2Outline Introduction Regularization based image restoration L2 norm regularizationL1 norm regularizationTikhonov regularizationTotal Variation regularizationLeast Absolute Shrinkage and Selection Operator (LASSO)ResultsConclusion and future work
3Introduction -Physics of Image formation Imaging systemg(x,y)K(x,y,x’,y’)f(x’,y’)Registrationsystemg(x,y)+noisenoiseReverse ProcessForward Process
4Image Restoration Image restoration is a subset of image processing. It is a highly ill-posed problem.Most of the image restoration algorithms uses least squares.L2 norm based algorithms produces smooth restoration which is inaccurate if the image consists of edges.L1 norm algorithms preserves the edge information in the restored images. But the algorithms are slow.
5Well-Posed ProblemIn 1923, the French mathematician Hadamard introduced thenotion of well-posed problems.According to Hadamard a problem is called well-posed ifA solution for the problem exists (existence).This solution is unique (uniqueness).This unique solution is stable under small perturbations in the data, in other words small perturbations in the data should cause small perturbations in the solution (stability).If at least one of these conditions fails the problem is called ill orincorrectly posed and demands a special consideration.
6Non-existence is Harmfull To deal with non-existence we have to enlarge the domain wherethe solution is sought.Example: A quadratic equation ax2 + bx +c =0 in general form hastwo solutions:There is a solutionReal DomainNo SolSutionComplex domainNon-existence is Harmfull
7Uniqueness Non-uniqueness is usually caused by the lack or absence of information about underlying model.Example: Neural networks. Error surface has multiple local minimaand many of these minima fit training data very well, howeverGeneralization capabilities of these different solution (predictivemodels) can be very different, ranging from poor to excellent. Howto pick up a model which is going to generalize well?Solution #3Bad or good?Solution #1Bad or good?Solution #2Bad or good?
8UniquenessNon-uniqueness is not always harmful. It depends on what we are looking for. If we are looking for a desired effect, that is we know how the good solution looks like then we can be happy with multiple solutions just picking up a good one from a variety of solution.The non-uniqueness is harmful if we are looking for an observed effect, that is we do not know how good solution looks like.The best way to combat non-uniqueness is just specify a modelusing prior knowledge of the domain or at least restrict the spacewhere the desired model is searched.
9InstabilityInstability is caused by an attempt to reverse cause-effectrelationships.Nature always solves just for forward problem, because of thearrow of time. Cause always goes before effect.In practice very often we have to reverse the relationships, that isto go from effect to cause.Example: Convolution-deconvolution, Fredhold integral equationsof the first kind.Forward OperationEffectCause
10L1 and L2 Norms The general expression for norm is given as L2 norm: is the Euclidean distance or vectordistance.L1 norm: is also known as Manhattan norm becauseit corresponds to the sum of the distances along the coordinateaxes.
11Why Regularization?Most of the restoration is based on Least Squares. But if the problem is ill-posed then least squares method fails.
12RegularizationThe general formulation for regularization techniques isWhereis the Error termis the regularization parameteris the penalty term
13Tikhonov Regularization Tikhonov is a L2 norm or classical regularization technique.Tikhonov regularization technique produces smoothing effect on the restored image.In zero order Tikhonov regularization, the regularization operator (L) is identity matrix.The expression that can be used to compute, Tikhonov regularization isIn Higher order Tikhonov, L is either first order or second order differentiation matrix.
14Tikhonov Regularization Original ImageBlurred Image
16Total Variation Total Variation is a deterministic approach. This regularization method preserve the edge information in the restored images.TV regularization penalty function obeys the L1 norm.The mathematical expression for TV regularization is given as
17Difference between Tikhonov regularization and Total Variation S.NoTikhonov RegularizationTotal Variation regularization1.2.Assumes smooth and continuous informationSmoothness is not assumed.3.Computationally less complexComputationally more complex4.Restored image is smoothRestored image is blocky and preserves the edges.
18Computation Challenges Total VariationGradientNon-Linear PDE
19Computation Challenges (Contd..) Iterative method is necessary to solve.TV function is non-differential at zero.The is non-linear operator.The ill conditioning of the operator causes numerical difficulties.Good Preconditioning is required.
20Computation of Regularization Operator Total Variation is computed using the formulation.The total variation is obtained after minimization of theLeast Square SolutionTotal Variation Penalty function (L)
21Computation of Regularization Operator Discretization of Total variation function:Gradient of Total Variation is given by
22Regularization Operator The regularization operator is computer using the expressionWhere
23Lasso RegressionLasso for “Least Absolute Shrinkage and Selection Operator” is a shrinkage and selection method for linear regression introduced by Tibshirani 1995.It minimizes the usual sum of squared errors, with a bound on the sum of the absolute values of the coefficients.The computation of solution for Lasso is a quadratic programming problem that can be best solved by least angle regression algorithm.Lasso also uses L1 penalty norm.
24Ridge Regression and Lasso Equivalence The cost function of ridge regression is given asRidge regression is identical to Zero Order Tikhonov regularizationAnalytical Solution of Ridge and Tikhonov are similarThe bias introduced favors solution with small weights and the effect is to smooth the output function.
25Ridge Regression and Lasso Equivalence Instead of single value of λ, different values of λ can be used for different pixels.It should provide same solution as lasso regression (regularization).Thus we establish relation between lasso and Zero Order Tikhonov, there is a relation between Total Variation and LassoTikhonovOur AimTo ProveProvedTotal VariationLassoBoth are L1Norm penalties
27L1 norm regularization - Restoration Total Variation RestorationLASSORestoration
28L1 norm regularization - Restoration I Deg of BlurII Deg of BlurIII Deg of BlurBlurred andNoisy ImagesTotal VariationRegularizationLASSORegularization
29L1 norm regularization - Restoration I level of NoiseII level of NoiseIII level of NoiseBlurred andNoisy ImagesTotal VariationRegularizationLASSORegularization
30Cross Section of Restoration Different degrees Of BlurringTotal VariationRegularizationLASSORegularization
31Cross Section of Restoration Different levels of NoiseTotal VariationRegularizationLASSORegularization
32Comparison of Algorithms Original ImageLASSO RestorationTikhonov RestorationTotal Variation Restoration
33Effect of Different Levels of Noise and Blurring LASSO RestorationBlurred and Noisy ImageTikhonov RestorationTotal Variation Restoration
34Numerical Analysis of Results - Airplane First Level of NoisePlanePDIterationCGLambdaBlurring Error(%)ResidualErrorRestoration Time(min)TotalVariation2102.05e-0281.41.742.50LASSORegression161.00e-041.810.80TikhonovRegularization--1.288e-109.850.20Second Level of NoisePlanePDIterationCGLambdaBlurring Error(%)Residual ErrorRestoration Time(min)TotalVariation1151e-0383.53.541.4LASSORegression24.2280.8TikhonovRegularization--1.12e-1011.20.30
39ConclusionTotal variation method preserves the edge information in the restored image.Restoration time in Total Variation regularization is highLASSO provides an impressive alternative to TV regularizationRestoration time of LASSO regularization is two times less than restoration time of RV regularizationRestoration quality of LASSO is better or equal to the restoration quality of TV regularization
40ConclusionBoth LASSO and TV regularization fails to suppress the noise in the restored images.Analysis shows increase in degree of blur increases the restoration errorIncrease in the noise level does not have a significant influence on the restoration time but effects the residual error