Download presentation

Presentation is loading. Please wait.

Published byLondon Helder Modified about 1 year ago

1
Discriminative Approach for Wavelet Denoising Yacov Hel-Or and Doron Shaked I.D.C.- Herzliya HPL-Israel

2
- Can we clean Lena? Motivation – Image denoising

3
Some reconstruction problems Images of Venus taken by the Russian lander Ventra-10 in 1975 - Can we “see” through the missing pixels? Sapiro et. al.

4
Image Inpainting Sapiro et.al.

5
Image De-mosaicing - Can we reconstruct the color image?

6
Image De-blurring Can we sharpen Barbara?

7
All the above deal with degraded images. Their reconstruction requires solving an inverse problem Inpainting De-blurring De-noising De-mosaicing

8
Typical Degradation Sources Low Illumination Atmospheric attenuation (haze, turbulence, …) Optical distortions (geometric, blurring) Sensor distortion (quantization, sampling, sensor noise, spectral sensitivity, de-mosaicing)

9
Reconstruction as an Inverse Problem Distortion H noise measurements Original image Reconstruction Algorithm Years of extensive study Thousands of research papers

10
Typically: –The distortion H is singular or ill-posed. –The noise n is unknown, only its statistical properties can be learnt.

11
Key point: Stat. Prior of Natural Images

12
The Image Prior Px(x)Px(x) Image space 1 0

13
measurements Most probable solution From amongst all possible solutions, choose the one that maximizes the a-posteriori probability P X (x|y) Bayesian Reconstruction (MAP) P(x|y) P X (x)

14
Unfortunately not! The p.d.f. P x defines a prior dist. over natural images: –Defined over a huge dim. space (1E6 for 1Kx1K grayscale image) –Sparsely sampled. –Known to be non Gaussian. –Complicated to model. So, are we set?

15
Example: 3D prior of 2x2 image neighborhoods form Mumford & Huang, 2000

16
Marginalization of Image Prior Observation1: The Wavelet transform tends to de- correlate pixel dependencies of natural images. W.T.

17
Observation2: The statistics of natural images are homogeneous. Share the same statistics How Many Mapping Functions

18
Wavelet Shrinkage Denoising Donoho & Johnston 94 (unitary case) Degradation Model: The MAP estimator:

19
The MAP estimator gives:

20
The MAP estimator diagonalizes the system: This leads to a very useful property: Scalar mapping functions:

21
Wavelet Shrinkage Pipe-line Transform W Transform W Mapping functions M i (y i w ) Mapping functions M i (y i w ) Inverse Transform W T Inverse Transform W T yiwyiw xiwxiw Non linear operation

22
Due to the fact that: N mapping functions are needed for N sub- bands. How Many Mapping Functions?

23
Subband Decomposition Wavelet transform: Shrinkage: where

24
Wavelet Shrinkage Pipe-line B1B1 B1B1 Wavelet transform B1B1 B1B1 B1B1 B1B1 BiBi BiBi Shrinkage functions Inverse transform B1B1 B1B1 B1B1 B1B1 B1B1 B1B1 BTiBTi BTiBTi xiBxiB yiByiB +

25
The shape of the mapping function M j depends solely on P j and the noise variance . Designing The Mapping Function ywyw Modeling marginal p.d.f. of band j (noise variance) MAP objective MAP objective

26
Commonly P j (y w ) are approximated by GGD: for p<1 from: Simoncelli 99

27
Hard Thresholding Soft Thresholding Linear Wiener Filtering MAP estimators for GGD model with three different exponents. The noise is additive Gaussian, with variance one third that of the signal.

28
Due to its simplicity Wavelet Shrinkage became extremely popular: –Thousands of applications. –Hundreds of related papers (984 citations of D&J paper in Google Scholar). What about efficiency? –Denoising performance of the original Wavelet Shrinkage technique is far from the state-of-the-art results. Why? –Wavelet coefficients are not really independent.

29
Recent Developments Since the original approach suggested by D&J significant improvements were achieved: Original Shrinkage Over-complete Joint (Local) Coefficient Modeling Overcomplete transform Scalar MFs Simple Not considered state-of-the-art Multivariate MFs Complicated Superior results

30
Joint (Local) Coefficient Modeling 94 06 2000 97 03 HMM Crouse et. al. Joint Bayesian Pizurika et. al Context Modeling Portilla et. al. Context Modeling Chang, et. al. HMM Fan-Xia Sparsity Mallat, Zhang Joint Bayesian Simoncelli Bivariate Sendur, Selesnick Co-occurence Shan, Aviyente Adaptive Thresh. Li, Orchad

31
Shrinkage in Over-complete Transforms 94 Shrinkage D.J. 06 2000 97 03 Steerable Simoncelli, Adelson Undecimated wavelet Coifman, Donoho Ridgelets Candes Ridgelets Carre, Helbert Ridgelets Nezamoddini et. al. Contourlets Matalon, et. al. Contourlets Do, Vetterli Curvelets Starck et. al. K-SVD Aharon, Elad

32
Over-Complete Shrinkage Denoising Over-complete transform: Shrinkage: Mapping Functions: Naively borrowed from the Unitary case. where

33
What’s wrong with existing MFs? 1.Map criterion: –Solution is biased towards the most probable case. 2.Independent assumption: –In the overcomplete case, the wavelet coefficients are inherently dependent. 3.Minimization domain: –For the unitary case MFs optimality is expressed in the transform domain. This is incorrect in the overcomplete case. 4.White noise assumption: –Image noise is not necessarily white i.i.d.

34
Why unitary based MFs are being used? Non-marginal statistics. Multivariate minimization. Multivariate MFs. Non-white noise.

35
Suggested Approach: Maintain simplicity –Use scalar LUTs. Improve Efficiency –Use Over-complete Transforms. –Design optimal MFs with respect to a given set of images. –Express optimality in the spatial domain. –Attain optimality with respect to MSE.

36
Optimal Mapping Function: Traditional approach: Descriptive Suggested approach: Discriminative Modeling wavelet p.d.f. MAP objective MAP objective Optimality criteria

37
The optimality Criteria Design the MFs with respect to a given set of examples: {x e i } and {y e i } Critical problem: How to optimize the non-linear MFs

38
The Spline Transform Let x R be a real value in a bounded interval [a,b). We divide [a,b) into M segments q=[q 0,q 1,...,q M ] w.l.o.g. assume x [q j-1,q j ) Define residue r(x)=(x-q j-1 )/(q j -q j-1 ) a b x q0q0 q1q1 qMqM q j-1 qjqj r(x) x= r(x) q j +(1- r(x) ) q j-1 x =[ 0, , 0, 1-r(x), r(x), 0, ] q = S q ( x )q

39
The Spline Transform-Cont. We define a vectorial extension: We call this the Spline Transform (SLT) of x. i th row

40
The SLTProperties The SLT Properties Substitution property: Substituting the boundary vector q with a different vector p forms a piecewise linear mapping. =S q (x) x q0q0 q1q1 q2q2 q3q3 q4q4 q1q1 q2q2 q3q3 q4q4 q p p0p0 p1p1 p2p2 p3p3 p4p4 x x’ x x

41
Back to the MFs Design We approximate the non-linear {M k } with piece-wise linear functions: Finding {p k } is a standard LS problem with a closed form solution!

42
B1B1 B1B1 B1B1 B1B1 B1B1 B1B1 BkBk BkBk B1B1 B1B1 B1B1 B1B1 B1B1 B1B1 BTkBTk BTkBTk + Designing the MFs closed form solution: M k (y; p k ) (B T B) -1 Undecimated wavelet: 2D convolutions

43
Results

44
Training Images

45
Tested Images

46
Simulation setup Transform used: Undecimated DCT Noise: Additive i.i.d. Gaussian Number of bins: 15 Number of bands: 3x3.. 10x10

47
B1B1 B1B1 B1B1 B1B1 BkBk BkBk Option 1: Transform domain – independent bands M k (y; p k ) (B T B) -1 B1B1 B1B1 B1B1 B1B1 BTkBTk BTkBTk B1B1 B1B1 B1B1 B1B1 BkBk BkBk B1B1 B1B1 B1B1 B1B1 BTkBTk BTkBTk

48
B1B1 B1B1 B1B1 B1B1 BkBk BkBk B1B1 B1B1 B1B1 B1B1 BTkBTk BTkBTk B1B1 B1B1 B1B1 B1B1 BkBk BkBk B1B1 B1B1 B1B1 B1B1 BTkBTk BTkBTk Option 2: Spatial domain – independent bands M k (y; p k )

49
B1B1 B1B1 B1B1 B1B1 BkBk BkBk (B T B) -1 B1B1 B1B1 B1B1 B1B1 BTkBTk BTkBTk B1B1 B1B1 B1B1 B1B1 BkBk BkBk B1B1 B1B1 B1B1 B1B1 BTkBTk BTkBTk Option 3: Spatial domain – joint bands M k (y; p k )

50
MFs for UDCT 8x8 (i,i) bands, i=1..4, =20 Option Option 1 Option Option 2 Option Option 3

51
Comparing psnr results for 8x8 undecimated DCT, sigma=20.

52
8x8 UDCT =10

53
8x8 UDCT =20

54
8x8 UDCT =10

55
The Role of Quantization Bins 8x8 UDCT =10

56
The Role of Transform Used =10

57
The Role of Training Image

58
MFs for UDCT 8x8 (i,i) bands, i=2..6. =5 =10 =15 =20 The Role of noise variance

59
Observation: The obtained MFs for different noise variances are similar up to scaling: The role of noise variance

60
Comparison between M 20 (v) and 0.5M 10 (2v) for basis [2:4]X[2:4]

61
Comparison with BLS-GSM

62

63
Other Degradation Models

64
JPEG Artifact Removal

65

66
Image Sharpening

67

68
Conclusions New and simple scheme for over-complete transform based denoising. MFs are optimized in a discriminative manner. Linear formulation of non-linear minimization. Eliminating the need for modeling complex statistical prior in high-dim. space. Seamlessly applied to other degradation problems as long as scalar MFs are used for reconstruction.

69
Conclusions – cont. Extensions: –Filter-cascade based denoising. –Multivariate MFs (activity level). –Non-homogeneous noise characteristics. Open problems: –What is the best transform for a given image? –How to choose training images that form faithful representation?

70
Thank You The End

71
MSE for MF scaling from =10 to =20

72
MSE for MF scaling from =15 to =20

73
MSE for MF scaling from =25 to =20

74

75
Image Sharpening

76
Wavelet Shrinkage Pipe-line B1B1 B1B1 Wavelet transform B1B1 B1B1 B1B1 B1B1 BiBi BiBi Shrinkage functions Inverse transform B1B1 B1B1 B1B1 B1B1 B1B1 B1B1 BTiBTi BTiBTi xiBxiB yiByiB + (B T B) -1

77
MFs for UDCT 8x8 (i,i) bands, i=1..4, =20 Option Option 1

78
MFs for UDCT 8x8 (i,i) bands, i=1..4, =20 Option Option 2

79
MFs for UDCT 8x8 (i,i) bands, i=1..4, =20 Option Option 3

80
Comparing psnr results for 8x8 undecimated DCT, sigma=20.

81
Comparing psnr results for 8x8 undecimated DCT, sigma=10.

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google