Presentation is loading. Please wait.

Presentation is loading. Please wait.

QUASI MAXIMUM LIKELIHOOD BLIND DECONVOLUTION QUASI MAXIMUM LIKELIHOOD BLIND DECONVOLUTION Alexander Bronstein.

Similar presentations


Presentation on theme: "QUASI MAXIMUM LIKELIHOOD BLIND DECONVOLUTION QUASI MAXIMUM LIKELIHOOD BLIND DECONVOLUTION Alexander Bronstein."— Presentation transcript:

1 QUASI MAXIMUM LIKELIHOOD BLIND DECONVOLUTION QUASI MAXIMUM LIKELIHOOD BLIND DECONVOLUTION Alexander Bronstein

2 BIBLIOGRAPHY 2 2 A. Bronstein, M. Bronstein, and M. Zibulevsky, "Quasi maximum likelihood blind deconvolution: super- ans sub-Gaussianity vs. asymptotic stability", submitted to IEEE Trans. Sig. Proc. A. Bronstein, M. Bronstein, M. Zibulevsky, and Y. Y. Zeevi, "Quasi maximum likelihood blind deconvolution: asymptotic performance analysis", submitted to IEEE Trans. Information Theory. A. Bronstein, M. Bronstein, and M. Zibulevsky, "Relative optimization for blind deconvolution", submitted to IEEE Trans. Sig. Proc. A. Bronstein, M. Bronstein, M. Zibulevsky, and Y. Y. Zeevi, "Quasi maximum likelihood blind deconvolution of images acquired through scattering media", Submitted to ISBI04. A. Bronstein, M. Bronstein, M. Zibulevsky, and Y. Y. Zeevi, "Quasi maximum likelihood blind deconvolution of images using optimal sparse representations", CCIT Report No. 455 (EE No. 1399), Dept. of Electrical Engineering, Technion, Israel, December 2003. A. Bronstein, M. Bronstein, and M. Zibulevsky, "Blind deconvolution with relative Newton method", CCIT Report No. 444 (EE No. 1385), Dept. of Electrical Engineering, Technion, Israel, October 2003.

3 AGENDA 3 3 Introduction QML blind deconvolution Asymptotic analysis Relative Newton Generalizations Problem formulation Applications

4 BLIND DECONVOLUTION PROBLEM 4 4 source signal convolution kernel observed signal sensor noise signal WH CONVOLUTION MODELDECONVOLUTION restoration kernel source estimate arbitrary scaling factor arbitrary delay

5 APPLICATIONS 5 5 Acoustics, speech processing  DEREVERBERATION Optics, image processing, biomedical imaging  DEBLURRING Communications  CHANNEL EQUALIZATION Control  SYSTEM IDENTIFICATION Statistics, finances  ARMA ESTIMATION

6 AGENDA 6 6 Introduction QML blind deconvolution Asymptotic analysis Relative Newton Generalizations ML vs. QML The choice of φ(s) Equivariance Gradient and Hessian

7 ML BLIND DECONVOLUTION 7 7 1 is i.i.d. with probability density function 2 has no zeros on the unit circle, i.e. ASSUMPTIONS MAXIMUM-LIKELIHOOD BLIND DECONVOLUTION: 3 No noise (precisely: no noise model) 4 is zero-mean

8 QUASI ML BLIND DECONVOLUTION 8 8 The true source PDF in usually unknown Many times is non-log-concave and not well-suited for optimization Substitute with some model function PROBLEMS OF MAXIMUM LIKELIHOOD QUASI MAXIMUM LIKELIHOOD BLIND DECONVOLUTION

9 THE CHOICE OF  (s) 9 9 SUPER-GAUSSIANSUB-GAUSSIAN

10 EQUIVARIANCE 10 QML estimator of given the observation Theorem: The QML estimator is equivariant, i.e., for every invertible kernel, it holds where stands for the impulse response of the inverse of. ANALYSIS OF

11 GRADIENT & HESSIAN OF 11 GRADIENT where is the mirror operator. HESSIAN

12 AGENDA 12 Introduction QML blind deconvolution Asymptotic analysis Relative Newton Generalizations Asymptotic Hessian structure Asymptotic error covariance Cramér-Rao bounds Superefficiency Examples

13 ASYMPTOTIC HESSIAN AT THE SOLUTION POINT 13 For a sufficiently large sample size, the Hessian becomes At the solution point, where

14 ASYMPTOTIC ERROR COVARIANCE 14 Estimation kernel from the data Exact restoration kernel The scaling factor has to obey

15 ASYMPTOTIC ERROR COVARIANCE 15 Estimation error From second-order Taylor expansion, equivariance

16 ASYMPTOTIC ERROR COVARIANCE 16 Asymptotically ( ), Separable structure:

17 ASYMPTOTIC ERROR COVARIANCE 17 The estimation error covariance matrix asymptotically separates to

18 ASYMPTOTIC ERROR COVARIANCE 18 Asymptotic gradient covariance matrices where

19 ASYMPTOTIC ERROR COVARIANCE 19 Asymptotic signal-to-interference ratio (SIR) estimate: Asymptotic estimation error covariance:

20 CRAMER-RAO LOWER BOUNDS 20 True ML estimator: The distribution-dependent parameters simplify to Asymptotic error covariance simplifies to where

21 CRAMER-RAO LOWER BOUNDS 21 Asymptotic SIR estimate simplifies to

22 SUPEREFFICIENCY 22 Let the source be sparse, i.e., Let be the smoothed absolute value with smoothing parameter In the limit

23 SUPEREFFICIENCY 23 Similar results are obtained for uniformly-distributed source with Can be extended for sources with PDF vanishing outside some interval.

24 ASYMPTOTIC STABILITY 24 The QML estimator is said to be asymptotically stable if is a local minimizer of in the limit. Theorem: The QML estimator is asymptotically stable if the following conditions hold: and is asymptotically unstable if one of the following conditions hold:

25 EXAMPLE 25 Generalized Laplace distribution

26 STABILITY OF THE SUPER-GAUSSIAN ESTIMATOR 26 SUPER-GAUSSIAN SUB-GAUSSIAN

27 STABILITY OF THE SUB-GAUSSIAN ESTIMATOR 27 SUPER-GAUSSIAN SUB-GAUSSIAN

28 PERFORMANCE OF THE SUPER-GAUSSIAN ESTIMATOR 28 SUPER-GAUSSIAN

29 PERFORMANCE OF THE SUB-GAUSSIAN ESTIMATOR 29 SUB-GAUSSIAN

30 AGENDA 30 Introduction QML blind deconvolution Asymptotic analysis Relative Newton Generalizations Relative optimization Relative Newton Fast Relative Newton

31 RELATIVE OPTIMIZATION (RO) 31 0 Start with and 1 For until convergence 4 Update source estimate 2 Start with 3 Find such that 5 End For Restoration kernel estimate: Source estimate:

32 RELATIVE OPTIMIZATION (RO) 32 Observation: The k-th step of the relative optimization algorithm depends only on Proposition: The sequence of target function values produced by the relative optimization algorithm is monotonically decreasing, i.e.,

33 RELATIVE NEWTON 33 Relative Newton = use one Newton step in the RO algorithm Near the solution point Newton system separates to

34 FAST RELATIVE NEWTON 34 Fast relative Newton = use one Newton step with approximate Hessian in the RO algorithm + regularized approximate Newton system solution. Approximate Hessian evaluation = order of gradient evaluation

35 AGENDA 35 Introduction QML blind deconvolution Asymptotic analysis Relative Newton Generalizations

36 GENERALIZATIONS 36 IIR KERNELS BLOCK PROCESSING  ONLINE DECONVOLUTION MULTI-CHANNEL DECONVOLUTION  BSS+BD DECONVOLUTION OF IMAGES + USE OF SPARSE REPRESENTATIONS

37 GENERALIZATIONS: IIR KERNELS IIR FIR

38 GENERALIZATIONS: ONLINE PROCESSING Fast Relative Newton (block)

39 GENERALIZATIONS: DECONVOLUTION OF IMAGES SOURCEOBSERVATIONRESTORATION

40 GENERALIZATIONS: DECONVOLUTION OF IMAGES Fast relative Newton Newton Fast relative Newton


Download ppt "QUASI MAXIMUM LIKELIHOOD BLIND DECONVOLUTION QUASI MAXIMUM LIKELIHOOD BLIND DECONVOLUTION Alexander Bronstein."

Similar presentations


Ads by Google