Presentation is loading. Please wait.

Presentation is loading. Please wait.

Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date : 11.09.2013 1.

Similar presentations


Presentation on theme: "Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date : 11.09.2013 1."— Presentation transcript:

1 Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date : 11.09.2013 1

2 Contents Introduction Unbiased Estimator Minimum Variance Criterion Minimum Variance Unbiased Estimator Finding BLUE (Best Linear unbiased estimators) Maximum Likelihood Estimation - MLE Example - Application of MLE References 2

3 Introduction Estimation Theory – deals with estimating the values of parameters based on measured data. Estimation is done for random process. Estimation not required for a deterministic process. Estimator Point Estimate Interval Estimate 3

4 Introduction Point Estimates MVUE (Minimum Variance mean unbiased estimator) ML (Maximum Likelihood) 4 Interval Estimates Confidence Intervals Prediction Intervals (as in regression analysis)

5 5 Estimated Parameter Biased/Unbiased E.g.; Population and sample mean Introduction

6 Consider the problem of estimating a DC level A in uncorrelated noise: Consider the following estimators: Mathematically unbiased estimator is, Computing two estimators gives, 6 Unbiased Estimator

7 Now, computing the two variances give, Hence, a better estimator can be obtained by averaging. So by increasing n, the variance will decrease. This is not the case for biased estimators. 7 Unbiased Estimator

8 Optimality Criterion – MSE (Mean Square Error) Measures mean squared deviation of the estimator from the true value. Problems: Unrealizable estimators 1.) Errors composed of variance(estimator) & bias. 2.) Dependence on unknown parameter A. Solution: Bias  0, Variance Minimum Variance Unbiased Estimator 8 Minimum Variance Criterion

9 Likelihood Function: The best estimator is whatever value of that maximizes PDF is viewed as a function of the unknown parameter (with x fixed). “Sharpness” of the likelihood function determines how accurate the estimate the unknown parameter is. Consider, and pdf is, 9 Minimum Variance Unbiased Estimator

10 Taking ln and first derivative, Then taking (-) of second derivative, According to Cramer-Rao Lower Bound theorem, 10 Minimum Variance Unbiased Estimator

11 11 Smaller the variance, the better!

12 If the data are of the general linear model, w is noise vector with zero mean and covariance C H is observation matrix, is vector of parameters to be estimated. BLUE is, Covariance matrix of, 12 Finding BLUE(Best linear Unbiased Estimators)

13 Maximum Likelihood Estimation (MLE) Existing problems: 1.) MVU estimator does not often exist. 2.) Best linear Unbiased estimators (BLUE) is only for linear models. MLE 1.) Applied if pdf is known 2.) Optimal for large data size Basic Idea: Choose the parameter value that makes the observed data, the most likely data to have been observed. 13

14 Asymptotically efficient : As N  ∞, bias  0 then variance of estimator  CRLB Procedure: 1.) Consider DC level in white Gaussian noise (WGN) with unknown variance x[n] = A + w[n]. Suppose that A > 0 and variance is A, the pdf is, 2.) Take derivative of log-likelihood function, we have: 14 Maximum Likelihood Estimation (MLE)

15 3.) Set the above eqn to 0 to get MLE, maximizes the log-likelihood function. Consistency : Estimator is a consistent estimator of if for every Properties of MLE: 1.) MLE is a consistent 2.) MLE asymptotically attains CRLB 3.) If MVU estimator exists then ML procedure can find it. 15 Maximum Likelihood Estimation (MLE)

16 Consider a general linear model with pdf, Taking ln and derivative gives, Solving, (setting eqn to 0) we get, Results as the same as BLUE or MVU 16 MLE example

17 Application of MLE In MRI, acquired complex data is corrupted by Gaussian noise Magnitude ~ non linear operation Rician PDF for magnitude Parameter estimation, reduce bias  use MLE Other : Communication systems, Image classification 17

18 1.) http://la.epfl.ch/files/content/sites/la/files/users/139973/public/Estima tion%20theory/EstimationTheory.pdf http://la.epfl.ch/files/content/sites/la/files/users/139973/public/Estima tion%20theory/EstimationTheory.pdf 2.) http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.95.1905& rep=rep1&type=pdf http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.95.1905& rep=rep1&type=pdf 3.) http://en.wikipedia.org/wiki/Maximum_likelihood#cite_note-10 http://en.wikipedia.org/wiki/Maximum_likelihood#cite_note-10 4.) https://files.nyu.edu/mrg217/public/mle_introduction1.pdf https://files.nyu.edu/mrg217/public/mle_introduction1.pdf 5.) Fundamentals of Statistical Signal Processing (Estimation Theory) Steven M. Kay 18 References

19 19


Download ppt "Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date : 11.09.2013 1."

Similar presentations


Ads by Google