STATISTICAL INFERENCE PART I POINT ESTIMATION

Slides:



Advertisements
Similar presentations
STATISTICS POINT ESTIMATION Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
Advertisements

Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Point Estimation.
Point Estimation Notes of STAT 6205 by Dr. Fan.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Chapter 7. Statistical Estimation and Sampling Distributions
Chapter 7 Title and Outline 1 7 Sampling Distributions and Point Estimation of Parameters 7-1 Point Estimation 7-2 Sampling Distributions and the Central.
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
SOLVED EXAMPLES.
Maximum likelihood (ML) and likelihood ratio (LR) test
Today Today: Chapter 9 Assignment: Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Statistical Inference Chapter 12/13. COMP 5340/6340 Statistical Inference2 Statistical Inference Given a sample of observations from a population, the.
Part 2b Parameter Estimation CSE717, FALL 2008 CUBS, Univ at Buffalo.
Maximum likelihood (ML)
Parametric Inference.
Today Today: Chapter 9 Assignment: 9.2, 9.4, 9.42 (Geo(p)=“geometric distribution”), 9-R9(a,b) Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
2. Point and interval estimation Introduction Properties of estimators Finite sample size Asymptotic properties Construction methods Method of moments.
1 STATISTICAL INFERENCE PART I EXPONENTIAL FAMILY & POINT ESTIMATION.
July 3, Department of Computer and Information Science (IDA) Linköpings universitet, Sweden Minimal sufficient statistic.
STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
Lecture 7 1 Statistics Statistics: 1. Model 2. Estimation 3. Hypothesis test.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Maximum likelihood (ML)
STATISTICAL INFERENCE PART I POINT ESTIMATION
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Random Sampling, Point Estimation and Maximum Likelihood.
A statistical model Μ is a set of distributions (or regression functions), e.g., all uni-modal, smooth distributions. Μ is called a parametric model if.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Chapter 7 Point Estimation
1 Lecture 16: Point Estimation Concepts and Methods Devore, Ch
Lecture 4: Statistics Review II Date: 9/5/02  Hypothesis tests: power  Estimation: likelihood, moment estimation, least square  Statistical properties.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
CLASS: B.Sc.II PAPER-I ELEMENTRY INFERENCE. TESTING OF HYPOTHESIS.
Confidence Interval & Unbiased Estimator Review and Foreword.
Week 41 How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Sometimes.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Conditional Expectation
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date :
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 9.1: Parameter estimation CIS Computational Probability.
Copyright © Cengage Learning. All rights reserved.
Statistical Estimation
STATISTICS POINT ESTIMATION
STATISTICAL INFERENCE
12. Principles of Parameter Estimation
6 Point Estimation.
Point and interval estimations of parameters of the normally up-diffused sign. Concept of statistical evaluation.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Parameter, Statistic and Random Samples
t distribution Suppose Z ~ N(0,1) independent of X ~ χ2(n). Then,
CONCEPTS OF ESTIMATION
More about Posterior Distributions
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 9.1: Parameter estimation CIS Computational Probability.
STATISTICAL INFERENCE PART IV
STATISTICAL INFERENCE PART III
Addition of Independent Normal Random Variables
6 Point Estimation.
STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS
STATISTICAL INFERENCE PART I POINT ESTIMATION
Chapter 9 Chapter 9 – Point estimation
12. Principles of Parameter Estimation
Chapter 8 Estimation.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Applied Statistics and Probability for Engineers
Presentation transcript:

STATISTICAL INFERENCE PART I POINT ESTIMATION

STATISTICAL INFERENCE Determining certain unknown properties of a probability distribution on the basis of a sample (usually, a r.s.) obtained from that distribution  Point Estimation: Interval Estimation: ( )  Hypothesis Testing:

STATISTICAL INFERENCE Parameter Space (): The set of all possible values of an unknown parameter, ; . A pdf with unknown parameter: f(x; ), . Estimation: Where in ,  is likely to be? { f(x; ),  } The family of pdfs

STATISTICAL INFERENCE Statistic: A function of rvs (usually a sample rvs in an estimation) which does not contain any unknown parameters. Estimator of an unknown parameter : A statistic used for estimating . An observed value

Method of Moments Estimation, Maximum Likelihood Estimation METHODS OF ESTIMATION Method of Moments Estimation, Maximum Likelihood Estimation

METHOD OF MOMENTS ESTIMATION (MME) Let X1, X2,…,Xn be a r.s. from a population with pmf or pdf f(x;1, 2,…, k). The MMEs are found by equating the first k population moments to corresponding sample moments and solving the resulting system of equations. Population Moments Sample Moments

METHOD OF MOMENTS ESTIMATION (MME) so on… Continue this until there are enough equations to solve for the unknown parameters.

EXAMPLES Let X~Exp(). For a r.s of size n, find the MME of . For the following sample (assuming it is from Exp()), find the estimate of : 11.37, 3, 0.15, 4.27, 2.56, 0.59.

EXAMPLES Let X~N(μ,σ²). For a r.s of size n, find the MMEs of μ and σ². For the following sample (assuming it is from N(μ,σ²)), find the estimates of μ and σ²: 4.93, 6.82, 3.12, 7.57, 3.04, 4.98, 4.62, 4.84, 2.95, 4.22

DRAWBACKS OF MMES Although sometimes parameters are positive valued, MMEs can be negative. If moments does not exist, we cannot find MMEs.

MAXIMUM LIKELIHOOD ESTIMATION (MLE) Let X1, X2,…,Xn be a r.s. from a population with pmf or pdf f(x;1, 2,…, k), the likelihood function is defined by

MAXIMUM LIKELIHOOD ESTIMATION (MLE) For each sample point (x1,…,xn), let be a parameter value at which L(1,…, k| x1,…,xn) attains its maximum as a function of (1,…, k), with (x1,…,xn) held fixed. A maximum likelihood estimator (MLE) of parameters (1,…, k) based on a sample (X1,…,Xn) is The MLE is the parameter point for which the observed sample is most likely.

EXAMPLES Let X~Bin(n,p). One observation on X is available, and it is known that n is either 2 or 3 and p=1/2 or 1/3. Our objective is to estimate the pair (n,p). x (2,1/2) (2,1/3) (3,1/2) (3,1/3) Max. Prob. 1/4 4/9 1/8 8/27 1 1/2 3/8 12/27 2 1/9 6/27 3 1/27 P(X=0|n=2,p=1/2)=1/4 …

MAXIMUM LIKELIHOOD ESTIMATION (MLE) It is usually convenient to work with the logarithm of the likelihood function. Suppose that f(x;1, 2,…, k) is a positive, differentiable function of 1, 2,…, k. If a supremum exists, it must satisfy the likelihood equations MLE occurring at boundary of  cannot be obtained by differentiation. So, use inspection.

MLE Moreover, you need to check that you are in fact maximizing the log-likelihood (or likelihood) by checking that the second derivative is negative.

EXAMPLES 1. X~Exp(), >0. For a r.s of size n, find the MLE of .

EXAMPLES 2. X~N(,2). For a r.s. of size n, find the MLEs of  and 2.

EXAMPLES 3. X~Uniform(0,), >0. For a r.s of size n, find the MLE of .

INVARIANCE PROPERTY OF THE MLE If is the MLE of , then for any function (), the MLE of () is . Example: X~N(,2). For a r.s. of size n, the MLE of  is . By the invariance property of MLE, the MLE of 2 is

ADVANTAGES OF MLE Often yields good estimates, especially for large sample size. Usually they are consistent estimators. Invariance property of MLEs Asymptotic distribution of MLE is Normal. Most widely used estimation technique.

DISADVANTAGES OF MLE Requires that the pdf or pmf is known except the value of parameters. MLE may not exist or may not be unique. MLE may not be obtained explicitly (numerical or search methods may be required.). It is sensitive to the choice of starting values when using numerical estimation.  MLEs can be heavily biased for small samples. The optimality properties may not apply for small samples.

SOME PROPERTIES OF ESTIMATORS UNBIASED ESTIMATOR (UE): An estimator is an UE of the unknown parameter , if Otherwise, it is a Biased Estimator of . Bias of for estimating  If is UE of ,

SOME PROPERTIES OF ESTIMATORS ASYMPTOTICALLY UNBIASED ESTIMATOR (AUE): An estimator is an AUE of the unknown parameter , if

SOME PROPERTIES OF ESTIMATORS CONSISTENT ESTIMATOR (CE): An estimator which converges in probability to an unknown parameter  for all  is called a CE of . For large n, a CE tends to be closer to the unknown population parameter. MLEs are generally CEs.

EXAMPLE For a r.s. of size n, By WLLN,

MEAN SQUARED ERROR (MSE) The Mean Square Error (MSE) of an estimator for estimating  is If is smaller, is the better estimator of .

MEAN SQUARED ERROR CONSISTENCY Tn is called mean squared error consistent (or consistent in quadratic mean) if E{Tn}20 as n. Theorem: Tn is consistent in MSE iff i) Var(Tn)0 as n. If E{Tn}20 as n, Tn is also a CE of .

EXAMPLES X~Exp(), >0. For a r.s of size n, consider the following estimators of , and discuss their bias and consistency. Which estimator is better?