Presentation is loading. Please wait.

Presentation is loading. Please wait.

Some General Concepts of Point Estimation

Similar presentations


Presentation on theme: "Some General Concepts of Point Estimation"— Presentation transcript:

1 Some General Concepts of Point Estimation

2 The motivation Suppose we want to estimate a parameter of a single population (e.g or ) based on a random sample , or a parameter of more than one sample (e.g , the difference between the means for samples and ). At times we use to represent a generic parameter.

3 Definition of a point estimate
A point estimate of a parameter is a single number that can be regarded as a sensible value for A point estimate is obtained by selecting a suitable statistic and computing its value from the given sample data. The selected statistic is called the point estimator of

4 An example Consider the following observations on dialectic breakdown voltage for 20 pieces of epoxy resin: Estimators and estimates for : Estimator = , estimate 24.46 25.61 26.25 26.42 26.66 27.15 27.31 27.54 27.74 27.94 27.98 28.04 28.28 28.49 28.50 28.76 29.11 29.13 29.50 30.88

5 Example (continued) Another estimator is , where the smallest and largest 10% of the data points are deleted, and the others are averaged. The estimate is

6 Which estimator should we choose?
Each of those estimators uses a different measure of the center of the sample to estimate Which is closest to the true value? We can’t answer that without knowing the true value. Which will tend to produce estimates closest to the true value?

7 Which estimator should we choose? (continued)
In the best of worlds, we would want an estimator for which always. However, is random. We want an estimator for which the estimator error is small. One criterion is to choose an estimator to minimize the mean square error However, the MSE will generally depend on the value of

8 A way out A way around this dilemma is to restrict attention to estimators that have some specified property and then find the best estimator in the restricted class. A popular property is unbiasedness.

9 Unbiasedness: motivation
Suppose we have two instruments for measurement and one has been accurately calibrated, but the other systematically gives readings smaller than the true value being measured. The measurements from the first instrument will average out to the true value, and the instrument is called an unbiased instrument. The measurements from the second instrument have a systematic error component or bias.

10 Definition A point estimator is said to be an unbiased estimator of if for every possible value of If is not unbiased, the difference is called the bias of .

11 Do we need to know the parameter to determine unbiasedness?
We typically don’t need to know the parameter to determine if an estimator is unbiased. For example, for a binomial rv, the sample proportion is unbiased, since

12 Example 2 Suppose that , the reaction time to a certain stimulus, has a uniform distribution on the interval We might think to estimate using must be biased, since all observations are less than or equal to . It can be shown that

13 Example 2 (continued) We can easily modify to get an unbiased estimator for , simply take

14 Principle of unbiasedness
When choosing among several different estimators of , select one that is unbiased.

15 Proposition Let be a random sample from a distribution with mean and variance Then the estimator is unbiased for estimating

16 Proof of proposition Recall that Then

17 Proof of proposition (continued)
… which equals as desired.

18 The estimator that has n as the divisor
The estimator then has expectation Its bias is

19 Is S unbiased for the population standard deviation?
Unfortunately, though is unbiased for , is not unbiased for Taking the square root messes up the property of unbiasedness.

20 Proposition If is a random sample from a distribution with mean , then is an unbiased estimator of If in addition the distribution is continuous and symmetric, then and any trimmed mean are also unbiased estimators of

21 The principle of minimum variance
Among all estimators of that are unbiased, choose the one that has minimum variance. The resulting is called the minimum variance unbiased estimator (MVUE) of .

22 Example 2 again We argued that for a random sample from the uniform distribution on , is unbiased for Since , is also unbiased for .

23 Example continued Now (Exercise 32) and . As long as , or ,
has the smaller variance. But how do we show that it has the minimum variance of all unbiased estimators? Results on MVUEs for certain distributions have been derived, the most important of which follows.

24 Theorem Let be a random sample from a normal distribution with parameters and . Then the estimator is MVUE for

25 Some complications Note that the last theorem doesn’t say that
should be used to estimate for any distribution. For a heavy-tailed distribution like the Cauchy, , , one is better off using (the UMVU is not known).

26 Reporting a point estimate: the standard error
The standard error of an estimator is its standard deviation. The standard error gives an idea of a typical deviation of the estimator from its mean. When has approximately a normal distribution, then we can say with reasonable confidence that the true value of lies within approximately 2 standard errors of .


Download ppt "Some General Concepts of Point Estimation"

Similar presentations


Ads by Google