Presentation is loading. Please wait.

Presentation is loading. Please wait.

Statistical Estimation and Sampling Distributions

Similar presentations


Presentation on theme: "Statistical Estimation and Sampling Distributions"— Presentation transcript:

1 Statistical Estimation and Sampling Distributions
Topic 7 Statistical Estimation and Sampling Distributions

2 Statistical Inference
A statistical method which involves investigation of properties (estimation) concerning the unknown population parameters based on sample statistic results.

3 Point Estimates The parameter, which is denoted by θ , is an unknown property of a population. For example, mean, variance, proportion or particular quantile of the probability distribution. The statistic is a property of a sample. For example, sample mean, sample variance, proportion or a particular sample quantile Estimation is a procedure by which the information contained within a sample is used to investigate properties of the population from which the sample is drawn

4 Estimate Population Parameters ( ) …. with Sample Statistics ( )
Point Estimates of Parameters A point estimate of an unknown parameter θ is a statistic that represents a “best guess” at the value of θ . There may be more than one sensible point estimate of a parameter. For example, Estimate Population Parameters ( ) …. with Sample Statistics ( ) Mean ( µ ) Standard Deviation (  ) S Proportion ( p )

5 Unbiased and Biased Point Estimates
A point estimates for a parameter is said to be unbiased if Unbiasedness is a very good property for a point estimate to possess. If a point estimate is not unbiased then its bias can be defined to be

6 Point Estimate of a Success Probability
The obvious point estimate of p is Notice that the number of successes X has a binomial distribution, X ~ B(n,p). Therefore So that indeed an unbiased point estimate of p

7 Point Estimate of a Population Mean
Clearly it is since [ Remember, fair coin (n = 2, μ = p = ½) and fair dice (n = 6, μ = p =1/6 ] So that Then indeed an unbiased point estimate of μ 7

8 Point Estimate of a Population Variance
We know that the sample variance Then

9 Point Estimate of a Population Variance
Since then

10 Point Estimate of a Population Variance
We notice that Remember,

11 Point Estimate of a Population Variance
Putting this all together gives

12 Minimum Variance Estimates
The best situation is constructing a point estimate that is unbiased and that also has the smallest possible variance An unbiased point estimate that has a smaller variance than any other point estimate is called a minimum variance unbiased estimate (MVUE). The efficiency of MVUE is shown by its relative efficiency The relative efficiency of an unbiased point estimate to an unbiased point estimate is given by

13 Mean Square Errors In the case that two point estimates have different expectations and different variances, we prefer the point estimate that minimizes the value of mean square error (MSE) which is defined to be

14 Exercises Suppose that E(X1) = μ, Var(X1) = 10, E(X2) = μ, and Var(X2) = 15, and consider the point estimates Have students explain why each of these occurs. Level of confidence can be seen in the sampling distribution. Calculate the bias of each point estimate. Is any one of them unbiased Calculate the variance of each point estimate. Which one has the smallest variance? Calculate the mean square error of each point estimate. Which point estimate has the smallest mean square error when μ = 8 What is the relative efficiency of to the point estimate of ?

15 Exercise Solution Have students explain why each of these occurs.
Level of confidence can be seen in the sampling distribution.

16 Exercise Solution b. Have students explain why each of these occurs.
Level of confidence can be seen in the sampling distribution.

17 Exercise Solution c. d. Have students explain why each of these occurs. Level of confidence can be seen in the sampling distribution.

18 Sampling Distributions
Since the summary measures of one sample vary to those of another sample, we need to consider the probability distributions or sampling distributions of the sample mean , the sample variance S2, and the sample proportion 24

19 Sampling Means If X1, … , Xn are observations from a population with a mean μ and a variance σ2 , then the central limit theorem indicates that the sample mean has the approximate distribution The standard deviation of the sample mean is referred to as standard error (SE) Since the standard deviation σ is usually unknown, it can be replaced by S. 24

20 Sample Variances If X1, … , Xn are normally distributed with a mean μ and a variance σ2 , then the sample variance S2 has the distribution is a chi-square distribution with n – 1 degrees of freedom. In the case that the variance is unknown, If X1, …. Xn are normally distributed with a mean μ , then tn-1 is student’s t distribution with n – 1 degrees of freedom. 24

21 Sample Proportions If X ~ B(n, p), then the sample proportion has the approximate distribution The standard error of the sample proportion is 24

22 Exercises The capacitances of certain electronic components have a normal distribution with a mean μ = 174 and a standard deviation σ = 2.8. If an engineer randomly selects a sample of n = 30 components and measures their capacitances, what is the probability that the engineer’s point estimate of the mean μ will be within the interval (173, 175)? A scientist reports that the proportion of defective items from a process is 12.6%. If the scientist’s estimate is based on the examination of a random sample of 360 items from the process, what is the standard error of the scientist’s estimate? The pH levels of food items prepared in a certain way are normally distributed with a standard deviation of σ = An experimenter estimates the mean pH level by averaging the pH levels of a random sample of n items. What sample size n is needed to ensure that there is a probability of at least 99% that the experimenter’s estimate in within 0.5 of the true mean value? 24

23 Exercise Solutions Recall look up the table! 2) 24

24 Exercise Solutions Recall
The estimate is within 0.5 of the true mean value 24

25 Maximum Likelihood Estimates
We have considered the obvious point estimates for a success probability, a population mean and variance. However, it is often of interest to estimate parameters that require less obvious point estimates. For example, how should the parameters of the Poisson, exponential, beta or gamma distributions be estimated? Maximum likelihood estimation is one of general and more technical methods of obtaining point estimates. 24

26 Maximum Likelihood Estimate for One Parameter
If a data set consists of observations x1, x2, …, xn from a probability distribution f (x,) depending upon one unknown parameter , the maximum likelihood estimate of the parameter is found by maximizing the likelihood function In practice, the maximization of the likelihood function is usually performed by taking the derivative of the natural log of the likelihood function.

27 Example Suppose again that x1, x2, …, xn are a set of Bernoulli observation, with each taking the value 1 (success) with probability p and the value 0 (no success) with the probability 1 – p . We can write this as The likelihood function is therefore Where x = x1 + x2 +…+ xn and the maximum likelihood estimate is the value that maximize this

28 Example and Have students explain why each of these occurs. Level of confidence can be seen in the sampling distribution. Setting this expression equal to 0 and solving for p produce

29 Maximum Likelihood Estimate for Two Parameter
If a data set consists of observations x1, x2, …, xn from a probability distribution f (x,1, 2) depending upon two unknown parameter, the maximum likelihood estimate and are the values of the parameters that jointly maximize the likelihood function Again the best way to perform the joint maximization is usually to take derivatives of the log-likelihood with respect to and to set the two resulting expressions equal to 0

30 Example The normal distribution is an example of a distribution with two parameters, with a probability density function The likelihood of a set of normal observation is therefore

31 Example So that the log-likelihood is
Taking derivatives with respect to the parameters values and gives

32 Example Setting d ln(L)/dμ = 0 gives
And setting d ln(L)/dσ2 = 0 then gives Did you see any difference from the variance estimate that we have discussed before?

33 Exercises Suppose that the quality inspector at the glass manufacturing company inspects 30 randomly selected sheets of the glass and records the number of flaws found in each sheet. These data values are shown as follows 0 , 1 , 1 , 1 , 0 , 0 , 0 , 2 , 0 , 1 , 0 , 1 , 0 , 0 , 0 , 0 , 0 , 1 , 0 , 2 , 0 , 0 , 3 , 1 , 2 , 0 , 0 , 1 , 0 , 0 If the distribution of the number of flaws per sheet is taken to have a Poisson distribution, how should the parameter λ of the Poisson distribution be estimated? And find its value.

34 Exercise Solutions We should first estimate the parameter of λ. Then, the probability mass function of the data is So that the likelihood is The log-likelihood is therefore Taking its derivative w.r.t. λ and setting it to zero, we get

35 Exercise Solutions Therefore
Since the variance of each data is λ , then The standard error of the estimate of a Poisson parameter is

36 Any Questions ?


Download ppt "Statistical Estimation and Sampling Distributions"

Similar presentations


Ads by Google