Presentation on theme: "Chapter 7. Statistical Estimation and Sampling Distributions"— Presentation transcript:
1 Chapter 7. Statistical Estimation and Sampling Distributions 7.1 Point Estimates7.2 Properties of Point Estimates7.3 Sampling Distributions7.4 Constructing Parameter Estimates7.5 Supplementary Problems
2 7.1 Point Estimates 7.1.1 Parameters In statistical inference, the term parameter is used to denote a quantity , say, that is a property of an unknown probability distribution.For example, the mean, variance, or a particular quantile of the probability distributionParameters are unknown, and one of the goals of statistical inference is to estimate them.
3 Figure 7.1 The relationship between a point estimate and an unknown parameter θ
4 Figure 7.2 Estimation of the population mean by the sample mean
5 7.1.2 Statistics Statistics In statistical inference, the term statistics is used to denote a quantity that is a property of a sample.Statistics are functions of a random sample. For example, the sample mean, sample variance, or a particular sample quantile.Statistics are random variables whose observed values can be calculated from a set of observed data.
6 7.1.3 Estimation Estimation A procedure of “guessing” properties of the population from which data are collected.A point estimate of an unknown parameter isa statistic that represents a “guess” atthe value of .Example 1 (Machine breakdowns)How to estimateP(machine breakdown due to operator misuse) ?Example 2 (Rolling mill scrap)How to estimate the mean and variance of the probability distribution of % scrap ( ) ?
7 7.2 Properties of Point Estimates 7.2.1. Unbiased Estimates (1/5) Definitions- A point estimate for a parameter is said to beunbiased if- If a point estimate is not unbiased, then its bias is definedto be
8 7.2.1. Unbiased Estimates (2/5) Point estimate of a success probability-
9 7.2.1. Unbiased Estimates(3/5) Point estimate of a population mean-
10 7.2.1. Unbiased Estimates(4/5) Point estimate of a population variance
14 7.2.2. Minimum Variance Estimates (3/4) An unbiased point estimate whose variance is smaller than any other unbiased point estimate: minimum variance unbised estimate (MVUE)Relative efficiencyMean squared error (MSE)How is it decomposed ?Why is it useful ?
16 Example: two independent measurements Point estimates of the unknown CThey are both unbiased estimates sinceThe relative efficiency of to isLet us consider a new estimateThen, this estimate is unbiased sinceWhat is the optimal value of p that results in havingthe smallest possible mean square error (MSE)?
17 Let the variance of be given by Differentiating with respect to p yields thatThe value of p that minimizesTherefore, in this example,The variance of
18 The relative efficiency of to is In general, assuming that we have n independent and unbiasedestimates having variancerespectively for a parameter , we can set the unbiasedestimator asThe variance of this estimator is
19 Mean square error (MSE): Let us consider a point estimateThen, the mean square error is defined byMoreover, notice that
20 7.3 Sampling Distribution 7.3.1 Sample Proportion (1/2)
21 7.3.1 Sample Proportion (2/2)Standard error of the sample mean
22 7.3.2 Sample Mean (1/3)Distribution of Sample Mean
23 7.3.2 Sample Mean (2/3)Standard error of the sample mean
25 7.3.3 Sample Variance (1/2)Distribution of Sample Variance
26 Theorem: if is a sample from a normal population having mean and variance , then are independent random variables, with being normal with mean and variance and being chi-square with n-1 degrees of freedom.(proof)Let Then,or equivalently,Dividing this equation by we getCf. Let X and Y be independent chi-square random variables with m and n degrees of freedom respectively. Then, Z=X+Y is a chi-square random variable with m+n degrees of freedom.