Presentation on theme: "Chapter 7. Statistical Estimation and Sampling Distributions"— Presentation transcript:
1Chapter 7. Statistical Estimation and Sampling Distributions 7.1 Point Estimates7.2 Properties of Point Estimates7.3 Sampling Distributions7.4 Constructing Parameter Estimates7.5 Supplementary Problems
27.1 Point Estimates 7.1.1 Parameters In statistical inference, the term parameter is used to denote a quantity , say, that is a property of an unknown probability distribution.For example, the mean, variance, or a particular quantile of the probability distributionParameters are unknown, and one of the goals of statistical inference is to estimate them.
3Figure 7.1 The relationship between a point estimate and an unknown parameter θ
4Figure 7.2 Estimation of the population mean by the sample mean
57.1.2 Statistics Statistics In statistical inference, the term statistics is used to denote a quantity that is a property of a sample.Statistics are functions of a random sample. For example, the sample mean, sample variance, or a particular sample quantile.Statistics are random variables whose observed values can be calculated from a set of observed data.
67.1.3 Estimation Estimation A procedure of “guessing” properties of the population from which data are collected.A point estimate of an unknown parameter isa statistic that represents a “guess” atthe value of .Example 1 (Machine breakdowns)How to estimateP(machine breakdown due to operator misuse) ?Example 2 (Rolling mill scrap)How to estimate the mean and variance of the probability distribution of % scrap ( ) ?
77.2 Properties of Point Estimates 7.2.1. Unbiased Estimates (1/5) Definitions- A point estimate for a parameter is said to beunbiased if- If a point estimate is not unbiased, then its bias is definedto be
87.2.1. Unbiased Estimates (2/5) Point estimate of a success probability-
97.2.1. Unbiased Estimates(3/5) Point estimate of a population mean-
107.2.1. Unbiased Estimates(4/5) Point estimate of a population variance
147.2.2. Minimum Variance Estimates (3/4) An unbiased point estimate whose variance is smaller than any other unbiased point estimate: minimum variance unbised estimate (MVUE)Relative efficiencyMean squared error (MSE)How is it decomposed ?Why is it useful ?
16Example: two independent measurements Point estimates of the unknown CThey are both unbiased estimates sinceThe relative efficiency of to isLet us consider a new estimateThen, this estimate is unbiased sinceWhat is the optimal value of p that results in havingthe smallest possible mean square error (MSE)?
17Let the variance of be given by Differentiating with respect to p yields thatThe value of p that minimizesTherefore, in this example,The variance of
18The relative efficiency of to is In general, assuming that we have n independent and unbiasedestimates having variancerespectively for a parameter , we can set the unbiasedestimator asThe variance of this estimator is
19Mean square error (MSE): Let us consider a point estimateThen, the mean square error is defined byMoreover, notice that
207.3 Sampling Distribution 7.3.1 Sample Proportion (1/2)
217.3.1 Sample Proportion (2/2)Standard error of the sample mean
227.3.2 Sample Mean (1/3)Distribution of Sample Mean
237.3.2 Sample Mean (2/3)Standard error of the sample mean
257.3.3 Sample Variance (1/2)Distribution of Sample Variance
26Theorem: if is a sample from a normal population having mean and variance , then are independent random variables, with being normal with mean and variance and being chi-square with n-1 degrees of freedom.(proof)Let Then,or equivalently,Dividing this equation by we getCf. Let X and Y be independent chi-square random variables with m and n degrees of freedom respectively. Then, Z=X+Y is a chi-square random variable with m+n degrees of freedom.