Some General Concepts of Point Estimation

Slides:



Advertisements
Similar presentations
Estimation of Means and Proportions
Advertisements

Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Point Estimation.
Previous Lecture: Distributions. Introduction to Biostatistics and Bioinformatics Estimation I This Lecture By Judy Zhong Assistant Professor Division.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Chapter 7 Title and Outline 1 7 Sampling Distributions and Point Estimation of Parameters 7-1 Point Estimation 7-2 Sampling Distributions and the Central.
Statistical Estimation and Sampling Distributions
1 Virtual COMSATS Inferential Statistics Lecture-7 Ossam Chohan Assistant Professor CIIT Abbottabad.
POINT ESTIMATION AND INTERVAL ESTIMATION
Chap 8-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 8 Estimation: Single Population Statistics for Business and Economics.
Copyright © Cengage Learning. All rights reserved.
Point estimation, interval estimation
Statistical Inference Chapter 12/13. COMP 5340/6340 Statistical Inference2 Statistical Inference Given a sample of observations from a population, the.
Evaluating Hypotheses
SAMPLING DISTRIBUTIONS. SAMPLING VARIABILITY
Statistical Background
Chapter 8 Estimation: Single Population
Today Today: Chapter 8, start Chapter 9 Assignment: Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Chapter 6: Sampling Distributions
Chapter 7 Estimation: Single Population
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 7 Sampling Distributions.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 6 Sampling Distributions.
Chapter 6. Point Estimation Weiqi Luo ( 骆伟祺 ) School of Software Sun Yat-Sen University : Office : # A313
Lecture 14 Sections 7.1 – 7.2 Objectives:
QBM117 Business Statistics Estimating the population mean , when the population variance  2, is known.
1 Introduction to Estimation Chapter Concepts of Estimation The objective of estimation is to determine the value of a population parameter on the.
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Random Sampling, Point Estimation and Maximum Likelihood.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
1 Lecture 16: Point Estimation Concepts and Methods Devore, Ch
Chapter 5 Parameter estimation. What is sample inference? Distinguish between managerial & financial accounting. Understand how managers can use accounting.
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Estimators and estimates: An estimator is a mathematical formula. An estimate is a number obtained by applying this formula to a set of sample data. 1.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Chapter 5 Sampling Distributions. The Concept of Sampling Distributions Parameter – numerical descriptive measure of a population. It is usually unknown.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Chapter 8 Estimation ©. Estimator and Estimate estimator estimate An estimator of a population parameter is a random variable that depends on the sample.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 7 Inferences Concerning Means.
Chapter 9 Estimation using a single sample. What is statistics? -is the science which deals with 1.Collection of data 2.Presentation of data 3.Analysis.
Chapter 9 Lesson 9.1 Estimation Using a Simple Sample 9.1: Point Estimation.
Sampling and Sampling Distributions. Sampling Distribution Basics Sample statistics (the mean and standard deviation are examples) vary from sample to.
5-1 Copyright © 2014, 2011, and 2008 Pearson Education, Inc.
Statistics for Business and Economics 7 th Edition Chapter 7 Estimation: Single Population Copyright © 2010 Pearson Education, Inc. Publishing as Prentice.
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
Virtual University of Pakistan
Chapter 6: Sampling Distributions
STATISTICS POINT ESTIMATION
STATISTICAL INFERENCE
Nature of Estimation.
Visual Recognition Tutorial
6 Point Estimation.
Point and interval estimations of parameters of the normally up-diffused sign. Concept of statistical evaluation.
Sampling Distributions and Estimation
Some General Concepts of Point Estimation
Chapter 6: Sampling Distributions
Sampling Distributions
Random Sampling Population Random sample: Statistics Point estimate
CONCEPTS OF ESTIMATION
Chapter 7 Sampling Distributions.
Chapter 7 Sampling Distributions.
Chapter 7 Sampling Distributions.
6 Point Estimation.
Chapter 9 Chapter 9 – Point estimation
Chapter 7 Sampling Distributions.
Chapter 8 Estimation.
Applied Statistics and Probability for Engineers
Some General Concepts of Point Estimation
Presentation transcript:

Some General Concepts of Point Estimation

The motivation Suppose we want to estimate a parameter of a single population (e.g. or ) based on a random sample , or a parameter of more than one sample (e.g. , the difference between the means for samples and ). At times we use to represent a generic parameter.

Definition of a point estimate A point estimate of a parameter is a single number that can be regarded as a sensible value for . A point estimate is obtained by selecting a suitable statistic and computing its value from the given sample data. The selected statistic is called the point estimator of .

An example Consider the following observations on dialectic breakdown voltage for 20 pieces of epoxy resin: Estimators and estimates for : Estimator = , estimate 24.46 25.61 26.25 26.42 26.66 27.15 27.31 27.54 27.74 27.94 27.98 28.04 28.28 28.49 28.50 28.76 29.11 29.13 29.50 30.88

Example (continued) Another estimator is , where the smallest and largest 10% of the data points are deleted, and the others are averaged. The estimate is

Which estimator should we choose? Each of those estimators uses a different measure of the center of the sample to estimate . Which is closest to the true value? We can’t answer that without knowing the true value. Which will tend to produce estimates closest to the true value?

Which estimator should we choose? (continued) In the best of worlds, we would want an estimator for which always. However, is random. We want an estimator for which the estimator error is small. One criterion is to choose an estimator to minimize the mean square error . However, the MSE will generally depend on the value of .

A way out A way around this dilemma is to restrict attention to estimators that have some specified property and then find the best estimator in the restricted class. A popular property is unbiasedness.

Unbiasedness: motivation Suppose we have two instruments for measurement and one has been accurately calibrated, but the other systematically gives readings smaller than the true value being measured. The measurements from the first instrument will average out to the true value, and the instrument is called an unbiased instrument. The measurements from the second instrument have a systematic error component or bias.

Definition A point estimator is said to be an unbiased estimator of if for every possible value of . If is not unbiased, the difference is called the bias of .

Do we need to know the parameter to determine unbiasedness? We typically don’t need to know the parameter to determine if an estimator is unbiased. For example, for a binomial rv, the sample proportion is unbiased, since

Example 2 Suppose that , the reaction time to a certain stimulus, has a uniform distribution on the interval . We might think to estimate using must be biased, since all observations are less than or equal to . It can be shown that

Example 2 (continued) We can easily modify to get an unbiased estimator for , simply take

Principle of unbiasedness When choosing among several different estimators of , select one that is unbiased.

Proposition Let be a random sample from a distribution with mean and variance . Then the estimator is unbiased for estimating .

Proof of proposition Recall that . Then

Proof of proposition (continued) … which equals as desired.

The estimator that has n as the divisor The estimator then has expectation Its bias is .

Is S unbiased for the population standard deviation? Unfortunately, though is unbiased for , is not unbiased for . Taking the square root messes up the property of unbiasedness.

Proposition If is a random sample from a distribution with mean , then is an unbiased estimator of . If in addition the distribution is continuous and symmetric, then and any trimmed mean are also unbiased estimators of .

The principle of minimum variance Among all estimators of that are unbiased, choose the one that has minimum variance. The resulting is called the minimum variance unbiased estimator (MVUE) of .

Example 2 again We argued that for a random sample from the uniform distribution on , is unbiased for . Since , is also unbiased for .

Example continued Now (Exercise 32) and . As long as , or , has the smaller variance. But how do we show that it has the minimum variance of all unbiased estimators? Results on MVUEs for certain distributions have been derived, the most important of which follows.

Theorem Let be a random sample from a normal distribution with parameters and . Then the estimator is MVUE for .

Some complications Note that the last theorem doesn’t say that should be used to estimate for any distribution. For a heavy-tailed distribution like the Cauchy, , , one is better off using (the UMVU is not known).

Reporting a point estimate: the standard error The standard error of an estimator is its standard deviation. The standard error gives an idea of a typical deviation of the estimator from its mean. When has approximately a normal distribution, then we can say with reasonable confidence that the true value of lies within approximately 2 standard errors of .