Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.

Slides:



Advertisements
Similar presentations
STATISTICS POINT ESTIMATION Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
Advertisements

Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Point Estimation.
Point Estimation Notes of STAT 6205 by Dr. Fan.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Chapter 7. Statistical Estimation and Sampling Distributions
Chapter 7 Title and Outline 1 7 Sampling Distributions and Point Estimation of Parameters 7-1 Point Estimation 7-2 Sampling Distributions and the Central.
Statistical Estimation and Sampling Distributions
SOLVED EXAMPLES.
Estimation A major purpose of statistics is to estimate some characteristics of a population. Take a sample from the population under study and Compute.
Maximum likelihood (ML) and likelihood ratio (LR) test
The Mean Square Error (MSE):. Now, Examples: 1) 2)
Point estimation, interval estimation
Today Today: Chapter 9 Assignment: Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Statistical Inference Chapter 12/13. COMP 5340/6340 Statistical Inference2 Statistical Inference Given a sample of observations from a population, the.
Part 2b Parameter Estimation CSE717, FALL 2008 CUBS, Univ at Buffalo.
Section 6.1 Let X 1, X 2, …, X n be a random sample from a distribution described by p.m.f./p.d.f. f(x ;  ) where the value of  is unknown; then  is.
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 6 Introduction to Sampling Distributions.
Today Today: Chapter 9 Assignment: 9.2, 9.4, 9.42 (Geo(p)=“geometric distribution”), 9-R9(a,b) Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
SAMPLING DISTRIBUTIONS. SAMPLING VARIABILITY
2. Point and interval estimation Introduction Properties of estimators Finite sample size Asymptotic properties Construction methods Method of moments.
Maximum-Likelihood estimation Consider as usual a random sample x = x 1, …, x n from a distribution with p.d.f. f (x;  ) (and c.d.f. F(x;  ) ) The maximum.
Statistical Background
1 STATISTICAL INFERENCE PART I EXPONENTIAL FAMILY & POINT ESTIMATION.
July 3, Department of Computer and Information Science (IDA) Linköpings universitet, Sweden Minimal sufficient statistic.
STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Maximum likelihood (ML)
Chapter 7 Estimation: Single Population
STATISTICAL INFERENCE PART I POINT ESTIMATION
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Random Sampling, Point Estimation and Maximum Likelihood.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Chapter 7 Point Estimation
1 Lecture 16: Point Estimation Concepts and Methods Devore, Ch
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
Confidence Interval & Unbiased Estimator Review and Foreword.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Week 41 How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Sometimes.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Statistical Estimation Vasileios Hatzivassiloglou University of Texas at Dallas.
Chapter 6 parameter estimation
Chapter7. Parameter estimation §1. Point estimation.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Week 31 The Likelihood Function - Introduction Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true.
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and  2 Now, we need procedures to calculate  and  2, themselves.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Conditional Expectation
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Estimation
Confidence Intervals and Sample Size
STATISTICS POINT ESTIMATION
STATISTICAL INFERENCE
12. Principles of Parameter Estimation
Probability Theory and Parameter Estimation I
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
STATISTICAL INFERENCE PART I POINT ESTIMATION
t distribution Suppose Z ~ N(0,1) independent of X ~ χ2(n). Then,
POINT ESTIMATOR OF PARAMETERS
6 Point Estimation.
12. Principles of Parameter Estimation
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Applied Statistics and Probability for Engineers
Presentation transcript:

Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population (i.e. ,  ) Parameter – numerical characteristic of the population (i.e. ,  ) Distributional characteristics – pdf and cdf Distributional characteristics – pdf and cdf Statistic - numerical characteristic of the sample used to estimate parameters Statistic - numerical characteristic of the sample used to estimate parameters

Point Estimation  A sample statistic is often used to estimate and draw conclusions about a population parameter (  ). The sample statistic is called a point estimator of  The sample statistic is called a point estimator of  For a particular sample, the calculated value of the statistic is called a point estimate of . For a particular sample, the calculated value of the statistic is called a point estimate of .

Point Estimation  Let X 1, X 2, …, X n be a random sample of size n from the population of interest, and let Y=u(X 1, X 2, …, X n ) be a statistic used to estimate . Then Y is called an estimator of . Then Y is called an estimator of . A specific value of the estimator y=u(x 1, x 2, …, x n ) is called an estimate of . A specific value of the estimator y=u(x 1, x 2, …, x n ) is called an estimate of .

Estimator  Discussion is limited to random variables with functional form of the pdf known. The pdf typically depends on an unknown parameter  which can take on any value in the parameter space . i.e. f(x;  ) The pdf typically depends on an unknown parameter  which can take on any value in the parameter space . i.e. f(x;  ) It is often necessary to pick one member from a family of members as most likely to be true. It is often necessary to pick one member from a family of members as most likely to be true. i.e. pick “best” value of  for f(x;  )i.e. pick “best” value of  for f(x;  ) The best estimator can depend on the distribution being sampled. The best estimator can depend on the distribution being sampled.

Properties of an Estimator  If E[Y]= , then the statistic Y is called an unbiased estimator of . Otherwise, it is said to be biased. E[Y-  ] is the bias of an estimator E[Y-  ] is the bias of an estimator In many cases, the “best” estimator is an unbiased estimator. In many cases, the “best” estimator is an unbiased estimator.

Properties of an Estimator  Another important property is small variance. If two estimators are both unbiased, we prefer the one with small variance. Minimize E[(Y-  ) 2 ] = Var[(Y-  )] + E[(Y-  )] 2 Minimize E[(Y-  ) 2 ] = Var[(Y-  )] + E[(Y-  )] 2 The estimator Y that minimizes E[(Y-  ) 2 ] is said to have minimum mean square error (MSE) The estimator Y that minimizes E[(Y-  ) 2 ] is said to have minimum mean square error (MSE) If we consider only unbiased estimators, the statistic Y that minimizes MSE is called the minimum variance unbiased estimator (MVUE) If we consider only unbiased estimators, the statistic Y that minimizes MSE is called the minimum variance unbiased estimator (MVUE)

Properties of an Estimator  The efficiency of an estimator  1 compare to another estimator  2 is equal to the ratio

Method of Maximum Likelihood  An important method for finding an estimator Let X 1,X 2,…,X n be a random sample of size n from f(x;  ). Let X 1,X 2,…,X n be a random sample of size n from f(x;  ). The likelihood function is the joint pdf of X 1,X 2,…,X n evaluated at observed values x 1,x 2,…,x n as a function of the parameter of interest. The likelihood function is the joint pdf of X 1,X 2,…,X n evaluated at observed values x 1,x 2,…,x n as a function of the parameter of interest. L(  ) = f(x 1, x 2, …, x n ;  ) = f(x 1,  )  f(x 2,  )    f(x n,  ) is the probability of observing x 1,x 2,…,x n if the pdf is f(x;  ).L(  ) = f(x 1, x 2, …, x n ;  ) = f(x 1,  )  f(x 2,  )    f(x n,  ) is the probability of observing x 1,x 2,…,x n if the pdf is f(x;  ). The value of  that maximizes L(  ) is the value of  most likely to have produced x 1,x 2,…,x nThe value of  that maximizes L(  ) is the value of  most likely to have produced x 1,x 2,…,x n

Maximum Likelihood Estimator  The maximum likelihood estimator (MLE) of  is found by setting the differential of L(  ) with respect to  equal to zero and solving for .  The MLE can also be found by maximizing the natural log of L(  ), which is often easier to differentiate.  For more than one parameter, maximum likelihood equations are formed and simultaneously solved to arrive at the MLE’s

Invariance Property  If t is the MLE for  and u(  ) is a function of  then the MLE of u(  ) is u(t). Plug the MLE(s) into the function to get an MLE estimate of the function Plug the MLE(s) into the function to get an MLE estimate of the function

Method of Moments  An important method for finding an estimator Let X 1,X 2,…,X n be a random sample of size n from f(x). The kth population moment is E[X k ]. The kth sample moment is (1/n)  X k. Let X 1,X 2,…,X n be a random sample of size n from f(x). The kth population moment is E[X k ]. The kth sample moment is (1/n)  X k. The method of moments involves setting the sample moment(s) equal to the population moment(s) and solving for the parameter of interest. The method of moments involves setting the sample moment(s) equal to the population moment(s) and solving for the parameter of interest.