Presentation is loading. Please wait.

Presentation is loading. Please wait.

Statistical Estimation

Similar presentations


Presentation on theme: "Statistical Estimation"— Presentation transcript:

1 Statistical Estimation
Ch05 Statistical Estimation

2 CHAPTER CONTENTS CHAPTER CONTENTS 5.1 Introduction 5.2 The Methods of Finding Point Estimators 5.3 Some Desirable Properties of Point Estimators 5.4 A Method of Finding the Confidence Interval: Pivotal Method 5.5 One Sample Confidence Intervals 5.6 A Confidence Interval for the Population Variance 5.7 Confidence Interval Concerning Two Population Parameters 5.8 Chapter Summary 5.9 Computer Examples Projects for Chapter

3 Unknown population parameters
5.1 Introduction Unknown population parameters To estimate: point estimation interval estimation How much money do I have in my pocket? 1000 $ (700, 1200)

4 5.2The Methods of Finding Point Estimators

5 Pdf or pmf of the population(?) (1, . . ., l)
X1, . . ., Xn independent and identically distributed (iid) random variables (in statistical language, a random sample) f (x, 1, . . ., l) Pdf or pmf of the population(?) (1, . . ., l) the unknown population parameters Point estimation: to determine statistics gi(X1, . . ., Xn), i = 1, . . ., l, which can be used to estimate the value of each of the parameters

6 the method of maximum likelihood Bayes’ method
Capital letters such as X and S2 to represent the estimators; Lowercase letters such as x and s2 to represent the estimates. Three of the more popular methods of estimation the method of moments This chapter the method of maximum likelihood Bayes’ method Chapter 11

7 Unbiased Bias consistency The estimator are said to satisfy the consistency property if the sample estimator has a high probability of being close to the population value  for a large sample size. efficiency smaller variance

8 5.2.1 THE METHOD OF MOMENTS : the kth population moment about the origin of a random variable X, : the kth sample moment of the random variable X

9

10

11

12

13

14

15

16 5.2.2 THE METHOD OF MAXIMUM LIKELIHOOD
Even though the method of moments is intuitive and easy to apply, it usually does not yield “good” estimators. The method of maximum likelihood is intuitively appealing, because we attempt to find the values of the true parameters that would have most likely produced the data that we in fact observed. For most cases of practical interest, the performance of MLEs is optimal for large enough data. This is one of the most versatile methods for fitting parametric statistical models to data.

17

18

19

20

21 Maximum likelihood estimates give the parameter values for which the observed sample is most likely to have been generated.

22

23

24

25

26

27 At times, the MLEs may be hard to calculate
At times, the MLEs may be hard to calculate. It may be necessary to use numerical methods to approximate values of the estimate.

28

29

30

31 5.3 Some Desirable Properties of Point Estimators

32 5.3.1 UNBIASED ESTIMATORS

33

34 The sample mean is always an unbiased estimator of the population mean.

35 Sample variance Population variance: Size of population = N
Elements of population: X1, X2, ,… , XN

36

37

38 Unbiased estimators need not be unique.
If we have two unbiased estimators, there are infinitely many unbiased estimators. It is better to have an estimator that has low bias as well as low variance.

39 For unbiased estimators,

40 5.3.2 SUFFICIENCY*

41

42

43

44

45

46 5.4 A Method of Finding the Confidence Interval: Pivotal Method

47

48

49 5.5 One Sample Confidence Intervals

50

51 5.6 A Confidence Interval for the Population Variance

52

53

54 5.7 Confidence Interval Concerning Two Population Parameters

55

56

57 5.8 Chapter Summary

58

59

60 5.9 Computer Examples (Optional)

61 Projects for Chapter 5


Download ppt "Statistical Estimation"

Similar presentations


Ads by Google