1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.

Slides:



Advertisements
Similar presentations
Distributions of sampling statistics Chapter 6 Sample mean & sample variance.
Advertisements

Chapter 6 Sampling and Sampling Distributions
Discrete Uniform Distribution
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Chap 8: Estimation of parameters & Fitting of Probability Distributions Section 6.1: INTRODUCTION Unknown parameter(s) values must be estimated before.
Chapter 7 Introduction to Sampling Distributions
Normal Distribution ch5.
DISTRIBUTION OF THE SAMPLE MEAN
Sampling Distributions
Chapter 6 Introduction to Sampling Distributions
Prof. Bart Selman Module Probability --- Part d)
Chapter 7 Sampling and Sampling Distributions
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 6 Introduction to Sampling Distributions.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 6-1 Introduction to Statistics Chapter 7 Sampling Distributions.
The moment generating function of random variable X is given by Moment generating function.
Part III: Inference Topic 6 Sampling and Sampling Distributions
Experimental Evaluation
Inferences About Process Quality
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Ka-fu Wong © 2004 ECON1003: Analysis of Economic Data Lesson6-1 Lesson 6: Sampling Methods and the Central Limit Theorem.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 7 Statistical Intervals Based on a Single Sample.
Approximations to Probability Distributions: Limit Theorems.
Some Continuous Probability Distributions Asmaa Yaseen.
Continuous Probability Distribution  A continuous random variables (RV) has infinitely many possible outcomes  Probability is conveyed for a range of.
Standard error of estimate & Confidence interval.
Chapter 6 Sampling and Sampling Distributions
Hamid R. Rabiee Fall 2009 Stochastic Processes Review of Elementary Probability Lecture I.
Chapter 5 Sampling Distributions
Statistical Intervals for a Single Sample
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 7 Sampling Distributions.
(c) 2007 IUPUI SPEA K300 (4392) Outline Normal Probability Distribution Standard Normal Probability Distribution Standardization (Z-score) Illustrations.
Chap 6-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 6 Introduction to Sampling.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 6-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology The Weak Law and the Strong.
Maximum Likelihood Estimator of Proportion Let {s 1,s 2,…,s n } be a set of independent outcomes from a Bernoulli experiment with unknown probability.
ENGR 610 Applied Statistics Fall Week 3 Marshall University CITE Jack Smith.
Convergence in Distribution
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 7 Sampling Distributions.
Biostatistics Dr. Chenqi Lu Telephone: Office: 2309 GuangHua East Main Building.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 3 Section 2 – Slide 1 of 27 Chapter 3 Section 2 Measures of Dispersion.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc.. Chap 7-1 Developing a Sampling Distribution Assume there is a population … Population size N=4.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
1 Chapter 7 Sampling Distributions. 2 Chapter Outline  Selecting A Sample  Point Estimation  Introduction to Sampling Distributions  Sampling Distribution.
B AD 6243: Applied Univariate Statistics Data Distributions and Sampling Professor Laku Chidambaram Price College of Business University of Oklahoma.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
The final exam solutions. Part I, #1, Central limit theorem Let X1,X2, …, Xn be a sequence of i.i.d. random variables each having mean μ and variance.
Statistics 300: Elementary Statistics Sections 7-2, 7-3, 7-4, 7-5.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Ka-fu Wong © 2003 Chap 6- 1 Dr. Ka-fu Wong ECON1003 Analysis of Economic Data.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Distributions of Functions of Random Variables November 18, 2015
1 Sampling distributions The probability distribution of a statistic is called a sampling distribution. : the sampling distribution of the mean.
Probability and Moment Approximations using Limit Theorems.
Lecture 5 Introduction to Sampling Distributions.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Chebyshev’s Inequality Markov’s Inequality Proposition 2.1.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
Chapter 6 Sampling and Sampling Distributions
Basic statistics Usman Roshan.
Sampling and Sampling Distributions
Continuous Probability Distributions
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Sample Mean Distributions
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Fundamental Sampling Distributions and Data Descriptions
Presentation transcript:

1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015

2 Outline 5.5 Random Functions Associated with Normal Distributions 5.6 The Central Limit Theorem 5.7 Approximations for Discrete Distributions 5.8 Chebyshev’s Inequality and Convergence in Probability

3 Random Functions Associated with Normal Distributions Theorem: Assume that X 1, X 2,…, X n are independent random variables with distributions N(μ 1,σ 1 2 ), N(μ 2,σ 2 2 ),…, N(μ n,σ n 2 ), respectively. Then,

4 Random Functions Associated with Normal Distributions Proof: Recall the mgf of N( ,  2 ) is

5 Example 1: If X 1 and X 2 are independent normal random variables N(µ 1,σ 1 2 ) and N(µ 2,σ 2 2 ), respectively, then X 1 + X 2 is N(µ 1 +µ 2, σ 1 2 +σ 2 2 ), and X 1 - X 2 is N(µ 1 -µ 2, σ 1 2 +σ 2 2 ) Random Functions Associated with Normal Distributions

6 Example 2: If X 1,X 2,…,X n correspond to random samples from a normal distribution N(μ,σ 2 ), then the sample mean is N(μ,σ 2 /n) Proof: Random Functions Associated with Normal Distributions

7 One important implication of the distribution of is that it has a greater probability of falling in an interval containing μ than does a single sample X k The larger the sample size n, the smaller the variance of the sample mean “Mean” is a constant, but “sample mean” is a random variable

8 Random Functions Associated with Normal Distributions For example, assume that X 1, X 2,…, X n are random samples from N(50,16) distribution. Then, is N(50,16/n). The following figure shows the pdf of with different values of n

9 Random Functions Associated with Normal Distributions

10 Random Functions Associated with Normal Distributions Recall: Let Z 1, Z 2, …, Z n be i.i.d. N(0,1). Then, w = Z Z …+ Z n 2 is χ 2 (n) Let X 1, X 2,…, X n be independent chi- square random variables with k 1, k 2,…, k n degrees of freedom, i.e., χ 2 (k 1 ), χ 2 (k 2 ),…, χ 2 (k n ), respectively. Then, Y=X 1 +X 2 +…+X n is χ 2 (k 1 +k 2 +…+k n )

11 Random Functions Associated with Normal Distributions Theorem: Let X 1, X 2,…, X n be random samples from the N(μ,σ 2 ) distribution. The sample mean and sample variance are given by Then, (a) and are independent (b) is χ 2 (n-1)

12 Random Functions Associated with Normal Distributions We will accept (a) without proving it Proof of (b):

13 Random Functions Associated with Normal Distributions

14 Random Functions Associated with Normal Distributions

15 Random Functions Associated with Normal Distributions It is interesting to observe that That is, when the actual mean is replaced by the sample mean, one degree of freedom is lost

16 Central Limit Theorem It is useful to first review some related theorems Theorem (Sample Mean): Let X 1, X 2, …, X n be a sequence of i.i.d. random variables with mean  and variance  2. Then, the sample mean is a random variable with mean  and variance  2 /n

17 Central Limit Theorem Theorem (Strong Law of Large Numbers): Let X 1, X 2, …, X n be a sequence of i.i.d. random variables with mean  Then, with probability 1, That is, (The sample mean converges almost surely, or converges with probability 1, to the expected value)

18 Central Limit Theorem Theorem (Strong Law of Large Numbers)(Cont.): This theorem holds for any distribution of the X i ’s This is one of the most well-known results in probability theory

19 Central Limit Theorem Theorem (Central Limit Theorem): Let X 1, X 2, …, X n be a sequence of i.i.d. random variables with mean  and variance  2. Then the distribution of is N(0,1) as That is, (convergence in distribution)

20 Central Limit Theorem Theorem (Central Limit Theorem)(Cont.): While tends to “degenerate” to zero (Strong Law of Large Numbers), the factor in “spreads out” the probability enough to prevent this degeneration

21 Central Limit Theorem Theorem (Central Limit Theorem)(Cont.): One observation that helps make sense of this result is that, in the case of normal distribution (i.e., X 1, X 2, …, X n are i.i.d. normal), is N( ,  2 /n) Hence, is (exactly) N(0,1) for each positive value of n Thus, in the limit, the distribution must also be N(0,1)

22 Central Limit Theorem Theorem (Central Limit Theorem)(Cont.): The powerful fact is that this theorem holds for any distribution of the X i ’s It explains the remarkable fact that the empirical frequencies of so many natural “populations” exhibit a bell-shaped (i.e., normal) curve The term “central limit theorem” traces back to George Polya who first used the term in 1920 in the title of a paper. Polya referred to the theorem as “central” due to its importance in probability theory

23 Central Limit Theorem The Central Limit Theorem and the Strong Law of Large Numbers are the two fundamental theorems of probability

24 Central Limit Theorem Example 1 (Normal Approximation to the Uniform Sum Distribution (a.k.a. the Irwin-Hall Distribution)): Let X i, i=1,2,… be i.i.d. U(0,1). Compare the graph of the pdf of Y=X 1 +X 2 +…+X n, with the graph of the N(n(1/2), n(1/12)) pdf n=2

25 Central Limit Theorem n=4

26 Central Limit Theorem Example 2 (Normal Approximation to the Uniform Sum Distribution (a.k.a. the Irwin-Hall Distribution)): Let X i, i=1,2,…,10 be i.i.d. U(0,1). Estimate P(X 1 +X 2 +…+X 10 > 7) Solution: With and by the central limit theorem,

27 Central Limit Theorem Example 3 (Normal Approximation to the Chi-Square Distribution): Let X 1,X 2,…,X n be i.i.d. N(0,1). Then, is chi-square with n degrees of freedom, with E(Y)=n and Var(Y)=2n Recall the pdf of Y is Let

28 Central Limit Theorem The pdf of W is given by Compare the pdf of W and the pdf of N(0,1): n=20 n=100

29 Approximations for Discrete Distributions The beauty of the central limit theorem is that it holds regardless of the underlying distribution (even discrete)

30 Approximations for Discrete Distributions Example 4 (Normal Approximation to the Binomial Distribution): X 1,X 2,… X n are random samples from a Bernoulli distribution with μ=p and σ 2 = p(1-p). Then, Y=X 1 +X 2 +…+X n is binomial b(n,p). The central limit theorem states that is N(0,1) as n approaches infinity

31 Approximations for Discrete Distributions Thus, if n is sufficiently large, the distribution of Y is approximately N(np,np(1-p)), and the probabilities for the binomial distribution b(n,p) can be approximated with this normal distribution, i.e., for sufficiently large n

32 Approximations for Discrete Distributions Consider n=10, p=1/2, i.e., Y~b(10,1/2). Then, by CLT, Y can be approximated by the normal distribution with mean 10(1/2)=5 and variance 10(1/2)(1/2)=5/2. Compare the pmf of Y and the pdf of N(5,5/2):

33 Approximations for Discrete Distributions Example 5 (Normal Approximation to the Poisson Distribution): Recall the Poisson pmf where parameter is both the mean and variance of the distribution Poisson random variable counts the number of discrete occurrences (sometimes called “events” or “arrivals”) that take place during a time-interval of given length

34 Approximations for Discrete Distributions A random variable having a Poisson distribution with mean 20 can be thought of as the sum Y of the observations of a random sample of size 20 from a Poisson distribution with mean 1. Thus, has a distribution that is approximately N(0,1), and the distribution of Y is approximately N(20,20)

35 Approximations for Discrete Distributions Compare the pmf of Y and the pdf of N(20,20):

36 Markov’s Inequality Theorem (Markov’s Inequality): If X is a continuous random variable that takes only nonnegative values, then for any a>0, The inequality is valid for all distributions (discrete or continuous)

37 Markov’s Inequality Proof:

38 Markov’s Inequality Intuition behind Markov’s Inequality, using a fair dice (discrete) example: Then, …

39 Chebyshev’s Inequality Theorem (Chebyshev’s Inequality): If X is a continuous random variable with mean  and variance  2, then for any k>0, The inequality is valid for all distributions (discrete or continuous) for which the standard deviation exists

40 Chebyshev’s Inequality Proof: Since (X-  ) 2 is a nonnegative random variable, we can apply Markov’s inequality (with a=k 2 ) to obtain Thus,

41 Chebyshev’s Inequality (Another Form) Chebyshev’s Inequality (another form): Chebyshev’s inequality states that the probability that X differs from its mean by at least k standard deviations is less than or equal to 1/k 2 It follows that the probability that X differs from its mean by less than k standard deviations is at least 1-1/k 2

42 Chebyshev’s Inequality The importance of Markov’s and Chebyshev’s inequalities is that they enable us to derive (sometimes loose but still useful) bounds on probabilities when only the mean, or both the mean and the variance, of the probability distribution are known

43 Chebyshev’s Inequality Example 1: If it is known that X has a mean of 25 and a variance of 16, then, a lower bound for P(17<X<33) is given by and an upper bound for P(|X-25|>=12) is The results hold for any distribution with mean 25 and variance 16