Presentation is loading. Please wait.

Presentation is loading. Please wait.

Probability and Moment Approximations using Limit Theorems.

Similar presentations


Presentation on theme: "Probability and Moment Approximations using Limit Theorems."— Presentation transcript:

1 Probability and Moment Approximations using Limit Theorems

2 Introduction Until now, the procedures employed were all based on knowledge of the PMF/PDF and the implementation of its summation/integration. In many situation the PMF/PDF may be unknown or the summation/integration may not easily carried out. It would be of great utility, therefore, to be able to approximate the desired quantities using much simpler methods. For RV that are the sum of a large number of independent and identically distributed (IID) RV this can be done. We will focus on two very powerful theorems in probability The law of large numbers The central limit theorem 2

3 Convergence and approximation of a Sum We need to understand the role that convergence play in approximating the behavior of a sum or terms. This sum may be evaluated in closed form to allow a comparison to it approximation. The error in the approximation of s N by, is given by 3 The rate of convergence depends upon particular sequence, but it will surely converge at infinity.

4 Law of Large Number We argued before that if a fair coin is tossed N times in succession, then the relative frequency(r.f) of heads is close to ½. If N  ∞ then we expect the r.f. to approach ½. Let’s prove it by modeling the repeated coin toss experiment as a sequence of N Bernoulli subexperiments. The overall experimental output by the random vector We next assume that the discrete RV X i are IID with marginal PMF 4

5 Law of Large Number The r.f. is given by the sample mean RV If N  ∞ consider the mean and variance 5 X i ’s are independent  uncorrelated X i ’s are identically distributed  Random Variable

6 Law of Large Number We know that for X i ~ Ber(p), the variance is var(X i ) = p(1 - p). Since p = ½ for a fair coin, Therefore the width of the PMF of must decrease as N increases and eventually go to zero. Since the variance is we must have that as 6 Degenerate RV

7 Law of Large Number Note that the sum of N IID Bernoulli RV is a binomial RV, thus And To find the PMF of we let and note that can take on values Thus the PMF becomes 7

8 Law of Large Number A N increases takes on values more densely in the interval [0, 1]. We do not obtain a PMF with all its mass concentrated at 0.5 as we might expect. 8 Appears Gaussian although it chances in amplitude and width for each N

9 Law of Large Number The preceding results say that for large enough N the sample mean RV will always yield a number, which in our case is ½. In general 9

10 Law of Large Number Note that since is the r.f. of heads and p is the prob. of heads we have shown that The prob of a head in a single coin toss can be interpreted as the value obtained as the r.f. of heads in a large number of independent and identical coin tosses. This justifies our use of the sample mean RV as an estimator of a moment since 10

11 Theorem: Law of Large Number If X 1, X 2,…,X N are IID RVs with mean E X [X] and var(X) = σ 2 < ∞, then Proof. Consider the probability of the sample mean RV deviating from the expected value by more than ε, where ε is a samall positive number. This probability is given by Since we have upon using Chebyshev’s inequality and taking the limit of both sides yields 11

12 Theorem: Law of Large Number Since a probability must be greater than or equal to zero, we have Which is a statement that the sample mean RV converge in probability to the expected value of a single RV. Convergence is probability does not mean all realizations will converge. 12

13 Central Limit Theorem(CLT): PDF of sum of uniformly distributed RV By law of large numbers the PMF/PDF of the sample mean RV decreases in width until all the probability is concentrates about the mean. Let’s consider a sum of IID RV when X i ~ U(-1/2, 1/2). If N = 2, then S 2 = X 1 + X 2 and the PDF of S 2 is easily found using a convolution integral. 13

14 CLT: PDF of sum of uniformly distributed RV More generally, we have To find p S3 (x) we must convolve p S2 (x) with p X (x) to yield p X (x) * p X (x) * p X (x) since p S2 (x) = p X (x) * p X (x). This is 14 contributes to the integral

15 CLT: PDF of sum of uniformly distributed RV Since p X (-x) = p X (x),we can express this is more convenient form as The integrand may be determined by plotting p S2 (x) and the right- shifted version p X (u - x) and multiplying these to function 15

16 CLT: PDF of sum of uniformly distributed RV We have chosen the mean and variance of the Gaussian approximation to match that of p S3 ( x ). var(X) = (b - a) 2 /12 for X i ~ U(-1/2, 1/2) and hence var ( X i ) = 1/12. 16

17 CLT: PDF of sum of uniformly distributed RV In case of a uniformly distributed RV over the interval (0,1) a repeated convolution produces 17 As N increases the PDF moves to the right since E[S N ] = NE X [X] = N/2 and variance also increases since var(S N ) = Nvar(X) = N/12 Because of this it is not possible to state that the PDF converges to any PDF.

18 Central Limit Theorem Normalization solves the problem We can assert that this standardized RV will converge to a N(0,1) 18 Central limit theorem states that the PDF of the standardized sum of a large number of continuous IID RV will converge to a Gaussian PDF.

19 Central Limit Theorem: applications There is no need to know the PDF of each RV or even if it is known, to determine the exact PDF of the sum, which may not be possible. Some applications are Polling Noise characterization Scattering effects modeling Kinetic theory of gases Economic modeling. 19

20 Central Limit Theorem Central limit theorem for continuous RVs If X 1, X 2,…,X N are continuous IID RVs, each with mean E X [X] and variance var( X ), and then as N  ∞. Equivalently, the CDF of the standardized sum converges to Φ(x) or 20

21 CLT: PDF of sum of squares of independent N(0,1) RV Let X i ~ N(0, 1) for i = 1,2,…,N and assume the X i ’s are independent. We wish to determine the approximate PDF of To apply the CLT we note that since the Xi’s are IID so are the X i 2 ’s. Then as N  ∞ we have But X 2 ~ so that E X [X 2 ] = 1 and var(X 2 ) = 2 and therefore 21

22 CLT: PDF of sum of squares of independent N(0,1) RV 22 N = 10N = 20


Download ppt "Probability and Moment Approximations using Limit Theorems."

Similar presentations


Ads by Google