Presentation is loading. Please wait.

Presentation is loading. Please wait.

Generating Functions. The Moments of Y We have referred to E(Y) and E(Y 2 ) as the first and second moments of Y, respectively. In general, E(Y k ) is.

Similar presentations


Presentation on theme: "Generating Functions. The Moments of Y We have referred to E(Y) and E(Y 2 ) as the first and second moments of Y, respectively. In general, E(Y k ) is."— Presentation transcript:

1 Generating Functions

2 The Moments of Y We have referred to E(Y) and E(Y 2 ) as the first and second moments of Y, respectively. In general, E(Y k ) is the k th moment of Y. Consider the polynomial where the moments of Y are incorporated into the coefficients

3 Moment Generating Function If the sum converges for all t in some interval |t| < b, the polynomial is called the moment-generating function, m(t), for the random variable Y. And we may note that for each k,

4 Moment Generating Function Hence, the moment-generating function is given by May rearrange, since finite for |t| < b.

5 Moment Generating Function That is, is the polynomial whose coefficients involve the moments of Y.

6 The k th moment To retrieve the k th moment from the MGF, evaluate the k th derivative at t = 0. And so, letting t = 0:

7 Geometric MGF For the geometric distribution,

8 Common MGFs The MGFs for some of the discrete distributions we’ve seen include:

9 Recognize the distribution Identify the distribution having the moment generating function Give the mean and variance for this distribution. Could use the derivatives, but is that necessary?

10 Geometric MGF Consider the MGF Use derivatives to determine the first and second moments. And so,

11 Geometric MGF Since We have And so,

12 Geometric MGF Since is for a geometric random variable with p = 1/3, our prior results tell us E(Y) = 1/p and V(Y) = (1 – p)/p 2. which do agree with our current results.

13 All the moments Although the mean and variance help to describe a distribution, they alone do not uniquely describe a distribution. All the moments are necessary to uniquely describe a probability distribution. That is, if two random variables have equal MGFs, (i.e., m Y (t) = m Z (t) for |t| < b ), then they have the same probability distribution.

14 m(aY+b)? For the random variable Y with MGF m(t), consider W = aY + b. Construct the MGF for the random variable W= 2Y + 3, where Y is a geometric random variable with p = 4/5.

15 E(aY+b) Now, based on the MGF, we could again consider E(W) = E(aY + b). And so, letting t = 0, as expected.

16 Tchebysheff’s Theorem For “bell-shaped” distributions, the empirical rule gave us a 68-95-99.7% rule for probability a value falls within 1, 2, or 3 standard deviations from the mean, respectively. When the distribution is not so bell-shaped, Tchebysheff tells use the probability of being within k standard deviations of the mean is at least 1 – 1/k 2, for k > 0. Remember, it’s just a lower bound.

17 A Skewed Distribution Consider a binomial experiment with n = 10 and p = 0.1.

18 A Skewed Distribution Verify Tchebysheff’s lower bound for k = 2:


Download ppt "Generating Functions. The Moments of Y We have referred to E(Y) and E(Y 2 ) as the first and second moments of Y, respectively. In general, E(Y k ) is."

Similar presentations


Ads by Google