# Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.

## Presentation on theme: "Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution."— Presentation transcript:

Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution

Therorem Let X and Y denote a independent random variables each having a gamma distribution with parameters (,  1 ) and (,  2 ). Then W = X + Y has a gamma distribution with parameters (,  1 +  2 ). Proof:

Recognizing that this is the moment generating function of the gamma distribution with parameters (,  1 +  2 ) we conclude that W = X + Y has a gamma distribution with parameters (,  1 +  2 ).

Therorem (extension to n RV’s) Let x 1, x 2, …, x n denote n independent random variables each having a gamma distribution with parameters (,  i ), i = 1, 2, …, n. Then W = x 1 + x 2 + … + x n has a gamma distribution with parameters (,  1 +  2 +… +  n ). Proof:

Recognizing that this is the moment generating function of the gamma distribution with parameters (,  1 +  2 +…+  n ) we conclude that W = x 1 + x 2 + … + x n has a gamma distribution with parameters (,  1 +  2 +…+  n ). Therefore

Therorem Suppose that x is a random variable having a gamma distribution with parameters (,  ). Then W = ax has a gamma distribution with parameters ( / a,  ). Proof:

1.Let X and Y be independent random variables having an exponential distribution with parameter then X + Y has a gamma distribution with  = 2 and Special Cases 2.Let x 1, x 2,…, x n, be independent random variables having a exponential distribution with parameter then S = x 1 + x 2 +…+ x n has a gamma distribution with  = n and 3.Let x 1, x 2,…, x n, be independent random variables having a exponential distribution with parameter then has a gamma distribution with  = n and n

Distribution of population – Exponential distribution Another illustration of the central limit theorem

4.Let X and Y be independent random variables having a  2 distribution with 1 and 2 degrees of freedom respectively then X + Y has a  2 distribution with degrees of freedom 1 + 2. Special Cases -continued 5.Let x 1, x 2,…, x n, be independent random variables having a  2 distribution with 1, 2,…, n degrees of freedom respectively then x 1 + x 2 +…+ x n has a  2 distribution with degrees of freedom 1 +…+ n. Both of these properties follow from the fact that a  2 random variable with degrees of freedom is a  random variable with  = ½ and  = /2.

If z has a Standard Normal distribution then z 2 has a  2 distribution with 1 degree of freedom. Recall Thus if z 1, z 2,…, z are independent random variables each having Standard Normal distribution then has a  2 distribution with degrees of freedom.

Therorem Suppose that U 1 and U 2 are independent random variables and that U = U 1 + U 2 Suppose that U 1 and U have a  2 distribution with degrees of freedom 1 and respectively. ( 1 < ) Then U 2 has a  2 distribution with degrees of freedom 2 = - 1 Proof:

Q.E.D.

Bivariate Distributions Discrete Random Variables

The joint probability function; p(x,y) = P[X = x, Y = y]

Marginal distributions Conditional distributions

The product rule for discrete distributions Independence

Bayes rule for discrete distributions Proof:

Continuous Random Variables

Definition: Two random variable are said to have joint probability density function f(x,y) if

Marginal distributions Conditional distributions

The product rule for continuous distributions Independence

Bayes rule for continuous distributions Proof:

Example Suppose that to perform a task we first have to recognize the task, then perform the task. Suppose that the time to recognize the task, X, has an exponential distribution with l = ¼ (i,e, mean  = 1/ = 4 ) Once the task is recognized the time to perform the task, Y, is uniform from X/2 to 2X. 1.Find the joint density of X and Y. 2.Find the conditional density of X given Y = y.

Now and Thus

Graph of non-zero region of f(x,y)

Bayes rule for continuous distributions

Conditional Expectation Let U = g(X,Y) denote any function of X and Y. Then is called the conditional expectation of U = g(X,Y) given X = x.

Conditional Expectation and Variance More specifically is called the conditional expectation of Y given X = x. is called the conditional variance of Y given X = x.

An Important Rule where E X and Var X denote mean and variance with respect to the marginal distribution of X, f X (x). and

ProofLet U = g(X,Y) denote any function of X and Y. Then

Now

Example Suppose that to perform a task we first have to recognize the task, then perform the task. Suppose that the time to recognize the task, X, has an exponential distribution with = ¼ (i,e, mean  = 1/ = 4 ) Once the task is recognized the time to perform the task, Y, is uniform from X/2 to 2X. 1.Find E[XY]. 2.Find Var[XY].

Solution

Conditional Expectation: k (>2) random variables

Let X 1, X 2, …, X q, X q+1 …, X k denote k continuous random variables with joint probability density function f(x 1, x 2, …, x q, x q+1 …, x k ) then the conditional joint probability function of X 1, X 2, …, X q given X q+1 = x q+1, …, X k = x k is Definition

Let U = h( X 1, X 2, …, X q, X q+1 …, X k ) then the Conditional Expectation of U given X q+1 = x q+1, …, X k = x k is Definition Note this will be a function of x q+1, …, x k.

Example Let X, Y, Z denote 3 jointly distributed random variable with joint density function Determine the conditional expectation of U = X 2 + Y + Z given X = x, Y = y.

The marginal distribution of X,Y. Thus the conditional distribution of Z given X = x,Y = y is

The conditional expectation of U = X 2 + Y + Z given X = x, Y = y.

Thus the conditional expectation of U = X 2 + Y + Z given X = x, Y = y.

The rule for Conditional Expectation Then Let (x 1, x 2, …, x q, y 1, y 2, …, y m ) = (x, y) denote q + m random variables.

Proof (in the simple case of 2 variables X and Y)

hence

Now

The probability of a Gamblers ruin

Suppose a gambler is playing a game for which he wins 1\$ with probability p and loses 1\$ with probability q. Note the game is fair if p = q = ½. Suppose also that he starts with an initial fortune of i\$ and plays the game until he reaches a fortune of n\$ or he loses all his money (his fortune reaches 0\$) What is the probability that he achieves his goal? What is the probability the he loses his fortune?

Let P i = the probability that he achieves his goal? Let Q i = 1 - P i = the probability the he loses his fortune? Let X = the amount that he was won after finishing the game If the game is fair Then E [X] = (n – i )P i + (– i )Q i = (n – i )P i + (– i ) (1 –P i ) = 0 or (n – i )P i = i(1 –P i ) and (n – i + i )P i = i

If the game is not fair Thus or

Note Also

hence or where

Note thus and

table

Suppose that each person in a restaurant is being served in an “equal” time. That is, in a group of n people the probability that one person took the longest time is the same for each person, namely Suppose that a person starts asking people as they leave – “How long did it take you to be served”. He continues until it he finds someone who took longer than himself Let X = the number of people that he has to ask. Then E[X] = ∞.

Proof = The probability that in the group of the first x people together with himself, he took the longest

Thus The harmonic series

Download ppt "Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution."

Similar presentations