Presentation is loading. Please wait.

Presentation is loading. Please wait.

Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.

Similar presentations


Presentation on theme: "Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution."— Presentation transcript:

1 Some additional Topics

2 Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution

3 Therorem Let X and Y denote a independent random variables each having a gamma distribution with parameters (,  1 ) and (,  2 ). Then W = X + Y has a gamma distribution with parameters (,  1 +  2 ). Proof:

4 Recognizing that this is the moment generating function of the gamma distribution with parameters (,  1 +  2 ) we conclude that W = X + Y has a gamma distribution with parameters (,  1 +  2 ).

5 Therorem (extension to n RV’s) Let x 1, x 2, …, x n denote n independent random variables each having a gamma distribution with parameters (,  i ), i = 1, 2, …, n. Then W = x 1 + x 2 + … + x n has a gamma distribution with parameters (,  1 +  2 +… +  n ). Proof:

6 Recognizing that this is the moment generating function of the gamma distribution with parameters (,  1 +  2 +…+  n ) we conclude that W = x 1 + x 2 + … + x n has a gamma distribution with parameters (,  1 +  2 +…+  n ). Therefore

7 Therorem Suppose that x is a random variable having a gamma distribution with parameters (,  ). Then W = ax has a gamma distribution with parameters ( / a,  ). Proof:

8 1.Let X and Y be independent random variables having an exponential distribution with parameter then X + Y has a gamma distribution with  = 2 and Special Cases 2.Let x 1, x 2,…, x n, be independent random variables having a exponential distribution with parameter then S = x 1 + x 2 +…+ x n has a gamma distribution with  = n and 3.Let x 1, x 2,…, x n, be independent random variables having a exponential distribution with parameter then has a gamma distribution with  = n and n

9 Distribution of population – Exponential distribution Another illustration of the central limit theorem

10 4.Let X and Y be independent random variables having a  2 distribution with 1 and 2 degrees of freedom respectively then X + Y has a  2 distribution with degrees of freedom 1 + 2. Special Cases -continued 5.Let x 1, x 2,…, x n, be independent random variables having a  2 distribution with 1, 2,…, n degrees of freedom respectively then x 1 + x 2 +…+ x n has a  2 distribution with degrees of freedom 1 +…+ n. Both of these properties follow from the fact that a  2 random variable with degrees of freedom is a  random variable with  = ½ and  = /2.

11 If z has a Standard Normal distribution then z 2 has a  2 distribution with 1 degree of freedom. Recall Thus if z 1, z 2,…, z are independent random variables each having Standard Normal distribution then has a  2 distribution with degrees of freedom.

12 Therorem Suppose that U 1 and U 2 are independent random variables and that U = U 1 + U 2 Suppose that U 1 and U have a  2 distribution with degrees of freedom 1 and respectively. ( 1 < ) Then U 2 has a  2 distribution with degrees of freedom 2 = - 1 Proof:

13 Q.E.D.

14 Bivariate Distributions Discrete Random Variables

15 The joint probability function; p(x,y) = P[X = x, Y = y]

16 Marginal distributions Conditional distributions

17 The product rule for discrete distributions Independence

18 Bayes rule for discrete distributions Proof:

19 Continuous Random Variables

20 Definition: Two random variable are said to have joint probability density function f(x,y) if

21 Marginal distributions Conditional distributions

22 The product rule for continuous distributions Independence

23 Bayes rule for continuous distributions Proof:

24 Example Suppose that to perform a task we first have to recognize the task, then perform the task. Suppose that the time to recognize the task, X, has an exponential distribution with l = ¼ (i,e, mean  = 1/ = 4 ) Once the task is recognized the time to perform the task, Y, is uniform from X/2 to 2X. 1.Find the joint density of X and Y. 2.Find the conditional density of X given Y = y.

25 Now and Thus

26 Graph of non-zero region of f(x,y)

27 Bayes rule for continuous distributions

28 Conditional Expectation Let U = g(X,Y) denote any function of X and Y. Then is called the conditional expectation of U = g(X,Y) given X = x.

29 Conditional Expectation and Variance More specifically is called the conditional expectation of Y given X = x. is called the conditional variance of Y given X = x.

30 An Important Rule where E X and Var X denote mean and variance with respect to the marginal distribution of X, f X (x). and

31 ProofLet U = g(X,Y) denote any function of X and Y. Then

32 Now

33 Example Suppose that to perform a task we first have to recognize the task, then perform the task. Suppose that the time to recognize the task, X, has an exponential distribution with = ¼ (i,e, mean  = 1/ = 4 ) Once the task is recognized the time to perform the task, Y, is uniform from X/2 to 2X. 1.Find E[XY]. 2.Find Var[XY].

34 Solution

35

36

37 Conditional Expectation: k (>2) random variables

38 Let X 1, X 2, …, X q, X q+1 …, X k denote k continuous random variables with joint probability density function f(x 1, x 2, …, x q, x q+1 …, x k ) then the conditional joint probability function of X 1, X 2, …, X q given X q+1 = x q+1, …, X k = x k is Definition

39 Let U = h( X 1, X 2, …, X q, X q+1 …, X k ) then the Conditional Expectation of U given X q+1 = x q+1, …, X k = x k is Definition Note this will be a function of x q+1, …, x k.

40 Example Let X, Y, Z denote 3 jointly distributed random variable with joint density function Determine the conditional expectation of U = X 2 + Y + Z given X = x, Y = y.

41 The marginal distribution of X,Y. Thus the conditional distribution of Z given X = x,Y = y is

42 The conditional expectation of U = X 2 + Y + Z given X = x, Y = y.

43 Thus the conditional expectation of U = X 2 + Y + Z given X = x, Y = y.

44 The rule for Conditional Expectation Then Let (x 1, x 2, …, x q, y 1, y 2, …, y m ) = (x, y) denote q + m random variables.

45 Proof (in the simple case of 2 variables X and Y)

46 hence

47 Now

48 The probability of a Gamblers ruin

49 Suppose a gambler is playing a game for which he wins 1$ with probability p and loses 1$ with probability q. Note the game is fair if p = q = ½. Suppose also that he starts with an initial fortune of i$ and plays the game until he reaches a fortune of n$ or he loses all his money (his fortune reaches 0$) What is the probability that he achieves his goal? What is the probability the he loses his fortune?

50 Let P i = the probability that he achieves his goal? Let Q i = 1 - P i = the probability the he loses his fortune? Let X = the amount that he was won after finishing the game If the game is fair Then E [X] = (n – i )P i + (– i )Q i = (n – i )P i + (– i ) (1 –P i ) = 0 or (n – i )P i = i(1 –P i ) and (n – i + i )P i = i

51 If the game is not fair Thus or

52 Note Also

53 hence or where

54 Note thus and

55 table

56 A waiting time paradox

57 Suppose that each person in a restaurant is being served in an “equal” time. That is, in a group of n people the probability that one person took the longest time is the same for each person, namely Suppose that a person starts asking people as they leave – “How long did it take you to be served”. He continues until it he finds someone who took longer than himself Let X = the number of people that he has to ask. Then E[X] = ∞.

58 Proof = The probability that in the group of the first x people together with himself, he took the longest

59 Thus The harmonic series

60


Download ppt "Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution."

Similar presentations


Ads by Google