Download presentation
1
Functions of Random Variables
2
Methods for determining the distribution of functions of Random Variables
Distribution function method Moment generating function method Transformation method
3
Distribution function method
Let X, Y, Z …. have joint density f(x,y,z, …) Let W = h( X, Y, Z, …) First step Find the distribution function of W G(w) = P[W ≤ w] = P[h( X, Y, Z, …) ≤ w] Second step Find the density function of W g(w) = G'(w).
4
Use of moment generating functions
Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …). Identify the distribution of W from its moment generating function This procedure works well for sums, linear combinations etc.
5
Some Useful Rules Let X be a random variable with moment generating function mX(t). Let Y = bX + a Then mY(t) = mbX + a(t) = E(e [bX + a]t) = eatmX (bt) Let X and Y be two independent random variables with moment generating function mX(t) and mY(t) . Then mX+Y(t) = mX (t) mY (t)
6
M. G. F.’s - Continuous distributions
7
M. G. F.’s - Discrete distributions
8
The Transformation Method
Theorem Let X denote a random variable with probability density function f(x) and U = h(X). Assume that h(x) is either strictly increasing (or decreasing) then the probability density of U is:
9
The Transfomation Method (many variables)
Theorem Let x1, x2,…, xn denote random variables with joint probability density function f(x1, x2,…, xn ) Let u1 = h1(x1, x2,…, xn). u2 = h2(x1, x2,…, xn). un = hn(x1, x2,…, xn). define an invertible transformation from the x’s to the u’s
10
Then the joint probability density function of u1, u2,…, un is given by:
where Jacobian of the transformation
11
The probability of a Gamblers ruin
12
Suppose a gambler is playing a game for which he wins 1$ with probability p and loses 1$ with probability q. Note the game is fair if p = q = ½. Suppose also that he starts with an initial fortune of i$ and plays the game until he reaches a fortune of n$ or he loses all his money (his fortune reaches 0$) What is the probability that he achieves his goal? What is the probability the he loses his fortune?
13
Let Pi = the probability that he achieves his goal?
Let Qi = 1 - Pi = the probability the he loses his fortune? Let X = the amount that he was won after finishing the game If the game is fair Then E [X] = (n – i )Pi + (– i )Qi = (n – i )Pi + (– i ) (1 –Pi ) = 0 or (n – i )Pi = i(1 –Pi ) and (n – i + i )Pi = i
14
If the game is not fair Thus or
15
Note Also
16
hence or where
17
Note thus and
18
table
19
A waiting time paradox
20
Suppose that each person in a restaurant is being served in an “equal” time.
That is, in a group of n people the probability that one person took the longest time is the same for each person, namely Suppose that a person starts asking people as they leave – “How long did it take you to be served”. He continues until it he finds someone who took longer than himself Let X = the number of people that he has to ask. Then E[X] = ∞.
21
Proof = The probability that in the group of the first x people together with himself, he took the longest
22
Thus The harmonic series
23
The harmonic series
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.