Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter6 Jointly Distributed Random Variables

Similar presentations


Presentation on theme: "Chapter6 Jointly Distributed Random Variables"— Presentation transcript:

1 Chapter6 Jointly Distributed Random Variables
STAT 111 Chapter6 Jointly Distributed Random Variables

2 In many experiments it is necessary to consider the properties of two or more random variables simultaneously. In the following, we shall be concerned with the bivariate case, that is, with situations where we are interested at the same time in a pair of random variables. Later, we shall extend this discussion to the multivariate case, covering any finite number of random variables, If X and Y are two discrete random variables, the probability distribution for their simultaneous occurrence can be represented by a function with values f (x, y) for any pair of values (x, y) within the range of the random variables X and Y. It is common to refer to this function as the joint probability distribution of X and Y, and it is defined as f (x, y) = P(X = x, Y = y)

3 Definition F (x , y ) = p (X ≤ x ,Y≤ y)= ∑ ∑ ƒ ( xi , yi)
The function f (x, y) is a joint probability distribution (or joint probability mass function) of the discrete random variables X and Y if and only if its values satisfy the conditions; for all (x, y) 2. ∑ ∑ f (x ,y ) = 1, where the double summation extends over all x y Based on this definition , the joint cumulative distribution function is the joint probably that X ≤ x and Y ≤ y ,given by F (x , y ) = p (X ≤ x ,Y≤ y)= ∑ ∑ ƒ ( xi , yi) xi ≤ x yi ≤ y for - ∞ < x<∞ , ∞< y <∞

4 Example Let X denote the number of heads and Y the number of heads minus the number of tails when coins are tossed. Find the joint probability distribution of X and Y. Possible values (0,-3) ,(1,-1), (2,1), (3,3) P(X=0, Y= -3) = , 8 P(X=1 , Y=-1)= , ….. These probabilities can most easily be expressed in tabular from as in this table Y/X 1 2 3 Sum -3 1/8 -1 3/8

5 Example Suppose that 3 balls are randomly selected from an urn containing 3 red, 4 white and 5 blue balls. Let X denotes the number of red balls chosen. Let Y denotes the number of white balls chosen. Find the joint probability function of X and Y, P(X + Y ≤ 2), P(X = 1, Y = 2), p(X = 0, 0 ≤ Y≤ 2) , and P(X > Y). X = # R , Y= #W 3R B 4 W 1. X + Y ≤ 2 for the following values of X and Y {(0. 0), (1, 0), (2, 0), (0, 1),(1, 1), (0,2)}therefore, P(X+Y≤2)= X/Y 1 2 3 10/220 40/220 30/220 4/220 60/220 18/220 15/220 12/220 1/220 2. P (X= 1,Y=2) =f (1,2) = 18/220 3. P(X = 0, 0 ≤ Y≤ 2) = ƒ (0, 0) + ƒ(0,1) + ƒ(0, 2 ) =

6 Example Let the joint probability mass function of X and Y is given in the following table Y X 1 2 1/16 1/8 1/4 Find 1. P(X= 1, Y≤ 0) = ƒ(1,0) =1/8 2. P(X = 2,Y ≤ 0) =ƒ(2, 0) = 1/8 3. P(X = 2,X + Y = 4) = ƒ(2,2) = 1/16 4. P(l ≤ X<3, Y≥ 1) = ƒ(1, l)+ƒ(l,2) + ƒ(2, l) + ƒ(2,2) = 5.P(X ≤ 2) = 1 - P(X > 2) = = 1 6. F(l, 1) = (P(X≤1,Y≤1) = f (1,0)+ f (1, l)+f (0,0)+f (0,1) =

7 Example Determine the value of c so that the following functions represent joint probability distributions of the random variables X and Y; 1. ƒ(x,y) = c x y, for x = 1,2, y=1,2,3. ∑ ƒ(x+ y) = c [ ] = 1  c = 1/36 2. ƒ(x , y) = c│ x-y │, for x= -2 ,0 , y =1,2,3. ∑ ƒ(x , y) = c[ ]=1  c = 1/20

8 All the preceding definitions concerning two random variables can be generalized to the multivariate case, where there are n random variables. The values of the joint probability distribution of the discrete random variables X1,X2 , …, Xn, defined over the same sample space S, are given by f(x1,x2, ...,xn) = P(X1=X 1,X2, ..., Xn = xn) for all n-tuple (x1, x2, ..., xn) within the range of the random variables. Also the values of their joint cumulative distribution function are given by F ( x1 , x2 ,… , xn ) = P ( X1 ≤ x1 , X2 ≤ x2, …Xn ≤ xn)

9 Example Considering n flips of a balanced coin, let X 1 be the number of heads (0 or 1) obtained on the first flip, X2 the number of heads obtained on the second flip, ..., and Xn the number of heads obtained on the nth flip. find the joint probability distribution of these n random variables. f(x1 , x2 , …,xn)= P(X1 = x1 , X2 = x2 ,..,Xn=xn) = 1 x 1 x 1 x…x 1 = n Where each X: can take on the value 0 or 1

10 Marginal Distributions
If X and Y are discrete random variables and f(x, y) is the value of their joint probability distribution at (x, y), the function given by f x (x) =∑ ƒ(x,y) y for each x within the range of X, is called the marginal distribution of X Similarly , the function given by f y(y) =∑ ƒ(x,y) x for each y within the range of y , is called the marginal distribution of y note that , because the probabilities are obtained from the margins , we call them marginal distribution

11 Example Assume that the random variable X and Y have the following joint prob. Mass function. Find the marginal distribution of X and Y. X \ Y 1 2 3 Sum 10/220 40/220 30/220 4/220 84/220 60/220 1 8/220 1 08/220 15/220 12/220 27/220 1/220 sum 56/220 112/220 4 8 /220 Marginal of x X 1 2 3 Sum ƒ(x) 84/220 108/220 27/220 1/220 Marginal of y Y ƒ(y) 56/220 112/220 48/220 4/220

12 ∑ ∑ Example ∞ e-2λ e λ ∞ λY Y! λ x X!
Assume that the random variables X and Y have the joint probability mass function given as f(x ,y)= λx+y e -2λ x = 0 , 1 , 2 ,.. x!y! y=0,1,2,…… Find the marginal distribution of X ƒ(x)= ∑ λx+y e-2λ = = x! y! = λx e -λ x! [Using e λ = λt ] t = 0 t ! λxe-2λ λY Y! Y=0 λ x X! e-2λ e λ

13 Example Let the joint distribution of X and Y be given as
f(x , y) = x + y x = 0,l,2,3 y = 0,1,2, 30 Find the marginal distribution function of X and Y. marginal of X Similarly, f(y) = (3+2x)/15 Or , since the joint is the marginal of x the marginal of y y x 1 2 3 1/30 1/15 1/10 2/15 1/6 1/5 3/10 2/5 x 1 2 3 f(x) 1/10 1/5 3/10 2/5 Y 1 2 f(y) 1/5 1/3 7/15

14 Conditional Distribution:
Recall that for any two events A and B, the conditional probability of A given B is defined as; P( A\ B) = P (A ∩ B) P (B) Provided P(B) ≠ 0. Suppose now that A and B are the events X = x and Y = y. So that we can write P(X = x\ Y = y)= (1) P (X=x ,Y=y) P (Y = y) Provided P(Y = y) = fY(y) ± 0, where P(X = x, Y = y) = f(x, y) is the value of the joint probability distribution of X and Y at (x, y) and fy (y) is the value of the marginal distribution of Y at y. Denoting the probability in equation 1 by f(x\y) to indicate that x is a variable and y is fixed, then we have the following definition.

15 Definition: If f(x, y) is the value of the joint probability' distribution of the discrete random variable X and Y at (x, y) and fy (y) is the value of the marginal distribution of Y at y, the function given by ƒ (x , y) ƒY (y) ƒ(x\y) = P(X = x\ Y = y) = for each x within the range of X, is called the conditional distribution of X given Y = y. Similarly, if fx(x) is the value of the marginal distribution of X at x f(y/x) = P ( Y = y / X = x ) = ƒ (x , y) ƒx (y) For each y within the range of Y , is called the conditional distribution of Y given X = x

16 Example 1 The joint probability mass function of X and Y is given by Marginal of Y f(1,1) = f(1,2) = f(2,1)= f(2,2)= 1.Compute the conditional mass function of X given Y = i, i =1,2 2.Compute P(XY ≤ 3) = f(1, 1) + f(1, 2) + f(2, 1) = 1/2 3. P(X/Y> l) = f(2, 1)= 1/8 Y x 1 2 Sum 1/8 2/8 1/4 1/2 6/8 sum 3/8 5/8 y 1 2 f(y) 2/8 6/8 Marginal of y The conditional mass f n of X/Y =1 The conditional of X/Y=2 x 1 2 f(x\y=1) 1/2 y 1 2 Sum f(x\y=2) 1/3 2/3

17 Independent Random Variables:
In chapter 3 we stated that two events are independent if and only if that is, their joint probability is equal to the product of their marginal probabilities. In the following the notion of independence of random variables is presented. P(A ∩ B) = P(A) P(B)

18 Definition The random variables X and Y are said to be independent if and only if ƒ(x, y) = ƒx(x) fY(y) for all possible values of X and Y. In terms of the Joint cumulative distribution function, X and Y are independent if and only if F(x, y) = Fx(x) FY(y) For all possible values of X and Y Or P(X є A, Y є B) = P(X є A) P(Y єB) for every A, B.

19 Thus, loosely speaking, X and Y are independent if knowing the value of one does not change the distribution of the other. Random variables that are not independent are said to be dependent. checking for Independence of discrete random variables requires a very thorough investigation since it is possible to have the product of the Marginal distributions equal to the joint probability distribution for some but not all combinations of ( x,y) . If one can find any point ( x , y ) for which ƒ(x , y) ≠ ƒx (x) . ƒy(y) ; then the discrete variable X and Y are not independent

20 If X is independent of Y, then the conditional mass function is the same as the unconditional ones. This follows because if X is independent of Y, then ƒ(x\y) = P(X = x\Y = y) = P(X = x,Y = y) P(Y = y) = P(X = x)P(Y = y) P( Y = y) = P(X = x)

21 Definition Let X1, X2, ..., Xn, be n discrete random variable having
densities ƒ1, f2, ..., fn respectively. These random variables are said to be independent if their joint density function f is given by ƒ(x1,x2,..„xn)=ƒX1(xl)ƒX2(x2)...ƒXn(xn) for all (x1, x2, ..., xn) within their range.

22 Example Show that the random variables of Example 1 are not independent, f (1,1) = f(1,2)= 1 f (2,1)= f (2,2)= 1 f x (1) = , fx (2)= 5 , fy (1)= 2 , fy (2)=3 f(1,1)= 1 ≠ 6 = fx (1) . fy (1)  X and Y are not independent

23 ½ Example The random variables X and Y are specified by
P(X=1)= P(X=0)= P(Y= 1)= P( Y= -1) = 3 Construct the joint distribution of X and Y assuming that X and Y are independent Y X 1 -1 1/4 1/12 2/12


Download ppt "Chapter6 Jointly Distributed Random Variables"

Similar presentations


Ads by Google