Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 9: Joint distributions and independence CIS 3033.

Similar presentations


Presentation on theme: "Chapter 9: Joint distributions and independence CIS 3033."— Presentation transcript:

1 Chapter 9: Joint distributions and independence CIS 3033

2 9.1 Joint distributions: discrete Number of random variables: one, two, and more, especially when they are defined on the same sample space. [Otherwise consider the products of sample spaces, as in Section 2.4.]Section 2.4 What is new: influence between variables, as relation among events. For example, two random variables S and M, the sum and the maximum of two throws of a die.

3 9.1 Joint distributions: discrete The joint probability mass function of discrete random variables X and Y (on the same sample space Ω) is the function p: R 2 → [0, 1] defined by p(a, b) = P(X = a, Y = b) for −∞< a,b < ∞. The joint distribution function of random variables X and Y is the function F: R 2 → [0, 1] defined by F(a, b) = P(X ≤ a, Y ≤ b) for −∞< a,b < ∞. The marginal probability mass function of discrete random variables X or Y can be obtained from p(a, b) by summing the values of the other variable.

4 9.1 Joint distributions: discrete

5 In many cases the joint probability mass functions of X and Y cannot be retrieved from the marginal probability mass functions p X and p Y. This is also the case for the distribution functions.

6 9.2 Joint distributions: continuous

7 For the distribution functions, the relation is the same as the discrete case, as given in formula (9.1) and (9.2).

8 9.3 More than two random variables The joint distribution function F of X 1,X 2,..., X n (all defined in the same Ω) is defined by F(a 1, a 2,..., a n ) = P(X 1 ≤ a 1, X 2 ≤ a 2,..., X n ≤ a n ) for −∞ < a 1, a 2,..., a n < ∞. Joint probability mass function p can be defined for discrete random variables, and joint density function f can be defined for continues random variables, just like the case of two-variable.

9 9.3 More than two random variables Suppose a vase contains N balls numbered 1, 2,..., N, and we draw n balls without replacement. Since there are N(N−1) · · · (N−n+1) possible combinations for the values of X 1,X 2,..., X n, each having the same probability, the joint probability mass function is given by p(a 1, a 2,..., a n ) = P(X 1 =a 1, X 2 =a 2,..., X n =a n ) = 1 / [N(N − 1) · · · (N − n + 1)], for all distinct values a 1, a 2,..., a n with 1 ≤ a j ≤ N. The marginal distribution of each X i is p Xi (k) = 1/N.

10 9.4 Independent random variables Random variables X and Y are independent if every event involving only X is independent of every event involving only Y. Random variables that are not independent are called dependent. Random variables X and Y are independent if P(X ≤ a, Y ≤ b) = P(X ≤ a)P(Y ≤ b), that is, the joint distribution function F(a, b) = F X (a)F Y (b), for all possible values of a and b. The same conclusion applies to probability mass function p and density function f.

11 9.5 Propagation of independence


Download ppt "Chapter 9: Joint distributions and independence CIS 3033."

Similar presentations


Ads by Google