Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which.

Similar presentations


Presentation on theme: "Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which."— Presentation transcript:

1 Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which a carcass is divided by a butcher. 4. adj., shared or common to two or more Chapter 5A Discrete RV

2 This week in Prob/Stat today’s good stuff time permitting

3 Joint Probability Distributions It is often useful (necessary) to have more than one RV defined in a random experiment. Examples: Polyethylene Specs X = Melt Point Y = Density Dimensions of a part X = length Y = width If X & Y are two RV’s, the probability distribution that defines their simultaneous behavior is a Joint Probability Distribution

4 Two Discrete Random Variables Let X = a discrete random variable, the number of orders placed per day for a high cost item Let Y = a discrete random variable, the number of items in stock Joint Probability Mass Function, f xy (x,y)

5 5-1 Two Discrete Random Variables 5-1.1 Joint Probability Distributions

6 Two Discrete Random Variables Let X = a discrete random variable, the number of orders placed per day for a high cost item Let Y = a discrete random variable, the number of items in stock Pr{X=0, Y=1} = f xy (0,1) =.15

7 5-1 Two Discrete Random Variables 5-1.2 Marginal Probability Distributions The individual probability distribution of a random variable is referred to as its marginal probability distribution. The marginal probability distribution of X can be determined from the joint probability distribution of X and other random variables. To determine P(X = x), we sum P(X = x, Y = y) over all points in the range of (X, Y ) for which X = x. Subscripts on the probability mass functions distinguish between the random variables.

8 5-1 Two Discrete Random Variables Definition: Marginal Probability Mass Functions

9 Two Discrete Random Variables Let X = a discrete random variable, the number of orders placed per day for a high cost item Let Y = a discrete random variable, the number of items in stock Pr{X = 1} = Pr{X=1, Y=0} + Pr{X=1, Y=1} + Pr{X=1, Y=2} + Pr{X=1, Y=3} = f x (1) =.1 +.17 +.05 +.01 =.33 Pr{Y  2} = f y (2) + f y (3) =.16 +.05 =.21

10 Marginal Mean & Variance If the marginal probability distribution of X has the probability mass function f X (x), then R x denotes all points of (X,Y) for which X = x and R denotes all points in the range of (X,Y)

11 Using the Marginal Distributions E[X] =  x = 0 (.39) + 1 (.33) + 2 (.28) =.89 E[Y] =  y = 0 (.38) + 1 (.41) + 2 (.16) + 3 (.05) =.88 Var[X] =  x 2 = 0 2 (.39) + 1 2 (.33) + 2 2 (.28) -.89 2 =.6579 Var[Y] =  y 2 = 0 2 (.38) + 1 2 (.41) + 2 2 (.16) + 3 2 (.05) -.88 2 =.7256

12 5-1.3 Conditional Probability Distributions

13 Conditional Distribution of Y f xy (x,y) f Y|x (y) = f xy (x,y) / f x (x) f Y|x = 1 (2) = f xy (1,2) / f x (1) =.05 /.33 =.151515

14 Conditional Distribution of X f xy (x,y) f X|y (x) = f xy (x,y) / f y (y) f X|y = 2 (1) = f xy (1,2) / f y (2) =.05 /.16 =.3125

15 Conditional Mean and Variance

16 Conditional Mean & Var of Y f Y|x (y) = f xy (x,y) / f x (x) E[Y|x=1] = 0 (.30303) + 1 (.515152) + 2 (.151515) + 3 (.030303) =.9091 Var[Y|x=1] = 0 (.30303) + 1 (.515152) + 4(.151515) + 9 (.030303) -.9091 2 =.5675

17 Conditional Mean & Var of X E[X|y =2] = 0 (.1875) + 1 (.3125) + 2 (.5) = 1.3125 f X|y (x) Var[X|y =2] = 0 (.1875) + 1 (.3125) + 4 (.5) - 1.3125 2 =.5896

18 5-1.4 Independence

19 Are X and Y Independent? f x (1) f y (2) = (.33) (.16) =.0528  f xy (1,2) =.05 f x (2) f y (0) = (.28) (.38) =.1064  f xy (2,0) =.08 No Chuck, they are not independent.

20 More on Independence Many evaluations of independence are based on knowledge of the physical situation. If we are reasoning based on data, we will need statistical tools to help us. It is very, very unlikely that counts and estimated probabilities will yield exact equalities as in the conditions for establishing independence.

21 The Search for Independence Let X = a discrete random variable, the number of defects in a lot of size 3 where the probability of a defect is a constant.1. Let Y = a discrete random, the demands in a given day for the number of units from the above lot.

22 The Search Continues assuming independence: f xy (x,y) = f x (x) f y (y) f xy (1,2) = f x (1) f y (2) = (.243) (.4) =.0972 Remember: P(A  B) = P(A) P(B) if A and B are independent

23 Recap - Sample Problem Assume X & Y are jointly distributed with the following joint probability mass function: Y X 1/8 1/16 3/16 1/4

24 Sample Problem Cont’d Determine the marginal probability distribution of X P(X = -1) = 1/8 + 1/8 = 1/4 P(X = -0.5) = 1/16 + 1/16 = 1/8 P(X = 0.5) = 3/16 + 1/4 = 7/16 P(X = 1) = 1/16 + 1/8 = 3/16

25 Sample Problem Cont’d Determine the conditional probability distribution of Y given that X = 1. P(Y = 1 | X = 1) = P(X = 1, Y = 1)/P(X = 1) = (1/16)/(3/16) = 1/3 P(Y = 2 | X = 1) = P(X = 1, Y = 2)/P(X = 1) = (1/8)/(3/16) = 2/3

26 Sample Problem Cont’d Determine the conditional probability distribution of X given that Y = 1. P(X = 0.5 | Y = 1) = P(X = 0.5, Y = 1)/P(Y = 1) = (1/4)/(5/16) = 4/5 P(X = 1 | Y = 1) = P(X = 1, Y = 1)/P(Y = 1) = (1/16)/(5/16) = 1/5

27 5-1.5 Multiple Discrete Random Variables Definition: Joint Probability Mass Function

28 5-1.5 Multiple Discrete Random Variables Definition: Marginal Probability Mass Function

29 5-1.5 Multiple Discrete Random Variables Mean and Variance from Joint Probability

30 5-1.6 Multinomial Probability Distribution

31

32 The Necessary Example Final inspection of products coming off of the assembly line categorizes every item as either acceptable, needing rework, or rejected. Historically, 90 percent have been acceptable, 7 percent needed rework, and 3 percent have been rejected. For the next 10 items that are produced, what is the probability that there will be 8 acceptable, 2 reworks, and no rejects? Let X 1 = number acceptable, X 2 = number reworks, X 3 = number rejects

33 More of the Necessary Example The production process is assumed to be out of control (i.e. the probability of an acceptable item is less than.9) if there are fewer than 8 acceptable items produced from a lot size of 10? What is the probability that the production process will be assumed to be out of control when the probability of an acceptable item remains.9? Let X 1 = number acceptable

34 This week in Prob/Stat Wednesday’s good stuff


Download ppt "Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which."

Similar presentations


Ads by Google