Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete.

Similar presentations


Presentation on theme: "Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete."— Presentation transcript:

1 Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete

2 Chapter 4 Homework – the stats mean87.3 std dev16.5 median91 skewness-2.12 minimum29 maximum100 1 quartile83 3 quartile97 interquartile range14 range71 kurtosis4.27 Kurtosis characterizes the relative peakedness or flatness of a distribution compared with the normal distribution. Positive kurtosis indicates a relatively peaked distribution. Negative kurtosis indicates a relatively flat distribution.

3 This week in Prob/Stat today’s good stuff as it pertains to discrete RV

4 Expected Value of a Function of Two Random Variables A common measure of the relationship between two random variables is the covariance. To describe covariance, we need to describe the expected value of a function of two RV’s: E[h(x,y)] can be thought of as the weighted average of h(x,y) for each point in the range of (X, Y) and represents the average value of h(X, Y).

5 Here is an old favorite joint distribution Let X = the number of orders placed per day for a high cost item Let Y = the number of items in stock at the start of the day f xy (x,y) Let Z = Y - X, the daily ending inventory. Then h(x,y) = Y - X, and E[h(,x,y)] = E[Z] = h(0,0) (.2) + h(1,0) (.1) + h(2,0) (.08) + h(0,1) (.15) + h(1,1) (.17) + h(2,1) (.09) + h(0,2) (.03) + h(1,2) (.05) + h(2,2) (.08) + h(0,3) (.01) + h(1,3) (.01) + h(2,3) (.03) = 0 (.2) + (-1)(.1) + (-2)(.08) + (1)(.15) + (0)(.17) + (-1)(.09) + (2)(.03) + (1)(.05) + (0)(.08) + (3)(.01) + (2)(.01) + (1)(.03) = -.01

6 Another look at the inventory Y012301230123sum X000011112222 f(x,y)0.20.150.030.010.10.170.050.010.080.090.080.031 Y-X0123012-201 (Y-X)f(x,y)00.150.060.03-0.100.050.02-0.2-0.100.03-0.01 Recall: E[X] =  x = 0 (.39) + 1 (.33) + 2 (.28) =.89 E[Y] =  y = 0 (.38) + 1 (.41) + 2 (.16) + 3 (.05) =.88 E(Y – X) = E(Y) – E(X) =.88 -.89 = -.01 Only works for linear relationships!

7 Covariance Definition: The covariance between the RV’s X & Y, denoted as cov(X,Y) or is:

8 Covariance If the points in the joint probability distribution tend to fall along a line of positive (negative) slope, then is positive (negative). Covariance measures the linear association between RV’s. Covariance is not a dimensionless quantity

9 Returning to our old favorite f xy (x,y) E[XY] = h(0,0) (.2) + h(1,0) (.1) + h(2,0) (.08) + h(0,1) (.15) + h(1,1) (.17) + h(2,1) (.09) + h(0,2) (.03) + h(1,2) (.05) + h(2,2) (.08) + h(0,3) (.01) + h(1,3) (.01) + h(2,3) (.03) = 0 (.2) + 0 (.1) + 0 (.08) + 0 (.15) + (1) (.17) + 2 (.09) + 0 (.03) + 2 (.05) + 4 (.08) + 0 (.01) + 3 (.01) + 6 (.03) =.98 Cov(XY) = E[XY] -  x  y =.98 – (.89) (.88) =.1968

10 Covariance Between X and Y Fig 5-13 Joint distributions and the sign of the covariance between X and Y.

11 Correlation Definition: the correlation between RV’s X and Y, denoted by is: For any two random variables X and Y Similar to covariance, correlation is also a measure of linear relationship between RV’s. If X and Y are independent RV’s, then Correlation is dimensionless and unaffected by the units chosen for measure.

12 Returning to, yes, the olde favorite I would say, a somewhat weak positive linear relationship.

13 5-3 Covariance and Correlation Example 5-26 Figure 5-14 Joint distribution for Example 5-26.

14 5-3 Covariance and Correlation Example 5-26 (continued)

15 Example 5-27 If P(X = 1) = 0.2, P(X = 2) = 0.6, P(X = 3) = 0.2 and Y = 2X + 5, then P(Y = 7) = 0.2, P(Y = 9) = 0.6, P(Y = 11) = 0.2. Determine the covariance and correlation between X and Y. X Y 11 9 7 1 2 3 0.2 0.6 0.2

16 Example 5-27 Cont’d E(XY) = (1 x 7 x 0.2) + (2 x 9 x 0.6) + (3 x 11 x 0.2) = 18.8 E(X) = (1 x 0.2) + (2 x 0.6) + (3 x 0.2) = 2.0 E(Y) = (7 x 0.2) + (9 x 0.6) + (11 x 0.2) = 9.0 V(X) = (1 – 2) 2 (0.2) + (2 – 2) 2 (0.6) + (3 – 2) 2 (0.2) = 0.4 V(Y) = (7 – 9) 2 (0.2) + (9 – 9) 2 (0.6) + (11 – 9) 2 (0.2) = 1.6

17 Example 5-27 Cont’d If we change the functional relationship of the RV’s X and Y to be Y = - 2X + 13 and leave the marginal probabilities of X (f X (x)) unchanged, compute the covariance and correlation again. Y X 1 2 3 11 9 7 0.2 0.6 0.2

18 Example 5-29 Cont’d For these changes, E(X), E(Y), V(X), and V(Y) remain the same. However, E(XY) = (1 x 11 x 0.2) + (2 x 9 x 0.6) + (3 x 7 x 0.2) = 17.2

19 Example 5-28 Finally, if we let X & Y have the following joint distribution, we can recompute the covariance and the correlation: X Y p 1 7 0.2 2 9 0.6 3 7 0.2 Y X 11 9 7 1 2 3 0.2 0.6 0.2

20 Example 5-28 Cont’d For this set of values, E(X) and V(X) are unchanged from the previous versions. E(XY) = (1 x 7 x 0.2) + (2 x 9 x 0.6) + (3 x 7 x 0.2) = = 16.4 E(Y) = (7 x 0.4) + (9 x 0.6) = 8.2 V(Y) = (7 – 8.2) 2 (0.4) + (9 – 8.2) 2 (0.6) = 0.96 Are X and Y independent?

21 If X & Y are independent look

22 5-5 Linear Combinations of Random Variables Definition Mean of a Linear Combination

23 5-5 Linear Combinations of Random Variables Variance of a Linear Combination Can you provide any insight on this variance thing with covariance's?

24 A Demonstration A most enjoyable experience.

25 Linear Combinations of RV’s – A First Example If Y = 4X 1 + 3X 2 and X 1 and X 2 are independent, then: E(Y) = 4E(X 1 ) + 3E(X 2 ) V(Y)= 16V(X 1 ) + 9V(X 2 ) This squared coefficients results from the fact that the variable of interest is always squared when computing variance, so its constant must also be squared.

26 Something to Ponder Shouldn’t we subtract the variances? The Box: Let X 1 = 0, 1, 2, 3, …, 10; range = 10 X 2 = 0, 1, 2, 3, …, 10; range = 10 Then Y = -10, -9, … -1, 0, 1, 2, …,10; range = 20

27 Linear Combinations - An example Let X i = a discrete RV, the daily demands for product i, i = 1,…,n where X i ~ Geo(p i ) and R i = selling price of product i Let Y = total daily revenue and assume independence What about Pr{Y < y} = F(y) = ?

28 Reproductive Discrete Distributions Given m independent random variables, X i, i = 1,…,n

29 A Reproductive Example Let X i = a discrete random variable, the daily demand for a reversed cylindrical aluminum spring coil Daily demand is Poisson with a mean of 2.3. Lead-time is a fixed 5 working days. There are currently 8 units in stock. A purchase order for additional units has just been placed. What is the probability of not having a stock out before the next shipment arrives? Let Y = the Lead-time demand

30 5-5 Linear Combinations of Random Variables Mean and Variance of an Average Note that all X i have the same mean and variance

31 Example mean and variance of an average The number of nightly guests (rooms rented) at the Dewdrop Inn has a uniform distribution from 82 to 126. Determine the mean and variance of the average number of guests over a 30-day period.

32 5-6 General Functions of Random Variables These linear relationships are child’s play, What can you say about some nonlinear relationships among the random variables?

33 5-6 General Functions of (Discrete) Random Variables Given a discrete RV X with PMF f x (x) Let Y = h(x) and x = u(y) be the inverse function Then f y (y) = f x [u(y)] I could really use an example of this.

34 The Example Let X = a discrete RV, the number of units tested until a good (success) unit is found. X ~ Geo(.25). Let Y = cost of testing where Y = 20X 2 y = 20, 80, 180, 320, …

35 What Independence does for us If X 1 and X 2 are independent:

36 Some pondering (i.e. deep thinking) Now what can be said about Y = X 1 X 2 ? If X 1 and X 2 are independent:

37 An Example to Ponder A part made by the Kovery Ance Company is produced in lot sizes of 20 units. Following final assembly, each item is inspected with an independent.9 probability of passing. Those items passing inspection must then be calibrated and aligned. Each independent attempt at alignment has a one-third chance of succeeding. Find the mean and variance of the total number of alignment attempts that must be made for a given production lot.

38 Pondering that Example… Let X 1 = a discrete RV, the number of units in a production lot that passes final inspection. Let X 2 = a discrete RV, the number of alignment attempts until successful. Let Y = total number of alignment attempts

39 Now Ponder the Variance

40 Next Time


Download ppt "Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete."

Similar presentations


Ads by Google