Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete.

Slides:



Advertisements
Similar presentations
Chapter 3 Properties of Random Variables
Advertisements

Chapter 5 Discrete Random Variables and Probability Distributions
Multivariate Distributions
Lecture note 6 Continuous Random Variables and Probability distribution.
Independence of random variables
Chapter 6 Random Variables
Chapter 4 Discrete Random Variables and Probability Distributions
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Probability Densities
Discrete Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
QA-2 FRM-GARP Sep-2001 Zvi Wiener Quantitative Analysis 2.
FRM Zvi Wiener Following P. Jorion, Financial Risk Manager Handbook Financial Risk Management.
Chapter 6 Continuous Random Variables and Probability Distributions
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Random Variable and Probability Distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
1 Random Variables and Discrete probability Distributions SESSION 2.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability Distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Definition of Covariance The covariance of X & Y, denoted Cov(X,Y), is the number where  X = E(X) and  Y = E(Y). Computational Formula:
Lecture 15: Expectation for Multivariate Distributions Probability Theory and Applications Fall 2008 Those who ignore Statistics are condemned to reinvent.
Chapter 5 Discrete Random Variables and Probability Distributions ©
1 1 Slide © 2009 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Variance and Covariance
1 G Lect 8b G Lecture 8b Correlation: quantifying linear association between random variables Example: Okazaki’s inferences from a survey.
Chapters 7 and 10: Expected Values of Two or More Random Variables
Continuous Distributions The Uniform distribution from a to b.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
Probability & Statistics I IE 254 Summer 1999 Chapter 4  Continuous Random Variables  What is the difference between a discrete & a continuous R.V.?
1 Lecture 14: Jointly Distributed Random Variables Devore, Ch. 5.1 and 5.2.
Chapter 5 Joint Continuous Probability Distributions Doubling our pleasure with two random variables Chapter 5C.
3.3 Expected Values.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Statistics for Business & Economics
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Continuous Random Variables and Probability Distributions
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 9 – Continuous Random Variables: Joint PDFs, Conditioning, Continuous Bayes Farinaz Koushanfar.
ENEE 324: Conditional Expectation Richard J. La Fall 2004.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 8 – Continuous Random Variables: PDF and CDFs Farinaz Koushanfar ECE Dept., Rice University.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
1 Two Discrete Random Variables The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is placed on each possible.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Pattern Recognition Mathematic Review Hamid R. Rabiee Jafar Muhammadi Ali Jalali.
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
+ Chapter 6 Random Variables 6.1Discrete and Continuous Random Variables 6.2Transforming and Combining Random Variables 6.3Binomial and Geometric Random.
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Keller: Stats for Mgmt & Econ, 7th Ed
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Independence of random variables
Chapter 2. Random Variables
IE 360: Design and Control of Industrial Systems I
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Mathematical Expectation
Presentation transcript:

Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete

Chapter 4 Homework – the stats mean87.3 std dev16.5 median91 skewness-2.12 minimum29 maximum100 1 quartile83 3 quartile97 interquartile range14 range71 kurtosis4.27 Kurtosis characterizes the relative peakedness or flatness of a distribution compared with the normal distribution. Positive kurtosis indicates a relatively peaked distribution. Negative kurtosis indicates a relatively flat distribution.

This week in Prob/Stat today’s good stuff as it pertains to discrete RV

Expected Value of a Function of Two Random Variables A common measure of the relationship between two random variables is the covariance. To describe covariance, we need to describe the expected value of a function of two RV’s: E[h(x,y)] can be thought of as the weighted average of h(x,y) for each point in the range of (X, Y) and represents the average value of h(X, Y).

Here is an old favorite joint distribution Let X = the number of orders placed per day for a high cost item Let Y = the number of items in stock at the start of the day f xy (x,y) Let Z = Y - X, the daily ending inventory. Then h(x,y) = Y - X, and E[h(,x,y)] = E[Z] = h(0,0) (.2) + h(1,0) (.1) + h(2,0) (.08) + h(0,1) (.15) + h(1,1) (.17) + h(2,1) (.09) + h(0,2) (.03) + h(1,2) (.05) + h(2,2) (.08) + h(0,3) (.01) + h(1,3) (.01) + h(2,3) (.03) = 0 (.2) + (-1)(.1) + (-2)(.08) + (1)(.15) + (0)(.17) + (-1)(.09) + (2)(.03) + (1)(.05) + (0)(.08) + (3)(.01) + (2)(.01) + (1)(.03) = -.01

Another look at the inventory Y sum X f(x,y) Y-X (Y-X)f(x,y) Recall: E[X] =  x = 0 (.39) + 1 (.33) + 2 (.28) =.89 E[Y] =  y = 0 (.38) + 1 (.41) + 2 (.16) + 3 (.05) =.88 E(Y – X) = E(Y) – E(X) = = -.01 Only works for linear relationships!

Covariance Definition: The covariance between the RV’s X & Y, denoted as cov(X,Y) or is:

Covariance If the points in the joint probability distribution tend to fall along a line of positive (negative) slope, then is positive (negative). Covariance measures the linear association between RV’s. Covariance is not a dimensionless quantity

Returning to our old favorite f xy (x,y) E[XY] = h(0,0) (.2) + h(1,0) (.1) + h(2,0) (.08) + h(0,1) (.15) + h(1,1) (.17) + h(2,1) (.09) + h(0,2) (.03) + h(1,2) (.05) + h(2,2) (.08) + h(0,3) (.01) + h(1,3) (.01) + h(2,3) (.03) = 0 (.2) + 0 (.1) + 0 (.08) + 0 (.15) + (1) (.17) + 2 (.09) + 0 (.03) + 2 (.05) + 4 (.08) + 0 (.01) + 3 (.01) + 6 (.03) =.98 Cov(XY) = E[XY] -  x  y =.98 – (.89) (.88) =.1968

Covariance Between X and Y Fig 5-13 Joint distributions and the sign of the covariance between X and Y.

Correlation Definition: the correlation between RV’s X and Y, denoted by is: For any two random variables X and Y Similar to covariance, correlation is also a measure of linear relationship between RV’s. If X and Y are independent RV’s, then Correlation is dimensionless and unaffected by the units chosen for measure.

Returning to, yes, the olde favorite I would say, a somewhat weak positive linear relationship.

5-3 Covariance and Correlation Example 5-26 Figure 5-14 Joint distribution for Example 5-26.

5-3 Covariance and Correlation Example 5-26 (continued)

Example 5-27 If P(X = 1) = 0.2, P(X = 2) = 0.6, P(X = 3) = 0.2 and Y = 2X + 5, then P(Y = 7) = 0.2, P(Y = 9) = 0.6, P(Y = 11) = 0.2. Determine the covariance and correlation between X and Y. X Y

Example 5-27 Cont’d E(XY) = (1 x 7 x 0.2) + (2 x 9 x 0.6) + (3 x 11 x 0.2) = 18.8 E(X) = (1 x 0.2) + (2 x 0.6) + (3 x 0.2) = 2.0 E(Y) = (7 x 0.2) + (9 x 0.6) + (11 x 0.2) = 9.0 V(X) = (1 – 2) 2 (0.2) + (2 – 2) 2 (0.6) + (3 – 2) 2 (0.2) = 0.4 V(Y) = (7 – 9) 2 (0.2) + (9 – 9) 2 (0.6) + (11 – 9) 2 (0.2) = 1.6

Example 5-27 Cont’d If we change the functional relationship of the RV’s X and Y to be Y = - 2X + 13 and leave the marginal probabilities of X (f X (x)) unchanged, compute the covariance and correlation again. Y X

Example 5-29 Cont’d For these changes, E(X), E(Y), V(X), and V(Y) remain the same. However, E(XY) = (1 x 11 x 0.2) + (2 x 9 x 0.6) + (3 x 7 x 0.2) = 17.2

Example 5-28 Finally, if we let X & Y have the following joint distribution, we can recompute the covariance and the correlation: X Y p Y X

Example 5-28 Cont’d For this set of values, E(X) and V(X) are unchanged from the previous versions. E(XY) = (1 x 7 x 0.2) + (2 x 9 x 0.6) + (3 x 7 x 0.2) = = 16.4 E(Y) = (7 x 0.4) + (9 x 0.6) = 8.2 V(Y) = (7 – 8.2) 2 (0.4) + (9 – 8.2) 2 (0.6) = 0.96 Are X and Y independent?

If X & Y are independent look

5-5 Linear Combinations of Random Variables Definition Mean of a Linear Combination

5-5 Linear Combinations of Random Variables Variance of a Linear Combination Can you provide any insight on this variance thing with covariance's?

A Demonstration A most enjoyable experience.

Linear Combinations of RV’s – A First Example If Y = 4X 1 + 3X 2 and X 1 and X 2 are independent, then: E(Y) = 4E(X 1 ) + 3E(X 2 ) V(Y)= 16V(X 1 ) + 9V(X 2 ) This squared coefficients results from the fact that the variable of interest is always squared when computing variance, so its constant must also be squared.

Something to Ponder Shouldn’t we subtract the variances? The Box: Let X 1 = 0, 1, 2, 3, …, 10; range = 10 X 2 = 0, 1, 2, 3, …, 10; range = 10 Then Y = -10, -9, … -1, 0, 1, 2, …,10; range = 20

Linear Combinations - An example Let X i = a discrete RV, the daily demands for product i, i = 1,…,n where X i ~ Geo(p i ) and R i = selling price of product i Let Y = total daily revenue and assume independence What about Pr{Y < y} = F(y) = ?

Reproductive Discrete Distributions Given m independent random variables, X i, i = 1,…,n

A Reproductive Example Let X i = a discrete random variable, the daily demand for a reversed cylindrical aluminum spring coil Daily demand is Poisson with a mean of 2.3. Lead-time is a fixed 5 working days. There are currently 8 units in stock. A purchase order for additional units has just been placed. What is the probability of not having a stock out before the next shipment arrives? Let Y = the Lead-time demand

5-5 Linear Combinations of Random Variables Mean and Variance of an Average Note that all X i have the same mean and variance

Example mean and variance of an average The number of nightly guests (rooms rented) at the Dewdrop Inn has a uniform distribution from 82 to 126. Determine the mean and variance of the average number of guests over a 30-day period.

5-6 General Functions of Random Variables These linear relationships are child’s play, What can you say about some nonlinear relationships among the random variables?

5-6 General Functions of (Discrete) Random Variables Given a discrete RV X with PMF f x (x) Let Y = h(x) and x = u(y) be the inverse function Then f y (y) = f x [u(y)] I could really use an example of this.

The Example Let X = a discrete RV, the number of units tested until a good (success) unit is found. X ~ Geo(.25). Let Y = cost of testing where Y = 20X 2 y = 20, 80, 180, 320, …

What Independence does for us If X 1 and X 2 are independent:

Some pondering (i.e. deep thinking) Now what can be said about Y = X 1 X 2 ? If X 1 and X 2 are independent:

An Example to Ponder A part made by the Kovery Ance Company is produced in lot sizes of 20 units. Following final assembly, each item is inspected with an independent.9 probability of passing. Those items passing inspection must then be calibrated and aligned. Each independent attempt at alignment has a one-third chance of succeeding. Find the mean and variance of the total number of alignment attempts that must be made for a given production lot.

Pondering that Example… Let X 1 = a discrete RV, the number of units in a production lot that passes final inspection. Let X 2 = a discrete RV, the number of alignment attempts until successful. Let Y = total number of alignment attempts

Now Ponder the Variance

Next Time