3.4 Joint Probability Distributions

Slides:



Advertisements
Similar presentations
Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Advertisements

Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Chapter 5 Discrete Random Variables and Probability Distributions
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Multivariate Distributions
Lecture note 6 Continuous Random Variables and Probability distribution.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
5.4 Joint Distributions and Independence
Joint Distributions, Marginal Distributions, and Conditional Distributions Note 7.
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
Statistics Lecture 18. Will begin Chapter 5 today.
Discrete Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Lecture 2 1 Probability theory Random variables Definition: A random variable X is a function defined on S, which takes values on the real axis In an experiment.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
Jointly distributed Random variables
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Joint Probability Distributions Leadership in Engineering
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
: Appendix A: Mathematical Foundations 1 Montri Karnjanadecha ac.th/~montri Principles of.
Estimation Basic Concepts & Estimation of Proportions
Jointly Distributed Random Variables
Chapter 5 Discrete Random Variables and Probability Distributions ©
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Chapter 3 Random vectors and their numerical characteristics.
LECTURE IV Random Variables and Probability Distributions I.
1 Lecture 14: Jointly Distributed Random Variables Devore, Ch. 5.1 and 5.2.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Statistics for Business & Economics
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Lecture 14 Prof. Dr. M. Junaid Mughal Mathematical Statistics 1.
Chapter 3 Multivariate Random Variables
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
EGR Joint Probability Distributions The probabilities associated with two things both happening, e.g. … –probability associated with the hardness.
President UniversityErwin SitompulPBST 3/1 Dr.-Ing. Erwin Sitompul President University Lecture 3 Probability and Statistics
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Continuous Random Variables and Probability Distributions
Jointly distributed Random variables Multivariate distributions.
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions.
MULTIPLE RANDOM VARIABLES A vector random variable X is a function that assigns a vector of real numbers to each outcome of a random experiment. e.g. Random.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Copyright © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 3 Random Variables and Probability Distributions.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Basics of Multivariate Probability
Statistics Lecture 19.
Random Variable 2013.
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Some Rules for Expectation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Cumulative Distribution Function
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Random Variables and Probability Distributions
Discrete Random Variables and Probability Distributions
Moments of Random Variables
Presentation transcript:

3.4 Joint Probability Distributions Joint Probability of two discrete random variables Joint probability of two continuous random variables Marginal distributions Conditional probability distributions Independence of two or more random variables

Reality However, there are many problems in which two or more random variables need to be studied simultaneously For example: 1. We might wish to study the number of available check-in counters at an airport in conjunction with the number of customers waiting in queue. 2. We might wish to study the yield of chemical reaction together with the temperature at which the reaction is run.

Typical questions to ask are: “What is average number of customers in the queue given that number of available counters is 5 ?”  “Is the yield independent of the temperature?” “What is the average yield if temperature is 40 C?”   To answer the questions of this type, we need to study what are called two-dimensional or multi-dimensional random variables of both discrete and continuous types.

Joint Probability Distribution of Discrete Random Variables Definition 3.8 Let X and Y be random variables The order the pair (X, Y) is called a two dimensional random variable. Ω ω X(s) S s y Y(s) x

For any region A in the xy plane, P[(X, Y)A] = Definition 3.8 A function f(x, y) is the joint probability distribution function or probability mass function for two-dimensional discrete random variable (X, Y) if: 1. f(x, y)  0 for all (x, y) 2. = 1 3. P(X = x, Y = y) = f(x, y) For any region A in the xy plane, P[(X, Y)A] = f(x, y) represents the probability distribution for the simultaneous occurrence of (X, Y) in any pair of (x, y) within the range of random variables X and Y.

Table for Joint Probability Distribution

Example 3.8, page 75 Two refills for a ballpoint pen are selected at random from a box that contains 3 blue refills, 2 red refills, and 3 green refills. If X is the number of blue refills and Y is the number of red refills selected, find (a) the joint probability function f(x,y), and (b) P((X,Y)A), where A is the region {(x, y) | x + y  1}

(a) find the joint probability function f(x,y)   f(0,0) = P(X=0,Y=0) = =3/28 f(0,1) = P(X=0,Y=1) = =3/14   In general,  f(x,y) = P(X =x, Y = y) = 1 2 Y X 3/28 9/28 3/14 1/28 (See Table 3.1, page 75.)

(b) P ((X,Y)A), where A is the region {(x, y) | x + y  1} because (0, 0), (0, 1) and (1, 0) are the ones , such that x + y  1, so: P((X,Y) A) = f(0, 0) + f(0, 1) + f(1, 0) = 3/28 + 3/14 + 9/28 = 9/14

2. Joint probability of two continuous random variables Definition 3.9 Let X and Y be continuous random variables. The order the pair (X,Y) is called a two dimensional continuous random variable. A function f(x, y) is the joint density function for (X, Y) if 1. f(x, y)  0 for all (x, y) 2. = 1 3. P((X, Y)A) = for any region A in the xy plane.

Example 3.9, page 76 A candy company distributes boxes of chocolates with a mixture of creams, toffees, and nuts coated in both light and dark chocolate. For a randomly selected box, let X and Y, respectively, be the proportions of the light and dark chocolates that are creams and suppose that the joint density function is (square) (may be circular, triangular or other regions) (a) Determine c. (b) Find P[(X, Y)A], A is the region A={(x, y) | 0 < x < ½, ¼ < y < ½ } (c) Find P[(X, Y)B], where B is the region B={(x, y) | 0 < x < y < 1 }

Solution (a) c = =2/5 (need to have ) (b) P[(X, Y)A] = =13/160 (c) P[(X, Y)B] = = x y 1 1/2 1/4

3.4 Joint Probability Distributions Joint Probability of two discrete random variables Joint probability of two continuous random variables Marginal distribution Conditional probability distributions Independence of two or more random variables

For any region A in the xy plane, P[(X, Y)A] = Definition 3.8 A function f(x, y) is the joint probability distribution function or probability mass function for two-dimensional discrete random variable (X, Y) if: 1. f(x, y)  0 for all (x, y) 2. = 1 3. P(X = x, Y = y) = f(x, y) For any region A in the xy plane, P[(X, Y)A] = f(x, y) represents the probability distribution for the simultaneous occurrence of (X, Y) in any pair of (x, y) within the range of random variables X and Y.

2. Joint probability of two continuous random variables Definition 3.9 Let X and Y be continuous random variables. The order the pair (X,Y) is called a two dimensional continuous random variable. A function f(x, y) is the joint density function for (X, Y) if 1. f(x, y)  0 for all (x, y) 2. = 1 3. P((X, Y)A) = for any region A in the xy plane.

3. Marginal distribution Definition 3.10 The marginal distributions of X alone and of Y alone are P(X = x) = g(x) = P(Y = y) = h(y) = for the discrete case, and by , for the continuous case.

Example 3.10, page 77 Use table 3.1, page 75 to find the marginal distributions for X and Y of Example 3.8. 1 2 Y X 3/28 9/28 3/14 1/28 h(y) g(x) 15/28 3/7 5/14 X g(x) 1 2 5/14 15/28 3/28 Y h(y) 1 2 15/28 3/7 1/28 g(0) = P(X = 0) = f(0, 0) + f(0, 1) + f(0, 2) = 3/28 + 3/14 + 1/ 28 = 5/14 g(1) = P(X = 1) = f(1, 0) + f(1, 1) + f(1, 2) = 9/28 + 3/14 + 0 = 15/28 g(2) = P(X = 2) = f(2, 0) + f(2, 1) + f(2, 2) = 3/28 + 0 + 0 = 3/28

Example Suppose (X, Y) has the joint density function Find P{X  x, Y  y} when x>0, y>0 (b) Find marginal distribution of X. x≥ 0 and g(x)=0 else where.

4.Conditional probability distributions Clearly, for discrete cases, P[X = x | Y = y] = = For continuous cases, it can be shown that P[X  x | Y = y] = Hence it is natural to define the conditional probability distributions as follows.

Definition Definition 3.11 Let X and Y be two random variables, discrete or continuous. The conditional distribution of the random variable X, given Y = y, , h(y) > 0 Similarly, the conditional distribution of the random variable Y, given X = x, is , g(x) > 0 P(a < X < b | Y = y) = for discrete cases. P(a < X < b | Y = y) = for continuous cases.

Example 3.12, page 79 f(x|1)=f(x,1) /h(1)=(7/3) f(x,1) x=0,1,2 Referring to Example 3.8, table 3.1, find the conditional distribution of X , given that Y=1 and use it to determine P(X = 0|Y= 1) f(x|1)=f(x,1) /h(1)=(7/3) f(x,1) x=0,1,2 f(0|1)=(7/3)f(0,1)=(7/3)(3/14)=1/2 f(1|1)=(7/3)f(1,1)=(7/3)(3/14)=1/2 f(2|1)=(7/3)f(2,1)=(7/3)(0)=0 Y g(x) 1 2 X 3/28 9/28 3/14 1/28 h(y) 15/28 3/7 5/14 X 1 2 f(x|1) 1/2 P(X = 0|Y= 1) =f(0|1) = f(0, 1)/h(1) = (3/14)/(3/7) = ½

Example 3.14 Page81 Given the joint density function: Find g(x), h(y), f(x|y), and evaluate P(1/4<X<1/2|Y=1/3). Solution: By definition and Therefore,

5.Independence of two or more random variables Definition 3.12 Let X and Y be two random variables, discrete or continuous, with joint probability distribution f(x,y) and marginal distribution g(x) and h(y), respectively. The random variable X and Y are said to be statistically independent if and only if f(x, y) = g(x)h(y) for all (x, y) within their range.

Example for discrete: ∵ f(x, y) = g(x)h(y) for ∀ x, y 1/2 1 2 -1 ∵ f(x, y) = g(x)h(y) for ∀ x, y ∴ X and Y are said to be statistically independent

Example 1.Example 3.15, page 82. From table 3.1, page 75, 1 2 Y X 3/28 9/28 3/14 1/28 h(y) g(x) 15/28 3/7 5/14 f(0, 1) = 3/14, g(0) = 5/14, h(1) = 3/7, f(0,1)  g(0)h(1) X and Y are not statistically independent. 2.Example 3.14, page 81. g(x)h(y)=f(x,y) X and Y are statistically independent.

Read page 82 – 83 Joint probability distribution for more than two random variables. Different marginal distributions and conditional distributions Definition 3.13 Let X1, X2, …, Xn be n random variables, discrete or continuous, with joint distribution f(x1, x2, …, xn) and marginal distributions f1(x1), f2(x2) , …, fn(xn), respectively. The random variables X1, X2, …, Xn are said to be mutually statistically independent if and only if f(x1, x2, …, xn) = f1(x1) f2(x2)··· fn(xn) for all x1, x2, …, xn. Remark: Distribution of X = X1 + X2 + ··· + Xn

Example 3. 16, page 83. Suppose that the shelf life, in years, of a certain perishable food product packaged in cardboard containers is a random variable whose probability density function is given by Let X1, X2, and X3 represent the shelf lives for three of these containers selected independently, find P(X1< 2, 1 < X2 < 3, X3 >2). P(X1< 2, 1 < X2 < 3, X3 >2)= =

Example 4 The joint density function of X and Y is given by Determine the conditional densities. Solution: The marginal densities for the given X and Y are Hence, from the formulas for conditional densities, we have Because , it is clear that X and Y are not independent.