Jointly distributed Random variables

Slides:



Advertisements
Similar presentations
Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Advertisements

Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Lecture (7) Random Variables and Distribution Functions.
Chapter 5 Discrete Random Variables and Probability Distributions
The Bernoulli distribution Discrete distributions.
1 Set #3: Discrete Probability Functions Define: Random Variable – numerical measure of the outcome of a probability experiment Value determined by chance.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
15 MULTIPLE INTEGRALS.
Chapter 4 Discrete Random Variables and Probability Distributions
5.1 Sampling Distributions for Counts and Proportions.
5.4 Joint Distributions and Independence
Probability Densities
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
Evaluating Hypotheses
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability Distributions
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Joint Distribution of two or More Random Variables
Chapter6 Jointly Distributed Random Variables
Joint Probability Distributions Leadership in Engineering
Mathematics for Business (Finance)
ME 2304: 3D Geometry & Vector Calculus Dr. Faraz Junejo Double Integrals.
Probability The definition – probability of an Event Applies only to the special case when 1.The sample space has a finite no.of outcomes, and 2.Each.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Copyright © Cengage Learning. All rights reserved.
Modular 11 Ch 7.1 to 7.2 Part I. Ch 7.1 Uniform and Normal Distribution Recall: Discrete random variable probability distribution For a continued random.
Continuous Distributions The Uniform distribution from a to b.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Ch 2. Probability Distributions (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by Yung-Kyun Noh and Joo-kyung Kim Biointelligence.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Discrete Random Variables. Discrete random variables For a discrete random variable X the probability distribution is described by the probability function,
Chapter 3 Multivariate Random Variables
Copyright © Cengage Learning. All rights reserved.
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
Continuous Random Variables and Probability Distributions
Jointly distributed Random variables Multivariate distributions.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
12.SPECIAL PROBABILITY DISTRIBUTIONS
Continuous Distributions
Chapter Five The Binomial Probability Distribution and Related Topics
Random Variables.
Expectations of Random Variables, Functions of Random Variables
Copyright © Cengage Learning. All rights reserved.
The Bernoulli distribution
Some Rules for Expectation
Copyright © Cengage Learning. All rights reserved.
ASV Chapters 1 - Sample Spaces and Probabilities
Random Variables Binomial Distributions
Discrete Random Variables and Probability Distributions
Continuous Distributions
Moments of Random Variables
Presentation transcript:

Jointly distributed Random variables Multivariate distributions

Quite often there will be 2 or more random variables (X, Y, etc) defined for the same random experiment. Example: A bridge hand (13 cards) is selected from a deck of 52 cards. X = the number of spades in the hand. Y = the number of hearts in the hand. In this example we will define: p(x,y) = P[X = x, Y = y]

The function p(x,y) = P[X = x, Y = y] is called the joint probability function of X and Y.

The possible values of X are 0, 1, 2, …, 13 Note: The possible values of X are 0, 1, 2, …, 13 The possible values of Y are also 0, 1, 2, …, 13 and X + Y ≤ 13. The number of ways of choosing the y hearts for the hand The number of ways of choosing the x spades for the hand The number of ways of completing the hand with diamonds and clubs. The total number of ways of choosing the 13 cards for the hand

Table: p(x,y)

Bar graph: p(x,y) p(x,y) y x

Example: A die is rolled n = 5 times X = the number of times a “six” appears. Y = the number of times a “five” appears. Now p(x,y) = P[X = x, Y = y] The possible values of X are 0, 1, 2, 3, 4, 5. The possible values of Y are 0, 1, 2, 3, 4, 5. and X + Y ≤ 5

A typical outcome of rolling a die n = 5 times will be a sequence F5FF6 where F denotes the outcome {1,2,3,4}. The probability of any such sequence will be: where x = the number of sixes in the sequence and y = the number of fives in the sequence

Now p(x,y) = P[X = x, Y = y] Where K = the number of sequences of length 5 containing x sixes and y fives.

Thus p(x,y) = P[X = x, Y = y] if x + y ≤ 5 .

Table: p(x,y)

Bar graph: p(x,y) p(x,y) y x

General properties of the joint probability function; p(x,y) = P[X = x, Y = y]

Example: A die is rolled n = 5 times X = the number of times a “six” appears. Y = the number of times a “five” appears. What is the probability that we roll more sixes than fives i.e. what is P[X > Y]?

Table: p(x,y)

Marginal and conditional distributions

Definition: Let X and Y denote two discrete random variables with joint probability function p(x,y) = P[X = x, Y = y] Then pX(x) = P[X = x] is called the marginal probability function of X. and pY(y) = P[Y = y] is called the marginal probability function of Y.

Note: Let y1, y2, y3, … denote the possible values of Y. Thus the marginal probability function of X, pX(x) is obtained from the joint probability function of X and Y by summing p(x,y) over the possible values of Y.

Also

Example: A die is rolled n = 5 times X = the number of times a “six” appears. Y = the number of times a “five” appears.

Conditional Distributions

Definition: Let X and Y denote two discrete random variables with joint probability function p(x,y) = P[X = x, Y = y] Then pX |Y(x|y) = P[X = x|Y = y] is called the conditional probability function of X given Y = y and pY |X(y|x) = P[Y = y|X = x] is called the conditional probability function of Y given X = x

Note and

Marginal distributions describe how one variable behaves ignoring the other variable. Conditional distributions describe how one variable behaves when the other variable is held fixed

Example: A die is rolled n = 5 times X = the number of times a “six” appears. Y = the number of times a “five” appears. y x

The conditional distribution of X given Y = y. pX |Y(x|y) = P[X = x|Y = y] y x

The conditional distribution of Y given X = x. pY |X(y|x) = P[Y = y|X = x] y x

Example A Bernoulli trial (S - p, F – q = 1 – p) is repeated until two successes have occurred. X = trial on which the first success occurs and Y = trial on which the 2nd success occurs. Find the joint probability function of X, Y. Find the marginal probability function of X and Y. Find the conditional probability functions of Y given X = x and X given Y = y,

Solution A typical outcome would be: FFF…FSFFF…FS x y x - 1 y – x - 1

p(x,y) - Table y 1 2 3 4 5 6 7 8 x p2 p2q p2q2 p2q3 p2q4 p2q5 p2q6

The marginal distribution of X This is the geometric distribution

The marginal distribution of Y This is the negative binomial distribution with k = 2.

The conditional distribution of X given Y = y This is the geometric distribution with time starting at x.

The conditional distribution of Y given X = x This is the uniform distribution on the values 1, 2, …(y – 1)

Summary Discrete Random Variables

The joint probability function; p(x,y) = P[X = x, Y = y]

Continuous Random Variables

Definition: Two random variable are said to have joint probability density function f(x,y) if

If then defines a surface over the x – y plane

Multiple Integration

A f(x,y) A

If the region A = {(x,y)| a ≤ x ≤ b, c ≤ y ≤ d} is a rectangular region with sides parallel to the coordinate axes: x y d c a b Then A f(x,y)

First evaluate the inner integral To evaluate A First evaluate the inner integral Then evaluate the outer integral f(x,y)

= area under surface above the line where y is constant x y d c a b dy y = area under surface above the line where y is constant Infinitesimal volume under surface above the line where y is constant f(x,y)

First evaluate the inner integral The same quantity can be calculated by integrating first with respect to y, than x. A First evaluate the inner integral Then evaluate the outer integral f(x,y)

= area under surface above the line where x is constant y d c a b dx x = area under surface above the line where x is constant Infinitesimal volume under surface above the line where x is constant f(x,y)

Example: Compute Now f(x,y)

The same quantity can be computed by reversing the order of integration f(x,y)

Integration over non rectangular regions

Suppose the region A is defined as follows A = {(x,y)| a(y) ≤ x ≤ b(y), c ≤ y ≤ d} y d c x Then b(y) a(y) A

If the region A is defined as follows A = {(x,y)| a ≤ x ≤ b, c(x) ≤ y ≤ d(x) } y d(x) c(x) x Then a b A

In general the region A can be partitioned into regions of either type x

Example: Compute the volume under f(x,y) = x2y + xy3 over the region A = {(x,y)| x + y ≤ 1, 0 ≤ x, 0 ≤ y} y (0, 1) x + y = 1 (1, 0) x f(x,y)

Integrating first with respect to x than y x + y = 1 (1, 0) (0, 1) (0, y) (1 - y, y) A f(x,y)

and

Now integrating first with respect to y than x (0, 1) x + y = 1 (x, 1 – x ) (1, 0) (x, 0) x A

Hence

Continuous Random Variables

Definition: Two random variable are said to have joint probability density function f(x,y) if

Definition: Let X and Y denote two random variables with joint probability density function f(x,y) then the marginal density of X is the marginal density of Y is

Definition: Let X and Y denote two random variables with joint probability density function f(x,y) and marginal densities fX(x), fY(y) then the conditional density of Y given X = x conditional density of X given Y = y

The bivariate Normal distribution

Let where This distribution is called the bivariate Normal distribution. The parameters are m1, m2 , s1, s2 and r.

Surface Plots of the bivariate Normal distribution

Note: is constant when is constant. This is true when x1, x2 lie on an ellipse centered at m1, m2 .

Marginal and Conditional distributions

Marginal distributions for the Bivariate Normal distribution Recall the definition of marginal distributions for continuous random variables: and It can be shown that in the case of the bivariate normal distribution the marginal distribution of xi is Normal with mean mi and standard deviation si.

Proof: The marginal distributions of x2 is where

Now:

Hence Also and

Finally

and

Summarizing where and

Thus

Thus the marginal distribution of x2 is Normal with mean m2 and standard deviation s2. Similarly the marginal distribution of x1 is Normal with mean m1 and standard deviation s1.

Conditional distributions for the Bivariate Normal distribution Recall the definition of conditional distributions for continuous random variables: and It can be shown that in the case of the bivariate normal distribution the conditional distribution of xi given xj is Normal with: and mean standard deviation

Proof

where and Hence Thus the conditional distribution of x2 given x1 is Normal with: and mean standard deviation

Bivariate Normal Distribution with marginal distributions

Bivariate Normal Distribution with conditional distribution

x2 ( m1, m2) x1 Major axis of ellipses Regression Regression to the mean x1

Example: Suppose that a rectangle is constructed by first choosing its length, X and then choosing its width Y. Its length X is selected form an exponential distribution with mean m = 1/l = 5. Once the length has been chosen its width, Y, is selected from a uniform distribution form 0 to half its length. Find the probability that its area A = XY is less than 4.

Solution:

xy = 4 y = x/2

This part can be evaluated This part may require Numerical evaluation

multivariate distributions k ≥ 2

Definition Let X1, X2, …, Xk denote k discrete random variables, then p(x1, x2, …, xk ) is joint probability function of X1, X2, …, Xk if

Definition Let X1, X2, …, Xk denote k continuous random variables, then f(x1, x2, …, xk ) is joint density function of X1, X2, …, Xk if

Example: The Multinomial distribution Suppose that we observe an experiment that has k possible outcomes {O1, O2, …, Ok } independently n times. Let p1, p2, …, pk denote probabilities of O1, O2, …, Ok respectively. Let Xi denote the number of times that outcome Oi occurs in the n repetitions of the experiment. Then the joint probability function of the random variables X1, X2, …, Xk is

Note: is the probability of a sequence of length n containing x1 outcomes O1 x2 outcomes O2 … xk outcomes Ok

is the number of ways of choosing the positions for the x1 outcomes O1, x2 outcomes O2, …, xk outcomes Ok

is called the Multinomial distribution

Example: Suppose that a treatment for back pain has three possible outcomes: O1 - Complete cure (no pain) – (30% chance) O2 - Reduced pain – (50% chance) O3 - No change – (20% chance) Hence p1 = 0.30, p2 = 0.50, p3 = 0.20. Suppose the treatment is applied to n = 4 patients suffering back pain and let X = the number that result in a complete cure, Y = the number that result in just reduced pain, and Z = the number that result in no change. Find the distribution of X, Y and Z. Compute P[X + Y ≥ Z]

Table: p(x,y,z)

P [X + Y ≥ Z] = 0.9728

Example: The Multivariate Normal distribution Recall the univariate normal distribution the bivariate normal distribution

The k-variate Normal distribution where

Marginal distributions

Definition Let X1, X2, …, Xq, Xq+1 …, Xk denote k discrete random variables with joint probability function p(x1, x2, …, xq, xq+1 …, xk ) then the marginal joint probability function of X1, X2, …, Xq is

Definition Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous random variables with joint probability density function f(x1, x2, …, xq, xq+1 …, xk ) then the marginal joint probability function of X1, X2, …, Xq is

Conditional distributions

Definition Let X1, X2, …, Xq, Xq+1 …, Xk denote k discrete random variables with joint probability function p(x1, x2, …, xq, xq+1 …, xk ) then the conditional joint probability function of X1, X2, …, Xq given Xq+1 = xq+1 , …, Xk = xk is

Definition Definition Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous random variables with joint probability density function f(x1, x2, …, xq, xq+1 …, xk ) then the conditional joint probability function of X1, X2, …, Xq given Xq+1 = xq+1 , …, Xk = xk is

Definition – Independence of sets of vectors Let X1, X2, …, Xq, Xq+1 …, Xk denote k continuous random variables with joint probability density function f(x1, x2, …, xq, xq+1 …, xk ) then the variables X1, X2, …, Xq are independent of Xq+1, …, Xk if A similar definition for discrete random variables.

Definition – Mutual Independence Let X1, X2, …, Xk denote k continuous random variables with joint probability density function f(x1, x2, …, xk ) then the variables X1, X2, …, Xk are called mutually independent if A similar definition for discrete random variables.

Example Let X, Y, Z denote 3 jointly distributed random variable with joint density function then Find the value of K. Determine the marginal distributions of X, Y and Z. Determine the joint marginal distributions of X, Y X, Z Y, Z

Solution Determining the value of K.

The marginal distribution of X.

The marginal distribution of X,Y.

Find the conditional distribution of: Z given X = x, Y = y, Y given X = x, Z = z, X given Y = y, Z = z, Y , Z given X = x, X , Z given Y = y X , Y given Z = z Y given X = x, X given Y = y X given Z = z Z given X = x, Z given Y = y Y given Z = z

The marginal distribution of X,Y. Thus the conditional distribution of Z given X = x,Y = y is

The marginal distribution of X. Thus the conditional distribution of Y , Z given X = x is