Chapter6 Jointly Distributed Random Variables

Slides:



Advertisements
Similar presentations
Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
Advertisements

Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
1. Frequency Distribution & Relative Frequency Distribution 2. Histogram of Probability Distribution 3. Probability of an Event in Histogram 4. Random.
Chapter 5 Discrete Random Variables and Probability Distributions
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Michael Maurizi Instructor Longin Jan Latecki C9:
Introduction to Probability
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Chapter 4 Discrete Random Variables and Probability Distributions
5.4 Joint Distributions and Independence
Short review of probabilistic concepts
Joint Distributions, Marginal Distributions, and Conditional Distributions Note 7.
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
Statistics Lecture 18. Will begin Chapter 5 today.
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
Introduction to Probability and Statistics
Discrete Random Variables and Probability Distributions
Short review of probabilistic concepts Probability theory plays very important role in statistics. This lecture will give the short review of basic concepts.
Class notes for ISE 201 San Jose State University
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Continuous Random Variables and Probability Distributions
Joint Probability distribution
Jointly distributed Random variables
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Joint Distribution of two or More Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Joint Probability Distributions Leadership in Engineering
Probability and Statistics Dr. Saeid Moloudzadeh Axioms of Probability/ Basic Theorems 1 Contents Descriptive Statistics Axioms of Probability.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Chapter 5 Discrete Random Variables and Probability Distributions ©
Lecture 14: Multivariate Distributions Probability Theory and Applications Fall 2005 October 25.
Chapter 3 Random Variables and Probability Distributions 3.1 Concept of a Random Variable: · In a statistical experiment, it is often very important to.
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
Chapter 3 Random vectors and their numerical characteristics.
LECTURE IV Random Variables and Probability Distributions I.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Chapter Four Random Variables and Their Probability Distributions
2.1 Introduction In an experiment of chance, outcomes occur randomly. We often summarize the outcome from a random experiment by a simple number. Definition.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
Chapter 3 Multivariate Random Variables
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Chapter Eight Expectation of Discrete Random Variable
President UniversityErwin SitompulPBST 3/1 Dr.-Ing. Erwin Sitompul President University Lecture 3 Probability and Statistics
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Continuous Random Variables and Probability Distributions
Jointly distributed Random variables Multivariate distributions.
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
3.4 Joint Probability Distributions
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Chapter 4 Discrete Random Variables and Probability Distributions
Chapter 9: Joint distributions and independence CIS 3033.
Statistics Lecture 19.
Probability The term probability refers to indicate the likelihood that some event will happen. For example, ‘there is high probability that it will rain.
Chapter Four Random Variables and Their Probability Distributions
The distribution function F(x)
ASV Chapters 1 - Sample Spaces and Probabilities
ASV Chapters 1 - Sample Spaces and Probabilities
ASV Chapters 1 - Sample Spaces and Probabilities
Random Variables and Probability Distributions
Discrete Random Variables and Probability Distributions
Presentation transcript:

Chapter6 Jointly Distributed Random Variables STAT 111 Chapter6 Jointly Distributed Random Variables

In many experiments it is necessary to consider the properties of two or more random variables simultaneously. In the following, we shall be concerned with the bivariate case, that is, with situations where we are interested at the same time in a pair of random variables. Later, we shall extend this discussion to the multivariate case, covering any finite number of random variables, If X and Y are two discrete random variables, the probability distribution for their simultaneous occurrence can be represented by a function with values f (x, y) for any pair of values (x, y) within the range of the random variables X and Y. It is common to refer to this function as the joint probability distribution of X and Y, and it is defined as f (x, y) = P(X = x, Y = y)

Definition F (x , y ) = p (X ≤ x ,Y≤ y)= ∑ ∑ ƒ ( xi , yi) The function f (x, y) is a joint probability distribution (or joint probability mass function) of the discrete random variables X and Y if and only if its values satisfy the conditions; 1. for all (x, y) 2. ∑ ∑ f (x ,y ) = 1, where the double summation extends over all x y Based on this definition , the joint cumulative distribution function is the joint probably that X ≤ x and Y ≤ y ,given by F (x , y ) = p (X ≤ x ,Y≤ y)= ∑ ∑ ƒ ( xi , yi) xi ≤ x yi ≤ y for - ∞ < x<∞ , - ∞< y <∞

Example Let X denote the number of heads and Y the number of heads minus the number of tails when coins are tossed. Find the joint probability distribution of X and Y. Possible values (0,-3) ,(1,-1), (2,1), (3,3) P(X=0, Y= -3) = 1 , 8 P(X=1 , Y=-1)= 3 , ….. These probabilities can most easily be expressed in tabular from as in this table Y/X 1 2 3 Sum -3 1/8 -1 3/8

Example Suppose that 3 balls are randomly selected from an urn containing 3 red, 4 white and 5 blue balls. Let X denotes the number of red balls chosen. Let Y denotes the number of white balls chosen. Find the joint probability function of X and Y, P(X + Y ≤ 2), P(X = 1, Y = 2), p(X = 0, 0 ≤ Y≤ 2) , and P(X > Y). X = # R , Y= #W 3R 5B 4 W 3 1. X + Y ≤ 2 for the following values of X and Y {(0. 0), (1, 0), (2, 0), (0, 1),(1, 1), (0,2)}therefore, P(X+Y≤2)= 10 + 40 + 30 + 30 + 60 + 15 220 220 220 220 220 220 X/Y 1 2 3 10/220 40/220 30/220 4/220 60/220 18/220 15/220 12/220 1/220 2. P (X= 1,Y=2) =f (1,2) = 18/220 3. P(X = 0, 0 ≤ Y≤ 2) = ƒ (0, 0) + ƒ(0,1) + ƒ(0, 2 ) = 10 + 40 + 30 220 220 220

Example Let the joint probability mass function of X and Y is given in the following table Y X 1 2 1/16 1/8 1/4 Find 1. P(X= 1, Y≤ 0) = ƒ(1,0) =1/8 2. P(X = 2,Y ≤ 0) =ƒ(2, 0) = 1/8 3. P(X = 2,X + Y = 4) = ƒ(2,2) = 1/16 4. P(l ≤ X<3, Y≥ 1) = ƒ(1, l)+ƒ(l,2) + ƒ(2, l) + ƒ(2,2) = 1 + 1 + 1 + 1 8 4 16 16 5.P(X ≤ 2) = 1 - P(X > 2) = 1 - 0 = 1 6. F(l, 1) = (P(X≤1,Y≤1) = f (1,0)+ f (1, l)+f (0,0)+f (0,1) = 1 + 1 + 1 + 1 8 8 16 16

Example Determine the value of c so that the following functions represent joint probability distributions of the random variables X and Y; 1. ƒ(x,y) = c x y, for x = 1,2,3. y=1,2,3. ∑ ƒ(x+ y) = c [1+2+3+2+4+6+3+6+9] = 1  c = 1/36 2. ƒ(x , y) = c│ x-y │, for x= -2 ,0 ,2 y =1,2,3. ∑ ƒ(x , y) = c[3+4+5+1+2+3+1+0+1]=1  c = 1/20

All the preceding definitions concerning two random variables can be generalized to the multivariate case, where there are n random variables. The values of the joint probability distribution of the discrete random variables X1,X2 , …, Xn, defined over the same sample space S, are given by f(x1,x2, ...,xn) = P(X1=X 1,X2, ..., Xn = xn) for all n-tuple (x1, x2, ..., xn) within the range of the random variables. Also the values of their joint cumulative distribution function are given by F ( x1 , x2 ,… , xn ) = P ( X1 ≤ x1 , X2 ≤ x2, …Xn ≤ xn)

Example Considering n flips of a balanced coin, let X 1 be the number of heads (0 or 1) obtained on the first flip, X2 the number of heads obtained on the second flip, ..., and Xn the number of heads obtained on the nth flip. find the joint probability distribution of these n random variables. f(x1 , x2 , …,xn)= P(X1 = x1 , X2 = x2 ,..,Xn=xn) = 1 x 1 x 1 x…x 1 = 1 2 2 2 2 2n Where each X: can take on the value 0 or 1

Marginal Distributions If X and Y are discrete random variables and f(x, y) is the value of their joint probability distribution at (x, y), the function given by f x (x) =∑ ƒ(x,y) y for each x within the range of X, is called the marginal distribution of X Similarly , the function given by f y(y) =∑ ƒ(x,y) x for each y within the range of y , is called the marginal distribution of y note that , because the probabilities are obtained from the margins , we call them marginal distribution

Example Assume that the random variable X and Y have the following joint prob. Mass function. Find the marginal distribution of X and Y. X \ Y 1 2 3 Sum 10/220 40/220 30/220 4/220 84/220 60/220 1 8/220 1 08/220 15/220 12/220 27/220 1/220 sum 56/220 112/220 4 8 /220 Marginal of x X 1 2 3 Sum ƒ(x) 84/220 108/220 27/220 1/220 Marginal of y Y ƒ(y) 56/220 112/220 48/220 4/220

∑ ∑ Example ∞ e-2λ e λ ∞ λY Y! λ x X! Assume that the random variables X and Y have the joint probability mass function given as f(x ,y)= λx+y e -2λ x = 0 , 1 , 2 ,.. x!y! y=0,1,2,…… Find the marginal distribution of X ∞ ƒ(x)= ∑ λx+y e-2λ = = x! y! = λx e -λ x! [Using e λ = λt ] t = 0 t ! λxe-2λ λY Y! Y=0 λ x X! e-2λ e λ ∑ ∞ ∑

Example Let the joint distribution of X and Y be given as f(x , y) = x + y x = 0,l,2,3 y = 0,1,2, 30 Find the marginal distribution function of X and Y. marginal of X Similarly, f(y) = (3+2x)/15 Or , since the joint is the marginal of x the marginal of y y x 1 2 3 1/30 1/15 1/10 2/15 1/6 1/5 3/10 2/5 x 1 2 3 f(x) 1/10 1/5 3/10 2/5 Y 1 2 f(y) 1/5 1/3 7/15

Conditional Distribution: Recall that for any two events A and B, the conditional probability of A given B is defined as; P( A\ B) = P (A ∩ B) P (B) Provided P(B) ≠ 0. Suppose now that A and B are the events X = x and Y = y. So that we can write P(X = x\ Y = y)= (1) P (X=x ,Y=y) P (Y = y) Provided P(Y = y) = fY(y) ± 0, where P(X = x, Y = y) = f(x, y) is the value of the joint probability distribution of X and Y at (x, y) and fy (y) is the value of the marginal distribution of Y at y. Denoting the probability in equation 1 by f(x\y) to indicate that x is a variable and y is fixed, then we have the following definition.

Definition: If f(x, y) is the value of the joint probability' distribution of the discrete random variable X and Y at (x, y) and fy (y) is the value of the marginal distribution of Y at y, the function given by ƒ (x , y) ƒY (y) ƒ(x\y) = P(X = x\ Y = y) = for each x within the range of X, is called the conditional distribution of X given Y = y. Similarly, if fx(x) is the value of the marginal distribution of X at x f(y/x) = P ( Y = y / X = x ) = ƒ (x , y) ƒx (y) For each y within the range of Y , is called the conditional distribution of Y given X = x

Example 1 The joint probability mass function of X and Y is given by Marginal of Y f(1,1) = 1 f(1,2) = 1 f(2,1)= 1 f(2,2)= 1 8 4 8 2 1.Compute the conditional mass function of X given Y = i, i =1,2 2.Compute P(XY ≤ 3) = f(1, 1) + f(1, 2) + f(2, 1) = 1/2 3. P(X/Y> l) = f(2, 1)= 1/8 Y x 1 2 Sum 1/8 2/8 1/4 1/2 6/8 sum 3/8 5/8 y 1 2 f(y) 2/8 6/8 Marginal of y The conditional mass f n of X/Y =1 The conditional of X/Y=2 x 1 2 f(x\y=1) 1/2 y 1 2 Sum f(x\y=2) 1/3 2/3

Independent Random Variables: In chapter 3 we stated that two events are independent if and only if that is, their joint probability is equal to the product of their marginal probabilities. In the following the notion of independence of random variables is presented. P(A ∩ B) = P(A) P(B)

Definition The random variables X and Y are said to be independent if and only if ƒ(x, y) = ƒx(x) fY(y) for all possible values of X and Y. In terms of the Joint cumulative distribution function, X and Y are independent if and only if F(x, y) = Fx(x) FY(y) For all possible values of X and Y Or P(X є A, Y є B) = P(X є A) P(Y єB) for every A, B.

Thus, loosely speaking, X and Y are independent if knowing the value of one does not change the distribution of the other. Random variables that are not independent are said to be dependent. checking for Independence of discrete random variables requires a very thorough investigation since it is possible to have the product of the Marginal distributions equal to the joint probability distribution for some but not all combinations of ( x,y) . If one can find any point ( x , y ) for which ƒ(x , y) ≠ ƒx (x) . ƒy(y) ; then the discrete variable X and Y are not independent

If X is independent of Y, then the conditional mass function is the same as the unconditional ones. This follows because if X is independent of Y, then ƒ(x\y) = P(X = x\Y = y) = P(X = x,Y = y) P(Y = y) = P(X = x)P(Y = y) P( Y = y) = P(X = x)

Definition Let X1, X2, ..., Xn, be n discrete random variable having densities ƒ1, f2, ..., fn respectively. These random variables are said to be independent if their joint density function f is given by ƒ(x1,x2,..„xn)=ƒX1(xl)ƒX2(x2)...ƒXn(xn) for all (x1, x2, ..., xn) within their range.

Example Show that the random variables of Example 1 are not independent, f (1,1) = 1 f(1,2)= 1 8 4 f (2,1)= 1 f (2,2)= 1 8 2 f x (1) = 3 , fx (2)= 5 , fy (1)= 2 , fy (2)=3 8 8 8 4 f(1,1)= 1 ≠ 6 = fx (1) . fy (1)  X and Y are not independent 8 64

½ Example The random variables X and Y are specified by P(X=1)= 2 P(X=0)= 1 3 3 P(Y= 1)= 1 P( Y= -1) = 3 4 4 Construct the joint distribution of X and Y assuming that X and Y are independent Y X 1 -1 1/4 ½ 1/12 2/12