Joint Probability distribution

Slides:



Advertisements
Similar presentations
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Advertisements

DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Lecture (7) Random Variables and Distribution Functions.
Chapter 5 Discrete Random Variables and Probability Distributions
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Introduction to Probability
Review of Basic Probability and Statistics
Chapter 4 Discrete Random Variables and Probability Distributions
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
1 Random Variables and Discrete probability Distributions Chapter 7.
Probability Densities
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
Discrete Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Chapter 6 Continuous Random Variables and Probability Distributions
Probability Distributions
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Statistics.
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
5 Joint Probability Distributions CHAPTER OUTLINE
Random Variable and Probability Distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Jointly distributed Random variables
1 Random Variables and Discrete probability Distributions SESSION 2.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability Distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Chapter6 Jointly Distributed Random Variables
Joint Probability Distributions Leadership in Engineering
Probability and Probability Distributions
OUTLINE Probability Theory Linear Algebra Probability makes extensive use of set operations, A set is a collection of objects, which are the elements.
Lecture Note 5 Discrete Random Variables and Probability Distributions ©
Chapter 5 Discrete Random Variables and Probability Distributions ©
Two Random Variables W&W, Chapter 5. Joint Distributions So far we have been talking about the probability of a single variable, or a variable conditional.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Discrete Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4)
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Statistics for Business & Economics
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Math 4030 – 6a Joint Distributions (Discrete)
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
President UniversityErwin SitompulPBST 3/1 Dr.-Ing. Erwin Sitompul President University Lecture 3 Probability and Statistics
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Continuous Random Variables and Probability Distributions
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
Chapter 4 Discrete Random Variables and Probability Distributions
Applied Discrete Mathematics Week 11: Relations
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Conditional Probability on a joint discrete distribution
ASV Chapters 1 - Sample Spaces and Probabilities
Chapter 5 Applied Statistics and Probability for Engineers
Chapter 2. Random Variables
Random Variables and Probability Distributions
Discrete Random Variables and Probability Distributions
Mathematical Expectation
Presentation transcript:

Joint Probability distribution

Joint Probability Distributions Given two random variables X and Y that are defined on the same probability space, the Joint Distribution for X and Y defines the probability of events defined in terms of both X and Y. In the case of only two random variables, this is called a Bivariate Distribution. The concept generalizes to any number of random variables, giving a Multivariate Distribution.

Example Consider the roll of a die and let A = 1 if the number is even (2, 4, or 6) and A = 0 otherwise. Furthermore, let  B = 1 if the number is prime (2, 3 or 5) and B = 0 otherwise. Find the Joint Distribution of A and B? P(A = 0, B = 0) = {1} = 1/6 P(A = 0, B = 1) = {3, 5} = 2/6 P(A = 1, B = 0) = {4, 6} = 2/6 P(A = 1, B = 1) = {2} = 1/6

Example Y X X1 = 0 Y1 = 0 Y2 = 1 Rows Total 1/6 2/6 3/6 = 1/2 X2 = 1 Column Total 1

Joint Probability Distributions

Discrete Joint Probability Distribution X and Y are Independent f(x, y) = f(x) x f(y) X and Y are Dependent

Discrete Joint Probability Distribution A die is flipped and a coin is tossed Y X 1 2 3 4 5 6 Row Totals Head f(H, 1) = 1/12 1/2 Tail f(T, 1) = 1/12 Column Totals 1/6

Marginal Probability Distributions Marginal Probability Distribution: the individual probability distribution of a random variable.

Discrete Joint Probability Distribution A die is flipped and a coin is tossed Y X 1 2 3 4 5 6 Marginal Probability of X Head f(H, 1) = 1/12 1/2 Tail f(T, 1) = 1/12 Marginal Probability of Y 1/6

Joint Probability Distributions

Continuous Two Dimensional Distribution

Joint Probability Distributions X: the time until a computer server connects to your machine , Y: the time until the server authorizes you as a valid user. Each of these random variables measures the wait from a common starting time and X <Y. Assume that the joint probability density function for X and Y is Find the probability that X<1000 and Y<2000.

Joint Probability Distributions

Marginal Probability Distributions

Functions of Random Variables Z = g(X, Y) If we roll two dice and X and Y are the number of dice turn up in a trial, then Z = X + Y is the sum of those two numbers. Discrete Distribution Function Continuous Distribution Function

Addition of Means The Mean (Expectation) of a sum of random variables equals the sum of Means (Expectations). E(X1 + X2 + … + Xn ) = E(X1) + E(X2) + … + E(Xn)

Multiplication of Means The Mean (Expectation) of the product of Independent random variables equals the product of Means (Expectations). E(X1 X2 … Xn ) = E(X1) E(X2) … E(Xn)

fXY(x, y) = fx(X) fy(Y) for all x and y Independence Two continuous random variables X and Y are said to be Independent, if fXY(x, y) = fx(X) fy(Y) for all x and y

Addition of Variances The Variance of the sum of Independent random variables equals the sum of Variances of these variables. Б2 = Б12 + Б22 -2 БXY БXY = Covariance of X and Y = E(XY) - E(X) E(Y) If X and Y are independent, then E(XY) = E(X) E(Y) Б2 = Б12 + Б22

Problem 1 Let f(x, y) = k when 8 ≤ x ≤ 12 and 0 ≤ y ≤ 2 and zero elsewhere. Find k. Find P(X ≤ 11 and 1 ≤ Y ≤ 1.5) and P(9 ≤ X ≤ 13, and Y ≤ 1).

Problem 3 Let f(x, y) = k. If x > 0, y > 0, x +y < 3 and zero otherwise. Find k. Find P(X + Y ≤ 1) and P(Y > X).

Problem 7 What are the mean thickness and standard deviation of transfer cores each consisting of 50 layers of sheet metal and 49 insulating paper layers, if the metal sheets have mean thickness 0.5 mm each with a standard deviation of 0.05 mm and paper layers have mean thickness 0.05mm each with a standard deviation of 0.02 mm?

Problem 9 A 5-gear Assembly is put together with spacers between the gears. The mean thickness of the gears is 5.020 cm with a standard deviation of 0.003 cm. The mean thickness of spacers is 0.040 cm with a standard deviation of 0.002 cm. Find the mean and standard deviation of the assembled units consisting of 5 randomly selected gears and 4 randomly selected spacers.

Problem 11 Show that the random variables with the density f(x, y) = x + y and g(x, y) = (x+1/2)(y+1/2) If 0 ≤ x ≤ 1 and 0 ≤ y ≤ 1 and f(x, y) = 0 and g(x, y) = 0 and zero otherwise, have the same Marginal Distribution.

Problem 13 f(x, y) = 0.01e-0.1(x + y) An electronic device consists of two components. Let X and Y [months] be the length of time until failure of the first and second components, respectively. Assume that X and Y have the Probability Density f(x, y) = 0.01e-0.1(x + y) If x > 0 and y > 0 and zero otherwise. a. Are X and Y dependent or independent? b. Find densities of Marginal Distribution. c. What is the probability that the first component has a lifetime of 10 months or longer?

Problem 15 f(x, y) = 0.25e-0.5(x + y) Find P(X > Y) when (X, Y) has the Probability Density f(x, y) = 0.25e-0.5(x + y) If x ≥ 0, y ≥ 0 and zero otherwise.

Problem 17 f(0, 0) = f(1, 1) = 1/8 f(0, 1) = f(1, 0) = 3/8 Let (X, Y) have the Probability Function f(0, 0) = f(1, 1) = 1/8 f(0, 1) = f(1, 0) = 3/8 Are X and Y independent?

Marginal Probability Distributions Example: For the random variables in the previous example, calculate the probability that Y exceeds 2000 milliseconds.

Conditional Probability Distributions When two random variables are defined in a random experiment, knowledge of one can change the probabilities of the other.

Conditional Mean and Variance

Conditional Mean and Variance Example: From the previous example, calculate P(Y=1|X=3), E(Y|1), and V(Y|1).

Independence In some random experiments, knowledge of the values of X does not change any of the probabilities associated with the values for Y. If two random variables are independent, then

Multiple Discrete Random Variables Joint Probability Distributions Multinomial Probability Distribution

Joint Probability Distributions In some cases, more than two random variables are defined in a random experiment. Marginal probability mass function

Joint Probability Distributions Mean and Variance

Joint Probability Distributions Conditional Probability Distributions Independence

Multinomial Probability Distribution A joint probability distribution for multiple discrete random variables that is quite useful in an extension of the binomial.

Multinomial Probability Distribution Example: Of the 20 bits received, what is the probability that 14 are Excellent, 3 are Good, 2 are Fair, and 1 is Poor? Assume that the classifications of individual bits are independent events and that the probabilities of E, G, F, and P are 0.6, 0.3, 0.08, and 0.02, respectively. One sequence of 20 bits that produces the specified numbers of bits in each class can be represented as: EEEEEEEEEEEEEEGGGFFP P(EEEEEEEEEEEEEEGGGFFP)= The number of sequences (Permutation of similar objects)=

Two Continuous Random Variables Joint Probability Distributions Marginal Probability Distributions Conditional Probability Distributions Independence

Conditional Probability Distributions

Conditional Probability Distributions Example: For the random variables in the previous example, determine the conditional probability density function for Y given that X=x Determine P(Y>2000|x=1500)

Conditional Probability Distributions Mean and Variance

Conditional Probability Distributions Example: For the random variables in the previous example, determine the conditional mean for Y given that x=1500

Independence

Independence Example: Let the random variables X and Y denote the lengths of two dimensions of a machined part, respectively. Assume that X and Y are independent random variables, and the distribution of X is normal with mean 10.5 mm and variance 0.0025 (mm)2 and that the distribution of Y is normal with mean 3.2 mm and variance 0.0036 (mm)2. Determine the probability that 10.4 < X < 10.6 and 3.15 < Y < 3.25. Because X,Y are independent

Multiple Continuous Random Variables

Multiple Continuous Random Variables Marginal Probability

Multiple Continuous Random Variables Mean and Variance Independence

Covariance and Correlation When two or more random variables are defined on a probability space, it is useful to describe how they vary together. It is useful to measure the relationship between the variables.

Covariance Covariance is a measure of linear relationship between the random variables. \ The expected value of a function of two random variables h(X, Y ).

Covariance

Covariance

Covariance Example: For the discrete random variables X, Y with the joint distribution shown in Fig. Determine

Correlation The correlation is a measure of the linear relationship between random variables. Easier to interpret than the covariance.

Correlation For independent random variables

Correlation Example: Two random variables , calculate the covariance and correlation between X and Y.

Bivariate Normal Distribution Correlation

Bivariate Normal Distribution Marginal distributions Dependence

Bivariate Normal Distribution Conditional probability

Bivariate Normal Distribution Ex. Suppose that the X and Y dimensions of an injection-modeled part have a bivariate normal distribution with Find the P(2.95<X<3.05,7.60<Y<7.80)

Bivariate Normal Distribution Ex. Let X, Y : milliliters of acid and base needed for equivalence, respectively. Assume X and Y have a bivariate normal distribution with Covariance between X and Y Marginal probability distribution of X P(X<116) P(X|Y=102) P(X<116|Y=102)

Linear Combination of random variables

Linear Combination of random variables Mean and Variance

Linear Combination of random variables Ex. A semiconductor product consists of 3 layers. The variances in thickness of the first, second, and third layers are 25,40,30 nm2 . What is the variance of the thickness of the final product? Let X1, X2, X3, and X be random variables that denote the thickness of the respective layers, and the final product. V(X)=V(X1)+V(X2)+V(X3)=25+40+30=95 nm2

Discrete Joint Probability Distribution Given a bag containing 3 black balls, 2 blue balls and 3 green balls, a random sample of 4 balls is selected. Find Joint Probability Distribution of X and Y, if X = black balls and Y = blue balls. f(x, y) 1 2 Row Totals 2/70 3/70 5/70 18/70 9/70 30/70 3 0/70 Column Total 15/70 40/70

Marginal Probability Distributions f(x, y) 1 2 Marginal Probability of Y 2/70 3/70 5/70 18/70 9/70 30/70 3 0/70 Marginal Probability of X 15/70 40/70

Independence

Quiz # 4 Given a bag containing 3 black balls, 2 blue balls and 3 green balls, a random sample of 4 balls is selected. Find Joint Probability Distribution of X and Y, if X = black balls and Y = blue balls.