Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which.

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Chapter 5 Discrete Random Variables and Probability Distributions
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Multivariate Distributions
Independence of random variables
Chapter 4 Discrete Random Variables and Probability Distributions
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 4-1 Introduction to Statistics Chapter 5 Random Variables.
Statistics Lecture 18. Will begin Chapter 5 today.
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Chapter 6 Continuous Random Variables and Probability Distributions
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Discrete Probability Distributions
DISCRETE RANDOM VARIABLES. RANDOM VARIABLES numericalA random variable assigns a numerical value to each simple event in the sample space Its value is.
Chapter 4: Joint and Conditional Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
Chapter 4 Joint Distribution & Function of rV. Joint Discrete Distribution Definition.
Mutually Exclusive: P(not A) = 1- P(A) Complement Rule: P(A and B) = 0 P(A or B) = P(A) + P(B) - P(A and B) General Addition Rule: Conditional Probability:
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Multivariate Probability Distributions. Multivariate Random Variables In many settings, we are interested in 2 or more characteristics observed in experiments.
Joint Probability distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability Distributions
Joint Distribution of two or More Random Variables
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Variance and Covariance
Biostat. 200 Review slides Week 1-3. Recap: Probability.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
Chapters 7 and 10: Expected Values of Two or More Random Variables
Discrete Random Variables A random variable is a function that assigns a numerical value to each simple event in a sample space. Range – the set of real.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Chapter 5 Joint Continuous Probability Distributions Doubling our pleasure with two random variables Chapter 5C.
Random Sampling Approximations of E(X), p.m.f, and p.d.f.
Statistics for Business & Economics
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete.
Topic 5 - Joint distributions and the CLT
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Conditional Expectation
Probability Refresher
MECH 373 Instrumentation and Measurements
Statistics Lecture 19.
Random Variable 2013.
Engineering Probability and Statistics - SE-205 -Chap 3
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Discrete Random Variables
Conditional Probability on a joint discrete distribution
In-Class Exercise: Discrete Distributions
Chapter 4: Mathematical Expectation:
Some Rules for Expectation
... DISCRETE random variables X, Y Joint Probability Mass Function y1
6.3 Sampling Distributions
AP Statistics Chapter 16 Notes.
ASV Chapters 1 - Sample Spaces and Probabilities
Moments of Random Variables
Mathematical Expectation
Presentation transcript:

Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which a carcass is divided by a butcher. 4. adj., shared or common to two or more Chapter 5A Discrete RV

This week in Prob/Stat today’s good stuff time permitting

Joint Probability Distributions It is often useful (necessary) to have more than one RV defined in a random experiment. Examples: Polyethylene Specs X = Melt Point Y = Density Dimensions of a part X = length Y = width If X & Y are two RV’s, the probability distribution that defines their simultaneous behavior is a Joint Probability Distribution

Two Discrete Random Variables Let X = a discrete random variable, the number of orders placed per day for a high cost item Let Y = a discrete random variable, the number of items in stock Joint Probability Mass Function, f xy (x,y)

5-1 Two Discrete Random Variables Joint Probability Distributions

Two Discrete Random Variables Let X = a discrete random variable, the number of orders placed per day for a high cost item Let Y = a discrete random variable, the number of items in stock Pr{X=0, Y=1} = f xy (0,1) =.15

5-1 Two Discrete Random Variables Marginal Probability Distributions The individual probability distribution of a random variable is referred to as its marginal probability distribution. The marginal probability distribution of X can be determined from the joint probability distribution of X and other random variables. To determine P(X = x), we sum P(X = x, Y = y) over all points in the range of (X, Y ) for which X = x. Subscripts on the probability mass functions distinguish between the random variables.

5-1 Two Discrete Random Variables Definition: Marginal Probability Mass Functions

Two Discrete Random Variables Let X = a discrete random variable, the number of orders placed per day for a high cost item Let Y = a discrete random variable, the number of items in stock Pr{X = 1} = Pr{X=1, Y=0} + Pr{X=1, Y=1} + Pr{X=1, Y=2} + Pr{X=1, Y=3} = f x (1) = =.33 Pr{Y  2} = f y (2) + f y (3) = =.21

Marginal Mean & Variance If the marginal probability distribution of X has the probability mass function f X (x), then R x denotes all points of (X,Y) for which X = x and R denotes all points in the range of (X,Y)

Using the Marginal Distributions E[X] =  x = 0 (.39) + 1 (.33) + 2 (.28) =.89 E[Y] =  y = 0 (.38) + 1 (.41) + 2 (.16) + 3 (.05) =.88 Var[X] =  x 2 = 0 2 (.39) (.33) (.28) =.6579 Var[Y] =  y 2 = 0 2 (.38) (.41) (.16) (.05) =.7256

5-1.3 Conditional Probability Distributions

Conditional Distribution of Y f xy (x,y) f Y|x (y) = f xy (x,y) / f x (x) f Y|x = 1 (2) = f xy (1,2) / f x (1) =.05 /.33 =

Conditional Distribution of X f xy (x,y) f X|y (x) = f xy (x,y) / f y (y) f X|y = 2 (1) = f xy (1,2) / f y (2) =.05 /.16 =.3125

Conditional Mean and Variance

Conditional Mean & Var of Y f Y|x (y) = f xy (x,y) / f x (x) E[Y|x=1] = 0 (.30303) + 1 ( ) + 2 ( ) + 3 ( ) =.9091 Var[Y|x=1] = 0 (.30303) + 1 ( ) + 4( ) + 9 ( ) =.5675

Conditional Mean & Var of X E[X|y =2] = 0 (.1875) + 1 (.3125) + 2 (.5) = f X|y (x) Var[X|y =2] = 0 (.1875) + 1 (.3125) + 4 (.5) =.5896

5-1.4 Independence

Are X and Y Independent? f x (1) f y (2) = (.33) (.16) =.0528  f xy (1,2) =.05 f x (2) f y (0) = (.28) (.38) =.1064  f xy (2,0) =.08 No Chuck, they are not independent.

More on Independence Many evaluations of independence are based on knowledge of the physical situation. If we are reasoning based on data, we will need statistical tools to help us. It is very, very unlikely that counts and estimated probabilities will yield exact equalities as in the conditions for establishing independence.

The Search for Independence Let X = a discrete random variable, the number of defects in a lot of size 3 where the probability of a defect is a constant.1. Let Y = a discrete random, the demands in a given day for the number of units from the above lot.

The Search Continues assuming independence: f xy (x,y) = f x (x) f y (y) f xy (1,2) = f x (1) f y (2) = (.243) (.4) =.0972 Remember: P(A  B) = P(A) P(B) if A and B are independent

Recap - Sample Problem Assume X & Y are jointly distributed with the following joint probability mass function: Y X 1/8 1/16 3/16 1/4

Sample Problem Cont’d Determine the marginal probability distribution of X P(X = -1) = 1/8 + 1/8 = 1/4 P(X = -0.5) = 1/16 + 1/16 = 1/8 P(X = 0.5) = 3/16 + 1/4 = 7/16 P(X = 1) = 1/16 + 1/8 = 3/16

Sample Problem Cont’d Determine the conditional probability distribution of Y given that X = 1. P(Y = 1 | X = 1) = P(X = 1, Y = 1)/P(X = 1) = (1/16)/(3/16) = 1/3 P(Y = 2 | X = 1) = P(X = 1, Y = 2)/P(X = 1) = (1/8)/(3/16) = 2/3

Sample Problem Cont’d Determine the conditional probability distribution of X given that Y = 1. P(X = 0.5 | Y = 1) = P(X = 0.5, Y = 1)/P(Y = 1) = (1/4)/(5/16) = 4/5 P(X = 1 | Y = 1) = P(X = 1, Y = 1)/P(Y = 1) = (1/16)/(5/16) = 1/5

5-1.5 Multiple Discrete Random Variables Definition: Joint Probability Mass Function

5-1.5 Multiple Discrete Random Variables Definition: Marginal Probability Mass Function

5-1.5 Multiple Discrete Random Variables Mean and Variance from Joint Probability

5-1.6 Multinomial Probability Distribution

The Necessary Example Final inspection of products coming off of the assembly line categorizes every item as either acceptable, needing rework, or rejected. Historically, 90 percent have been acceptable, 7 percent needed rework, and 3 percent have been rejected. For the next 10 items that are produced, what is the probability that there will be 8 acceptable, 2 reworks, and no rejects? Let X 1 = number acceptable, X 2 = number reworks, X 3 = number rejects

More of the Necessary Example The production process is assumed to be out of control (i.e. the probability of an acceptable item is less than.9) if there are fewer than 8 acceptable items produced from a lot size of 10? What is the probability that the production process will be assumed to be out of control when the probability of an acceptable item remains.9? Let X 1 = number acceptable

This week in Prob/Stat Wednesday’s good stuff