Presentation is loading. Please wait.

Presentation is loading. Please wait.

Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.

Similar presentations


Presentation on theme: "Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred."— Presentation transcript:

1 Conditional Probability Mass Function

2 Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred. Unless A and B are independent, B will affect the probability of A. Example: We choose a coin out of a fair and weighted coins and toss it 4 times. What’s the probability of observing 2 or more head? The probability depends on which coin is selected (condition). p x [k| coin 1 chosen] is a Binomial PMF depends on p 1 p x [k| coin 2 chosen] is a Binomial PMF depends on p 2 2

3 Conditional Probability Mass Function Let X be the discrete RV describing the outcome of the coin choice 3 Since S X = {1,2}, we assign a PMF to X of The second part of the experiment consists of tossing the chosen coin 4 times in succession. S Y = {0,1,2,3,4} The event A corresponds to 2 or more heads.

4 Conditional Probability Mass Function Only the PMF is needed to determine the desired probability. To do so we need 4 By using the definition of conditional probability for events we have (definition of joint PMF) (definition of cond. prob.) (definition of marginal PMF)

5 Conditional Probability Mass Function can be determined from the experimental description 5 Is given earlier Note, that probability depends on the outcome X = i via p i. For a given value of X = i, the probability has all usual properties of a PMF

6 Conditional Probability Mass Function Then 6 is a conditional PMF Now we know p Y|X [j|i] and p X we have

7 Conditional Probability Mass Function Finally the desired probability of even A is 7 The joint PDF is then given by As an example, if p 1 = ¼ and p 2 = ¾, we have for α = ½, that P[A] = 0.6055, but if α = 1/8, then P[A] = 0.8633. Why??

8 Conditional Probability Mass Function The conditional PMF can be expressed as 8 To make connection with cond. probability let’s rename Hence, p Y|X [j|i] is a conditional probability for the events A j and B i.

9 Joint, Conditional, and Marginal PMFs Conditional PMF is defined as Each PMF in the family is a valid PMF when x i is considered to be a constant. In previous example { p Y|X [j|1], p Y|X [j|2] } is a family or valid PMFs. 9 But not

10 Example: Two Dice toss Two dice are tossed. All outcomes are equally likely. The numbers of dots are added together. What’s the cond. PMF of the sum if it’s known the sum is even? 10 Let We wish to determine p Y|X [j|0] and p Y|X [j|1] for all j. The sample space for Y is S Y = {2,3,…,12}.

11 Example: Two Dice toss Conditional probability if the sum being even and also equaling j or 11 N j is the number of outcomes in S X,Y for which the sum is j.

12 Example: Two Dice toss 12 Note that

13 Properties of PMF Property 1. Joint PMF yields conditional PMFs If the joint PMF p X,Y [x i, y j ] is known, then the conditional PMFs are 13 Hence, the cond. PMF is the joint PMF with x i fixed and then normalized so that it sums to one.

14 Properties of PMF Property 2. Conditional PMFs are related Proof: but therefore Using p X,Y [x i, y j ] = p Y,X [y j |x i ]p X [x i ] yields the desired the results. 14 (*)

15 Properties of PMF Property 3. Conditional PMF is expressible using Bayes’ rule Proof: From property 1 and using (*) we have substituting it into (**) yields the desired results 15 (**)

16 Properties of PMF Property 4. Conditional PMF and its corresponding marginal PMF yields the joint PMF 16 Property 5. Conditional PMF and its corresponding marginal PMF yields the other marginal PMF This is the law of total probability.

17 Conditional PMF relationships 17 Can also interchange X and Y for similar results

18 Simplifying Probability Calculations Using Conditioning Conditional PMFs can be used to simplify probability calculations. Find Z = X + Y, if X and Y are independent. If X were known X = i we can find the PMF of Z because Z = i + Y This is a transformation of one discrete RV to another discrete RV Z. p Z|X [j|i] = p Y|X [j-i|i].(*) To find unconditional PMF of Z we use property 5. Since (*) If X and Y are independent so that p Y|X = p Y then 18

19 Mean of the Conditional PMF We can determine attributes such as the expected value of a RV Y, when it is known that X = x i. The mean of the conditional PMF is a constant when x i is fixed. Generally, mean is a function of x i. Example: Two dice are tossed, the event of interest is a sum, given the sum is even or odd. The means of the conditional PMF are given 19 Usually not equal

20 Example: Toss one of two dice Two dice are given: D 1 = {1,2,3,4,5,6} and D 2 = {2,3,2,3,2,3}. The die is selected at random and tossed. What’s the expected number of dots observed for the tossed die? We can view this problem as a conditional one by letting and Y is the number of dots observed. Thus, we wish to determine E Y|X [Y|1] and E Y|X [Y|2]. 20

21 Example: Toss one of two dice What is the unconditional mean (mean of Y )? Unconditional mean is the number of dots observed without first condition on which die was chosen. Intuitively 21 Outcomes when die 1 is chosenOutcomes when die 2 is chosen Mean = 3.88, True mean = 3.5 Mean = 3.58, True mean = 2.5

22 Unconditional mean Let determine E Y [Y] for the following experiment 1.Choose die 1 or die 2 with probability of ½. 2.Toss the chose die. 3.Count the number of dots on the face of tossed die, that is RV Y. To determine theoretical mean Y we need p Y [j]. 22

23 Unconditional mean Thus the unconditional mean becomes 23 The other way to find unconditional mean That is the average of the conditional means.

24 Unconditional mean (Proof) In general unconditional mean is found as 24 Proof (def. of cond. mean) (def. of cond. PMF) (marginal PMF from joint PMF)

25 Modeling human learning Child learns by attempting to pick up the toy, dropping it, picking it up again after having learned something. Each time the experiment, “attempting to pick up the toy”, is repeated the child learns something or equivalently narrows down then number the number of strategies. 25 Many models of human learning employ a Baysian framework. By using it we are able to discriminate the right strategy with more accuracy as we repeatedly perform and experiment and observe the output.

26 Modeling human learning: Example Suppose we wish to “learn” whether a coin is fair ( p = ½ ) or is weighted ( p ≠ ½ ). Our certainty that the coin is fair or not, will increase as the number of trials increase. In the Bayesian model we assume that p is a RV. In reality, the coin has a fixed probably, but it is unknown to us. Let the probability of heads be denoted by RV Y and its values by y j. 26 for Prior PMF, it summarizes our state of knowledge before the experiment is performed

27 Modeling human learning: Example Let N be the number of coin tosses and X denote the number of tosses heads observed in the N tosses. X ~ bin(N, p) i.e. is binomially distributed, however the probability of heads Y is unknown. We can only specify the PMF of X conditionally Y = y j then the conditional PMF of the number of heads for X = i is 27 We are interested in the prob of heads or the PMF of Y after observing the outcomes of N coin tosses p Y|X [y j |i]. p Y|X [y j |i] is a posterior PMF, since it is determined after the experiment is performed.

28 Modeling human learning: Example The posterior PMF p Y|X [y j |i] contains all the info about the prob. of heads that results from our prior knowledge, summarized by p Y, and our “data” knowledge, summarized by p X|Y. The posterior PMF is given by Bayes’ rule with x i = i as 28 p Y|X [y j |i] depends on the observed number of heads i.

29 Modeling human learning: Example 29

30 Problems A fair coin is tossed, If it comes up heads, then X = 1 and if it comes up tails, then X = 0. Next, a point is selected at random from the area A if X = 1 and from the area B if X = 0 as shown. 30 B A 1 1 The area of the square is 4 and A and B both have areas of 3/2. If the point selected is in an upper quadrant, we set Y = 1 and if it is in a lower quadrant, we set Y = 0. Find the conditional PMF p Y|X [j|i] for all values of i and j. Next, compute P[Y = 0].

31 Problems Prove that 31

32 Problems If X and Y are independent RV, find PMF of Z = | X – Y|. Assume that S X = {0,1,…} and S Y = {0,1,…}. Hint: The answer is as intermediate step show that 32


Download ppt "Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred."

Similar presentations


Ads by Google