Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bayes’ Rule. Bayes’ Rule - Updating Probabilities Let A 1,…,A k be a set of events that partition a sample space such that (mutually exclusive and exhaustive):

Similar presentations


Presentation on theme: "Bayes’ Rule. Bayes’ Rule - Updating Probabilities Let A 1,…,A k be a set of events that partition a sample space such that (mutually exclusive and exhaustive):"— Presentation transcript:

1 Bayes’ Rule

2 Bayes’ Rule - Updating Probabilities Let A 1,…,A k be a set of events that partition a sample space such that (mutually exclusive and exhaustive): –each set has known P(A i ) > 0 (each event can occur) –for any 2 sets A i and A j, P(A i and A j ) = 0 (events are disjoint) –P(A 1 ) + … + P(A k ) = 1 (each outcome belongs to one of events) If C is an event such that –0 < P(C) < 1 (C can occur, but will not necessarily occur) –We know the probability will occur given each event A i : P(C|A i ) Then we can compute probability of A i given C occurred:

3 Example - OJ Simpson Trial Given Information on Blood Test (T+/T-) –Sensitivity: P(T+|Guilty)=1 –Specificity: P(T-|Innocent)=.9957  P(T+|I)=.0043 Suppose you have a prior belief of guilt: P(G)=p* What is “posterior” probability of guilt after seeing evidence that blood matches: P(G|T+)? Source: B.Forst (1996). “Evidence, Probabilities and Legal Standards for Determination of Guilt: Beyond the OJ Trial”, in Representing OJ: Murder, Criminal Justice, and the Mass Culture, ed. G. Barak pp. 22-28. Harrow and Heston, Guilderland, NY

4 OJ Simpson Posterior (to Positive Test) Probabilities

5 Northern Army at Gettysburg Regiments: partition of soldiers (A 1,…,A 9 ). Casualty: event C P(A i ) = (size of regiment) / (total soldiers) = (Column 3)/95369 P(C|A i ) = (# casualties) / (regiment size) = (Col 4)/(Col 3) P(C|A i ) P(A i ) = P(A i and C) = (Col 5)*(Col 6) P(C)=sum(Col 7) P(A i |C) = P(A i and C) / P(C) = (Col 7)/.2416

6 CRAPS Player rolls 2 Dice (“Come out roll”): –2,3,12 - Lose (Miss Out) –7,11 - Win (Pass) –4,5,6.8,9,10 - Makes point. Roll until point (Win) or 7 (Lose) –Probability Distribution for first (any) roll: After first roll: P(Win|2) = P(Win|3) = P(Win|12) = 0 P(Win|7) = P(Win|11) = 1 What about other conditional probabilities if make point?

7 CRAPS Suppose you make a point: (4,5,6,8,9,10) –You win if your point occurs before 7, lose otherwise and stop –Let P mean you make point on a roll –Let C mean you continue rolling (neither point nor 7) –You win for any of the mutually exclusive events: P, CP, CCP, …, CC…CP,… If your point is 4 or 10, P(P)=3/36, P(C)=27/36 By independence, and multiplicative, and additive rules:

8 CRAPS Similar Patterns arise for points 5,6,8, and 9: –For 5 and 9: P(P) = 4/36 P(C) = 26/36 –For 6 and 8: P(P) = 5/36 P(C) = 25/36 Finally, we can obtain player’s probability of winning:

9 CRAPS - P(Winning) Note in the previous slides we derived P(Win|Roll), we multiply those by P(Win to obtain P(Roll&Win) and sum those for P(Win). The last column gives the probability of each come out roll given we won.


Download ppt "Bayes’ Rule. Bayes’ Rule - Updating Probabilities Let A 1,…,A k be a set of events that partition a sample space such that (mutually exclusive and exhaustive):"

Similar presentations


Ads by Google