Presentation is loading. Please wait.

Presentation is loading. Please wait.

Discrete Mathematics and Its Applications Sixth Edition By Kenneth Rosen Chapter 6 Discrete Probability 歐亞書局.

Similar presentations


Presentation on theme: "Discrete Mathematics and Its Applications Sixth Edition By Kenneth Rosen Chapter 6 Discrete Probability 歐亞書局."— Presentation transcript:

1 Discrete Mathematics and Its Applications Sixth Edition By Kenneth Rosen Chapter 6 Discrete Probability 歐亞書局

2 6.1 An Introduction to Discrete Probability 6.2 Probability Theory 6.3 Bayes’ Theorem 6.4 Expected Value and Variance P. 1 歐亞書局

3 6.1 An Introduction to Discrete Probability Gambling –Tossing a die Finite Probability –We will restrict to experiments that have finitely many outcomes Experiment: a procedure that yields one of a given set of possible outcomes Sample space: the set of possible outcomes Event: a subset of the sample space

4 Definition 1: If S is a finite sample space of equally likely outcomes, and E is an event (a subset of S), then the probability of E is p(E)=|E|/|S|. –Ex.1-8 –Lotteries –Poker –Sampling without replacement –Sampling with replacement

5 The probability of combinations of events –Theorem 1: Let E be an event in a sample space S. The probability of the event Ē, the complementary event of E, is given by: p(Ē) = 1 – p(E). Proof Ex.8

6 Theorem 2: Let E1 and E2 be events in the sample space S. Then, p(E1  E2)=p(E1)+p(E2)-p(E1  E2). –Ex.9 Probabilistic reasoning –Determining which of two events is more likely –Ex.10

7 6.2 Probability Thoery Assigning probabilities –We assign a probability p(s) to each outcome s 0<=p(s)<=1 for each s  S (S: finite or countable number of outcomes)  s  S p(s)=1 –For n possible outcomes x1, x2, …, xn 0<=p(xi)<=1 for i=1,2,…,n  i=1..n p(xi)=1 –P: probability distribution Ex.1

8 Definition 1: Suppose that S is a set with n elements. The uniform distribution assigns the probability 1/n to each element of S. Definition 2: The probability of the event E is the sum of the probabilities of the outcomes in E. That is, p(E)=  s  E p(s) –Selecting at random –Ex.2

9 Combinations of events –The same as Sec.6.1 p(Ē) = 1 – p(E) p(E1  E2)=p(E1)+p(E2)-p(E1  E2) –Theorem 1: If E 1, E 2, … is a sequence of pairwise disjoint events in a sample space S, then p(  i E i )=  i p(E i ).

10 Conditional Probability Definition 3: Let E and F be events with p(F)>0. The conditional probability of E given F, denoted by p(E|F), is defined as p(E|F)=p(E  F)/p(F). –Ex.3-4 Independence –P(E|F)=p(E) Definition 4: The events E and F are independent if and only if p(E  F)=p(E)p(F) –Ex.5-7

11 Bernoulli Trials and the Binomial Distribution Bernoulli trial: each performance of an experiment with two possible outcomes –A success, or a failure –Ex.8 Theorem 2: The probability of exactly k successes in n independent Bernoulli trials, with probability of success p and probability of failure q=1-p, is C(n,k)p k q n-k. –Denoted by b(k;n,p), binomial ditribution –Ex.9

12 Random Variable Definition 5: A random variable is a function from the sample space of an experiment to the set of real numbers. –Not a variable, not random –Ex.10 Definition 6: The distribution of a random variable X on a sample space S is the set of pairs (r, p(X=r)) for all r  S, where p(X=r) is the probability that X takes the value r. –Ex. 11-12

13 The Birthday Problem The smallest number of people needed in a room so that it’s more likely than not that at lease two of them have the same day of year as their birthday –Similar to hashing functions –Ex.13 (The birthday problem) –Ex.14 (Probability of a collision in hashing functions)

14 Monte Carlo Algorithms Deterministic vs. probabilistic algorithms Monte Carlo algorithms –Always produce answers to problems, but a small probability remains that the answers may be incorrect This probability decreases rapidly when the algorithm carries out sufficient computation Ex. 15-16

15 The Probabilistic Method Theorem 3: (The Probabilistic Method) If the probability that an element of a set S does not have a particular property is less than 1, there exists an element in S with this property. –Nonconstructive existence proof Theorem 4: If k is an integer with k>=2, then R(k,k)>=2 k/2. –Proof

16 6.3 Bayes’ Theorem Ex.1 Theorem 1: (Bayes’ Theorem) Suppose that E and F are events from a sample space S such that p(E)  0 and p(F)  0. Then p(F|E) = p(E|F)p(F)/(p(E|F)p(F)+p(E|~F)p(~F)). –Proof –Ex.2

17 Theorem 2: (Generalized Bayes’ Theorem) Suppose that E is an event from a sample space S and F1, F2, …, Fn are mutually exclusive events such that  i=1..n Fi=S. Assume that p(E)  0 and p(Fi)  0 for i=1,2,…,n. Then p(Fj|E) = p(E|Fj)p(Fj)/  i=1..n p(E|Fi)p(Fi).

18 Bayesian Spam Filters Bayesian spam filters –B: set of spam messages, G: set of good messages –n B (w), n G (w) –p(w)=n B (w)/|B|, q(w)= n G (w)/|G| –P(S|E) = p(E|S)p(S)/(p(E|S)p(S)+p(E|~S)p(~S)). Assume p(S)=p(~S)=1/2 P(S|E)=p(E|S)/(p(E|S)+p(E|~S)) –P(E|S)=p(w), p(E|~S)=q(w) r(w)=p(w)/(p(w)+q(w)) Ex.3-4 –r(w1,w2,…,wk)=  i=1..k p(wi)/(  i=1..k p(wi)+  i=1..k q(wi))

19 Thanks for Your Attention!


Download ppt "Discrete Mathematics and Its Applications Sixth Edition By Kenneth Rosen Chapter 6 Discrete Probability 歐亞書局."

Similar presentations


Ads by Google