Finite probability space set  (sample space) function P:  R + (probability distribution)  P(x) = 1 x 

Slides:



Advertisements
Similar presentations
Copyright © 2006 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide
Advertisements

Randomized Algorithms Randomized Algorithms CS648 Lecture 2 Randomized Algorithm for Approximate Median Elementary Probability theory Lecture 2 Randomized.
Presentation 5. Probability.
Section 5.1 and 5.2 Probability
Randomized Algorithms Kyomin Jung KAIST Applied Algorithm Lab Jan 12, WSAC
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
8.7 Probability. Ex 1 Find the sample space for each of the following. One coin is tossed. Two coins are tossed. Three coins are tossed.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
22C:19 Discrete Structures Discrete Probability Fall 2014 Sukumar Ghosh.
April 2, 2015Applied Discrete Mathematics Week 8: Advanced Counting 1 Random Variables In some experiments, we would like to assign a numerical value to.
Introduction to Probability
1 Introduction to Stochastic Models GSLM Outline  course outline course outline  Chapter 1 of the textbook.
1 Discrete Math CS 280 Prof. Bart Selman Module Probability --- Part a) Introduction.
Section 1.2 Suppose A 1, A 2,..., A k, are k events. The k events are called mutually exclusive if The k events are called mutually exhaustive if A i 
Administrative Sep. 27 (today) – HW4 due Sep. 28 8am – problem session Oct. 2 Oct. 4 – QUIZ #2 (pages of DPV)
Chris Morgan, MATH G160 January 18, 2012 Lecture 4 Chapter 4.4: Independence 1.
T i = indicator random variable of the event that i-th throw results in a tail E[T] = E[T 1 ] + … + E[T 6 ] = 6*(1/2) = 3 P(T=3) = P(H=3) = binomial(6,3)/2.
Expected Value.  In gambling on an uncertain future, knowing the odds is only part of the story!  Example: I flip a fair coin. If it lands HEADS, you.
Administrative Sep. 25 (today) – HW3 (=QUIZ #1) due Sep. 27 – HW4 due Sep. 28 8am – problem session Oct. 2 Oct. 4 – QUIZ #2 (pages of DPV)
The probability of an event based on the fact that some other event has occurred, will occur, or is occurring. P(B/A) = Conditional Probability The probability.
Chapter 6 Probability.
Review of Probability Theory. © Tallal Elshabrawy 2 Review of Probability Theory Experiments, Sample Spaces and Events Axioms of Probability Conditional.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2004 Lecture 22April 1, 2004Carnegie Mellon University
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Sample space The set of all possible outcomes of a chance experiment –Roll a dieS={1,2,3,4,5,6} –Pick a cardS={A-K for ♠, ♥, ♣ & ♦} We want to know the.
Chapter 1 Probability Spaces 主講人 : 虞台文. Content Sample Spaces and Events Event Operations Probability Spaces Conditional Probabilities Independence of.
Warm-Up 1. What is Benford’s Law?
Chapter 1:Independent and Dependent Events
Lecture Discrete Probability. 5.3 Bayes’ Theorem We have seen that the following holds: We can write one conditional probability in terms of the.
Section 7.2. Section Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli.
Great Theoretical Ideas in Computer Science.
Computing Fundamentals 2 Lecture 6 Probability Lecturer: Patrick Browne
Recap from last lesson Compliment Addition rule for probabilities
22C:19 Discrete Structures Discrete Probability Spring 2014 Sukumar Ghosh.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
From Randomness to Probability Chapter 14. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,
Probability Prof. Richard Beigel Math C067 September 27, 2006.
Discrete Distributions. Random Variable - A numerical variable whose value depends on the outcome of a chance experiment.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
ICS 253: Discrete Structures I Discrete Probability King Fahd University of Petroleum & Minerals Information & Computer Science Department.
Randomization Carmella Kroitoru Seminar on Communication Complexity.
Basic Concepts of Probability
Great Theoretical Ideas in Computer Science for Some.
Great Theoretical Ideas In Computer Science John LaffertyCS Fall 2006 Lecture 10 Sept. 28, 2006Carnegie Mellon University Classics.
3/7/20161 Now it’s time to look at… Discrete Probability.
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
Chapter 2 Probability. Motivation We need concept of probability to make judgments about our hypotheses in the scientific method. Is the data consistent.
Chapter 7. Section 7.1 Finite probability  In a lottery, players win a large prize when they pick four digits that match, in the correct order, four.
Introduction to Discrete Probability
Introduction to Discrete Probability
ICS 253: Discrete Structures I
Chapter 6: Discrete Probability
Chapter 6 6.1/6.2 Probability Probability is the branch of mathematics that describes the pattern of chance outcomes.
PROBABILITY AND PROBABILITY RULES
CS104:Discrete Structures
Conditional Probability
Discrete Probability Chapter 7 With Question/Answer Animations
Now it’s time to look at…
CSCI 5832 Natural Language Processing
Independent vs. Dependent events
Applied Discrete Mathematics Week 12: Discrete Probability
Randomized Algorithms CS648
Mutually Exclusive Events
Chapter 3: Independent Events
Lecture 2 Basic Concepts on Probability (Section 0.2)
Sets, Combinatorics, Probability, and Number Theory
Presentation transcript:

Finite probability space set  (sample space) function P:  R + (probability distribution)  P(x) = 1 x 

Finite probability space set  (sample space) function P:  R + (probability distribution) elements of  are called atomic events subsets of  are called events probability of an event A is  P(x) xAxA P(A)=  P(x) = 1 x 

Examples 1. Roll a (6 sided) dice. What is the probability that the number on the dice is even? 2. Flip two coins, what is the probability that they show the same symbol? 3. Flip five coins, what is the probability that they show the same symbol? 4. Mix a pack of 52 cards. What is the probability that all red cards come before all black cards?

Union bound P(A  B)  P(A) + P(B) P(A 1  A 2  …  A n )  P(A 1 ) + P(A 2 )+…+P(A n )

Union bound P(A 1  A 2  …  A n )  P(A 1 ) + P(A 2 )+…+P(A n ) Suppose that the probability of winning in a lottery is What is the probability that somebody out of 100 people wins? A i = i-th person wins somebody wins = ?

Union bound P(A 1  A 2  …  A n )  P(A 1 ) + P(A 2 )+…+P(A n ) Suppose that the probability of winning in a lottery is What is the probability that somebody out of 100 people wins? A i = i-th person wins somebody wins = A 1  A 2  …  A 100

Union bound P(A 1  A 2  …  A n )  P(A 1 ) + P(A 2 )+…+P(A n ) Suppose that the probability of winning in a lottery is What is the probability that somebody out of 100 people wins? P(A 1  A 2  …  A 100 )  100*10 -6 = 10 -4

Union bound P(A 1  A 2  …  A n )  P(A 1 ) + P(A 2 )+…+P(A n ) Suppose that the probability of winning in a lottery is What is the probability that somebody out of 100 people wins? P(A 1  A 2  …  A 100 )  100*10 -6 = P(A 1  A 2  …  A 100 ) = 1–P(A C 1  A C 2  …  A C 100 ) = 1-P(A C 1 )P(A C 2 )…P(A C 100 )= 1-( ) 100  0.99*10 -4

Independence Events A,B are independent if P(A  B) = P(A) * P(B)

Independence Events A,B are independent if P(A  B) = P(A) * P(B) “observing whether B happened gives no information on A” B A

Independence Events A,B are independent if P(A  B) = P(A) * P(B) “observing whether B happened gives no information on A” B A P(A|B) = P(A  B)/P(B) conditional probability of A, given B

Independence Events A,B are independent if P(A  B) = P(A) * P(B) P(A|B) = P(A)

Examples Roll two (6 sided) dice. Let S be their sum. 1) What is that probability that S=7 ? 2) What is the probability that S=7, conditioned on S being odd ? 3) Let A be the event that S is even and B the event that S is odd. Are A,B independent? 4) Let C be the event that S is divisible by 4. Are A,C independent? 5) Let D be the event that S is divisible by 3. Are A,D independent?

Examples A B C Are A,B independent ? Are A,C independent ? Are B,C independent ? Is it true that P(A  B  C)=P(A)P(B)P(C)?

Examples A B C Are A,B independent ? Are A,C independent ? Are B,C independent ? Is it true that P(A  B  C)=P(A)P(B)P(C)? Events A,B,C are pairwise independent but not (fully) independent

Full independence Events A 1,…,A n are (fully) independent If for every subset S  [n]:={1,2,…,n} P (  A i ) =  P(A i ) iSiS iSiS

Testing equality of strings Alice: A = Bob : B = slow network QUESTION: Is A=B? n-bits

Testing equality of strings slow network QUESTION: Is A=B? Alice: A = Bob : B = n-bits Protocol: 1. Alice picks a random prime p  n Alice computes a:=(A mod p), and sends p and a to Bob. 3. Bob computes b:=(B mod p), and checks whether a=b.

Testing equality of strings Protocol: 1. Alice picks a random prime p  n Alice computes a:=(A mod p), and sends p and a to Bob. 3. Bob computes b:=(B mod p), and checks whether a=b. How many bits are communicated?

Testing equality of strings Protocol: 1. Alice picks a random prime p  n Alice computes a:=(A mod p), and sends p and a to Bob. 3. Bob computes b:=(B mod p), and checks whether a=b. What is the probabilty of failure?

Testing equality of strings Protocol: 1. Alice picks a random prime p  n Alice computes a:=(A mod p), and sends p and a to Bob. 3. Bob computes b:=(B mod p), and checks whether a=b. What is the probabilty of failure? BAD EVENT = p divides A-B

Testing equality of strings What is the probabilty of failure? BAD EVENT = p divides A-B How many (different) primes can divide an n-bit number? How many primes  n 2 are there?

Testing equality of strings What is the probabilty of failure? BAD EVENT = p divides A-B How many (different) primes can divide an n-bit number? 2 n  M=p 1 p 2 …p k  2 k k  n How many primes  n 2 are there? Prime Number Theorem  (m)  m/ln m number of primes  m

Testing equality of strings If A=B then the algorithm always answers YES If A  B then the algorithms answers NO with probability  1- (ln n)/n Monte Carlo algorithm with 1-sided error

Random variable set  (sample space) function P:  R + (probability distribution)  P(x) = 1 x  A random variable is a function Y :  R The expected value of Y is E[X] :=  P(x)* Y(x) x 

Examples Roll two dice. Let S be their sum. If S=7 then player A gives player B $6 otherwise player B gives player A $1 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12

Examples Roll two dice. Let S be their sum. If S=7 then player A gives player B $6 otherwise player B gives player A $1 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 -1, -1,-1,-1, -1, 6,-1,-1, -1, -1, -1 Expected income for B E[Y] = 6*(1/6)-1*(5/6)= 1/6 Y:

Linearity of expectation E[X  Y]  E[X] + E[Y] E[X 1  X 2  …  X n ]  E[X 1 ] + E[X 2 ]+…+E[X n ]

Linearity of expectation Everybody pays me $1 and writes their name on a card. I mix the cards and give everybody one card. If you get back the card with your name – I pay you $10. Let n be the number of people in the class. For what n is the game advantageous for me?

Linearity of expectation Everybody pays me $1 and writes their name on a card. I mix the cards and give everybody one card. If you get back the card with your name – I pay you $10. X 1 = -9 if player 1 gets his card back 1 otherwise E[X 1 ] = ?

Linearity of expectation Everybody pays me $1 and writes their name on a card. I mix the cards and give everybody one card. If you get back the card with your name – I pay you $10. X 1 = -9 if player 1 gets his card back 1 otherwise E[X 1 ] = -9/n + 1*(n-1)/n

Linearity of expectation Everybody pays me $1 and writes their name on a card. I mix the cards and give everybody one card. If you get back the card with your name – I pay you $10. X 1 = -9 if player 1 gets his card back 1 otherwise X 2 = -9 if player 2 gets his card back 1 otherwise E[X 1 +…+X n ] = E[X 1 ]+…+E[X n ] = n ( -9/n + 1*(n-1)/n ) = n – 10.

Expected number of coin-tosses until HEADS?

Expected number of coin-tosses until HEADS? 1/2 1 1/4 2 1/8 3 1/16 4 ….   n.2 -n = 2 n=1 

Expected number of coin-tosses until HEADS? S S= 1 + ½*S S=2

Expected number of dice-throws until you get “6” S

Expected number of dice-throws until you get “6” S S= 1 + (5/6)*S S=6

Coupon collector problem n coupons to collect What is the expected number of cereal boxes that you need to buy?

Expected number of coin-tosses until 3 consecutive HEADS?

Markov’s inequality A group of 10 people have average income $20,000. At most how many people in the group can have average income at least $40,000? A group of 10 people have average income $ At most how many people in the group can have average income at least $100,000?

Markov’s inequality A group of 10 people have average income $20,000. At most how many people in the group can have average income at least $40,000? Let X be a random variable such that X  0. Then P(X  a*E[X])  1/a

Example Alice has an algorithm A which runs in expected running time T(n). Bob uses Alice’s algorithm to construct his own algorithm B. 1. Run algorithm A for 2T(n) steps. 2. If A terminates then B outputs the same, otherwise goto step 1. What is the expected running time of B? What is the probability that A terminates after 100T(n) steps? What is the probability that B terminates after 100T(n) steps?