CompSci 102 Discrete Math for Computer Science March 29, 2012 Prof. Rodger Lecture adapted from Bruce Maggs/Lecture developed at Carnegie Mellon, primarily.

Slides:



Advertisements
Similar presentations
MAT 103 Probability In this chapter, we will study the topic of probability which is used in many different areas including insurance, science, marketing,
Advertisements

CompSci 102 Discrete Math for Computer Science March 27, 2012 Prof. Rodger Lecture adapted from Bruce Maggs/Lecture developed at Carnegie Mellon, primarily.
Lecture 13 Elements of Probability CSCI – 1900 Mathematics for Computer Science Fall 2014 Bill Pine.
Discrete Probability Chapter 7.
Great Theoretical Ideas in Computer Science.
The Bernoulli distribution Discrete distributions.
Algorithmic Foundations of Computational Biology Statistical Significance in Bioinformatics Statistics Probability Theory.
March 31, 2015Applied Discrete Mathematics Week 8: Advanced Counting 1 Discrete Probability Example I: A die is biased so that the number 3 appears twice.
Mathematics in Today's World
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
22C:19 Discrete Structures Discrete Probability Fall 2014 Sukumar Ghosh.
Section 16.1: Basic Principles of Probability
Lec 18 Nov 12 Probability – definitions and simulation.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
1 Discrete Structures & Algorithms Discrete Probability.
1 Discrete Math CS 280 Prof. Bart Selman Module Probability --- Part a) Introduction.
Statistics Lecture 7. Last day: Completed Chapter 2 Today: Discrete probability distributions Assignment 3: Chapter 2: 44, 50, 60, 68, 74, 86, 110.
1 Section 5.1 Discrete Probability. 2 LaPlace’s definition of probability Number of successful outcomes divided by the number of possible outcomes This.
Learning Goal 13: Probability Use the basic laws of probability by finding the probabilities of mutually exclusive events. Find the probabilities of dependent.
Great Theoretical Ideas in Computer Science.
Probability Theory: Counting in Terms of Proportions Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2004 Lecture.
Chapter 9 Introducing Probability - A bridge from Descriptive Statistics to Inferential Statistics.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
1 9/8/2015 MATH 224 – Discrete Mathematics Basic finite probability is given by the formula, where |E| is the number of events and |S| is the total number.
1 9/23/2015 MATH 224 – Discrete Mathematics Basic finite probability is given by the formula, where |E| is the number of events and |S| is the total number.
Lecture Discrete Probability. 5.1 Probabilities Important in study of complexity of algorithms. Modeling the uncertain world: information, data.
Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2004 Lecture 22April 1, 2004Carnegie Mellon University
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
Section 7.1. Section Summary Finite Probability Probabilities of Complements and Unions of Events Probabilistic Reasoning.
Chapter 7 With Question/Answer Animations. Section 7.1.
Section 7.2. Section Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli.
Great Theoretical Ideas in Computer Science.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
COMP 170 L2 L17: Random Variables and Expectation Page 1.
1.3 Simulations and Experimental Probability (Textbook Section 4.1)
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
22C:19 Discrete Structures Discrete Probability Spring 2014 Sukumar Ghosh.
COMPSCI 102 Introduction to Discrete Mathematics.
5.1 Randomness  The Language of Probability  Thinking about Randomness  The Uses of Probability 1.
Statistics Lecture 4. Last class: measures of spread and box-plots Have completed Chapter 1 Today - Chapter 2.
How likely is it that…..?. The Law of Large Numbers says that the more times you repeat an experiment the closer the relative frequency of an event will.
Natural Language Processing Giuseppe Attardi Introduction to Probability IP notice: some slides from: Dan Jurafsky, Jim Martin, Sandiway Fong, Dan Klein.
Random Variables Example:
Introduction Lecture 25 Section 6.1 Wed, Mar 22, 2006.
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
Week 21 Rules of Probability for all Corollary: The probability of the union of any two events A and B is Proof: … If then, Proof:
Great Theoretical Ideas in Computer Science for Some.
Tutorial 5: Discrete Probability I Reference: cps102/lecture11.ppt Steve Gu Feb 15, 2008.
Great Theoretical Ideas In Computer Science John LaffertyCS Fall 2006 Lecture 10 Sept. 28, 2006Carnegie Mellon University Classics.
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U Authors: Gary Greer (with.
Counting Techniques (Dr. Monticino). Overview  Why counting?  Counting techniques  Multiplication principle  Permutation  Combination  Examples.
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
Great Theoretical Ideas in Computer Science.
Introduction to Discrete Probability
ICS 253: Discrete Structures I
Lecture 8.
Introduction to Discrete Mathematics
Conditional Probability
CHAPTER 12: Introducing Probability
Natural Language Processing
Discrete Probability Chapter 7 With Question/Answer Animations
Introduction to Probability
Great Theoretical Ideas In Computer Science
Probabilities and Proportions
Probability Theory: Counting in Terms of Proportions
Probability Theory: Counting in Terms of Proportions
Presentation transcript:

CompSci 102 Discrete Math for Computer Science March 29, 2012 Prof. Rodger Lecture adapted from Bruce Maggs/Lecture developed at Carnegie Mellon, primarily by Prof. Steven Rudich.

Announcements Recitation this week, bring laptop Read Chapter

Probability Theory: Counting in Terms of Proportions

The Descendants of Adam Adam was X inches tall He had two sons: One was X+1 inches tall One was X-1 inches tall Each of his sons had two sons …

X X-1X+1 X-2X+2X X-3X+3X-1X+1 X-4X+4X-2X+2X In the n th generation there will be 2 n males, each with one of n+1 different heights: h 0, h 1,…,h n h i = (X-n+2i) occurs with proportion: nini / 2 n

Example: 4 th generation How likely is the second smallest height? N = 4, 2 4 = 16 males 5 different heights Probability:

Unbiased Binomial Distribution On n+1 Elements Let S be any set {h 0, h 1, …, h n } where each element h i has an associated probability Any such distribution is called an Unbiased Binomial Distribution or an Unbiased Bernoulli Distribution nini 2n2n

Some Puzzles

Teams A and B are equally good In any one game, each is equally likely to win What is most likely length of a best of 7 series? Flip coins until either 4 heads or 4 tails Is this more likely to take 6 or 7 flips?

6 and 7 Are Equally Likely To reach either one, after 5 games, the win to losses must be ½ chance it ends 4 to 2; ½ chance it doesnt 3 to 2

Silver and Gold One bag has two silver coins, another has two gold coins, and the third has one of each One bag is selected at random. One coin from it is selected at random. It turns out to be gold What is the probability that the other coin is gold?

3 choices of bag 2 ways to order bag contents 6 equally likely paths

Given that we see a gold, 2/3 of remaining paths have gold in them!

So, sometimes, probabilities can be counterintuitive ? ?

Language of Probability The formal language of probability is a very important tool in describing and analyzing probability distribution

Finite Probability Distribution A (finite) probability distribution D is a finite set S of elements, where each element x in S has a non-negative real weight, proportion, or probability p(x) p(x) = 1 x S For convenience we will define D(x) = p(x) S is often called the sample space and elements x in S are called samples The weights must satisfy:

S Sample space Sample Space D(x) = p(x) = 0.2 weight or probability of x

Events Any set E S is called an event p(x) x E Pr D [E] = S Pr D [E] = 0.4

Uniform Distribution If each element has equal probability, the distribution is said to be uniform p(x) = x E Pr D [E] = |E| |S|

A fair coin is tossed 100 times in a row What is the probability that we get exactly half heads?

The sample space S is the set of all outcomes {H,T} 100 Each sequence in S is equally likely, and hence has probability 1/|S|=1/2 100 Using the Language

S = all sequences of 100 tosses x = HHTTT……TH p(x) = 1/|S| Visually

Set of all sequences {H,T} 100 Probability of event E = proportion of E in S Event E = Set of sequences with 50 Hs and 50 Ts / 2 100

Suppose we roll a white die and a black die What is the probability that sum is 7 or 11?

(1,1), (1,2), (1,3), (1,4), (1,5), (1,6), (2,1), (2,2), (2,3), (2,4), (2,5), (2,6), (3,1), (3,2), (3,3), (3,4), (3,5), (3,6), (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5), (5,6), (6,1), (6,2), (6,3), (6,4), (6,5), (6,6) } Pr[E] = |E|/|S| = proportion of E in S = Same Methodology! S = { 8/36

23 people are in a room Suppose that all possible birthdays are equally likely What is the probability that two people will have the same birthday?

x = (17,42,363,1,…, 224,177) 23 numbers And The Same Methods Again! Sample space W = {1, 2, 3, …, 366} 23 Event E = { x W | two numbers in x are same } Count |E| instead!What is |E|?

all sequences in S that have no repeated numbers E = |W| = |E| = (366)(365)…(344) = 0.494… |W| |E| |W| = 0.506…

and is defined to be = S A B proportion of A B More Language Of Probability The probability of event A given event B is written Pr[ A | B ] to B Pr [ A B ] Pr [ B ]

event A = {white die = 1} event B = {total = 7} Suppose we roll a white die and black die What is the probability that the white is 1 given that the total is 7?

(1,1), (1,2), (1,3), (1,4), (1,5), (1,6), (2,1), (2,2), (2,3), (2,4), (2,5), (2,6), (3,1), (3,2), (3,3), (3,4), (3,5), (3,6), (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5), (5,6), (6,1), (6,2), (6,3), (6,4), (6,5), (6,6) } S = { |B| Pr[B] 6 |A B| =Pr [ A | B ] Pr [ A B ] 1 == event A = {white die = 1}event B = {total = 7}

Independence! A and B are independent events if Pr[ A | B ] = Pr[ A ] Pr[ A B ] = Pr[ A ] Pr[ B ] Pr[ B | A ] = Pr[ B ]

Pr[A 1 | A 2 A 3 ] = Pr[A 1 ] Pr[A 2 | A 1 A 3 ] = Pr[A 2 ] Pr[A 3 | A 1 A 2 ] = Pr[A 3 ] Pr[A 1 | A 2 ] = Pr[A 1 ]Pr[A 1 | A 3 ] = Pr[A 1 ] Pr[A 2 | A 1 ] = Pr[A 2 ] Pr[A 2 | A 3 ] = Pr[A 2 ] Pr[A 3 | A 1 ] = Pr[A 3 ]Pr[A 3 | A 2 ] = Pr[A 3 ] E.g., {A 1, A 2, A 3 } are independent events if: Independence! A 1, A 2, …, A k are independent events if knowing if some of them occurred does not change the probability of any of the others occurring

Silver and Gold One bag has two silver coins, another has two gold coins, and the third has one of each One bag is selected at random. One coin from it is selected at random. It turns out to be gold What is the probability that the other coin is gold?

Let G 1 be the event that the first coin is gold Pr[G 1 ] = 1/2 Let G 2 be the event that the second coin is gold Pr[G 2 | G 1 ] = Pr[G 1 and G 2 ] / Pr[G 1 ] = (1/3) / (1/2) = 2/3 Note: G 1 and G 2 are not independent

Monty Hall Problem Announcer hides prize behind one of 3 doors at random You select some door Announcer opens one of others with no prize You can decide to keep or switch What to do?

Staying we win if we choose the correct door Switching we win if we choose an incorrect door Pr[ choosing correct door ] = 1/3 Pr[ choosing incorrect door ] = 2/3 Monty Hall Problem Sample space = { prize behind door 1, prize behind door 2, prize behind door 3 } Each has probability 1/3

We are inclined to think: After one door is opened, others are equally likely… But his action is not independent of yours! Why Was This Tricky?

Study Bee Binomial Distribution Definition Language of Probability Sample Space Events Uniform Distribution Pr [ A | B ] Independence