Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.

Slides:



Advertisements
Similar presentations
Exponential Distribution
Advertisements

CS433: Modeling and Simulation
2 Discrete Random Variable. Independence (module 1) A and B are independent iff the knowledge that B has occurred does not change the probability that.
Lecture (7) Random Variables and Distribution Functions.
Great Theoretical Ideas in Computer Science.
Chapter 4 Probability and Probability Distributions
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Lec 18 Nov 12 Probability – definitions and simulation.
1 Discrete Structures & Algorithms Discrete Probability.
Introduction to Probability and Statistics
Chapter 2: Probability.
Probability Distributions
TDC 369 / TDC 432 April 2, 2003 Greg Brewster. Topics Math Review Probability –Distributions –Random Variables –Expected Values.
1 Review of Probability Theory [Source: Stanford University]
Rensselaer Polytechnic Institute © Shivkumar Kalvanaraman & © Biplab Sikdar1 ECSE-4730: Computer Communication Networks (CCN) Network Layer Performance.
Probability and Statistics Review
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Learning Goal 13: Probability Use the basic laws of probability by finding the probabilities of mutually exclusive events. Find the probabilities of dependent.
Probability Review ECS 152A Acknowledgement: slides from S. Kalyanaraman & B.Sikdar.
Great Theoretical Ideas in Computer Science.
Probability Theory: Counting in Terms of Proportions Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2004 Lecture.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Review of Probability Theory. © Tallal Elshabrawy 2 Review of Probability Theory Experiments, Sample Spaces and Events Axioms of Probability Conditional.
CMPE 252A: Computer Networks Review Set:
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
QA in Finance/ Ch 3 Probability in Finance Probability.
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Theory of Probability Statistics for Business and Economics.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review Instructor: Anirban Mahanti Office: ICT Class.
 Review Homework Chapter 6: 1, 2, 3, 4, 13 Chapter 7 - 2, 5, 11  Probability  Control charts for attributes  Week 13 Assignment Read Chapter 10: “Reliability”
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Prob1 INTRODUCTION to PROBABILITY. prob2 BASIC CONCEPTS of PROBABILITY  Experiment  Outcome  Sample Space Discrete Continuous  Event.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Computing Fundamentals 2 Lecture 6 Probability Lecturer: Patrick Browne
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Class 2 Probability Theory Discrete Random Variables Expectations.
Probability Review-1 Probability Review. Probability Review-2 Probability Theory Mathematical description of relationships or occurrences that cannot.
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
Natural Language Processing Giuseppe Attardi Introduction to Probability IP notice: some slides from: Dan Jurafsky, Jim Martin, Sandiway Fong, Dan Klein.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Random Variables Example:
Probability and Distributions. Deterministic vs. Random Processes In deterministic processes, the outcome can be predicted exactly in advance Eg. Force.
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
Lecture 7 Dustin Lueker.  Experiment ◦ Any activity from which an outcome, measurement, or other such result is obtained  Random (or Chance) Experiment.
 Recall your experience when you take an elevator.  Think about usually how long it takes for the elevator to arrive.  Most likely, the experience.
Great Theoretical Ideas in Computer Science for Some.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Great Theoretical Ideas in Computer Science.
3.1 Discrete Random Variables Present the analysis of several random experiments Discuss several discrete random variables that frequently arise in applications.
3. Random Variables (Fig.3.1)
ONE DIMENSIONAL RANDOM VARIABLES
Random variables (r.v.) Random variable
Natural Language Processing
Introduction to Discrete Mathematics
What is Probability? Quantification of uncertainty.
Business Statistics Topic 4
Lecture 10 – Introduction to Probability
Natural Language Processing
STA 291 Spring 2008 Lecture 7 Dustin Lueker.
Advanced Artificial Intelligence
Great Theoretical Ideas In Computer Science
Probability distributions
Some Discrete Probability Distributions
Discrete Random Variables: Basics
Discrete Random Variables: Basics
Discrete Random Variables: Basics
Presentation transcript:

Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities Exponential distribution Poisson distribution

There is $1,000,000 behind one door & $0.00 behind the other two. You “reserve” a door (say #1) but it remains closed. Then, Monte opens one of the other two doors (say #2). The door Monte opens will be empty! (Monte wants to keep the money.) #1#2#3 The Monte Hall Problem

1. Stay with the original door we chose? 2. Switch to the other unopened door? What’s the best strategy? We are interested in probability as it relates to helping us make good or optimal decisions.

Coins & Drawers One drawer has 2 gold coins One drawer has 1 gold & 1 silver coin One drawer has 2 silver coins You select a drawer at random and then randomly select one coin It turns out that your coin is gold What is the probability that the other coin is gold ? Another Example

Probability Overview The sample space, S = set of all possible outcomes of an experiment Events are subsets of the sample space. An event occurs if any of its elements occur when the experiment is run. Two events A & B are mutually exclusive if their intersection is null. Elements of S are called realizations outcomes, sample points, or scenarios The choice of the sample space depends on the question that you are trying to answer.

Two possible sample spaces are: S 1 = { (1,1), (1,2), (1,3), …, (6,4), (6,5), (6,6) } S 2 = { 2, 3, 4,..., 11, 12 } (sum of the two values) Examples of Events: A 1 = “the sum of the face values is 3” Under S 1 : A 1 = { (1,2), (2,1) } ; Under S 2 : A 1 = { 3 } A 2 = “one die has value 3 & the other has value 1” Roll Two Dice Under S 1 : A 2 = { (1,3), (3,1) } ; Under S 2 : not an event

Mathematical:A probability measure P is defined on the set of all events satisfying the following: (1)P(A)  A  S (2)P(S) = P(A 1  A 2  · · · ) = 1 (3) Mutually exclusive events A i imply that P(A 1  A 2  · · · ) = P(A 1 ) + P(A 2 ) + · · · (4) P(A) = 1  P(A) where A = complement of A (5) P(A  B) = P(A) + P(B)  P(A  B) (6) If A & B are independent, then P(A  B) = P(A)P(B) _ _ Probability Intuitive:The proportion of time that an event occurs when the same experiment is repeated 

(7) If A & B are independent events then P(A | B) = P(A  B) P(B) = P(A)P(B) P(B) In this case B gives “no information” as to whether A will occur. = P(A). (6) The conditional probability of event A given B is P(A|B) = P(A  B) P(B)

Probability Calculations Sample space : S = { (H), (T) } Example: roll a balanced dice Event H: head appears (H) Event T: tail appears (T) S = H  T and H  T =  = P (H) + P (T) Equal chance P (H) = P (T) = 1/2 1 = P(S) = P ( H  T ) According to (2) According to (3) Proof: Intuitively: P(H) = P(T) = ½.How to prove it mathematically? 

Basic Conditional Probabilities The conditional probability of event A given B is P(A|B) = P(A  B) P(B) Example:You roll 2 dice and were told that the sum is 7. What is probability that the first die is 2? B = { (1,6), (2,5), (3,4), (4,3), (5,2), (6,1) } and A  B = { (2,5) } P(A | B) = P (A  B ) P(B) 1/36 6/36 = 1 6 = Define Event B: the sum of the two dice is 7 Event A: the first dice is 2

You roll 2 dice and double your money if the sum is  8 and lose if the sum is  7 However, the rolls occur in sequence and you do not get to observe the first roll. You are simply told either “the value is  4” or “the value is  3”. After being told this you get to place your bet. A = event that you win = { (2,6), (3,5), (3,6), (4,4), (4,5), (4,6), (5,3), (5,4), (5,5), (5,6), (6,2), (6,3), (6,4), (6,5), (6,6) } B = you are told “  4” after the first roll = { (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5), (5,6), (6,1),(6,2), (6,3), (6,4), (6,5), (6,6) } Computing Conditional Probabilities

The goal is to calculate: win = P(A|B) and lose = P(A|B). _ P(A I B)= P(A  B) P(B) A  B = { (4,4), (4,5), (4,6), (5,3), (5,4), (5,5), (5,6), (6,2), (6,3), (6,4), (6,5), (6,6) } Each realization ( i, j ) is equally likely P(A|B) = |A  B| |B| = 12/36 18/36 = 2/3. 36 In a similar manner, we can show that _ P(A | B)= 6/18 = 1/3 However, P(A|B) = 1 – P(A|B). _

More Conditional Probability Calculations P(A  B) P(B) and P(B|A) = P(B  A) P(A) P(A  B) = P(A|B)P(B) and P(B  A) = P(B|A)P(A) This leads to Bayes’ Theorem P(A|B) = P(B| A)P(A) P(B) P(A|B) =

Example: Coins & Drawers Drawer 1 has 2 gold coins Drawer 2 has 1 gold & 1 silver coin Drawer 3 has 2 silver coins D 1 = event that we pick drawer 1 G 1 = event that the first coin we select is gold P(D 1 |G 1 ) = probability that both coins are gold given that the first is gold

Coin & Drawer Computations = (1)(1/3) (1)(1/3) + (1/2)(1/3) + (0)(1/3) = 2/3 P(D i ) = 1/3, i = 1, 2, 3 P(G 1 ) = (1)(1/3) + (1/2)(1/3) + (0)(1/3) P(D 1 |G 1 ) = P(G 1 | D 1 ) P(D 1 ) P(G 1 )

Example : The Monte Hall Problem There is $1,000,000 behind one of the doors & $0.00 behind the others. #1#2#3 If you don’t switch, P(win) = 1/3. Optimal Strategy:Switch to the door that remains closed; P(win) = 2/3. Say you pick door #1 and then Monte opens one of the other two doors.

Events:D 1 = you end up choosing door 1 D 2 = you end up choosing door 2 D 3 = you end up choosing door 3 L = prize is behind door 1 M = prize is behind door 2 R = prize is behind door 3 Event you win: W = (D 1  L)  (D 2  M)  (D 3  R) These are mutually exclusive so P(W) = P(D 1  L) + P(D 2  M) + P(D 3  R) = P(D 1 |L) P(L) + P(D 2 |M) P(M) + P(D 3 |R) P(R) = (0) (1/3) + (1) (1/3) + (1) (1/3) = 2/3

Random Variables R.V. is a real-valued function defined on the sample space S Example: toss two dice, Sample space: (1,1),..., (6,6) Quantity of interest: sum of the two dice  2, 3,...,12 RV: function that maps sample space to the quantity of interest Define : X  the RV, which is defined as the sum of 2 fair dice P{X = 2} = P{ (1,1) } = 1/36 P(X = 3} = P{ (1,2), (2,1)} = 2/36 P{X = 4} = P{ (1,3), (2,2), (3,1) } = 3/36... P{X = 7} = P{ (1,6), (2,5), (3,4), (4,3), (5,2), (6,1) } = 6/36 … P{X = 12} = P{ (6,6) } = 1/36

Classification of Random Variables Discrete random variables: take finite or a countable num of possible values Examples: Bernoulli, Binomial, Geometric and Poisson Probability mass function (pmf): Probability of an outcome, p(a) = P{ X = a} Continuous random variables: takes uncountable number of possible values Probability density function (pdf) : f ( a ) = P{ X  a } =  -  f(x) d x Gamma, Normal Examples: Uniform, Exponential, Gamma, Normal a

Expected Value Continuous RV, X : E [ X ] =  x f ( x )d x --  Discrete random varible, X : E [ X ] =  x  S x p ( x )

This is the most frequently used distribution for modeling interarrival and service times in queueing systems. In many applications, it is regarded as doing a good job of balancing realism and mathematical tractability. The Exponential Distribution Time between customer arrivals at an ATM Time until a machine in a workcenter breaks Time between calls to a reservation system

F ( t ) = Pr{ T  t ) =  0 t e – u d u = 1 – e – t ( t  0) f ( t ) = { e – t, t  0 0, t < 0 probability density function (pdf)  Exponential Random Variable, T Let be parameter of the exponential distribution. 0  Variance of T : Var[ T ] = E[ T – 1/ ] 2 =  ( t – 1/ ) 2 e – t d t = 1/ 2 Expected value of T : E [ T ] =  t f ( t )d t =  t e – t d t = 1/ 0  0 

Memoryless Property of an Exponential Random Variable Pr{ T  a + b | T  b } = Pr{ T  a } a, b  0 This says that if we’ve already waited b units of time and the event has not occurred (e.g., the next customer hasn’t arrived) then the probability distribution governing the remaining time until its occurrence is the same as it would be if the system were restarted. That is, the process “forgets” that the event has not yet occurred. This is often a good assumption when the time of the next arrival is not “influenced” by the last arrival.

Proof: P(A I B)= P(A  B) P(B) = P( T > a ) P( T > a + b | T > b ) = P( T > b ) = P( T > a + b and T > b )P( T > a + b ) P( T > b ) = e – (a+b) = e – a e – b = e – a e – b e – b

Using the Exponential Distribution Calls arrive at an emergency hotline switchboard at a rate of 3 per hour. It is now 8 AM. (c) Given that no calls have arrived by 8:30 AM what is the probability that one or more calls arrive by 9 AM? (b) What is the probability that the first call arrives between 8:15 and 8:45 AM? (a) What is the probability that the first call arrives after 8:30 AM?

Let T = interarrival time random variable. Assume T is exponentially distributed with = 3, so F(t) = P(T  t) = 1 – e – 3t and f(t) = 3 e – 3t (a) P( T > ½) = e – 3(1/2) = (b) P(¼ < T < ¾) = 3e -3u du = F(¾) –  F(¼)  1/4 3/4 = [ 1  e -9/4 ]  [ 1  e -3/4 ] = e -3/4  e -9/4 = (c) P( T  1 | T  ½) = 1  P( T > 1 | T  ½  = 1  P(T  ½)  memoryless property = 1  e – 3(1/2) = Solutions

Relationship between Exponential Distribution and the Poisson “Counting” Distribution The exponential distribution is a continuous (time) distribution that, in our setting, models the waiting time until the next event (e,g., next customer arrival). The Poisson distribution is a discrete (counting) distribution which governs the total number of events (e.g., arrivals) in the time interval (0, t ). Fact:If inter-event times are independent exponential RVs with rate then the number of events that occur in the interval (0, t ) is governed by a Poisson distribution with rate. And vice-versa.

X t =# of events (e.g., arrivals) in the interval (0, t ) is governed by the following probability mass function Note: Pr{ X t = 0 } = Pr{ T > t } = e – t Pr{ X t = n } = ( t ) n e – t / n !, n = 0, 1,... E[ X t ] = t  arrival rate  length of interval Poisson Distribution

Example (revisited) Calls arrive at an emergency hotline switchboard once every 20 minutes on average. It is now 8:00 AM. Use the Poisson distribution to answer the following questions. (a)What is the probability that the first call arrives after 8:30 AM? Solution: X t ~ Poisson(3 t ); X ½ ~ Poisson(3/2), where = # of arrivals in first ½ hour. P( X ½ = 0) = (3/2) 0 e –3/2 = e –3/2 = 0.223

Solution: Let Y 1 = #of arrivals in first 15 minutes Y 2 = # of arrivals between 8:15 and 8:45 AM We must determine: Pr{ Y 1 = 0 and Y 2  1 } However, Y 1 ~ Poison(3/4), Y 2 ~ Poisson(3/2) and the events { Y 1 = 0} and { Y 2  1} are independent. (b) What is the probability that the first call arrives between 8:15 AM and 8:45 AM. P( Y 1 = 0 and Y 2  1) = P( Y 1 = 0)P( Y 2  1) = P( Y 1 = 0) [ 1  P( Y 2  0) ] = (3/4) 0 0! [1  (3/2) 0 0! e –3/4 e –3/2 ] = e –3/4  e –9/4 = 0.367

Solution: Pr{ Y 2  1 | Y 1 = 0 }, where Y 1 = #of arrivals between 8:00 and 8:30 AM Y 2 = # of arrivals between 8:30 and 9:00 AM and Y 1 ~ Poisson(3/2), Y 2 ~ Poisson(3/2) Y 1 and Y 2 are independent because they are defined on non-overlapping time intervals. (c) Given that no calls have arrived by 8:30 AM what’s the probability that one or more calls arrive by 9:00 AM? = 1  (3/2) 0 0. l e –3/2 = 1  e –3/2 = P( Y 2  1 | Y 1 = 0 ) = P( Y 2  1) = 1  P( Y 2 = 0)

Generalization:Let T i ~ exp( i ), i = 1,…, n and define T = min{ T 1,…, T n }. Then T ~ exp( ), where = · · · + n Multiple Arrival Streams If customers of type 1 arrive according to a Poisson process with rate 1 and customers of type 2 arrive according to an independent Poisson process with rate 2 then customers arrive, regardless of type, according to a Poisson process with rate = Restated: The minimum of several independent exponential rv’s is an exponential random variable with a rate that is the sum of the individual rates.

What You Should Know About Probability How to identify events and the corresponding sample space. How to work with conditional probabilities. How to work with probability functions (e.g., normal, exponential, uniform, discrete). How to work with the Poisson distribution.