Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.

Slides:



Advertisements
Similar presentations
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Advertisements

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Chapter 3 Some Special Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Exponential Distribution. = mean interval between consequent events = rate = mean number of counts in the unit interval > 0 X = distance between events.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Stochastic Processes Dr. Nur Aini Masruroh. Stochastic process X(t) is the state of the process (measurable characteristic of interest) at time t the.
Review.
Chapter 6 Continuous Random Variables and Probability Distributions
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Chapter 4. Continuous Probability Distributions
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Exponential Distribution & Poisson Process
1 Exponential Distribution & Poisson Process Memorylessness & other exponential distribution properties; Poisson process and compound P.P.’s.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
4.3 More Discrete Probability Distributions Statistics Mrs. Spitz Fall 2008.
Chapter 5 Some Discrete Probability Distributions.
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
HAWKES LEARNING SYSTEMS math courseware specialists Copyright © 2010 by Hawkes Learning Systems/Quant Systems, Inc. All rights reserved. Chapter 8 Continuous.
Exponential Distribution
Lectures prepared by: Elchanan Mossel elena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Continuous Distributions The Uniform distribution from a to b.
1 Birth and death process N(t) Depends on how fast arrivals or departures occur Objective N(t) = # of customers at time t. λ arrivals (births) departures.
Why Wait?!? Bryan Gorney Joe Walker Dave Mertz Josh Staidl Matt Boche.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
4.3 More Discrete Probability Distributions NOTES Coach Bridges.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Uniform, normal, and exponential. 2.Exponential example. 3.Uniform.
Probability Generating Functions Suppose a RV X can take on values in the set of non-negative integers: {0, 1, 2, …}. Definition: The probability generating.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Chapter 4 Continuous Random Variables and Probability Distributions  Probability Density Functions.2 - Cumulative Distribution Functions and E Expected.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Continuous Random Variables.
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Probability Distributions: a review
The Exponential and Gamma Distributions
Exponential Distribution & Poisson Process
5. Continuous Random Variables
Pertemuan ke-7 s/d ke-10 (minggu ke-4 dan ke-5)
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
The Bernoulli distribution
Lecture 10 – Introduction to Probability
Multinomial Distribution
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
STATISTICAL MODELS.
Chapter 4 Discrete Probability Distributions.
Lectures prepared by: Elchanan Mossel Yelena Shvets
Discrete Probability Distributions
Lectures prepared by: Elchanan Mossel Yelena Shvets
Lectures prepared by: Elchanan Mossel Yelena Shvets
Continuous Distributions
Moments of Random Variables
Presentation transcript:

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section 4.2

Random Times Random times are often described by a distribution with a density. Examples: Lifetime of an individual from a given population. Time until decay of a radioactive atom. Lifetime of a circuit. Time until a phone call arrives.

Random Times If T is a random time then its range is [0,  ). A random time with the density f(t) satisfies: P(a  T  b) =  a b f(t)dt. The probability of surviving past time s is called the Survival Function: P(T>s) =  s ∞ f(t)dt. Note that The Survival Function completely characterizes the distribution: P(a  T  b) = P(T>a) – P(T>b).

Exponential Distribution Definition: A random time T has an Exponential( ) Distribution if it has a probability density f(t)= e - t, (t > 0). Equivalently, for (a  b <  ) P(a<T<b) = s a b e - t dt= e - a – e - b. Equivalently: P(T>t) = e - t, (t > 0). Claim: If T has Exp( ) distribution then: E(T)=SD(T) = 1/

Exponential densities   f(t)= e - t, (t > 0).

Memoryless Property Claim: A positive random variable T has the Exp( ) distribution iff T has the memoryless property : P(T>t+s|T>t) = P(T>s), (t,s > 0) Proof: If T  Exp( ), then P(T>t+s |T>t) = P(T>t+s & T>t)/P(T>t) = P(T>t+s)/P(T>t) = e -  t+s) /e - t = e - s = P(T>s) Conversely, if T has the memoryless property, the survival function G(t)=P(T>t) must solve G(t+s) = G(t)G(s). Thus L(t) = log G(t) satisfies L(t+s) = L(t) + L(s) and L is decreasing. This implies L(T) = - t or G(t) = e - t

Memoryless Property Mr. D. is expecting a phone call. Given that up to the time t minutes the call hasn’t come, the chance that the phone call will not arrive in the next s min. is the same as the chance that it hasn’t come in the first s min. P(T>t+s|T>t) = P(T>s), (t,s ¸ 0).

Hazard Rate The constant parameter  can be interpreted as the instantaneous death rate or hazard rate: P(T  t +  | T  t) = 1 - P(T  t +  | T  t ) = 1 – P(T   ), by the memoryless property = 1 – e -  = 1 – [1 -  + ½ (  ) 2 - …]   The approximation holds if  is small (so we can neglect higher order terms).

Exponential & Geometric Distributions Claim: Suppose G(p) has a geometric p distribution, then for all t and lim p   P(pG(p)/ > t) = e - t. Proof: lim p -> 0 P(p G(p)/ >t)= lim p -> 0 P(G(p)> t/p) = lim p -> 0 (1-p)  t/p   lim p -> 0 (1-p) t/p  e - t.

Relationship to the Poisson Process 0 1 Let the unit interval be subdivided into n cells of length 1/n each occupied with probability p =  /n. Note that the number of occupied cells is Bin(n, /n).

Relationship to the Poisson Process Number of successes in (t-s) n trials is distributed as Bin((t-s) n, /n)  Poisson((t-s) ). s t Question: What’s the distribution of the successes in an interval (s,t)? Time = (# of empty cells+1)* 1/n  ~ Geom(  /n)*1/n  exp( ). Question: What is the distribution of W i ’s, the waiting times between success i-1 and success i. W 2 = 3*1/n W 3 =2/n W 4 = 1/n W 5 = 2/n W 6 = 3/n W 7 = 1/n W 8 = 1/n W 9 = 1/n

Two descriptions of Poisson Arrival Process 1.Counts of arrivals: The Number of arrivals in a fixed time interval of length t is a Poisson( t) random variable: P(N(t) = k) = e - t ( t) -k /k! # of arrivals on disjoint intervals are independent. 2. Times between successes: Waiting times W i between successes are Exp( ) random variables: P(W i >t) = e - t The random variables W i are independent.

Poisson Arrival Process So the number of calls N(s,t) between the minutes s and t satisfies: N(s,t)  Poisson(3(t-s)). The waiting time between the (i-1) st and the i th calls satisfies: W i  Exp(3). Suppose phone calls are coming into a telephone exchange at an average rate of  3 per minute. Question: What’s the probability that no call arrives between t=0 and t=2? Solution: N(0,2)  Poisson(6), P(N(0,2) = 0 ) = e -6 =

Poisson Arrival Process Question: What’s the probability that the first call after t=0 takes more than 2 minute to arrive? Solution: W 1  Exp(3), P(W 1  2 ) = e -3*2 = Note: the answer is the same as in the first question. So for the number of calls between the minutes s and t, we have: N(s,t)  Poisson(3(t-s)). For the waiting time between the (i-1) st and the i th calls, we have: W i  Exp(3). Suppose phone calls are coming into a telephone exchange at an average rate of  3 per minute.

Poisson Arrival Process Question: What’s the probability that no call arrives between t=0 and t=2 and at most 4 calls arrive between t=2 and t=3 ? Solution: By independence of N(0,2) and N(2,3), this is P(N(0,2) = 0) * P(N(2,3)  4) = e -6 e -3 ( /2! /3! /4!) =0.0020, So for the number of calls between the minutes s and t, we have: N(s,t)  Poisson(3(t-s)). For the waiting time between the (i-1) st and the i th calls, we have: W i  Exp(3). Suppose phone calls are coming into a telephone exchange at an average rate of  3 per minute.

Poisson Arrival Process Question: What’s the probability that the fourth call arrives within 30 seconds of the third? Solution: P(W 4  0.5) = 1 – P(W 4 > 0.5) = 1 - e -3/2 = So for the number of calls between the minutes s and t, we have: N(s,t)  Poisson(3(t-s)). For the waiting time between the (i-1) st and the i th calls, we have: W i  Exp(3). Suppose phone calls are coming into a telephone exchange at an average rate of  3 per minute.

Poisson Arrival Process Question: What’s the probability that the fifth call takes more than 2 minutes to arrive? Solution: P(W 1 + W 2 + W 3 + W 4 + W 5 > 2) = P(N(0,2)  4) P(N(0,2)  4) = e -6 ( /2! /3! /4!) = So for the number of calls between the minutes s and t, we have: N(s,t)  Poisson(3(t-s)). For the waiting time between the (i-1) st and the i th calls, we have: W i  Exp(3). Suppose phone calls are coming into a telephone exchange at an average rate of  3 per minute.

Poisson r th Arrival Times Let T r denote the time of the r th arrival in a Poisson process with intensity. The distribution of T r is called Gamma(r, ) distribution. Note that:T r = W 1 + W 2 + … + W r, where W i ’s are the waiting times between arrivals. Density: P(T r  (t,t+dt))/dt = P[N(0,t) = r-1] * (Rate); Right tail probability: P(T r  dt) = P(N(0,t)  r-1); Mean and SD:

Gamma Densities for r=1 to 10. Time in multiples of 1/. Probability density in multiples of. Due to the central limit theorem, the gamma(r, ) distribution becomes asymptotically normal as r  .

Relationship between Poisson and Exponential. Property/ideaDiscreteContinuous Wait time until one success Geometric (p) (trials) Exponential (λ) (time) Number of successes in a fixed interval Binomial (n, p) (fixed trials n) Poisson (λt) (fixed time t) Wait time until k successes NegBin (k, λ)Gamma (k, λ) P(kth success/hit is after some time) NegBin (k, λ) > nGamma (k, λ) > t P(less than k successes in some time) Binomial (n, p) ≤ k-1Poisson (λt) ≤ k-1