6ExampleA fair coin is tossed 5,000 times.Find the probability that the number of heads is between 2,475 to 2,525.We needSince n is large we can use the normal approximation.so that andandSo the approximation is valid for andSolution
8The Poisson Approximation For large n, the Gaussian approximation of a binomial r.v is valid only if p is fixed, i.e., only if andWhat if is small, or if it does not increase with n?for example, as such that is a fixed number.
9The Poisson Approximation Consider random arrivals such as telephone calls over a line.n: total number of calls in the intervalas we haveSupposeΔ : a small interval of duration
10The Poisson Approximation p: probability of a single call (during 0 to T) occurring in Δ:asNormal approximation is invalid here.Suppose the interval Δ in the figure:(H) “success” : A call inside Δ,(T ) “failure” : A call outside Δ: probability of obtaining k calls (in any order) in an interval of duration Δ ,
12Example: Winning a Lottery Supposetwo million lottery tickets are issuedwith 100 winning tickets among them.a) If a person purchases 100 tickets, what is the probability of winning?SolutionThe probability of buying a winning ticket
13P: an approximate Poisson distribution with parameter Winning a Lottery - continuedX: number of winning ticketsn: number of purchased tickets ,P: an approximate Poisson distribution with parameterSo, The Probability of winning is:
14Winning a Lottery - continued b) How many tickets should one buy to be 95% confident of having a winning ticket?we needBut orThus one needs to buy about 60,000 tickets to be 95% confident of having a winning ticket!Solution
15n is large and p is small Example: Danger in Space Mission A space craft has 100,000 componentsThe probability of any one component being defective isThe mission will be in danger if five or more components become defective.Find the probability of such an event.n is large and p is smallPoisson Approximation with parameterSolution
22ExampleLet B represent the event withFor a given determine andSolution
23Example - continuedFor we haveand henceFor we haveand henceFor we haveso thatThus,
24Conditional p.d.f & Bayes’ Theorem First, we extend the conditional probability results to random variables:We know that If is a partition of S and B is an arbitrary event, then:By setting we obtain:
26Conditional p.d.f & Bayes’ Theorem Let so that in the limit asorwe also get(Total Probability Theorem)
27Bayes’ Theorem (continuous version) using total probability theorem inWe get the desired result
28Example: Coin Tossing Problem Revisited probability of obtaining a head in a toss.For a given coin, a-priori p can possess any value in (0,1).: A uniform in the absence of any additional informationAfter tossing the coin n times, k heads are observed.How can we update this is new information?Let A= “k heads in n specific tosses”.Since these tosses result in a specific sequence,and using Total Probability Theorem we getSolution
29Example - continuedThe a-posteriori p.d.f represents the updated information given the event A,UsingThis is a beta distribution.We can use this a-posteriori p.d.f to make further predictions.For example, in the light of the above experiment, what can we say about the probability of a head occurring in the next (n+1)th toss?
30Example - continuedLet B= “head occurring in the (n+1)th toss, given that k heads have occurred in n previous tosses”.ClearlyFrom Total Probability Theorem,Using (1) in (2), we get:Thus, if n =10, and k = 6, thenwhich is more realistic compare to p = 0.5.