Presentation is loading. Please wait.

Presentation is loading. Please wait.

Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula.

Similar presentations


Presentation on theme: "Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula."— Presentation transcript:

1 Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula

2 Let X represent a Binomial r.v ,Then from
for large n. In this context, two approximations are extremely useful.

3 The Normal Approximation (Demoivre-Laplace Theorem)
Suppose with p held fixed. Then for k in the neighborhood of np, we can approximate And we have: where

4 As we know, If and are within with approximation: where We can express this formula in terms of the normalized integral that has been tabulated extensively.

5

6 Example A fair coin is tossed 5,000 times. Find the probability that the number of heads is between 2,475 to 2,525. We need Since n is large we can use the normal approximation. so that and and So the approximation is valid for and Solution

7 Example - continued Here, Using the table,

8 The Poisson Approximation
For large n, the Gaussian approximation of a binomial r.v is valid only if p is fixed, i.e., only if and What if is small, or if it does not increase with n? for example, as such that is a fixed number.

9 The Poisson Approximation
Consider random arrivals such as telephone calls over a line. n: total number of calls in the interval as we have Suppose Δ : a small interval of duration

10 The Poisson Approximation
p: probability of a single call (during 0 to T) occurring in Δ: as Normal approximation is invalid here. Suppose the interval Δ in the figure: (H) “success” : A call inside Δ, (T ) “failure” : A call outside Δ : probability of obtaining k calls (in any order) in an interval of duration Δ ,

11 The Poisson Approximation
Thus, the Poisson p.m.f

12 Example: Winning a Lottery
Suppose two million lottery tickets are issued with 100 winning tickets among them. a) If a person purchases 100 tickets, what is the probability of winning? Solution The probability of buying a winning ticket

13 P: an approximate Poisson distribution with parameter
Winning a Lottery - continued X: number of winning tickets n: number of purchased tickets , P: an approximate Poisson distribution with parameter So, The Probability of winning is:

14 Winning a Lottery - continued
b) How many tickets should one buy to be 95% confident of having a winning ticket? we need But or Thus one needs to buy about 60,000 tickets to be 95% confident of having a winning ticket! Solution

15 n is large and p is small Example: Danger in Space Mission
A space craft has 100,000 components The probability of any one component being defective is The mission will be in danger if five or more components become defective. Find the probability of such an event. n is large and p is small Poisson Approximation with parameter Solution

16 Conditional Probability Density Function

17 Conditional Probability Density Function
Further, Since for

18 Example Toss a coin and X(T)=0, X(H)=1. Suppose Determine has the following form. We need for all x. For so that and Solution (a) 1 (b) 1

19 Example - continued For so that For and 1

20 Example Given suppose Find We will first determine For so that Solution

21 Example - continued Thus and hence (a) (b)

22 Example Let B represent the event with For a given determine and Solution

23 Example - continued For we have and hence For we have and hence For we have so that Thus,

24 Conditional p.d.f & Bayes’ Theorem
First, we extend the conditional probability results to random variables: We know that If is a partition of S and B is an arbitrary event, then: By setting we obtain:

25 Conditional p.d.f & Bayes’ Theorem
Using: We obtain: For ,

26 Conditional p.d.f & Bayes’ Theorem
Let so that in the limit as or we also get (Total Probability Theorem)

27 Bayes’ Theorem (continuous version)
using total probability theorem in We get the desired result

28 Example: Coin Tossing Problem Revisited
probability of obtaining a head in a toss. For a given coin, a-priori p can possess any value in (0,1). : A uniform in the absence of any additional information After tossing the coin n times, k heads are observed. How can we update this is new information? Let A= “k heads in n specific tosses”. Since these tosses result in a specific sequence, and using Total Probability Theorem we get Solution

29 Example - continued The a-posteriori p.d.f represents the updated information given the event A, Using This is a beta distribution. We can use this a-posteriori p.d.f to make further predictions. For example, in the light of the above experiment, what can we say about the probability of a head occurring in the next (n+1)th toss?

30 Example - continued Let B= “head occurring in the (n+1)th toss, given that k heads have occurred in n previous tosses”. Clearly From Total Probability Theorem, Using (1) in (2), we get: Thus, if n =10, and k = 6, then which is more realistic compare to p = 0.5.


Download ppt "Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula."

Similar presentations


Ads by Google