Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula.

Slides:



Advertisements
Similar presentations
Stats for Engineers Lecture 5
Advertisements

Lecture 7. Distributions
4. Binomial Random Variable Approximations,
Commonly Used Distributions
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
生醫統計學期末報告 Distributions 學生 : 劉俊成 學號 : 授課老師 : 蔡章仁.
Chapter 12 Probability © 2008 Pearson Addison-Wesley. All rights reserved.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Physics 114: Lecture 9 Probability Density Functions Dale E. Gary NJIT Physics Department.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Descriptive statistics Experiment  Data  Sample Statistics Sample mean Sample variance Normalize sample variance by N-1 Standard deviation goes as square-root.
Probability Distributions
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Statistical Analysis Pedro Flores. Conditional Probability The conditional probability of an event B is the probability that the event will occur given.
R. Kass/S07 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Probability Distribution
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
1 If we can reduce our desire, then all worries that bother us will disappear.
1 Let X represent a Binomial r.v,Then from => for large n. In this context, two approximations are extremely useful. (4-1) 4. Binomial Random Variable.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
Modeling and Simulation CS 313
Binomial Distributions Calculating the Probability of Success.
1. Normal Approximation 1. 2 Suppose we perform a sequence of n binomial trials with probability of success p and probability of failure q = 1 - p and.
Poisson Random Variable Provides model for data that represent the number of occurrences of a specified event in a given unit of time X represents the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 5. Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v?
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
1 STAT 500 – Statistics for Managers STAT 500 Statistics for Managers.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
1 TABLE OF CONTENTS PROBABILITY THEORY Lecture – 1Basics Lecture – 2 Independence and Bernoulli Trials Lecture – 3Random Variables Lecture – 4 Binomial.
Convergence in Distribution
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
One Random Variable Random Process.
1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Functions of a Random Variable.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
THE NORMAL APPROXIMATION TO THE BINOMIAL. Under certain conditions the Normal distribution can be used as an approximation to the Binomial, thus reducing.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Discrete Random Variables. Discrete random variables For a discrete random variable X the probability distribution is described by the probability function,
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Random Variables Example:
One Function of Two Random Variables
Lecture 21 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
THE NORMAL DISTRIBUTION
Random variables (r.v.) Random variable
Random Variables.
STATISTICS AND PROBABILITY IN CIVIL ENGINEERING
Appendix A: Probability Theory
Chapter 5 Sampling Distributions
Multinomial Distribution
Econometric Models The most basic econometric model consists of a relationship between two variables which is disturbed by a random error. We need to use.
Distributions and expected value
Chapter 5 Sampling Distributions
5. Functions of a Random Variable
5. Functions of a Random Variable
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
11. Conditional Density Functions and Conditional Expected Values
11. Conditional Density Functions and Conditional Expected Values
Chapter 11 Probability.
Presentation transcript:

Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula

Let X represent a Binomial r.v ,Then from for large n. In this context, two approximations are extremely useful.

The Normal Approximation (Demoivre-Laplace Theorem) Suppose with p held fixed. Then for k in the neighborhood of np, we can approximate And we have: where

As we know, If and are within with approximation: where We can express this formula in terms of the normalized integral that has been tabulated extensively.

Example A fair coin is tossed 5,000 times. Find the probability that the number of heads is between 2,475 to 2,525. We need Since n is large we can use the normal approximation. so that and and So the approximation is valid for and Solution

Example - continued Here, Using the table,

The Poisson Approximation For large n, the Gaussian approximation of a binomial r.v is valid only if p is fixed, i.e., only if and What if is small, or if it does not increase with n? for example, as such that is a fixed number.

The Poisson Approximation Consider random arrivals such as telephone calls over a line. n: total number of calls in the interval as we have Suppose Δ : a small interval of duration

The Poisson Approximation p: probability of a single call (during 0 to T) occurring in Δ: as Normal approximation is invalid here. Suppose the interval Δ in the figure: (H) “success” : A call inside Δ, (T ) “failure” : A call outside Δ : probability of obtaining k calls (in any order) in an interval of duration Δ ,

The Poisson Approximation Thus, the Poisson p.m.f

Example: Winning a Lottery Suppose two million lottery tickets are issued with 100 winning tickets among them. a) If a person purchases 100 tickets, what is the probability of winning? Solution The probability of buying a winning ticket

P: an approximate Poisson distribution with parameter Winning a Lottery - continued X: number of winning tickets n: number of purchased tickets , P: an approximate Poisson distribution with parameter So, The Probability of winning is:

Winning a Lottery - continued b) How many tickets should one buy to be 95% confident of having a winning ticket? we need But or Thus one needs to buy about 60,000 tickets to be 95% confident of having a winning ticket! Solution

n is large and p is small Example: Danger in Space Mission A space craft has 100,000 components The probability of any one component being defective is The mission will be in danger if five or more components become defective. Find the probability of such an event. n is large and p is small Poisson Approximation with parameter Solution

Conditional Probability Density Function

Conditional Probability Density Function Further, Since for

Example Toss a coin and X(T)=0, X(H)=1. Suppose Determine has the following form. We need for all x. For so that and Solution (a) 1 (b) 1

Example - continued For so that For and 1

Example Given suppose Find We will first determine For so that Solution

Example - continued Thus and hence (a) (b)

Example Let B represent the event with For a given determine and Solution

Example - continued For we have and hence For we have and hence For we have so that Thus,

Conditional p.d.f & Bayes’ Theorem First, we extend the conditional probability results to random variables: We know that If is a partition of S and B is an arbitrary event, then: By setting we obtain:

Conditional p.d.f & Bayes’ Theorem Using: We obtain: For ,

Conditional p.d.f & Bayes’ Theorem Let so that in the limit as or we also get (Total Probability Theorem)

Bayes’ Theorem (continuous version) using total probability theorem in We get the desired result

Example: Coin Tossing Problem Revisited probability of obtaining a head in a toss. For a given coin, a-priori p can possess any value in (0,1). : A uniform in the absence of any additional information After tossing the coin n times, k heads are observed. How can we update this is new information? Let A= “k heads in n specific tosses”. Since these tosses result in a specific sequence, and using Total Probability Theorem we get Solution

Example - continued The a-posteriori p.d.f represents the updated information given the event A, Using This is a beta distribution. We can use this a-posteriori p.d.f to make further predictions. For example, in the light of the above experiment, what can we say about the probability of a head occurring in the next (n+1)th toss?

Example - continued Let B= “head occurring in the (n+1)th toss, given that k heads have occurred in n previous tosses”. Clearly From Total Probability Theorem, Using (1) in (2), we get: Thus, if n =10, and k = 6, then which is more realistic compare to p = 0.5.