Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.

Similar presentations


Presentation on theme: "Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint."— Presentation transcript:

1 Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint Distributions, Moment Generating Functions, Limit Theorems

2 Chapter 22 Definition of random variable A random variable is a function that assigns a number to each outcome in a sample space. If the set of all possible values of a random variable X is countable, then X is discrete. The distribution of X is described by a probability mass function: Otherwise, X is a continuous random variable if there is a nonnegative function f(x), defined for all real numbers x, such that for any set B, f(x) is called the probability density function of X.

3 Chapter 23 pmf’s and cdf’s The probability mass function (pmf) for a discrete random variable is positive for at most a countable number of values of X: x 1, x 2, …, and The cumulative distribution function (cdf) for any random variable X is F(x) is a nondecreasing function with For a discrete random variable X,

4 Chapter 24 Bernoulli Random Variable An experiment has two possible outcomes, called “success” and “failure”: sometimes called a Bernoulli trial The probability of success is p X = 1 if success occurs, X = 0 if failure occurs Then p(0) = P{X = 0} = 1 – p and p(1) = P{X = 1} = p X is a Bernoulli random variable with parameter p.

5 Chapter 25 Binomial Random Variable A sequence of n independent Bernoulli trials are performed, where the probability of success on each trial is p X is the number of successes Then for i = 0, 1, …, n, where X is a binomial random variable with parameters n and p.

6 Chapter 26 Geometric Random Variable A sequence of independent Bernoulli trials is performed with p = P(success) X is the number of trials until (including) the first success. Then X may equal 1, 2, … and X is named after the geometric series: Use this to verify that

7 Chapter 27 Poisson Random Variable X is a Poisson random variable with parameter > 0 if note: X can represent the number of “rare events” that occur during an interval of specified length A Poisson random variable can also approximate a binomial random variable with large n and small p if = np: split the interval into n subintervals, and label the occurrence of an event during a subinterval as “success”.

8 Chapter 28 Continuous random variables A probability density function (pdf) must satisfy: The cdf is: means that f(a) measures how likely X is to be near a.

9 Chapter 29 Uniform random variable X is uniformly distributed over an interval (a, b) if its pdf is Then its cdf is: all we know about X is that it takes a value between a and b

10 Chapter 210 Exponential random variable X has an exponential distribution with parameter > 0 if its pdf is Then its cdf is: This distribution has very special characteristics that we will use often!

11 Chapter 211 Gamma random variable X has an gamma distribution with parameters > 0 and  > 0 if its pdf is It gets its name from the gamma function If  is an integer, then

12 Chapter 212 Normal random variable X has a normal distribution with parameters  and   if its pdf is This is the classic “bell-shaped” distribution widely used in statistics. It has the useful characteristic that a linear function Y = aX+b is normally distributed with parameters a  b and (a  . In particular, Z = (X –  )/  has the standard normal distribution with parameters 0 and 1.

13 Chapter 213 Expectation Expected value (mean) of a random variable is Also called first moment – like moment of inertia of the probability distribution If the experiment is repeated and random variable observed many times, it represents the long run average value of the r.v.

14 Chapter 214 Expectations of Discrete Random Variables Bernoulli: E[X] = 1(p) + 0(1-p) = p Binomial: E[X] = np Geometric: E[X] = 1/p (by a trick, see text) Poisson: E[X] = the parameter is the expected or average number of “rare events” per interval; the random variable is the number of events in a particular interval chosen at random

15 Chapter 215 Expectations of Continuous Random Variables Uniform: E[X] = (a + b)/2 Exponential: E[X] = 1/ Gamma: E[X] =  Normal: E[X] =  the first parameter is the expected value: note that its density is symmetric about x =  :

16 Chapter 216 Expectation of a function of a r.v. First way: If X is a r.v., then Y = g(X) is a r.v.. Find the distribution of Y, then find Second way: If X is a random variable, then for any real- valued function g, If g(X) is a linear function of X:

17 Chapter 217 Higher-order moments The nth moment of X is E[X n ]: The variance is It is sometimes easier to calculate as

18 Chapter 218 Variances of Discrete Random Variables Bernoulli: E[X 2 ] = 1(p) + 0(1-p) = p; Var(X) = p – p 2 = p(1-p) Binomial: Var(X) = np(1-p) Geometric: Var(X) = 1/p 2 (similar trick as for E[X]) Poisson: Var(X) = the parameter is also the variance of the number of “rare events” per interval!

19 Chapter 219 Variances of Continuous Random Variables Uniform: Var(X) = (b - a) 2 /2 Exponential: Var(X) = 1/ Gamma: Var(X) =  2 Normal: Var(X) =  2  the second parameter is the variance

20 Chapter 220 Jointly Distributed Random Variables See text pages 46-47 for definitions of joint cdf, pmf, pdf, marginal distributions. Main results that we will use: especially useful with indicator r.v.’s: I A = 1 if A occurs, 0 otherwise

21 Chapter 221 Independent Random Variables X and Y are independent if This implies that: Also, if X and Y are independent, then for any functions h and g,

22 Chapter 222 Covariance The covariance of X and Y is: If X and Y are independent then Cov(X,Y) = 0. Properties:

23 Chapter 223 Variance of a sum of r.v.’s If X 1, X 2, …, X n are independent, then

24 Chapter 224 Moment generating function The moment generating function of a r.v. X is Its name comes from the fact that Also, if X and Y are independent, then And, there is a one-to-one correspondence between the m.g.f. and the distribution function of a r.v. – this helps to identify distributions with the reproductive property


Download ppt "Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint."

Similar presentations


Ads by Google