A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable
Example: A series of n independent trials, each having a probability p of being a success and 1 – p of being a failure, are performed. Let X be the number of successes in the n trials.
A random variable that has the following pmf is said to be a Poisson random variable with parameter The Poisson random variable
Example: The number of cars sold per day by a dealer is Poisson with parameter = 2. What is the probability of selling no cars today? What is the probability of selling 2?
Example: The number of cars sold per day by a dealer is Poisson with parameter = 2. What is the probability of selling no cars today? What is the probability of receiving 100? Solution: P ( X =0) = e -2 P(X = 2)= e -2 (2 2 /2!) 0.270
We say that X is a continuous random variable if there exists a non-negative function f ( x ), for all real values of x, such that for any set B of real numbers, we have The function f ( x ) is called the probability density function (pdf) of the random variable X. Continuous random variables
Properties
A random variable that has the following pdf is said to be a uniform random variable over the interval ( a, b ) The Uniform random variable
A random variable that has the following pdf is said to be a uniform random variable over the interval ( a, b ) The Uniform random variable
A random variable that has the following pdf is said to be a exponential random variable with parameter The Exponential random variable
A random variable that has the following pdf is said to be a exponential random variable with parameter The Exponential random variable
A random variable that has the following pdf is said to be a gamma random variable with parameters , The Gamma random variable
A random variable that has the following pdf is said to be a normal random variable with parameters , The Normal random variable Note: The distribution with parameters = 0 and = 1 is called the standard normal distribution.
If X is a discrete random variable with pmf p ( x ), then the expected value of X is defined by Expectation of a random variable
If X is a discrete random variable with pmf p ( x ), then the expected value of X is defined by Expectation of a random variable Example: p (1)=0.2, p (3)=0.3, p (5)=0.2, p (7)=0.3 E [ X ] = 0.2(1)+0.3(3)+0.2(5)+0.3(7)= =4.2
If X is a continuous random variable with pdf f ( x ), then the expected value of X is defined by
Expectation of a Bernoulli random variable E [ X ] = 0(1 - p ) + 1( p ) = p
Expectation of a Bernoulli random variable E [ X ] = 0(1 - p ) + 1( p ) = p Expectation of a geometric random variable
Expectation of a Bernoulli random variable E [ X ] = 0(1 - p ) + 1( p ) = p Expectation of a geometric random variable Expectation of a binomial random variable
Expectation of a Bernoulli random variable E [ X ] = 0(1 - p ) + 1( p ) = p Expectation of a geometric random variable Expectation of a binomial random variable Expectation of a Poisson random variable
Expectation of a uniform random variable
Expectation of an normal random variable
Expectation of a uniform random variable Expectation of an exponential random variable Expectation of a exponential random variable
(1) If X is a discrete random variable with pmf p ( x ), then for any real-valued function g, (2) If X is a continuous random variable with pdf f ( x ), then for any real-valued function g, Expectation of a function of a random variable
(1) If X is a discrete random variable with pmf p ( x ), then for any real-valued function g, (2) If X is a continuous random variable with pdf f ( x ), then for any real-valued function g, Expectation of a function of a random variable Note: P ( Y = g ( x ))=P( X = x )
If a and b are constants, then E [ aX + b ]= aE [X]+ b The expected value E [ X n ] is called the n th moment of the random variable X. The expected value E [( X - E [ X ]) 2 ] is called the variance of the random variable X and denoted by Var( X ) Var( X ) = E [ X 2 ] - E [ X ] 2
Let X and Y be two random variables. The joint cumulative probability distribution of X and Y is defined as Jointly distributed random variables
If X and Y are both discrete random variables, the joint pmf of X and Y is defined as
If X and Y are continuous random variables, X and Y are said to be jointly continuous if there exists a function f ( x, y ) such that