Presentation is loading. Please wait.

Presentation is loading. Please wait.

Discrete Random Variables and Probability Distributions

Similar presentations


Presentation on theme: "Discrete Random Variables and Probability Distributions"— Presentation transcript:

1 Discrete Random Variables and Probability Distributions
Chapter 5 Discrete Random Variables and Probability Distributions

2 Random Variables A random variable is a variable that takes on numerical values determined by the outcome of a random experiment.

3 Discrete Random Variables
A random variable is discrete if it can take on no more than a countable number of values.

4 Discrete Random Variables (Examples)
The number of defective items in a sample of twenty items taken from a large shipment. The number of customers arriving at a check-out counter in an hour. The number of errors detected in a corporation’s accounts. The number of claims on a medical insurance policy in a particular year.

5 Continuous Random Variables
A random variable is continuous if it can take any value in an interval.

6 Continuous Random Variables (Examples)
The income in a year for a family. The amount of oil imported into the U.S. in a particular month. The change in the price of a share of IBM common stock in a month. The time that elapses between the installation of a new computer and its failure. The percentage of impurity in a batch of chemicals.

7 Discrete Probability Distributions
The probability distribution function (DPF), P(x), of a discrete random variable expresses the probability that X takes the value x, as a function of x. That is

8 Discrete Probability Distributions (Example 5.1)
Graph the probability distribution function for the roll of a single six-sided die. P(x) 1/6 1 2 3 4 5 6 x Figure 5.1

9 Required Properties of Probability Distribution Functions of Discrete Random Variables
Let X be a discrete random variable with probability distribution function, P(x). Then P(x)  0 for any value of x The individual probabilities sum to 1; that is Where the notation indicates summation over all possible values x.

10 Cumulative Probability Function
The cumulative probability function, F(x0), of a random variable X expresses the probability that X does not exceed the value x0, as a function of x0. That is Where the function is evaluated at all values x0

11 Derived Relationship Between Probability Function and Cumulative Probability Function
Let X be a random variable with probability function P(x) and cumulative probability function F(x0). Then it can be shown that Where the notation implies that summation is over all possible values x that are less than or equal to x0.

12 Derived Properties of Cumulative Probability Functions for Discrete Random Variables
Let X be a discrete random variable with a cumulative probability function, F(x0). Then we can show that 0  F(x0)  1 for every number x0 If x0 and x1 are two numbers with x0 < x1, then F(x0)  F(x1)

13 Expected Value The expected value, E(X), of a discrete random variable X is defined Where the notation indicates that summation extends over all possible values x. The expected value of a random variable is called its mean and is denoted x.

14 Expected Value: Functions of Random Variables
Let X be a discrete random variable with probability function P(x) and let g(X) be some function of X. Then the expected value, E[g(X)], of that function is defined as

15 Variance and Standard Deviation
Let X be a discrete random variable. The expectation of the squared discrepancies about the mean, (X - )2, is called the variance, denoted 2x and is given by The standard deviation, x , is the positive square root of the variance.

16 Variance (Alternative Formula)
The variance of a discrete random variable X can be expressed as

17 Expected Value and Variance for Discrete Random Variable Using Microsoft Excel (Figure 5.4)

18 Summary of Properties for Linear Function of a Random Variable
Let X be a random variable with mean x , and variance 2x ; and let a and b be any constant fixed numbers. Define the random variable Y = a + bX. Then, the mean and variance of Y are and so that the standard deviation of Y is

19 Summary Results for the Mean and Variance of Special Linear Functions
Let b = 0 in the linear function, W = a + bX. Then W = a (for any constant a). If a random variable always takes the value a, it will have a mean a and a variance 0. Let a = 0 in the linear function, W = a + bX. Then W = bX.

20 Mean and Variance of Z Let a = -X/X and b = 1/ X in the linear function Z = a + bX. Then, so that and

21 Bernoulli Distribution
A Bernoulli distribution arises from a random experiment which can give rise to just two possible outcomes. These outcomes are usually labeled as either “success” or “failure.” If  denotes the probability of a success and the probability of a failure is (1 -  ), the the Bernoulli probability function is

22 Mean and Variance of a Bernoulli Random Variable
The mean is: And the variance is:

23 Sequences of x Successes in n Trials
The number of sequences with x successes in n independent trials is: Where n! = n x (x – 1) x (n – 2) x x 1 and 0! = 1.

24 Binomial Distribution
Suppose that a random experiment can result in two possible mutually exclusive and collectively exhaustive outcomes, “success” and “failure,” and that  is the probability of a success resulting in a single trial. If n independent trials are carried out, the distribution of the resulting number of successes “x” is called the binomial distribution. Its probability distribution function for the binomial random variable X = x is: P(x successes in n independent trials)= for x = 0, 1, , n

25 Mean and Variance of a Binomial Probability Distribution
Let X be the number of successes in n independent trials, each with probability of success . The x follows a binomial distribution with mean, and variance,

26 Binomial Probabilities - An Example – (Example 5.7)
An insurance broker, Shirley Ferguson, has five contracts, and she believes that for each contract, the probability of making a sale is 0.40. What is the probability that she makes at most one sale? P(at most one sale) = P(X  1) = P(X = 0) + P(X = 1) = = 0.337

27 Binomial Probabilities, n = 100,  =0.40 (Figure 5.10)

28 Hypergeometric Distribution
Suppose that a random sample of n objects is chosen from a group of N objects, S of which are successes. The distribution of the number of X successes in the sample is called the hypergeometric distribution. Its probability function is: Where x can take integer values ranging from the larger of 0 and [n-(N-S)] to the smaller of n and S.

29 Poisson Probability Distribution
Assume that an interval is divided into a very large number of subintervals so that the probability of the occurrence of an event in any subinterval is very small. The assumptions of a Poisson probability distribution are: The probability of an occurrence of an event is constant for all subintervals. There can be no more than one occurrence in each subinterval. Occurrences are independent; that is, the number of occurrences in any non-overlapping intervals in independent of one another.

30 Poisson Probability Distribution
The random variable X is said to follow the Poisson probability distribution if it has the probability function: where P(x) = the probability of x successes over a given period of time or space, given   = the expected number of successes per time or space unit;  > 0 e = (the base for natural logarithms) The mean and variance of the Poisson probability distribution are:

31 Partial Poisson Probabilities for  = 0
Partial Poisson Probabilities for  = 0.03 Obtained Using Microsoft Excel PHStat (Figure 5.14)

32 Poisson Approximation to the Binomial Distribution
Let X be the number of successes resulting from n independent trials, each with a probability of success, . The distribution of the number of successes X is binomial, with mean n. If the number of trials n is large and n is of only moderate size (preferably n  7), this distribution can be approximated by the Poisson distribution with  = n. The probability function of the approximating distribution is then:

33 Joint Probability Functions
Let X and Y be a pair of discrete random variables. Their joint probability function expresses the probability that X takes the specific value x and simultaneously Y takes the value y, as a function of x and y. The notation used is P(x, y) so,

34 Joint Probability Functions
Let X and Y be a pair of jointly distributed random variables. In this context the probability function of the random variable X is called its marginal probability function and is obtained by summing the joint probabilities over all possible values; that is, Similarly, the marginal probability function of the random variable Y is

35 Properties of Joint Probability Functions
Let X and Y be discrete random variables with joint probability function P(x,y). Then P(x,y)  0 for any pair of values x and y The sum of the joint probabilities P(x, y) over all possible values must be 1.

36 Conditional Probability Functions
Let X and Y be a pair of jointly distributed discrete random variables. The conditional probability function of the random variable Y, given that the random variable X takes the value x, expresses the probability that Y takes the value y, as a function of y, when the value x is specified for X. This is denoted P(y|x), and so by the definition of conditional probability: Similarly, the conditional probability function of X, given Y = y is:

37 Independence of Jointly Distributed Random Variables
The jointly distributed random variables X and Y are said to be independent if and only if their joint probability function is the product of their marginal probability functions, that is, if and only if And k random variables are independent if and only if

38 Expected Value Function of Jointly Distributed Random Variables
Let X and Y be a pair of discrete random variables with joint probability function P(x, y). The expectation of any function g(x, y) of these random variables is defined as:

39 Stock Returns, Marginal Probability, Mean, Variance (Example 5.16)
Y Return X Return 0% 5% 10% 15% 0.0625 Table 5.6

40 Covariance Let X be a random variable with mean X , and let Y be a random variable with mean, Y . The expected value of (X - X )(Y - Y ) is called the covariance between X and Y, denoted Cov(X, Y). For discrete random variables An equivalent expression is

41 Correlation Let X and Y be jointly distributed random variables. The correlation between X and Y is:

42 Covariance and Statistical Independence
If two random variables are statistically independent, the covariance between them is 0. However, the converse is not necessarily true.

43 Portfolio Analysis The random variable X is the price for stock A and the random variable Y is the price for stock B. The market value, W, for the portfolio is given by the linear function, Where, a, is the number of shares of stock A and, b, is the number of shares of stock B.

44 Portfolio Analysis The mean value for W is, The variance for W is,
or using the correlation,

45 Key Words Bernoulli Random Variable, Mean and Variance
Binomial Distribution Conditional Probability Function Continuous Random Variable Correlation Covariance Cumulative Probability Function Differences of Random Variables Discrete Random Variable Expected Value Expected Value: Functions of Random Variables Expected Value: Function of Jointly Distributed Random Variable Hypergeometric Distribution Independence of Jointly Distributed Random Variables

46 Key Words (continued) Joint Probability Function
Marginal Probability Function Mean of Binomial Distribution Mean: Functions of Random Variables Poisson Approximation to the Binomial Distribution Poisson Distribution Portfolio Analysis Portfolio, Market Value Probability Distribution Function Properties: Cumulative Probability Functions Properties: Joint Probability Functions Properties: Probability Distribution Functions Random Variable

47 Key Words (continued) Relationships: Probability Function and Cumulative Probability Function Standard Deviation: Discrete Random Variable Sums of Random Variables Variance: Binomial Distribution Variance: Discrete Random Variable Variance: Discrete Random Variable (Alternative Formula) Variance: Functions of Random Variables


Download ppt "Discrete Random Variables and Probability Distributions"

Similar presentations


Ads by Google