Continuous Random Variables and Probability Distributions
Published byModified over 6 years ago
Presentation on theme: "Continuous Random Variables and Probability Distributions"— Presentation transcript:
1 Continuous Random Variables and Probability Distributions Chapter 6Continuous Random Variables and Probability Distributions
2 Continuous Random Variables A random variable is continuous if it can take any value in an interval.
3 Cumulative Distribution Function The cumulative distribution function, F(x), for a continuous random variable X expresses the probability that X does not exceed the value of x, as a function of x
4 Cumulative Distribution Function F(x)11Cumulative Distribution Function for a Random variable Over 0 to 1
5 Cumulative Distribution Function Let X be a continuous random variable with a cumulative distribution function F(x), and let a and b be two possible values of X, with a < b. The probability that X lies between a and b is
6 Probability Density Function Let X be a continuous random variable, and let x be any number lying in the range of values this random variable can take. The probability density function, f(x), of the random variable is a function with the following properties:f(x) > 0 for all values of xThe area under the probability density function f(x) over all values of the random variable X is equal to 1.0Suppose this density function is graphed. Let a and b be two possible values of the random variable X, with a<b. Then the probability that X lies between a and b is the area under the density function between these points.The cumulative density function F(x0) is the area under the probability density function f(x) up to x0where xm is the minimum value of the random variable x.
7 Shaded Area is the Probability That X is Between a and b abx
8 Probability Density Function for a Uniform 0 to 1 Random Variable f(x)11x
9 Areas Under Continuous Probability Density Functions Let X be a continuous random variable with the probability density function f(x) and cumulative distribution F(x). Then the following properties hold:The total area under the curve f(x) = 1.The area under the curve f(x) to the left of x0 is F(x0), where x0 is any value that the random variable can take.
10 Properties of the Probability Density Function f(x)CommentsTotal area under the uniform probability density function is 1.1x01x
11 Properties of the Probability Density Function CommentsArea under the uniform probability density function to the left of x0 is F(x0), which is equal to x0 for this uniform distribution because f(x)=1.f(x)1x01x
12 Rationale for Expectations of Continuous Random Variables Suppose that a random experiment leads to an outcome that can be represented by a continuous random variable. If N independent replications of this experiment are carried out, then the expected value of the random variable is the average of the values taken, as the number of replications becomes infinitely large. The expected value of a random variable is denoted by E(X).
13 Rationale for Expectations of Continuous Random Variables Similarly, if g(x) is any function of the random variable, X, then the expected value of this function is the average value taken by the function over repeated independent trials, as the number of trials becomes infinitely large. This expectation is denoted E[g(X)]. By using calculus we can define expected values for continuous random variables similarly to that used for discrete random variables.
14 Mean, Variance, and Standard Deviation Let X be a continuous random variable. There are two important expected values that are used routinely to define continuous probability distributions.The mean of X, denoted by X, is defined as the expected value of X.The variance of X, denoted by X2, is defined as the expectation of the squared deviation, (X - X)2, of a random variable from its meanOr an alternative expression can be derivedThe standard deviation of X, X, is the square root of the variance.
15 Linear Functions of Variables Let X be a continuous random variable with mean X and variance X2, and let a and b any constant fixed numbers. Define the random variable W asThen the mean and variance of W areandand the standard deviation of W is
16 Linear Functions of Variable An important special case of the previous results is the standardized random variablewhich has a mean 0 and variance 1.
17 Reasons for Using the Normal Distribution The normal distribution closely approximates the probability distributions of a wide range of random variables.Distributions of sample means approach a normal distribution given a “large” sample size.Computations of probabilities are direct and elegant.The normal probability distribution has led to good business decisions for a number of applications.
18 Probability Density Function for a Normal Distribution 0.40.30.20.10.0x
19 Probability Density Function of the Normal Distribution The probability density function for a normally distributed random variable X isWhere and 2 are any number such that - < < and - < 2 < and where e and are physical constants, e = and =
20 Properties of the Normal Distribution Suppose that the random variable X follows a normal distribution with parameters and 2. Then the following properties hold:The mean of the random variable is ,The variance of the random variable is 2,The shape of the probability density function is a symmetric bell-shaped curve centered on the mean as shown in Figure 6.8.By knowing the mean and variance we can define the normal distribution by using the notation
21 Effects of on the Probability Density Function of a Normal Random Variable 0.4Mean = 6Mean = 188.8.131.52.0x184.108.40.206.220.127.116.11.5
22 Effects of 2 on the Probability Density Function of a Normal Random Variable 0.4Variance =0.30.2Variance = 10.10.01.52.18.104.22.168.57.58.5x
23 Cumulative Distribution Function of the Normal Distribution Suppose that X is a normal random variable with mean and variance 2 ; that is X~N(, 2). Then the cumulative distribution function isThis is the area under the normal probability density function to the left of x0, as illustrated in Figure As for any proper density function, the total area under the curve is 1; that is F() = 1.
24 Shaded Area is the Probability that X does not Exceed x0 for a Normal Random Variable f(x)x0x
25 Range Probabilities for Normal Random Variables Let X be a normal random variable with cumulative distribution function F(x), and let a and b be two possible values of X, with a < b. ThenThe probability is the area under the corresponding probability density function between a and b.
26 Range Probabilities for Normal Random Variables f(x)abx
27 The Standard Normal Distribution Let Z be a normal random variable with mean 0 and variance 1; that isWe say that Z follows the standard normal distribution. Denote the cumulative distribution function as F(z), and a and b as two numbers with a < b, then
28 Standard Normal Distribution with Probability for z = 1.25 0.8944z1.25
29 Finding Range Probabilities for Normally Distributed Random Variables Let X be a normally distributed random variable with mean and variance 2. Then the random variable Z = (X - )/ has a standard normal distribution: Z ~ N(0, 1)It follows that if a and b are any numbers with a < b, thenwhere Z is the standard normal random variable and F(z) denotes its cumulative distribution function.
30 Computing Normal Probabilities A very large group of students obtains test scores that are normally distributed with mean 60 and standard deviation 15. What proportion of the students obtained scores between 85 and 95?That is, 3.76% of the students obtained scores in the range 85 to 95.
31 Approximating Binomial Probabilities Using the Normal Distribution Let X be the number of successes from n independent Bernoulli trials, each with probability of success . The number of successes, X, is a Binomial random variable and if n(1 - ) > 9 a good approximation isOr if 5 < n(1 - ) < 9 we can use the continuity correction factor to obtainwhere Z is a standard normal variable.
32 The Exponential Distribution The exponential random variable T (t>0) has a probability density functionWhere is the mean number of occurrences per unit time, t is the number of time units until the next occurrence, and e = Then T is said to follow an exponential probability distribution.The cumulative distribution function isThe distribution has mean 1/ and variance 1/2
33 Probability Density Function for an Exponential Distribution with = 0.2 f(x)Lambda = 0.20.20.10.0x1020
34 Joint Cumulative Distribution Functions Let X1, X2, . . .Xk be continuous random variablesTheir joint cumulative distribution function, F(x1, x2, . . .xk) defines the probability that simultaneously X1 is less than x1, X2 is less than x2, and so on; that isThe cumulative distribution functions F(x1), F(x2), . . .,F(xk) of the individual random variables are called their marginal distribution functions. For any i, F(xi) is the probability that the random variable Xi does not exceed the specific value xi.The random variables are independent if and only if
35 CovarianceLet X and Y be a pair of continuous random variables, with respective means x and y. The expected value of (x - x)(Y - y) is called the covariance between X and Y. That isAn alternative but equivalent expression can be derived asIf the random variables X and Y are independent, then the covariance between them is 0. However, the converse is not true.
36 CorrelationLet X and Y be jointly distributed random variables. The correlation between X and Y is
37 Sums of Random Variables Let X1, X2, . . .Xk be k random variables with means 1, 2,. . . k and variances 12, 22,. . ., k2. The following properties hold:The mean of their sum is the sum of their means; that isIf the covariance between every pair of these random variables is 0, then the variance of their sum is the sum of their variances; that isHowever, if the covariances between pairs of random variables are not 0, the variance of their sum is
38 Differences Between a Pair of Random Variables Let X and Y be a pair of random variables with means X and Y and variances X2 and Y2. The following properties hold:The mean of their difference is the difference of their means; that isIf the covariance between X and Y is 0, then the variance of their difference isIf the covariance between X and Y is not 0, then the variance of their difference is
39 Linear Combinations of Random Variables The linear combination of two random variables, X and Y, isWhere a and b are constant numbers.The mean for W is,The variance for W is,Or using the correlation,If both X and Y are joint normally distributed random variables then the resulting random variable, W, is also normally distributed with mean and variance derived above.