Continuous Random Variables and Probability Distributions

Slides:



Advertisements
Similar presentations
Chapter 6 Continuous Random Variables and Probability Distributions
Advertisements

Chapter 5 Discrete Random Variables and Probability Distributions
Lecture note 6 Continuous Random Variables and Probability distribution.
Chapter 4 Discrete Random Variables and Probability Distributions
Chapter 5 Basic Probability Distributions
Probability Densities
Discrete Random Variables and Probability Distributions
Continuous Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Chapter 6 Continuous Random Variables and Probability Distributions
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Chapter 5 Continuous Random Variables and Probability Distributions
Chap 6-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 6 Continuous Random Variables and Probability Distributions Statistics.
QMS 6351 Statistics and Research Methods Probability and Probability distributions Chapter 4, page 161 Chapter 5 (5.1) Chapter 6 (6.2) Prof. Vera Adamchik.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Continuous Random Variables and Probability Distributions.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 7 Statistical Intervals Based on a Single Sample.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Continuous Probability Distribution  A continuous random variables (RV) has infinitely many possible outcomes  Probability is conveyed for a range of.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Chapter 4 Continuous Random Variables and Probability Distributions
Chapter 6 The Normal Probability Distribution
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
Chapter 5 Discrete Random Variables and Probability Distributions ©
Continuous Random Variables and Probability Distributions
McGraw-Hill/Irwin Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.
Continuous Random Variables
Random Sampling, Point Estimation and Maximum Likelihood.
Theory of Probability Statistics for Business and Economics.
Review of Probability Concepts ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes SECOND.
1 G Lect 3b G Lecture 3b Why are means and variances so useful? Recap of random variables and expectations with examples Further consideration.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
Modular 11 Ch 7.1 to 7.2 Part I. Ch 7.1 Uniform and Normal Distribution Recall: Discrete random variable probability distribution For a continued random.
Review of Probability Concepts ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Continuous Distributions The Uniform distribution from a to b.
ENGR 610 Applied Statistics Fall Week 3 Marshall University CITE Jack Smith.
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 6 Continuous Random Variables.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
1 Since everything is a reflection of our minds, everything can be changed by our minds.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Math b (Discrete) Random Variables, Binomial Distribution.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
INTRODUCTORY MATHEMATICAL ANALYSIS For Business, Economics, and the Life and Social Sciences  2011 Pearson Education, Inc. Chapter 16 Continuous Random.
Math 4030 – 6a Joint Distributions (Discrete)
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
CONTINUOUS RANDOM VARIABLES
© 2010 Pearson Prentice Hall. All rights reserved 7-1.
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Continuous Random Variables and Probability Distributions
Chapter 7 The Normal Probability Distribution 7.1 Properties of the Normal Distribution.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 6-1 Chapter 6 The Normal Distribution and Other Continuous Distributions Basic Business.
Chapter 8 Estimation ©. Estimator and Estimate estimator estimate An estimator of a population parameter is a random variable that depends on the sample.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
CHAPTER 5 CONTINUOUS PROBABILITY DISTRIBUTION Normal Distributions.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Chapter 4 Discrete Random Variables and Probability Distributions
MECH 373 Instrumentation and Measurements
Section 7.3: Probability Distributions for Continuous Random Variables
CONTINUOUS RANDOM VARIABLES
Review of Probability Concepts
Chapter 5 Continuous Random Variables and Probability Distributions
Chapter 2. Random Variables
Discrete Random Variables and Probability Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
Continuous Distributions
Presentation transcript:

Continuous Random Variables and Probability Distributions Chapter 6 Continuous Random Variables and Probability Distributions

Continuous Random Variables A random variable is continuous if it can take any value in an interval.

Cumulative Distribution Function The cumulative distribution function, F(x), for a continuous random variable X expresses the probability that X does not exceed the value of x, as a function of x

Cumulative Distribution Function F(x) 1 1 Cumulative Distribution Function for a Random variable Over 0 to 1

Cumulative Distribution Function Let X be a continuous random variable with a cumulative distribution function F(x), and let a and b be two possible values of X, with a < b. The probability that X lies between a and b is

Probability Density Function Let X be a continuous random variable, and let x be any number lying in the range of values this random variable can take. The probability density function, f(x), of the random variable is a function with the following properties: f(x) > 0 for all values of x The area under the probability density function f(x) over all values of the random variable X is equal to 1.0 Suppose this density function is graphed. Let a and b be two possible values of the random variable X, with a<b. Then the probability that X lies between a and b is the area under the density function between these points. The cumulative density function F(x0) is the area under the probability density function f(x) up to x0 where xm is the minimum value of the random variable x.

Shaded Area is the Probability That X is Between a and b a b x

Probability Density Function for a Uniform 0 to 1 Random Variable f(x) 1 1 x

Areas Under Continuous Probability Density Functions Let X be a continuous random variable with the probability density function f(x) and cumulative distribution F(x). Then the following properties hold: The total area under the curve f(x) = 1. The area under the curve f(x) to the left of x0 is F(x0), where x0 is any value that the random variable can take.

Properties of the Probability Density Function f(x) Comments Total area under the uniform probability density function is 1. 1 x0 1 x

Properties of the Probability Density Function Comments Area under the uniform probability density function to the left of x0 is F(x0), which is equal to x0 for this uniform distribution because f(x)=1. f(x) 1 x0 1 x

Rationale for Expectations of Continuous Random Variables Suppose that a random experiment leads to an outcome that can be represented by a continuous random variable. If N independent replications of this experiment are carried out, then the expected value of the random variable is the average of the values taken, as the number of replications becomes infinitely large. The expected value of a random variable is denoted by E(X).

Rationale for Expectations of Continuous Random Variables Similarly, if g(x) is any function of the random variable, X, then the expected value of this function is the average value taken by the function over repeated independent trials, as the number of trials becomes infinitely large. This expectation is denoted E[g(X)]. By using calculus we can define expected values for continuous random variables similarly to that used for discrete random variables.

Mean, Variance, and Standard Deviation Let X be a continuous random variable. There are two important expected values that are used routinely to define continuous probability distributions. The mean of X, denoted by X, is defined as the expected value of X. The variance of X, denoted by X2, is defined as the expectation of the squared deviation, (X - X)2, of a random variable from its mean Or an alternative expression can be derived The standard deviation of X, X, is the square root of the variance.

Linear Functions of Variables Let X be a continuous random variable with mean X and variance X2, and let a and b any constant fixed numbers. Define the random variable W as Then the mean and variance of W are and and the standard deviation of W is

Linear Functions of Variable An important special case of the previous results is the standardized random variable which has a mean 0 and variance 1.

Reasons for Using the Normal Distribution The normal distribution closely approximates the probability distributions of a wide range of random variables. Distributions of sample means approach a normal distribution given a “large” sample size. Computations of probabilities are direct and elegant. The normal probability distribution has led to good business decisions for a number of applications.

Probability Density Function for a Normal Distribution 0.4 0.3 0.2 0.1 0.0  x

Probability Density Function of the Normal Distribution The probability density function for a normally distributed random variable X is Where  and 2 are any number such that - <  <  and - < 2 <  and where e and  are physical constants, e = 2.71828. . . and  = 3.14159. . .

Properties of the Normal Distribution Suppose that the random variable X follows a normal distribution with parameters  and 2. Then the following properties hold: The mean of the random variable is , The variance of the random variable is 2, The shape of the probability density function is a symmetric bell-shaped curve centered on the mean  as shown in Figure 6.8. By knowing the mean and variance we can define the normal distribution by using the notation

Effects of  on the Probability Density Function of a Normal Random Variable 0.4 Mean = 6 Mean = 5 0.3 0.2 0.1 0.0 x 1.5 2.5 3.5 4.5 5.5 6.5 7.5 8.5

Effects of 2 on the Probability Density Function of a Normal Random Variable 0.4 Variance = 0.0625 0.3 0.2 Variance = 1 0.1 0.0 1.5 2.5 3.5 4.5 5.5 6.5 7.5 8.5 x

Cumulative Distribution Function of the Normal Distribution Suppose that X is a normal random variable with mean  and variance 2 ; that is X~N(, 2). Then the cumulative distribution function is This is the area under the normal probability density function to the left of x0, as illustrated in Figure 6.10. As for any proper density function, the total area under the curve is 1; that is F() = 1.

Shaded Area is the Probability that X does not Exceed x0 for a Normal Random Variable f(x) x0 x

Range Probabilities for Normal Random Variables Let X be a normal random variable with cumulative distribution function F(x), and let a and b be two possible values of X, with a < b. Then The probability is the area under the corresponding probability density function between a and b.

Range Probabilities for Normal Random Variables f(x) a b x 

The Standard Normal Distribution Let Z be a normal random variable with mean 0 and variance 1; that is We say that Z follows the standard normal distribution. Denote the cumulative distribution function as F(z), and a and b as two numbers with a < b, then

Standard Normal Distribution with Probability for z = 1.25 0.8944 z 1.25

Finding Range Probabilities for Normally Distributed Random Variables Let X be a normally distributed random variable with mean  and variance 2. Then the random variable Z = (X - )/ has a standard normal distribution: Z ~ N(0, 1) It follows that if a and b are any numbers with a < b, then where Z is the standard normal random variable and F(z) denotes its cumulative distribution function.

Computing Normal Probabilities A very large group of students obtains test scores that are normally distributed with mean 60 and standard deviation 15. What proportion of the students obtained scores between 85 and 95? That is, 3.76% of the students obtained scores in the range 85 to 95.

Approximating Binomial Probabilities Using the Normal Distribution Let X be the number of successes from n independent Bernoulli trials, each with probability of success . The number of successes, X, is a Binomial random variable and if n(1 - ) > 9 a good approximation is Or if 5 < n(1 - ) < 9 we can use the continuity correction factor to obtain where Z is a standard normal variable.

The Exponential Distribution The exponential random variable T (t>0) has a probability density function Where  is the mean number of occurrences per unit time, t is the number of time units until the next occurrence, and e = 2.71828. . . Then T is said to follow an exponential probability distribution. The cumulative distribution function is The distribution has mean 1/ and variance 1/2

Probability Density Function for an Exponential Distribution with  = 0.2 f(x) Lambda = 0.2 0.2 0.1 0.0 x 10 20

Joint Cumulative Distribution Functions Let X1, X2, . . .Xk be continuous random variables Their joint cumulative distribution function, F(x1, x2, . . .xk) defines the probability that simultaneously X1 is less than x1, X2 is less than x2, and so on; that is The cumulative distribution functions F(x1), F(x2), . . .,F(xk) of the individual random variables are called their marginal distribution functions. For any i, F(xi) is the probability that the random variable Xi does not exceed the specific value xi. The random variables are independent if and only if

Covariance Let X and Y be a pair of continuous random variables, with respective means x and y. The expected value of (x - x)(Y - y) is called the covariance between X and Y. That is An alternative but equivalent expression can be derived as If the random variables X and Y are independent, then the covariance between them is 0. However, the converse is not true.

Correlation Let X and Y be jointly distributed random variables. The correlation between X and Y is

Sums of Random Variables Let X1, X2, . . .Xk be k random variables with means 1, 2,. . . k and variances 12, 22,. . ., k2. The following properties hold: The mean of their sum is the sum of their means; that is If the covariance between every pair of these random variables is 0, then the variance of their sum is the sum of their variances; that is However, if the covariances between pairs of random variables are not 0, the variance of their sum is

Differences Between a Pair of Random Variables Let X and Y be a pair of random variables with means X and Y and variances X2 and Y2. The following properties hold: The mean of their difference is the difference of their means; that is If the covariance between X and Y is 0, then the variance of their difference is If the covariance between X and Y is not 0, then the variance of their difference is

Linear Combinations of Random Variables The linear combination of two random variables, X and Y, is Where a and b are constant numbers. The mean for W is, The variance for W is, Or using the correlation, If both X and Y are joint normally distributed random variables then the resulting random variable, W, is also normally distributed with mean and variance derived above.