Lecture note 6 Continuous Random Variables and Probability distribution.

Slides:



Advertisements
Similar presentations
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Advertisements

Chapter 5 Discrete Random Variables and Probability Distributions
Chapter 4 Mathematical Expectation.
Lecture 2 Describing Data II ©. Summarizing and Describing Data Frequency distribution and the shape of the distribution Frequency distribution and the.
Chapter 4 Discrete Random Variables and Probability Distributions
Probability Densities
Review.
Discrete Random Variables and Probability Distributions
Continuous Random Variables and Probability Distributions
Probability Distributions Finite Random Variables.
Class notes for ISE 201 San Jose State University
QA-2 FRM-GARP Sep-2001 Zvi Wiener Quantitative Analysis 2.
FRM Zvi Wiener Following P. Jorion, Financial Risk Manager Handbook Financial Risk Management.
Chapter 6 Continuous Random Variables and Probability Distributions
Business 90: Business Statistics
Chapter Five Continuous Random Variables McGraw-Hill/Irwin Copyright © 2004 by The McGraw-Hill Companies, Inc. All rights reserved.
Probability Distributions Random Variables: Finite and Continuous A review MAT174, Spring 2004.
Continuous Random Variables and Probability Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
Chap 6-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 6 Continuous Random Variables and Probability Distributions Statistics.
QMS 6351 Statistics and Research Methods Probability and Probability distributions Chapter 4, page 161 Chapter 5 (5.1) Chapter 6 (6.2) Prof. Vera Adamchik.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Continuous Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Lecture II-2: Probability Review
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Chapter6 Jointly Distributed Random Variables
Chapter 4 Continuous Random Variables and Probability Distributions
Lecture 3-2 Summarizing Relationships among variables ©
Common Probability Distributions in Finance. The Normal Distribution The normal distribution is a continuous, bell-shaped distribution that is completely.
Lecture Note 5 Discrete Random Variables and Probability Distributions ©
Chapter 5 Discrete Random Variables and Probability Distributions ©
Business Statistics: Communicating with Numbers
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
Topics Covered Discrete probability distributions –The Uniform Distribution –The Binomial Distribution –The Poisson Distribution Each is appropriately.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
Applied Quantitative Analysis and Practices LECTURE#11 By Dr. Osman Sadiq Paracha.
1 Continuous Probability Distributions Continuous Random Variables & Probability Distributions Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering.
Statistics for Business & Economics
Random Variables Presentation 6.. Random Variables A random variable assigns a number (or symbol) to each outcome of a random circumstance. A random variable.
INTRODUCTORY MATHEMATICAL ANALYSIS For Business, Economics, and the Life and Social Sciences  2011 Pearson Education, Inc. Chapter 16 Continuous Random.
Math 4030 – 6a Joint Distributions (Discrete)
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
CONTINUOUS RANDOM VARIABLES
Probability Distributions, Discrete Random Variables
1 Continuous Probability Distributions Chapter 8.
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Continuous Random Variables and Probability Distributions
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Chapter 4 Discrete Random Variables and Probability Distributions
Section 7.3: Probability Distributions for Continuous Random Variables
MTH 161: Introduction To Statistics
Continuous Probability Distributions
Some Rules for Expectation
Review of Probability Concepts
Continuous Probability Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
LESSON 8: RANDOM VARIABLES EXPECTED VALUE AND VARIANCE
Virtual University of Pakistan
Statistical analysis and its application
Chapter 2. Random Variables
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables and Probability Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
Further Topics on Random Variables: Derived Distributions
Normal Probability Distribution Lecture 1 Sections: 6.1 – 6.2
Mathematical Expectation
Presentation transcript:

Lecture note 6 Continuous Random Variables and Probability distribution

Continuous Random Variables A random variable is continuous if it can take any value in an interval.

Probability Distribution of Continuous Random Variables For continuous random variable, we use the following two concepts to describe the probability distribution Probability Density Function Cumulative Distribution Function

Probability Density Function Probability Density Function is a similar concept as the probability distribution function for a discrete random variable. You can consider the probability density function as a “smoothed” probability distribution function.

Comparing Probability Distribution Function (for discrete r Comparing Probability Distribution Function (for discrete r.v) and Probability density function (for continuous r.v)

Comparing Probability Distribution Function (Discrete r Comparing Probability Distribution Function (Discrete r.v) and Probability density function Probability distribution function for a discrete random variable shows the probability for each outcome.

Comparing Probability Distribution Function (Discrete r Comparing Probability Distribution Function (Discrete r.v) and Probability density function (Continuous r.v) f(x) a b x Probability Density Function shows the probability that the random variable falls in a particular range.

Probability density function Let X be a continuous random variable. Let x denotes the value that the random variable X takes. We use f(x) to denote the probability density function.

Example You client told you that he will visit you between noon and 2pm. Between noon and 2pm, the time he will arrive at your company is totally random. Let X be the random variable for the time he arrives (X=1.5 means he visit your office at 1:30pm) Let x be the possible value for the random variable X. Then, the probability density function f(x) has the following shape.

Probability Density Function Example f(x) 0.5 x 2

Probability Density Function Example The probability that your client visits your office between 12:30 and 1:00 is given by the shaded area. f(x) 0.5 1 x 0.5 2

Probability Density Function Example Note that area between 0 and 2 should be equal to 1 since the probability that your clients arrives between noon and 2pm is 1 (assuming that he will keep his promise that he will visit between noon and 2pm) f(x) 0.5 x 2

Some Properties of probability density function f(x)≥0 for any x Total area under f(x) is 1

Cumulative Distribution Function The cumulative distribution function, F(x), for a continuous random variable X expresses the probability that X does not exceed the value of x, as a function of x

In other words, the cumulative distribution function F(x) is given by the shaded area. F(x)=P(X≤x) x x

Cumulative Distribution Function -Example- f(x) F(x) Cumulative DistributionFunction Density Function 1 0.5 x 2 2

A property of cumulative distribution functions

Probability Density Function and Cumulative Distribution Function -Exercise- The daily revenue of a certain shop is a continuous random variable. Let X be the random variable for the daily revenue (in thousand dollars). Suppose that X has the following probability density function. Ex.1 Graph f(x), Ex. 2 Find F(3), Ex.3 Find P(2<X<3) Ex. 4 Graph F(x)

Relationship Between Probability Density Function and Cumulative Distribution Function Let X be a continuous random variable. Then, there is a following relationship between probability density function and cumulative distribution function.

Expectation for continuous random variable Expectation for a continuous random variable is defined in a similar way as the expectation for a discrete random variable. See Next Page

Expectation for the discrete random variable is defined as For continuous variable, f(x) corresponds to P(x). However, we cannot sum xf(x) since the value of x is continuous. Therefore, we define the expectation using integral in the following way. where is the lowest possible value of X and is the highest possible value for X. We use μX to denote E(X) .

Expectation of a function of a continuous random variable Let g(X) be a function of a continuous random variable X. Then the expectation of g(X) is given by the following. A special case is the variance given in the next slide

Variance and standard deviation for the continuous random variable Variance and standard deviation for a continuous random variable are defined as

Expected value for Continuous Random variable -Exercise- Continue using the daily revenue example. The probability density function for X is given by Compute E(X) and SD(X)

Linear Functions of Variables Let X be a continuous random variable with mean X and variance X2, and let a and b any constant fixed numbers. Define the random variable W as Then the mean and variance of W are and and the standard deviation of W is

Linear Functions of Variables -Exercise- Continue using the daily sales example. Suppose that the manager’s daily salary in thousand dollars is determined in the following way. W=4+0.5X Ex: Compute E(W) and SD(W)

Linear Functions of Variable An important special case of the previous results is the standardized random variable which has a mean 0 and variance 1.

Normal Distribution A random variable X is said to be a normal random variable with mean μ and variance σ2 if X has the following probability density function. where e and  are physical constants, e = 2.71828. . . and  = 3.14159. . .

Probability Density Function for a Normal Distribution 0.4 0.3 0.2 0.1 0.0  x

A note about the normal distribution Normal distributions approximate many types of distributions. Normal distributions have been applied to many areas of studies, such as economics, finance, operations management etc.

Normal Distribution approximates many types of distributions.

The shape of the normal distribution The shape of the normal density function changes with the mean and the standard deviation

The shape of normal distribution. Exercise Open “Normal Density Function Exercise”. Excel function NORMDIST(x, μ,σ,FALSE) will compute the normal density function with mean μ and standard deviation σ evaluated at x. EX1: Plot the normal density functions for μ=0 and σ=1, and μ=1 and σ=1. You will see how the change in the mean affect the shape of the density (without changing standard deviation) EX2: Plot the normal density function for μ=0 and σ=1, and μ=0 and σ=2. You will see how the change in the standard deviation affects the shape (without changing the mean)

Answer Changing the mean without changing the standard deviation causes a shift. Increasing standard deviation makes the shape flatter (Fat Tail).

Properties of the Normal Distribution Suppose that the random variable X follows a normal distribution given by the previous slides. Then The mean of the random variable is , The variance of the random variable is 2, The Following notation means that a random variable has the normal distribution with mean  and variance 2.

Cumulative Distribution Function of the Normal Distribution Suppose that X is a normal random variable with mean  and variance 2 ; that is X~N(, 2). Then the cumulative distribution function is This is the area under the normal probability density function to the left of x0, as illustrated in the figure in the next slide.

Shaded Area is the Probability that X does not Exceed x0 for a Normal Random Variable f(x) x0 x

Range Probabilities for Normal Random Variables Let X be a normal random variable with cumulative distribution function F(x), and let a and b be two possible values of X, with a < b. Then The probability is the area under the corresponding probability density function between a and b.

Range Probabilities for Normal Random Variables (Figure 6.12) f(x) a b x 

The Standard Normal Distribution The Standard Normal Distribution is the normal distribution with mean 0 and variance 1. We often use Z to denote the Standard Normal Variable. If Z has standard normal distribution, we say that Z follows the standard normal distribution.

Computing the cumulative distribution and a range probability for the standard normal distribution We often need to compute the cumulative distribution and a range probability for a standard normal distribution. Since the standard normal distribution is used so frequently, most of the textbooks provide the cumulative distribution table. See P837 of our textbook.

Compute the following probability P(Z≤1.72) P(1.25<Z<1.72) Computing the cumulative distribution and a range probability for the standard normal distribution -Exercise- Compute the following probability P(Z≤1.72) P(1.25<Z<1.72) P(Z≤-1.25) P(-1.25<Z<1.72)

Finding Range Probabilities for Normally Distributed Random Variables Let X be a normally distributed random variable with mean  and variance 2. Then the random variable Z = (X - )/ has a standard normal distribution: Z ~ N(0, 1) It follows that if a and b are any numbers with a < b, then where Z is the standard normal random variable and F(z) denotes its cumulative distribution function.

Computing Normal Probabilities (Example 6.6) A very large group of students obtains test scores that are normally distributed with mean 60 and standard deviation 15. If you randomly choose a student, what is the probability that the test score of the student is between 85 and 95?

Joint Cumulative Distribution Functions Let X1, X2, . . .Xk be continuous random variables Their joint cumulative distribution function, F(x1, x2, . . .xk) defines the probability that simultaneously X1 is less than x1, X2 is less than x2, and so on; that is The cumulative distribution functions F1(x1), F2(x2), . . .,Fk(xk) of the individual random variables are called their marginal distribution functions. For any i, Fi(xi) is the probability that the random variable Xi does not exceed the specific value xi. The random variables are independent if and only if

Joint Cumulative Distribution Functions Joint cumulative distribution functions often have complex forms, especially when they are not independent. Some simple joint distribution functions are presented in the next page.

Joint distribution functions -Example, optional- X and Y are said to have independent joint uniform distribution if X and Y has the following distribution function. F(x,y)=xy 0≤x≤1, 0≤y≤1 X and Y are said to have independent joint exponential distribution if the joint distribution function is given by F(x,y)=(1-e-x)(1-e-y), 0<x<∞、 0<y<∞

Expectation of a function of two random variables. Let g(X,Y) be a function of two continuous random variables X and Y. Let f(x,y) be the joint distribution function of X and Y. Then the expectation of g(X,Y) is defined as Special case is the covariance in the next slide

Covariance Let X and Y be a pair of continuous random variables, with respective means x and y. The expected value of (x - x)(Y - y) is called the covariance between X and Y. That is An alternative but equivalent expression can be derived as If the random variables X and Y are independent, then the covariance between them is 0. However, the converse is not true.

Correlation Let X and Y be jointly distributed random variables. The correlation between X and Y is

Sum of two random variables Let X and Y be jointly distributed random variables. Let μX be the mean of X, μY be the mean of Y, σ2X be the variance of X ,σ2Y be the variance of Y. Then, we have E(aX+bY) = aμX + bμY Var(aX+bY) = a2 σX2 +b2 σY2 +2abCov(X,Y) = a2 σX2 +b2 σY2 +2abCorr(X,Y)σXσY

Sums of Random Variables -Exercise- Let X be the return for the stock A, and Y be the return for stock B. You have some fixed amount of money to invest in Stock A and Stock B. Let a be the share of money invested in stock A. (if you invest half of the money in stock A, a=0.5). Let b be the share of money invested in stock B (that is b=1-a). Then the return from the portfolio W is given by Return of the portfolio: W=aX+bY

Suppose that μX=0.4, μY=0.1, σ2X=0.5, σ2Y=0.2, Corr(X,Y)=-0.2 Ex 1. Suppose you invest 40% of your money in stock A, (i.e., a=0.4). Let W be the return of this portfolio. Compute E(W) and Var(W), and SD(W). Ex 2. Compute E(W) and SD(W) for different values of a, using “sum of random variables exercise”. Plot the relationship between SD(W) and E(W).

Answer for Ex2