Further Topics on Random Variables: Derived Distributions

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Lecture note 6 Continuous Random Variables and Probability distribution.
Probability Densities
Review.
MISC.. 2 Integration Formula for consumer surplus Income stream - revenue enters as a stream - take integral of income stream to get total revenue.
Continuous Random Variables and Probability Distributions
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Lecture II-2: Probability Review
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 11 – Derived distributions, covariance, correlation and convolution Dr. Farinaz Koushanfar.
Further distributions
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Modular 11 Ch 7.1 to 7.2 Part I. Ch 7.1 Uniform and Normal Distribution Recall: Discrete random variable probability distribution For a continued random.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
One Random Variable Random Process.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Chapter 3 DeGroot & Schervish. Functions of a Random Variable the distribution of some function of X suppose X is the rate at which customers are served.
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
Continuous Random Variables and Probability Distributions
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
One Function of Two Random Variables
Unit 4 Review. Starter Write the characteristics of the binomial setting. What is the difference between the binomial setting and the geometric setting?
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Expectations of Random Variables, Functions of Random Variables
The Exponential and Gamma Distributions
Functions and Transformations of Random Variables
Cumulative distribution functions and expected values
3.1 Expectation Expectation Example
Tutorial 8: Further Topics on Random Variables 1
Example Suppose X ~ Uniform(2, 4). Let . Find .
Tutorial 9: Further Topics on Random Variables 2
3.0 Functions of One Random Variable
Virtual University of Pakistan
Functions of Random variables
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
11. Conditional Density Functions and Conditional Expected Values
9. Two Functions of Two Random Variables
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Berlin Chen Department of Computer Science & Information Engineering
Discrete Random Variables: Expectation, Mean and Variance
Further Topics on Random Variables: 1
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Covariance and Correlation
Independence and Counting
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Experiments, Outcomes, Events and Random Variables: A Revisit
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Introduction to Probability: Solutions for Quizzes 4 and 5
HKN ECE 313 Exam 2 Review Session
Discrete Random Variables: Expectation, Mean and Variance
Discrete Random Variables: Expectation, Mean and Variance
Berlin Chen Department of Computer Science & Information Engineering
Independence and Counting
Berlin Chen Department of Computer Science & Information Engineering
8. One Function of Two Random Variables
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Continuous Random Variables: Basics
Presentation transcript:

Further Topics on Random Variables: Derived Distributions Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University Reference: - D. P. Bertsekas, J. N. Tsitsiklis, Introduction to Probability , Section 4.1

Two-step approach to Calculating Derived PDF Calculate the PDF of a Function of a continuous random variable Calculate the CDF of using the formula Differentiate to obtain the PDF (called the derived distribution) of

Illustrative Examples (1/2) Example 4.1. Let be uniform on [0, 1]. Find the PDF of . Note that takes values between 0 and 1. 2 1 1 1

Illustrative Examples (2/2) Example 4.3. Let , where is a random variable with known PDF . Find the PDF of represented in terms of .

The PDF of a Linear Function of a Random Variable Let be a continuous random variable with PDF , and let for some scalar and . Then, a>0, b>0

The PDF of a Linear Function of a Random Variable (1/2) Verification of the above formula

Illustrative Examples (1/2) Example 4.4. A linear function of an exponential random variable. Suppose that is an exponential random variable with PDF where is a positive parameter. Let . Then, Note that if , then is a exponential with parameter

Illustrative Examples (2/2) Example 4.5. A linear function of a normal random variable is normal. Suppose that is a normal random variable with mean and variance , And let , where and are some scalars. We have

Monotonic Functions of a Random Variable (1/4) Let be a continuous random variable and have values in a certain interval . While random variable and we assume that is strictly monotonic over the interval . That is, either (1) for all , satisfying (monotonically increasing case), or (2) for all , satisfying (monotonically decreasing case)

Monotonic Functions of a Random Variable (2/4) Suppose that is monotonic and that for some function and all in the range of we have For example,

Monotonic Functions of a Random Variable (3/4) Assume that has first derivative . Then the PDF of in the region where is given by For the monotonically increasing case

Monotonic Functions of a Random Variable (4/4) For the monotonically decreasing case

Illustrative Examples (1/5) Example 4.6. Let , where is a continuous uniform random variable in the interval . What is the PDF of ? Within this interval, is strictly monotonic, and its inverse is

Illustrative Examples (2/5) Example 4.7. Let and be independent random variables that are uniformly distributed on the interval [0, 1], respectively. What is the PDF of the random variable

Question Let and be independent random variables that are uniformly distributed on the interval [0, 1], respectively. What is the PDF of the random variable

Illustrative Examples (3/5) Example 4.8. Let and be independent random variables that are uniformly distributed on the interval [0, 1]. What is the PDF of the random variable

Illustrative Examples (4/5) Extra Example. Let and be independent random variables that are uniformly distributed on the interval [0, 1], respectively. What is the PDF of the random variable

Illustrative Examples (5/5) Example 4.9. Let and be independent random variables that are exponential distributed with parameter . What is the PDF of the random variable

An Extra Example Let and be independent random variables that are uniformly distributed on the interval [0, 1], respectively. What is the PDF of the random variable 1 1/2 1 2

Exercise 1. Let and be independent random variables that are uniformly distributed on the interval [0, 1]. What is the PDF of the random variable 2. Let and be independent random variables that are uniformly distributed on the interval [0, 1]. What is the PDF of the random variable

Sums of Independent Random Variables (1/2) We also can use the convolution method to obtain the distribution of If and are independent discrete random variables with integer values Convolution of PMFs of and

Sums of Independent Random Variables (2/2) If and are independent continuous random variables, the PDF of can be obtained by Convolution of PMFs of and independence assumption

Illustrative Examples (1/4) Example. Let and be independent and have PMFs given by Calculate the PMF of by convolution.

Illustrative Examples (2/4) Example 4.10. The random variables and are independent and uniformly distributed in the interval [0, 1]. The PDF of is

Illustrative Examples (3/4)

Illustrative Examples (4/4) Or, we can use the “Derived Distribution” method previously introduced

Graphical Calculation of Convolutions Figure 4.10. Illustration of the convolution calculation. For the value of under consideration, is equal to the integral of the function shown in the last plot. 2 1 ( w>0: shifted right w<0: shifted left ) Shifted by w Flipped around the origin 3 Calculate the product of two plots

Recitation SECTION 4.1 Derived Distributions Problems 1, 4, 8, 11, 14