Berlin Chen Department of Computer Science & Information Engineering

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
One Random Variable Random Process.
1 Topic 5 - Joint distributions and the CLT Joint distributions –Calculation of probabilities, mean and variance –Expectations of functions based on joint.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Math 4030 – 6a Joint Distributions (Discrete)
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Continuous Random Variables and Probability Distributions
Continuous Random Variable (1) Section Continuous Random Variable What is the probability that X is equal to x?
Probability and Moment Approximations using Limit Theorems.
One Function of Two Random Variables
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Lecture 21 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Tutorial 8: Further Topics on Random Variables 1
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
ASV Chapters 1 - Sample Spaces and Probabilities
Tutorial 9: Further Topics on Random Variables 2
ASV Chapters 1 - Sample Spaces and Probabilities
Virtual University of Pakistan
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
11. Conditional Density Functions and Conditional Expected Values
9. Two Functions of Two Random Variables
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Berlin Chen Department of Computer Science & Information Engineering
Discrete Random Variables: Expectation, Mean and Variance
Further Topics on Random Variables: 1
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Covariance and Correlation
Independence and Counting
Conditional Probability, Total Probability Theorem and Bayes’ Rule
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Derived Distributions
Experiments, Outcomes, Events and Random Variables: A Revisit
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Introduction to Probability: Solutions for Quizzes 4 and 5
Discrete Random Variables: Expectation, Mean and Variance
Discrete Random Variables: Expectation, Mean and Variance
Independence and Counting
Independence and Counting
Berlin Chen Department of Computer Science & Information Engineering
8. One Function of Two Random Variables
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Continuous Random Variables: Basics
Presentation transcript:

Further Topics on Random Variables: Convolution, Conditional Expectation and Variance Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University Reference: - D. P. Bertsekas, J. N. Tsitsiklis, Introduction to Probability , Sections 4.2-3

Sums of Independent Random Variables (1/2) Recall: If and are independent random variables, the distribution (PMF or PDF) of can be obtained by computing and inverting the transform We also can use the convolution method to obtain the distribution of If and are independent discrete random variables with integer values Convolution of PMFs of and

Sums of Independent Random Variables (2/2) If and are independent continuous random variables, the PDF of can be obtained by Convolution of PMFs of and

Illustrative Examples (1/4) Example 4.13. Let and be independent and have PMFs given by Calculate the PMF of by convolution.

Illustrative Examples (2/4) Example 4.14. The random variables and are independent and uniformly distributed in the interval [0, 1]. The PDF of is

Illustrative Examples (3/4)

Illustrative Examples (4/4) Or, we can use the “Derived Distribution” method previously introduced in Section 3.6

Graphical Calculation of Convolutions Figure 4.4. Illustration of the convolution calculation. For the value of under consideration, is equal to the integral of the function shown in the last plot. 2 1 ( w>0: shifted right w<0: shifted left ) Shifted by w Flipped around the origin 3 Calculate the product of two plots

More on Conditional Expectation Recall that the conditional expectation is defined by and in fact can be viewed as a function of , because its value depends on the value of Is a random variable ? What is the expected value of ? Note also that the expectation of a function of (If is discrete) (If is continuous)

An Illustrative Example (1/2) Example 4.15. Let the random variables and have a joint PDF which is equal to 2 for belonging to the triangle indicated below and zero everywhere else. What’s the value of ? Joint PDF Conditional PDF (a linear function of )

An Illustrative Example (2/2) We saw that . Hence, is the random variable : The expectation of Total Expectation Theorem

Law of Iterated Expectations (If is discrete) (If is continuous)

An Illustrative Example (1/2) Example 4.16. We start with a stick of length . We break it at a point which is chosen randomly and uniformly over its length, and keep the piece that contains the left end of the stick. We then repeat the same process on the stick that we were left with. What is the expected length of the stick that we are left with, after breaking twice? Let be the length of the stick after we break for the first time. Let be the length after the second time. uniformly distributed uniformly distributed

An Illustrative Example (2/2) By the Law of Iterated Expectations, we have  

More on Conditional Variance Recall that the conditional variance of , given , is defined by in fact can be viewed as a function of , because its value depends on the value of Is a random variable ? What is the expected value of ?

Law of Total Variance The expectation of the conditional variance is related to the unconditional variance Law of Iterated Expectations

Illustrative Examples (1/4) Example 4.16. (continued) Consider again the problem where we break twice a stick of length , at randomly chosen points, with being the length of the stick after the first break and being the length after the second break Calculate using the law of total variance uniformly distributed uniformly distributed (a function of )

Illustrative Examples (2/4) cf. p.14 (a function of ) ( is uniformly distributed)

Illustrative Examples (3/4) Example 4.20. Computing Variances by Conditioning. Consider a continuous random variable with the PDF given in the following figure. We define an auxiliary (discrete) random variable as follows: ½ ¼ 1 3 1 2

Illustrative Examples (4/4) Justification 3

Properties of Conditional Expectation and Variance

Recitation SECTION 4.2 Convolutions Problems 11, 12 SECTION 4.3 More on Conditional Expectation and Variance Problems 15, 16, 17