Berlin Chen Department of Computer Science & Information Engineering

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Random Variables ECE460 Spring, 2012.
Continuous Random Variables and Probability Distributions
Lecture II-2: Probability Review
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Random variables Petter Mostad Repetition Sample space, set theory, events, probability Conditional probability, Bayes theorem, independence,
One Random Variable Random Process.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Continuous Random Variables and Probability Distributions
Joint Moments and Joint Characteristic Functions.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
One Function of Two Random Variables
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 9 – Continuous Random Variables: Joint PDFs, Conditioning, Continuous Bayes Farinaz Koushanfar.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 10 – Conditioning, Continuous Bayes rule Farinaz Koushanfar ECE Dept., Rice University.
Expectations of Random Variables, Functions of Random Variables
ASV Chapters 1 - Sample Spaces and Probabilities
The Exponential and Gamma Distributions
Chapter 4 Continuous Random Variables and Probability Distributions
12. Principles of Parameter Estimation
Analysis of Economic Data
Continuous Probability Distributions
Instructor: Shengyu Zhang
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
STATISTICS Random Variables and Distribution Functions
Multinomial Distribution
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Tutorial 9: Further Topics on Random Variables 2
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
11. Conditional Density Functions and Conditional Expected Values
Sets and Probabilistic Models
Conditional Probability, Total Probability Theorem and Bayes’ Rule
12. Principles of Parameter Estimation
Discrete Random Variables: Expectation, Mean and Variance
Further Topics on Random Variables: 1
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Covariance and Correlation
Sets and Probabilistic Models
Independence and Counting
Conditional Probability, Total Probability Theorem and Bayes’ Rule
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Derived Distributions
Experiments, Outcomes, Events and Random Variables: A Revisit
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Introduction to Probability: Solutions for Quizzes 4 and 5
HKN ECE 313 Exam 2 Review Session
Discrete Random Variables: Expectation, Mean and Variance
Discrete Random Variables: Expectation, Mean and Variance
Sets and Probabilistic Models
Independence and Counting
Berlin Chen Department of Computer Science & Information Engineering
Independence and Counting
Berlin Chen Department of Computer Science & Information Engineering
8. One Function of Two Random Variables
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Sets and Probabilistic Models
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Continuous Random Variables: Basics
Fundamental Sampling Distributions and Data Descriptions
Presentation transcript:

Continuous Random Variables: Joint PDFs, Conditioning, Expectation and Independence Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University Reference: - D. P. Bertsekas, J. N. Tsitsiklis, Introduction to Probability , Sections 3.4-3.6

Multiple Continuous Random Variables (1/2) Two continuous random variables and associated with a common experiment are jointly continuous and can be described in terms of a joint PDF satisfying is a nonnegative function Normalization Probability Similarly, can be viewed as the “probability per unit area” in the vicinity of Where is a small positive number

Multiple Continuous Random Variables (2/2) Marginal Probability We have already defined that We thus have the marginal PDF Similarly

An Illustrative Example Example 3.10. Two-Dimensional Uniform PDF. We are told that the joint PDF of the random variables and is a constant on an area and is zero outside. Find the value of and the marginal PDFs of and .

Joint CDFs If and are two (either continuous or discrete) random variables associated with the same experiment , their joint cumulative distribution function (Joint CDF) is defined by If and further have a joint PDF ( and are continuous random variables) , then And If can be differentiated at the point

An Illustrative Example Example 3.12. Verify that if X and Y are described by a uniform PDF on the unit square, then the joint CDF is given by

Expectation of a Function of Random Variables If and are jointly continuous random variables, and is some function, then is also a random variable (can be continuous or discrete) The expectation of can be calculated by If is a linear function of and , e.g., , then Where and are scalars We will see in Section 4.1 methods for computing the PDF of (if it has one).

More than Two Random Variables The joint PDF of three random variables , and is defined in analogy with the case of two random variables The corresponding marginal probabilities The expected value rule takes the form If is linear (of the form ), then

Conditioning PDF Given an Event (1/3) The conditional PDF of a continuous random variable , given an event If cannot be described in terms of , the conditional PDF is defined as a nonnegative function satisfying Normalization property

Conditioning PDF Given an Event (2/3) If can be described in terms of ( is a subset of the real line with ), the conditional PDF is defined as a nonnegative function satisfying The conditional PDF is zero outside the conditioning event and for any subset Normalization Property remains the same shape as except that it is scaled along the vertical axis

Conditioning PDF Given an Event (3/3) If are disjoint events with for each , that form a partition of the sample space, then Verification of the above total probability theorem think of as an event , and use the total probability theorem from Chapter 1

Illustrative Examples (1/2) Example 3.13. The exponential random variable is memoryless. The time T until a new light bulb burns out is exponential distribution. John turns the light on, leave the room, and when he returns, t time units later, find that the light bulb is still on, which corresponds to the event A={T>t} Let X be the additional time until the light bulb burns out. What is the conditional PDF of X given A ?

Illustrative Examples (2/2) Example 3.14. The metro train arrives at the station near your home every quarter hour starting at 6:00 AM. You walk into the station every morning between 7:10 and 7:30 AM, with the time in this interval being a uniform random variable. What is the PDF of the time you have to wait for the first train to arrive? 1/20 Total Probability theorem:

Conditioning one Random Variable on Another Two continuous random variables and have a joint PDF. For any with , the conditional PDF of given that is defined by Normalization Property The marginal, joint and conditional PDFs are related to each other by the following formulas marginalization

Illustrative Examples (1/2) Notice that the conditional PDF has the same shape as the joint PDF , because the normalizing factor does not depend on Figure 3.16: Visualization of the conditional PDF . Let , have a joint PDF which is uniform on the set . For each fixed , we consider the joint PDF along the slice and normalize it so that it integrates to 1 cf. example 3.13

Illustrative Examples (2/2) Example 3.15. Circular Uniform PDF. Ben throws a dart at a circular target of radius . We assume that he always hits the target, and that all points of impact are equally likely, so that the joint PDF of the random variables and is uniform What is the marginal PDF For each value , is uniform

Conditional Expectation Given an Event The conditional expectation of a continuous random variable , given an event ( ), is defined by The conditional expectation of a function also has the form Total Expectation Theorem and Where are disjoint events with for each , that form a partition of the sample space

An Illustrative Example Example 3.17. Mean and Variance of a Piecewise Constant PDF. Suppose that the random variable has the piecewise constant PDF

Conditional Expectation Given a Random Variable The properties of unconditional expectation carry though, with the obvious modifications, to conditional expectation

Total Probability/Expectation Theorems Total Probability Theorem For any event and a continuous random variable Total Expectation Theorem For any continuous random variables and

Independence Two continuous random variables and are independent if Since that We therefore have Or

More Factors about Independence (1/2) If two continuous random variables and are independent, then Any two events of the forms and are independent It also implies that The converse statement is also true (See the end-of-chapter problem 28)

More Factors about Independence (2/2) If two continuous random variables and are independent, then The random variables and are independent for any functions and Therefore,

Recall: the Discrete Bayes’ Rule Let be disjoint events that form a partition of the sample space, and assume that , for all . Then, for any event such that we have Multiplication rule Total probability theorem

Inference and the Continuous Bayes’ Rule As we have a model of an underlying but unobserved phenomenon, represented by a random variable with PDF , and we make a noisy measurement , which is modeled in terms of a conditional PDF . Once the experimental value of is measured, what information does this provide on the unknown value of ? Measurement Inference

Inference and the Continuous Bayes’ Rule (2/2) Inference about a Discrete Random Variable If the unobserved phenomenon is inherently discrete Let is a discrete random variable of the form that represents the different discrete probabilities for the unobserved phenomenon of interest, and be the PMF of Total probability theorem

Illustrative Examples (1/2) Example 3.19. A lightbulb produced by the General Illumination Company is known to have an exponentially distributed lifetime . However, the company has been experiencing quality control problems. On any given day, the parameter of the PDF of is actually a random variable, uniformly distributed in the interval . If we test a lightbulb and record its lifetime ( ), what can we say about the underlying parameter ? Conditioned on , has a exponential distribution with parameter

Illustrative Examples (2/2) Example 3.20. Signal Detection. A binary signal is transmitted, and we are given that and . The received signal is , where normal noise with zero mean and unit variance , independent of . What is the probability that , as a function of the observed value of ? Conditioned on , has a normal distribution with mean and unit variance

Inference Based on a Discrete Random Variable The earlier formula expressing in terms of can be turned around to yield ?

Recitation SECTION 3.4 Joint PDFs of Multiple Random Variables Problems 15, 16 SECTION 3.5 Conditioning Problems 18, 20, 23, 24 SECTION 3.6 The Continuous Bayes’ Rule Problems 34, 35