Berlin Chen Department of Computer Science & Information Engineering

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Random Variables ECE460 Spring, 2012.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Lecture II-2: Probability Review
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Joint Distribution of two or More Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Random variables Petter Mostad Repetition Sample space, set theory, events, probability Conditional probability, Bayes theorem, independence,
One Random Variable Random Process.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
One Function of Two Random Variables
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 9 – Continuous Random Variables: Joint PDFs, Conditioning, Continuous Bayes Farinaz Koushanfar.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Random Variables By: 1.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 10 – Conditioning, Continuous Bayes rule Farinaz Koushanfar ECE Dept., Rice University.
Expectations of Random Variables, Functions of Random Variables
ASV Chapters 1 - Sample Spaces and Probabilities
The Exponential and Gamma Distributions
Chapter 4 Continuous Random Variables and Probability Distributions
12. Principles of Parameter Estimation
Instructor: Shengyu Zhang
Appendix A: Probability Theory
STATISTICS Random Variables and Distribution Functions
Chapter 7: Sampling Distributions
Tutorial 8: Further Topics on Random Variables 1
Tutorial 9: Further Topics on Random Variables 2
STOCHASTIC HYDROLOGY Random Processes
Functions of Random variables
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
11. Conditional Density Functions and Conditional Expected Values
Sets and Probabilistic Models
Conditional Probability, Total Probability Theorem and Bayes’ Rule
12. Principles of Parameter Estimation
Berlin Chen Department of Computer Science & Information Engineering
Discrete Random Variables: Expectation, Mean and Variance
Further Topics on Random Variables: 1
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Covariance and Correlation
Sets and Probabilistic Models
Independence and Counting
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Derived Distributions
Experiments, Outcomes, Events and Random Variables: A Revisit
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Introduction to Probability: Solutions for Quizzes 4 and 5
HKN ECE 313 Exam 2 Review Session
Discrete Random Variables: Expectation, Mean and Variance
Discrete Random Variables: Expectation, Mean and Variance
Quick Review of Probability
Sets and Probabilistic Models
Independence and Counting
Berlin Chen Department of Computer Science & Information Engineering
Independence and Counting
8. One Function of Two Random Variables
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Sets and Probabilistic Models
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Continuous Random Variables: Basics
Presentation transcript:

Continuous Random Variables: Conditioning, Expectation and Independence Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University Reference: - D. P. Bertsekas, J. N. Tsitsiklis, Introduction to Probability , Sections 3.4-3.5

Conditioning PDF Given an Event (1/3) The conditional PDF of a continuous random variable , given an event If cannot be described in terms of , the conditional PDF is defined as a nonnegative function satisfying Normalization property

Conditioning PDF Given an Event (2/3) If can be described in terms of ( is a subset of the real line with ), the conditional PDF is defined as a nonnegative function satisfying and for any subset Normalization Property remains the same shape as except that it is scaled along the vertical axis

Conditioning PDF Given an Event (3/3) If are disjoint events with for each , that form a partition of the sample space, then Verification of the above total probability theorem

Conditional Expectation Given an Event The conditional expectation of a continuous random variable , given an event ( ), is defined by The conditional expectation of a function also has the form Total Expectation Theorem Where are disjoint events with for each , that form a partition of the sample space

An Illustrative Example Example 3.10. Mean and Variance of a Piecewise Constant PDF. Suppose that the random variable has the piecewise constant PDF

Multiple Continuous Random Variables Two continuous random variables and associated with a common experiment are jointly continuous and can be described in terms of a joint PDF satisfying is a nonnegative function Normalization Probability Similarly, can be viewed as the “probability per unit area” in the vicinity of Where is a small positive number

An Illustrative Example Example 3.13. Two-Dimensional Uniform PDF. We are told that the joint PDF of the random variables and is a constant on an area and is zero outside. Find the value of and the marginal PDFs of and .

Conditioning one Random Variable on Another Two continuous random variables and have a joint PDF. For any with , the conditional PDF of given that is defined by Normalization Property The marginal, joint and conditional PDFs are related to each other by the following formulas marginalization

Illustrative Examples (1/2) Notice that the conditional PDF has the same shape as the joint PDF , because the normalizing factor does not depend on Figure 3.17: Visualization of the conditional PDF . Let , have a joint PDF which is uniform on the set . For each fixed , we consider the joint PDF along the slice and normalize it so that it integrates to 1 cf. example 3.13

Illustrative Examples (2/2) Example 3.15. Circular Uniform PDF. Ben throws a dart at a circular target of radius . We assume that he always hits the target, and that all points of impact are equally likely, so that the joint PDF of the random variables and is uniform What is the marginal PDF For each value , is uniform

Expectation of a Function of Random Variables If and are jointly continuous random variables, and is some function, then is also a random variable (can be continuous or discrete) The expectation of can be calculated by If is a linear function of and , e.g., , then Where and are scalars

Conditional Expectation The properties of unconditional expectation carry though, with the obvious modifications, to conditional expectation

Total Probability/Expectation Theorems Total Probability Theorem For any event and a continuous random variable Total Expectation Theorem For any continuous random variables and

Independence Two continuous random variables and are independent if Or

More Factors about Independence (1/2) If two continuous random variables and are independent, then Any two events of the forms and are independent

More Factors about Independence (2/2) If two continuous random variables and are independent, then The random variables and are independent for any functions and Therefore,

Joint CDFs If and are two (either continuous or discrete) random variables, their joint cumulative distribution function (CDF) is defined by If and further have a joint PDF , then And If can be differentiated at the point

An Illustrative Example Example 3.20. Verify that if X and Y are described by a uniform PDF on the unit square, then the joint CDF is given by

Recall: the Discrete Bayes’ Rule Let be disjoint events that form a partition of the sample space, and assume that , for all . Then, for any event such that we have Multiplication rule Total probability theorem

Inference and the Continuous Bayes’ Rule (1/2) As we have a model of an underlying but unobserved phenomenon, represented by a random variable with PDF , and we make a noisy measurement , which is modeled in terms of a conditional PDF . Once the experimental value of is measured, what information does this provide on the unknown value of ? Measurement Inference

Inference and the Continuous Bayes’ Rule (2/2) If the unobserved phenomenon is inherently discrete Let is a discrete random variable of the form that represents the different discrete probabilities for the unobserved phenomenon of interest, and be the PMF of Total probability theorem

Illustrative Examples (1/2) Example 3.18. A lightbulb produced by the General Illumination Company is known to have an exponentially distributed lifetime . However, the company has been experiencing quality control problems. On any given day, the parameter of the PDF of is actually a random variable, uniformly distributed in the interval . If we test a lightbulb and record its lifetime ( ), what can we say about the underlying parameter ? Conditioned on , has a exponential distribution with parameter

Illustrative Examples (2/2) Example 3.19. Signal Detection. A binary signal is transmitted, and we are given that and . The received signal is , where normal noise with zero mean and unit variance , independent of . What is the probability that , as a function of the observed value of ? Conditioned on , has a normal distribution with mean and unit variance

Recitation SECTION 3.4 Conditioning on an Event Problems 14, 17, 18 SECTION 3.5 Multiple Continuous Random Variables Problems 19, 24, 25, 26, 28