EE 5345 Multiple Random Variables

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
1 Continuous random variables Continuous random variable Let X be such a random variable Takes on values in the real space  (-infinity; +infinity)  (lower.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Review of Basic Probability and Statistics
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Sep 16, 2005CS477: Analog and Digital Communications1 LTI Systems, Probability Analog and Digital Communications Autumn
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
G. Cowan Lectures on Statistical Data Analysis Lecture 2 page 1 Statistical Data Analysis: Lecture 2 1Probability, Bayes’ theorem 2Random variables and.
1 Engineering Computation Part 6. 2 Probability density function.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
1 Review of Probability Theory [Source: Stanford University]
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
The moment generating function of random variable X is given by Moment generating function.
Sep 20, 2005CS477: Analog and Digital Communications1 Random variables, Random processes Analog and Digital Communications Autumn
Lecture II-2: Probability Review
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
Random variables Petter Mostad Repetition Sample space, set theory, events, probability Conditional probability, Bayes theorem, independence,
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 11 – Derived distributions, covariance, correlation and convolution Dr. Farinaz Koushanfar.
WELCOME. SUBJECT CODE - MA1252 SUBJECT - PROBABILITY AND SUBJECT CODE - MA1252 SUBJECT - PROBABILITY AND QUEUEING THEORY.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
The Erik Jonsson School of Engineering and Computer Science Chapter 6 pp William J. Pervin The University of Texas at Dallas Richardson, Texas.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
IRDM WS Chapter 2: Basics from Probability Theory and Statistics 2.1 Probability Theory Events, Probabilities, Random Variables, Distributions,
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Moment Generating Functions
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
Chapter 5 Joint Continuous Probability Distributions Doubling our pleasure with two random variables Chapter 5C.
One Random Variable Random Process.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Operations on Multiple Random Variables
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Basics on Probability Jingrui He 09/11/2007. Coin Flips  You flip a coin Head with probability 0.5  You flip 100 coins How many heads would you expect.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Geology 6600/7600 Signal Analysis 02 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Joint Moments and Joint Characteristic Functions.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Ver Chapter 5 Continuous Random Variables 1 Probability/Ch5.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
ASV Chapters 1 - Sample Spaces and Probabilities
Probability for Machine Learning
Appendix A: Probability Theory
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
UNIT-2 Multiple Random Variable
5.6 The Central Limit Theorem
Example Suppose X ~ Uniform(2, 4). Let . Find .
CS723 - Probability and Stochastic Processes
ASV Chapters 1 - Sample Spaces and Probabilities
Chapter 3 : Random Variables
CS723 - Probability and Stochastic Processes
copyright Robert J. Marks II
Moments of Random Variables
Presentation transcript:

EE 5345 Multiple Random Variables Cdf’s and pdf’s, Marginals, Independence Functions of Several RV’s Multidimensional Expectation: Correlation & Covariance Multivariate Gaussian RV’s.

Multiple Random Variables Cumulative Distribution Function

Multiple Random Variables (cont) Probability Density Function CDF Marginals

Multiple Random Variables (cont) Pdf Marginals: Integrate out what you don’t want.

Multiple Random Variables (cont) Conditional Pdf’s

Multiple Random Variables (cont) Independence The joint is the product of the marginals.

Multiple Random Variables (cont) Expectation Note: If Xk’s are independent (proof)

Multiple Random Variables (cont) If Xk’s are independent, the joint characteristic function is the product of the marginal characteristic functions

Random Variable Sum (cont) If X and Y are independent, and Z=X+Y Thus, from the convolution theorem of Fourier analysis

Random Variable Sum (cont) If {Xk | 1  k  n } are i.i.d. (independent and identically distributed), and Then…

Random Variable Sum (cont) If {Xk | 1  k  n } are i.i.d. and Then S is Gaussian if the Xk’s are Gaussian. S is Poisson if the Xk’s are Poisson. S is Binomial if the Xk’s are Binomial. S is Gamma if the Xk’s are Gamma. S is Cauchy if the Xk’s are Cauchy. S is Negative Binomial if the Xk’s are Negative Binomial .

Functions of Several Random Variables Types of Transformations A Single Function of n RV’s Functions of n RV’s

Leibnitz’s Rule "It is unworthy of excellent men to lose hours like slaves in the labor of calculation, which could be safely relegated to anyone else if machines were used." Gottfried von Leibnitz. Gottfried von Leibnitz (1648-1716)

One Function of Several Random Variables CDF of Z : Challenge: Find the region x1 x2 Rz

One Function of Several Random Variables (cont) x1 x2 Rz

Sum of Two Random Variables (cont) x y z Rz

Sum of Two Random Variables (cont) x y z Rz

Sum of Two Random Variables (cont)

Sum of Two Random Variables (cont) If X and Y are independent: This is the same result we saw with the Characteristic function.

Product of Two Random Variables Assume X > 0 and Y >0 . Then… x y y = z / x Rz

Product of Two Random Variables X ,Y >0 x y y = z / x Rz

Product of Two Random Variables X ,Y >0 Use Liebnitz’ Rule What happens when we do not restrict X and Y to be positive?

Product of Two Random Variables: Example X and Y i.i.d. and uniform on (0,1)

Product of Two Random Variables: Example X and Y i.i.d. and uniform on (0,1) z 1

Quotient of Two Random Variables Scaling Background: If Y=aX fX(x) x a = 2 fY(y) y

Quotient of Two Random Variables (cont) Given Y, this is a simple scaling problem with Thus…

Quotient of Two Random Variables (cont) Joint pdf…

Quotient of Two Random Variables (example) X & Y i.i.d. exponential RV’s

Quotient of Two Random Variables (example) Integration

Expectation There are two ways to find 1. Smart way 2. Dumb way. (unless you know the distribution of Z): Set Find Compute

Expectation (example) X and Y uniform on (0,1) and i.i.d. Find E[Z] when Z= cos(2(X+Y))

Expectation (discrete) If X and Y are discrete RV’s, we can use the probability mass function

Expectation (joint moments) The joint moments of X and Y are If discrete

Expectation (correlation) Correlation of X and Y are If correlation=0, X and Y are orthogonal. Covariance of X and Y is Correlation coefficient

Joint Gaussian Random Variables What does this look like?

Contours If g(x,y) has contours, then f(g(x,y)) has the same contours. g(x,y)=a y x f(g(x,y)) = f(a)

Joint Gaussian Random Variables Thus has the same contours as This is the equation for an ellipse.

Joint Gaussian Random Variables Means (m1 and m2), variances and 1and 2 aand correlation coefficient , uniquely define 2-D Gaussian. The marginals are 1-D Gaussian RV’s. Do Gaussian marginals imply a joint Gaussian RV? When is a joint Gaussian RV a line mass? x y m1 m2

n Jointly Gaussian RV’s where and the covariance matrix is

n Jointly Gaussian RV’s The Characteristic Function