Operations on Multiple Random Variables

Slides:



Advertisements
Similar presentations
9. Two Functions of Two Random Variables
Advertisements

Ch 7.7: Fundamental Matrices
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Lecture 2 Today: Statistical Review cont’d:
Chapter 6 Continuous Random Variables and Probability Distributions
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Probability theory 2008 Outline of lecture 5 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different.
Random Variable and Probability Distribution
Lecture II-2: Probability Review
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Jointly distributed Random variables
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Modern Navigation Thomas Herring
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Review of Probability.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Review of Probability Concepts ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes SECOND.
Continuous Distributions The Uniform distribution from a to b.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Multiple Random Variables & OperationsUnit-2. MULTIPLE CHOICE TRUE OR FALSE FILL IN THE BLANKS Multiple.
Lecture 11 Pairs and Vector of Random Variables Last Time Pairs of R.Vs. Marginal PMF (Cont.) Joint PDF Marginal PDF Functions of Two R.Vs Expected Values.
Statistics for Business & Economics
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
EE 5345 Multiple Random Variables
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Multiple Discrete Random Variables. Introduction Consider the choice of a student at random from a population. We wish to know student’s height, weight,
Geology 6600/7600 Signal Analysis 02 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Continuous Random Variables and Probability Distributions
Jointly distributed Random variables Multivariate distributions.
Lecture 1: Basic Statistical Tools. A random variable (RV) = outcome (realization) not a set value, but rather drawn from some probability distribution.
Joint Moments and Joint Characteristic Functions.
Boundary-Value Problems in Rectangular Coordinates
One Function of Two Random Variables
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
3.1 Expectation Expectation Example
UNIT-2 Multiple Random Variable
Some Rules for Expectation
Review of Probability Concepts
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Handout Ch 4 實習.
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
9. Two Functions of Two Random Variables
Further Topics on Random Variables: Covariance and Correlation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Introduction to Probability: Solutions for Quizzes 4 and 5
Further Topics on Random Variables: Covariance and Correlation
Presentation transcript:

Operations on Multiple Random Variables Previously we discussed operations on one Random Variable: Expected Value Moment Central moment

Characteristic Function Moment Generation Function of Random Variable Monotone Transformation Nonmonotone Transformation

EX 5.1-1 EXPECTED VALUE OF A FUNCTION OF Multiple RANDOM VARIABLES Two Random Variables N Random Variables EX 5.1-1

Joint Moment about the Origin the nth moment mn of the one random variable X the kth moment mk of the one random variable Y n + k is called the order of the moments Thus m02 , m20 and m11 are all 2ed order moments of X and Y The first order moments are the expected values of X and Y and are the coordinates of the center of gravity of the function fXY(x,y)

The second-order moment m11 = E[XY] is called the correlation of X and Y and given the symbol RXY hence If the correlation can be written as Then the random variables X and Y are said to be uncorrelated. Statistically independence of X and Y → X and Y are uncorrelated The converse is not true in general

Uncorrelated of X and Y does not imply that X and Y are Statistically independent in general, except for the Gaussian random variables as will be shown later If RXY = 0 then the random variables X and Y are called orthogonal. For N random variables X1, X2, …,XN the (n1 + n2 + … nN) order joint moments are defined by where n1,n2 ,… nN are all integers 0, 1, 2, EX 5.1-2

Joint Central Moments We define the joint central moments for two random variables X and Y as follows The second-order ( n + k =2) central moments are The variance of X The variance of Y The second-order joint moment µ11 is very important. It is called the covariance of X and Y and is given the symbol CXY

The second-order joint moment µ11 is very important The second-order joint moment µ11 is very important. It is called the covariance of X and Y and is given the symbol CXY By direct expansion of the product we get Note on the 1-D , the variance was defined as Which can be written as

The normalized second order-moments defined as is known as the correlation coefficient of X and Y It can be shown that -1 ≤ r ≤ 1

For N random variables X1, X2, …,XN the (n1 + n2 + … nN) order joint central moments are defined by

Problem 5.1-1

Problem 5.1-3

Problem 5.1-19

JOINT CHARACTERISTIC FUNTIONS Let u first review the characteristic function for the single random variable Let X be a random variable with probability density function fX(x) We defined the characteristic function FX(w) as follows FX(w) is the Fourier transform of fX(x) with the sign of w is reverse The moments mn can be found from the characteristic function as follows:

We now define the joint characteristic function of two random variables X and Y with joint probability density function fXY(x,y) as follows were w1 and w2 are real numbers FX,Y(w1,w2) is the two-dimensional Fourier Transform of fXY(x,y) with reversal of sign of w1 and w2 From the inverse two-dimensional Fourier Transform we have

By setting w2= 0 in the expression of FX,Y(w1,w2) above we obtain the following The characteristic function for the marginal probability density function fX(x) Similarly

The joint moments can be found from the joint characteristic function This expression is the two-dimensional extension of the one-dimension The joint characteristic function for N random variables The joint moments are obtained from

The Gaussian Random Variable A random variable X is called Gaussian if its density function has the form The “spread” about the point x = ax is related to σx

JOINTLY GAUSSINA RANDOM VARIABLES Two random variables X and Y are said to be jointly gaussian or Bivariate gaussian density if their joint density function is of the form were

The joint Gaussian density function and its maximum is located at the point The maximum value is obtained from The locus of constant values of will be an ellipse This is equivalent to the intersection of with the plane parallel to the xy-plane

If then the joint Gaussian density

Example

Example

Example Let X and Y are two zero mean Gaussian independent random variables

Solution Since X and Y are independent