Presentation is loading. Please wait.

Presentation is loading. Please wait.

Joint Moments and Joint Characteristic Functions.

Similar presentations


Presentation on theme: "Joint Moments and Joint Characteristic Functions."— Presentation transcript:

1 Joint Moments and Joint Characteristic Functions

2 Compact Representation of Joint p.d.f  Various parameters for compact representation of the information contained in the joint p.d.f of two r.vs  Given two r.vs X and Y and a function define the r.v  We can define the mean of Z to be

3  It is possible to express the mean of in terms of without computing  Recall that where is the region in xy plane satisfying the above inequality.  As covers the entire z axis, the corresponding regions are nonoverlapping, and they cover the entire xy plane.

4  By integrating, we obtain  or  If X and Y are discrete-type r.vs, then  Since expectation is a linear operator, we also get

5  If X and Y are independent r.vs, and are always independent of each other.  Therefore Note that this is in general not true (if X and Y are not independent).  How to parametrically represent cross-behavior between two random variables?  we generalize the variance definition

6 Covariance  Given any two r.vs X and Y, define  By expanding and simplifying the right side of (10-10), we also get  It is easy to see that

7 Proof  Let so that  Hence the discriminant must be non-positive  This gives us the desired result. a quadratic in the variable a with imaginary(or double) roots

8 Correlation Parameter (Covariance)  Using, we may define the normalized parameter  or  This represents the correlation coefficient between X and Y.

9 Uncorrelatedness and Orthogonality  If then X and Y are said to be uncorrelated r.vs.  if X and Y are uncorrelated, then since  X and Y are said to be orthogonal if  If either X or Y has zero mean, then orthogonality and uncorrelatedness are equivalent.

10  If X and Y are independent, from with we get  We conclude that independence implies uncorrelatedness.  There cannot be any correlation between two independent r.vs.  However, random variables can be uncorrelated without being independent.

11  Let  Suppose X and Y are independent.  Define Z = X + Y, W = X - Y.  Show that Z and W are dependent, but uncorrelated r.vs.  gives the only solution set to be  Moreover and Example Solution

12  Thus  and hence Example - continued

13  or by direct computation ( Z = X + Y )  and  We have  Thus Z and W are not independent. Example - continued

14  However  and  and hence  implying that Z and W are uncorrelated random variables. Example - continued

15  Let  Determine the variance of Z in terms of and  and using  If X and Y are independent, then and this reduces to  Thus the variance of the sum of independent r.vs is the sum of their variances Example Solution

16 Moments represents the joint moment of order ( k, m ) for X and Y.  We can define the joint characteristic function between two random variables  This will turn out to be useful for moment calculations.

17 Joint Characteristic Functions  The joint characteristic function between X and Y is defined as  Note that  It is easy to show that

18  If X and Y are independent r.vs, then from the definition of joint characteristic function, we obtain  Also

19 More on Gaussian r.vs  X and Y are said to be jointly Gaussian as if their joint p.d.f has the form  By direct substitution and simplification, we obtain the joint characteristic function of two jointly Gaussian r.vs to be  So,

20  By direct computation, it is easy to show that for two jointly Gaussian random variables,  in represents the actual correlation coefficient of the two jointly Gaussian r.vs  Notice that implies  Thus if X and Y are jointly Gaussian, uncorrelatedness does imply independence between the two random variables.  Gaussian case is the only exception where the two concepts imply each other.

21  Let X and Y be jointly Gaussian r.vs with parameters  Define  Determine  make use of characteristic function  We get  where Example Solution

22  This has the form  We conclude that is also Gaussian with so defined mean and variance.  We can also conclude that any linear combination of jointly Gaussian r.vs generate a Gaussian r.v. Example - continued

23  Suppose X and Y are jointly Gaussian r.vs as in the previous example.  Define two linear combinations  what can we say about their joint distribution?   The characteristic function of Z and W is given by Example Solution

24  Thus  Where  and  We conclude that Z and W are also jointly distributed Gaussian r.vs with means, variances and correlation coefficient as described. Example - continued

25  Any two linear combinations of jointly Gaussian random variables (independent or dependent) are also jointly Gaussian r.vs.  Of course, we could have reached the same conclusion by deriving the joint p.d.f  Gaussian random variables are also interesting because of the Central Limit Theorem. Linear operator Gaussian input Gaussian output Example - summary

26 Central Limit Theorem  Suppose are a set of -zero mean -independent, identically distributed (i.i.d) random variables -with some common distribution  Consider their scaled sum  Then asymptotically (as )

27 Central Limit Theorem - Proof  We shall prove it here under the independence assumption  The theorem is true under even more general conditions  Let represent their common variance. Since  we have  Consider  where we have made use of the independence of the r.vs

28 Central Limit Theorem - Proof  But  Also,  and as  Since This is the characteristic function of a zero mean normal r.v with variance and this completes the proof.

29 Central Limit Theorem - Conclusion  A large sum of independent random variables each with finite variance tends to behave like a normal random variable.  The individual p.d.fs become unimportant to analyze the collective sum behavior.  If we model the noise phenomenon as the sum of a large number of independent random variables, then this theorem allows us to conclude that noise behaves like a Gaussian r.v.  (eg: electron motion in resistor components)

30 Central Limit Theorem - Conclusion  The finite variance assumption is necessary.  Consider the r.vs to be Cauchy distributed, and let  where each  Then since  We get  which shows that Y is still Cauchy with parameter

31 Central Limit Theorem - Conclusion  Joint characteristic functions are useful in determining the p.d.f of linear combinations of r.vs.  With X and Y as independent Poisson r.vs with parameters and respectively, let  Then  so that  i.e., sum of independent Poisson r.vs is also a Poisson random variable.


Download ppt "Joint Moments and Joint Characteristic Functions."

Similar presentations


Ads by Google