Joint Moments and Joint Characteristic Functions.

Slides:



Advertisements
Similar presentations
5.4 Basis And Dimension.
Advertisements

CHAPTER 8 More About Estimation. 8.1 Bayesian Estimation In this chapter we introduce the concepts related to estimation and begin this by considering.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
STAT 497 APPLIED TIME SERIES ANALYSIS
6. Mean, Variance, Moments and Characteristic Functions
From PET to SPLIT Yuri Kifer. PET: Polynomial Ergodic Theorem (Bergelson) preserving and weakly mixing is bounded measurable functions polynomials,integer.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
SYSTEMS Identification
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
Visual Recognition Tutorial
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
19. Series Representation of Stochastic Processes
The moment generating function of random variable X is given by Moment generating function.
Lecture II-2: Probability Review
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Modern Navigation Thomas Herring
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
1 1. Basics Probability theory deals with the study of random phenomena, which under repeated experiments yield different outcomes that have certain underlying.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
1 5. Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v?
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
1 One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint p.d.f.
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Functions of a Random Variable.
Chapter VII. Classification of Quadric Surfaces 65. Intersection of a quadric and a line. General form and its matrix representation.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
EE 5345 Multiple Random Variables
Operations on Multiple Random Variables
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
Geology 6600/7600 Signal Analysis 02 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
One Function of Two Random Variables
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
1 Two Discrete Random Variables The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is placed on each possible.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Ver Chapter 5 Continuous Random Variables 1 Probability/Ch5.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
12. Principles of Parameter Estimation
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
5. Functions of a Random Variable
5. Functions of a Random Variable
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
11. Conditional Density Functions and Conditional Expected Values
9. Two Functions of Two Random Variables
12. Principles of Parameter Estimation
16. Mean Square Estimation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
8. One Function of Two Random Variables
Presentation transcript:

Joint Moments and Joint Characteristic Functions

Compact Representation of Joint p.d.f  Various parameters for compact representation of the information contained in the joint p.d.f of two r.vs  Given two r.vs X and Y and a function define the r.v  We can define the mean of Z to be

 It is possible to express the mean of in terms of without computing  Recall that where is the region in xy plane satisfying the above inequality.  As covers the entire z axis, the corresponding regions are nonoverlapping, and they cover the entire xy plane.

 By integrating, we obtain  or  If X and Y are discrete-type r.vs, then  Since expectation is a linear operator, we also get

 If X and Y are independent r.vs, and are always independent of each other.  Therefore Note that this is in general not true (if X and Y are not independent).  How to parametrically represent cross-behavior between two random variables?  we generalize the variance definition

Covariance  Given any two r.vs X and Y, define  By expanding and simplifying the right side of (10-10), we also get  It is easy to see that

Proof  Let so that  Hence the discriminant must be non-positive  This gives us the desired result. a quadratic in the variable a with imaginary(or double) roots

Correlation Parameter (Covariance)  Using, we may define the normalized parameter  or  This represents the correlation coefficient between X and Y.

Uncorrelatedness and Orthogonality  If then X and Y are said to be uncorrelated r.vs.  if X and Y are uncorrelated, then since  X and Y are said to be orthogonal if  If either X or Y has zero mean, then orthogonality and uncorrelatedness are equivalent.

 If X and Y are independent, from with we get  We conclude that independence implies uncorrelatedness.  There cannot be any correlation between two independent r.vs.  However, random variables can be uncorrelated without being independent.

 Let  Suppose X and Y are independent.  Define Z = X + Y, W = X - Y.  Show that Z and W are dependent, but uncorrelated r.vs.  gives the only solution set to be  Moreover and Example Solution

 Thus  and hence Example - continued

 or by direct computation ( Z = X + Y )  and  We have  Thus Z and W are not independent. Example - continued

 However  and  and hence  implying that Z and W are uncorrelated random variables. Example - continued

 Let  Determine the variance of Z in terms of and  and using  If X and Y are independent, then and this reduces to  Thus the variance of the sum of independent r.vs is the sum of their variances Example Solution

Moments represents the joint moment of order ( k, m ) for X and Y.  We can define the joint characteristic function between two random variables  This will turn out to be useful for moment calculations.

Joint Characteristic Functions  The joint characteristic function between X and Y is defined as  Note that  It is easy to show that

 If X and Y are independent r.vs, then from the definition of joint characteristic function, we obtain  Also

More on Gaussian r.vs  X and Y are said to be jointly Gaussian as if their joint p.d.f has the form  By direct substitution and simplification, we obtain the joint characteristic function of two jointly Gaussian r.vs to be  So,

 By direct computation, it is easy to show that for two jointly Gaussian random variables,  in represents the actual correlation coefficient of the two jointly Gaussian r.vs  Notice that implies  Thus if X and Y are jointly Gaussian, uncorrelatedness does imply independence between the two random variables.  Gaussian case is the only exception where the two concepts imply each other.

 Let X and Y be jointly Gaussian r.vs with parameters  Define  Determine  make use of characteristic function  We get  where Example Solution

 This has the form  We conclude that is also Gaussian with so defined mean and variance.  We can also conclude that any linear combination of jointly Gaussian r.vs generate a Gaussian r.v. Example - continued

 Suppose X and Y are jointly Gaussian r.vs as in the previous example.  Define two linear combinations  what can we say about their joint distribution?   The characteristic function of Z and W is given by Example Solution

 Thus  Where  and  We conclude that Z and W are also jointly distributed Gaussian r.vs with means, variances and correlation coefficient as described. Example - continued

 Any two linear combinations of jointly Gaussian random variables (independent or dependent) are also jointly Gaussian r.vs.  Of course, we could have reached the same conclusion by deriving the joint p.d.f  Gaussian random variables are also interesting because of the Central Limit Theorem. Linear operator Gaussian input Gaussian output Example - summary

Central Limit Theorem  Suppose are a set of -zero mean -independent, identically distributed (i.i.d) random variables -with some common distribution  Consider their scaled sum  Then asymptotically (as )

Central Limit Theorem - Proof  We shall prove it here under the independence assumption  The theorem is true under even more general conditions  Let represent their common variance. Since  we have  Consider  where we have made use of the independence of the r.vs

Central Limit Theorem - Proof  But  Also,  and as  Since This is the characteristic function of a zero mean normal r.v with variance and this completes the proof.

Central Limit Theorem - Conclusion  A large sum of independent random variables each with finite variance tends to behave like a normal random variable.  The individual p.d.fs become unimportant to analyze the collective sum behavior.  If we model the noise phenomenon as the sum of a large number of independent random variables, then this theorem allows us to conclude that noise behaves like a Gaussian r.v.  (eg: electron motion in resistor components)

Central Limit Theorem - Conclusion  The finite variance assumption is necessary.  Consider the r.vs to be Cauchy distributed, and let  where each  Then since  We get  which shows that Y is still Cauchy with parameter

Central Limit Theorem - Conclusion  Joint characteristic functions are useful in determining the p.d.f of linear combinations of r.vs.  With X and Y as independent Poisson r.vs with parameters and respectively, let  Then  so that  i.e., sum of independent Poisson r.vs is also a Poisson random variable.