PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.

Slides:



Advertisements
Similar presentations
Chapter 4 Euclidean Vector Spaces
Advertisements

9. Two Functions of Two Random Variables
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula.
ELEC 303 – Random Signals Lecture 20 – Random processes
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS.
Probability: Many Random Variables (Part 2) Mike Wasikowski June 12, 2008.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Continuous Random Variables and Probability Distributions
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Lecture II-2: Probability Review
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
1 1. Basics Probability theory deals with the study of random phenomena, which under repeated experiments yield different outcomes that have certain underlying.
CHAPTER 4 Multiple Random Variable
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
1 5. Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v?
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
1 One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint p.d.f.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Functions of a Random Variable.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Random Variables and Stochastic Processes – Lecture#13 Dr. Ghazi Al Sukkar Office Hours:
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Operations on Multiple Random Variables
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Geology 6600/7600 Signal Analysis 02 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Chapter 3 DeGroot & Schervish. Functions of a Random Variable the distribution of some function of X suppose X is the rate at which customers are served.
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Distributions of Functions of Random Variables November 18, 2015
Joint Moments and Joint Characteristic Functions.
EEE APPENDIX B Transformation of RV Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
One Function of Two Random Variables
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
12. Principles of Parameter Estimation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
5. Functions of a Random Variable
5. Functions of a Random Variable
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
CS723 - Probability and Stochastic Processes
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
11. Conditional Density Functions and Conditional Expected Values
9. Two Functions of Two Random Variables
12. Principles of Parameter Estimation
Further Topics on Random Variables: Derived Distributions
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Further Topics on Random Variables: Derived Distributions
Introduction to Probability: Solutions for Quizzes 4 and 5
8. One Function of Two Random Variables
Further Topics on Random Variables: Derived Distributions
Presentation transcript:

PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random Variables

Two Functions of Two RV.s  X and Y are two random variables with joint p.d.f  and are functions  define the new random variables:  How to determine the joint p.d.f  with in hand, the marginal p.d.fs and can be easily determined.

Two Functions of Two RV.s  For given z and w,  where is the region in the xy plane such that :

 X and Y are independent uniformly distributed rv.s in  Define  Determine  Obviously both w and z vary in the interval Thus  two cases: Example Solution

 For  With  we obtain  Thus Example - continued

 Also,  and  and are continuous and differentiable functions,  So, it is possible to develop a formula to obtain the joint p.d.f directly. Example - continued

 Let’s consider:  Let us say these are the solutions to the above equations: (a) (b)

Consider the problem of evaluating the probability This can be rewritten as: To translate this probability in terms of we need to evaluate the equivalent region for in the xy plane.

 The point A with coordinates ( z, w ) gets mapped onto the point with coordinates  As z changes to to point B in figure (a), -let represent its image in the xy plane.  As w changes to to C, -let represent its image in the xy plane. (a) (b)

 Finally D goes to  represents the equivalent parallelogram in the XY plane with area  The desired probability can be alternatively expressed as  Equating these, we obtain  To simplify this, we need to evaluate the area of the parallelograms in terms of

 let and denote the inverse transformations, so that  As the point ( z,w ) goes to -the point  Hence x and y of are given by:  Similarly, those of are given by:

 The area of the parallelogram is given by  From the figure and these equations,  so that  or

 This is the Jacobian of the transformation  Using these in, and we get  where represents the Jacobian of the original transformation: ( * )

 Example 9.2: Suppose X and Y are zero mean independent Gaussian r.vs with common variance  Define where  Obtain  Here  Since  if is a solution pair so is Thus Example Solution

 Substituting this into z, we get  and  Thus there are two solution sets   so that Example - continued

 Also is  Notice that here also  Using ( * ),  Thus  which represents a Rayleigh r.v with parameter Example - continued

 Also,  which represents a uniform r.v in  Moreover,  So Z and W are independent.  To summarize, If X and Y are zero mean independent Gaussian random variables with common variance, then - has a Rayleigh distribution - has a uniform distribution. -These two derived r.vs are statistically independent. Example - continued

 Alternatively, with X and Y as independent zero mean Gaussian r.vs with common variance, X + jY represents a complex Gaussian r.v. But  where Z and W are as in except that for the abovementioned, to hold good on the entire complex plane we must have  The magnitude and phase of a complex Gaussian r.v are independent with Rayleigh and uniform distributions respectively. Example - continued

 Let X and Y be independent exponential random variables with common parameter.  Define U = X + Y, V = X - Y.  Find the joint and marginal p.d.f of U and V.  It is given that  Now since u = x + y, v = x - y, always and there is only one solution given by Example Solution

 Moreover the Jacobian of the transformation is given by  and hence represents the joint p.d.f of U and V. This gives  and  Notice that in this case the r.vs U and V are not independent. Example - continued

As we will show, the general transformation formula in ( * ) making use of two functions can be made useful even when only one function is specified.

Auxiliary Variables  Suppose  X and Y : two random variables.  To determine by making use of the above formulation in ( * ), we can define an auxiliary variable  and the p.d.f of Z can be obtained from by proper integration.

 Z = X + Y  Let W = Y so that the transformation is one-to-one and the solution is given by  The Jacobian of the transformation is given by  and hence  or  This reduces to the convolution of and if X and Y are independent random variables. Example

 Let and be independent.  Define  Find the density function of Z.  Making use of the auxiliary variable W = Y, Example Solution

 Using these in ( * ), we obtain  and  Let so that  Notice that as w varies from 0 to 1, u varies from to Example - continued

 Using this in the previous formula, we get  As you can see,  A practical procedure to generate Gaussian random variables is from two independent uniformly distributed random sequences, based on Example - continued

 Let X and Y be independent identically distributed Geometric random variables with  (a) Show that min ( X, Y ) and X – Y are independent random variables.  (b) Show that min ( X, Y ) and max ( X, Y ) – min ( X, Y ) are also independent  (a) Let Z = min ( X, Y ), and W = X – Y.  Note that Z takes only nonnegative values while W takes both positive, zero and negative values Example Solution

 We have P ( Z = m, W = n ) = P { min ( X, Y ) = m, X – Y = n }.  But  Thus Example - continued

represents the joint probability mass function of the random variables Z and W. Also Example - continued

 Thus Z represents a Geometric random variable since and  Note that establishing the independence of the random variables Z and W.  The independence of X – Y and min (X, Y ) when X and Y are  independent Geometric random variables is an interesting observation. Example - continued

 (b) Let Z = min (X, Y ), R = max (X, Y ) – min (X, Y ).  In this case both Z and R take nonnegative integer values  we get Example - continued

 This equation is the joint probability mass function of Z and R.  Also we can obtain:  and  From (9-68)-(9-70), we get  This proves the independence of the random variables Z and R as well. Example - continued