1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.

Slides:



Advertisements
Similar presentations
9. Two Functions of Two Random Variables
Advertisements

Bayesian inference of normal distribution
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS.
Chapter 6 Continuous Random Variables and Probability Distributions
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Continuous Random Variables and Probability Distributions
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 13 Oct 14, 2005 Nanjing University of Science & Technology.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
1 1. Basics Probability theory deals with the study of random phenomena, which under repeated experiments yield different outcomes that have certain underlying.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
1 5. Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v?
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
1 One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint p.d.f.
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Functions of a Random Variable.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Random Variables and Stochastic Processes – Lecture#13 Dr. Ghazi Al Sukkar Office Hours:
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Lecture 22 Numerical Analysis. Chapter 5 Interpolation.
EE 5345 Multiple Random Variables
Operations on Multiple Random Variables
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Geology 6600/7600 Signal Analysis 02 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Definition Section 4.1: Indefinite Integrals. The process of finding the indefinite integral of f(x) is called integration of f(x) or integrating f(x).
Chapter 3 DeGroot & Schervish. Functions of a Random Variable the distribution of some function of X suppose X is the rate at which customers are served.
1 Section 5.3 Linear Systems of Equations. 2 THREE EQUATIONS WITH THREE VARIABLES Consider the linear system of three equations below with three unknowns.
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Joint Moments and Joint Characteristic Functions.
One Function of Two Random Variables
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 9 – Continuous Random Variables: Joint PDFs, Conditioning, Continuous Bayes Farinaz Koushanfar.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Lecture 21 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
12. Principles of Parameter Estimation
Some Rules for Expectation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
EE 505 Multiple Random Variables (cont)
5. Functions of a Random Variable
5. Functions of a Random Variable
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
CS723 - Probability and Stochastic Processes
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
11. Conditional Density Functions and Conditional Expected Values
9. Two Functions of Two Random Variables
12. Principles of Parameter Estimation
Further Topics on Random Variables: Derived Distributions
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Further Topics on Random Variables: Derived Distributions
Introduction to Probability: Solutions for Quizzes 4 and 5
Berlin Chen Department of Computer Science & Information Engineering
8. One Function of Two Random Variables
Further Topics on Random Variables: Derived Distributions
Presentation transcript:

1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random variables with joint p.d.f Given two functions and define the new random variables How does one determine their joint p.d.f Obviously with in hand, the marginal p.d.fs and can be easily determined. (9-1) (9-2)

2 The procedure is the same as that in (8-3). In fact for given z and w, where is the region in the xy plane such that the inequalities and are simultaneously satisfied. We illustrate this technique in the next example. (9-3) Fig. 9.1

3 Example 9.1: Suppose X and Y are independent uniformly distributed random variables in the interval Define Determine Solution: Obviously both w and z vary in the interval Thus We must consider two cases: and since they give rise to different regions for (see Figs. 9.2 (a)-(b)). (9-4) (9-5) Fig. 9.2

4 For from Fig. 9.2 (a), the region is represented by the doubly shaded area. Thus and for from Fig. 9.2 (b), we obtain With we obtain Thus (9-6) (9-7) (9-8) (9-9) (9-10)

5 From (9-10), we also obtain and If and are continuous and differentiable functions, then as in the case of one random variable (see (5- 30)) it is possible to develop a formula to obtain the joint p.d.f directly. Towards this, consider the equations For a given point (z,w), equation (9-13) can have many solutions. Let us say (9-11) (9-13) (9-12)

6 represent these multiple solutions such that (see Fig. 9.3) (9-14) Fig. 9.3 Consider the problem of evaluating the probability (a) (9-15) (b)

7 (9-28) where represents the Jacobian of the original transformation given by (9-29) (9-30) (9-31)

8 Next we shall illustrate the usefulness of the formula in (9-29) through various examples: Example 9.2: Suppose X and Y are zero mean independent Gaussian r.vs with common variance Define where Obtain Solution: Here Since if is a solution pair so is From (9-33) (9-32) (9-33) (9-34)

9 Substituting this into z, we get and Thus there are two solution sets We can use (9-35) - (9-37) to obtain From (9-28) so that (9-35) (9-36) (9-37) (9-38) (9-39)

10 We can also compute using (9-31). From (9-33), Notice that agreeing with (9-30). Substituting (9-37) and (9-39) or (9-40) into (9-29), we get Thus which represents a Rayleigh r.v with parameter and (9-40) (9-41) (9-42) (9-43)

11 Example 9.3: Let X and Y be independent exponential random variables with common parameter. Define U = X + Y, V = X - Y. Find the joint and marginal p.d.f of U and V. Solution: It is given that Now since u = x + y, v = x - y, always and there is only one solution given by Moreover the Jacobian of the transformation is given by (9-46) (9-47)

12 and hence represents the joint p.d.f of U and V. This gives and Notice that in this case the r.vs U and V are not independent. As we show below, the general transformation formula in (9-29) making use of two functions can be made useful even when only one function is specified. (9-48) (9-49) (9-50)

13 Auxiliary Variables: Suppose where X and Y are two random variables. To determine by making use of the above formulation in (9-29), we can define an auxiliary variable and the p.d.f of Z can be obtained from by proper integration. Example 9.4: Suppose Z = X + Y and let W = Y so that the transformation is one-to-one and the solution is given by (9-51) (9-52)

14 The Jacobian of the transformation is given by and hence or which agrees with (8.7). Note that (9-53) reduces to the convolution of and if X and Y are independent random variables. Next, we consider a less trivial example. Example 9.5: Let  and  be independent. Define (9-53) (9-54)