Chapter 5a:Functions of Random Variables Yang Zhenlin.

Slides:



Advertisements
Similar presentations
Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Advertisements

Chapter 3 Some Special Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Statistical Estimation and Sampling Distributions
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
1 Functions of Random Variables Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR.
Review of Basic Probability and Statistics
Chapter 1 Probability Theory (i) : One Random Variable
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS.
Statistics Lecture 18. Will begin Chapter 5 today.
Probability Densities
Section 6.1 Let X 1, X 2, …, X n be a random sample from a distribution described by p.m.f./p.d.f. f(x ;  ) where the value of  is unknown; then  is.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Chapter 4: Joint and Conditional Distributions
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Continuous Random Variables and Probability Distributions.
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
STAT 552 PROBABILITY AND STATISTICS II
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
DATA ANALYSIS Module Code: CA660 Lecture Block 3.
Section 3.5 Let X have a gamma( ,  ) with  = r/2, where r is a positive integer, and  = 2. We say that X has a chi-square distribution with r degrees.
Moment Generating Functions
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Chapter 2: Random Variable and Probability Distributions Yang Zhenlin.
Chapter 8: Simple Linear Regression Yang Zhenlin.
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Chapter 3: Special Distributions Yang Zhenlin.
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
Distributions of Functions of Random Variables November 18, 2015
Joint Moments and Joint Characteristic Functions.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
5 pair of RVs.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Statistics Lecture 19.
ASV Chapters 1 - Sample Spaces and Probabilities
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Example Suppose X ~ Uniform(2, 4). Let . Find .
How accurately can you (1) predict Y from X, and (2) predict X from Y?
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Chapter 5 Properties of a Random Sample
Berlin Chen Department of Computer Science & Information Engineering
Moments of Random Variables
Presentation transcript:

Chapter 5a:Functions of Random Variables Yang Zhenlin

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU Chapter 5a Contents Functions of One Random Variable --- change of variable technique Functions of Two Random Variables --- change of variable technique Sum of Independent Random variables --- The Moment Generating Function Technique 2 The main purpose of this chapter: Introducing methods for finding the distribution of a function of the random variable(s). The main purpose of this chapter: Introducing methods for finding the distribution of a function of the random variable(s).

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 3 Functions of One Random Variable Definition 5a.1 (Change-of-Variable Technique) Let X be a continuous type random variable with pdf f(x). Let Y = u(X) be a one-to-one transformation of X with inverse function X = v(Y). Then the pdf of Y is given by g(y) = f[v(y)] |v(y)|, where v(y) is the derivative of v(y). If the possible values of X are c 1 < x <c 2, then the possible values of Y are u(c 1 )< y <u(c 2 ). Definition 5a.1 (Change-of-Variable Technique) Let X be a continuous type random variable with pdf f(x). Let Y = u(X) be a one-to-one transformation of X with inverse function X = v(Y). Then the pdf of Y is given by g(y) = f[v(y)] |v(y)|, where v(y) is the derivative of v(y). If the possible values of X are c 1 < x <c 2, then the possible values of Y are u(c 1 )< y <u(c 2 ). Example 5a.1. Let X have a gamma distribution with pdf Let Y = e X. Find the pdf of Y. The case of a continuous random variable.

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 4 Functions of One Random Variable Solution: Since the inverse function is X= v(Y) = log (Y), v(y) = 1/y. Thus, by Definition 5.1, the pdf of Y is given by Since the support of X is (0,  ), the support of Y is (1,  ). The pdf of Y is thus, The way to see the change-of-variable technique is through CDF: G(y) = P{Y  y} = P{X  v(y)} = F[v(y)]. Taking derivatives leads to g(y) = f[v(y)] |v(y)|. So, the change- of-variable technique is essentially the CDF technique. The way to see the change-of-variable technique is through CDF: G(y) = P{Y  y} = P{X  v(y)} = F[v(y)]. Taking derivatives leads to g(y) = f[v(y)] |v(y)|. So, the change- of-variable technique is essentially the CDF technique.

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 5 Functions of One Random Variable The Change-of-variable technique can be applied to a random variable X of the discrete type, but there is major difference: pmf p(x) = P{X = x} represents probability, but pdf f(x) does not. For a one-to-one transformation, Y = u(X), with inverse X = v(Y), we can easily see that the pmf g(y) of Y is g(y) = P{Y = y} = P{X = v(y)} = p[v(y)] The possible values of Y are found directly from the possible values of X through the functional relation Y = u(X). The case of a discrete random variable. Example 5a.2. Let X have a Poisson distribution with = 4. Find the pmf of Y = X 1/2. Since X = Y 2, we have,

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 6 Functions of One Random Variable The case of a non-one-to-one function of a continuous r.v. The change-of-variable technique requires that the function is one-to-one, thus cannot be applied when the function is not one-to-one. However, as noted earlier, the distribution of functions of a random variable are essentially developed from the CDF. Thus, the distribution of a non-one-to-one function can still be derived from the CDF! We will demonstrate this idea by showing an important result: The square of a standard normal random variable is a gamma r.v. with parameters (1/2, 2), this special gamma r.v. is called the chi- squared random variable with degrees of freedom equal to 1.

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 7 Functions of One Random Variable Example 5a.3. Let Z be a standard normal r.v. and let X = Z 2. The CDF of X is Taking derivative with respect to x, we obtain the pdf of X: Recognizing that, the above is the pdf of a gamma r.v. with parameters (1/2, 2), called the chi-squared with 1 d.f.

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 8 Functions of Two Random Variables The above change-of-variable technique can be extended to the case of joint distributions involving two or more random variables. Many interesting problems solved. Definition 5a.2. (Change-of-Variable Technique) Let X 1 and X 2 be two continuous type random variables with joint pdf f(x 1, x 2 ). Let Y 1 = u 1 (X 1, X 2 ) and Y 2 = u 2 (X 1, X 2 ) be two continuous functions, which have single-valued inverse: X 1 = v 1 (Y 1, Y 2 ) and X 2 = v 2 (Y 1, Y 2 ). Then the joint pdf of Y 1 and Y 2 is where J, called the Jacobian, is the determinant of the matrix of partial derivatives: Definition 5a.2. (Change-of-Variable Technique) Let X 1 and X 2 be two continuous type random variables with joint pdf f(x 1, x 2 ). Let Y 1 = u 1 (X 1, X 2 ) and Y 2 = u 2 (X 1, X 2 ) be two continuous functions, which have single-valued inverse: X 1 = v 1 (Y 1, Y 2 ) and X 2 = v 2 (Y 1, Y 2 ). Then the joint pdf of Y 1 and Y 2 is where J, called the Jacobian, is the determinant of the matrix of partial derivatives:

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 9 Functions of Two Random Variables Example 5a.4. Let X 1 and X 2 be two independent r.v.s, each with pdf f(x) = e  x, 0 < x < . Consider Y 1 = X 1  X 2, and Y 2 = X 1 + X 2. (a)Find the joint pdf of Y 1 and Y 2 (b)Find the marginal pdfs of Y 1 and Y 2, respectively.

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 10 Functions of Two Random Variables where the possible values of Y 1 and Y 2 can be found as follows: Y 2 = X 1 + X 2 implies 0 < Y 2 <  ; X 1 = (Y 1 + Y 2 )/2 > 0 implies Y 1 >  Y 2 ; X 2 = (Y 2  Y 1 )/2 > 0 implies Y 1 < Y 2. y1y1 y2y2 y 2 = y 1 y 2 = -y 1 The region of (Y 1, Y 2 ) values

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU The latter expression can simply be written as That is called the double exponential pdf. 11 Functions of Two Random Variables (b)The marginal pdf of Y 2 : The marginal pdf of Y 1 :

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 12 Functions of Two Random Variables Definition 5a.3. Let X and Y be jointly distributed r.v.s with joint pmf p(x, y), or a joint pdf f(x, y). Let u(X, Y) be a continuous function of X and Y. Then, u(X, Y) is also a random variable. If X and Y are both discrete, And if X and Y are both continuous, Definition 5a.3. Let X and Y be jointly distributed r.v.s with joint pmf p(x, y), or a joint pdf f(x, y). Let u(X, Y) be a continuous function of X and Y. Then, u(X, Y) is also a random variable. If X and Y are both discrete, And if X and Y are both continuous, If X and Y are jointly distributed r.v.s, then, Var(X + Y) = Var(X) + Var(Y) + 2Cov(X, Y) If further X and Y are independent, then Var(X + Y) = Var(X) + Var(Y) If X and Y are jointly distributed r.v.s, then, Var(X + Y) = Var(X) + Var(Y) + 2Cov(X, Y) If further X and Y are independent, then Var(X + Y) = Var(X) + Var(Y)

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 13 Functions of Two Random Variables Example 5a.5. The joint probability distribution of variables X and Y is shown in the table below, Y 1 X2X (a) Determine the marginal probability distributions of X and Y. (b) Are X and Y independent? Explain. (c) Find the probability mass function of X+Y. (d) Find the probability of P(X  Y). X123 Y123 pX(x)pX(x) pY(y)pY(y) Solution: (a) The marginal pmfs are:

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 14 Functions of Two Random Variables (b)No. Because p(1, 1) = 0.20, but = 0.42  0.50 = (c)Let Z = X+Y, then the pmf of Z is Where, for example, p Z (3) = P(X+Y = 3) = P(X = 1, Y = 2) + P(X = 2, Y = 1) = = (d) P(X  Y) = 1  P(X = Y) = 1  P(X = 1, Y = 1)  P(X = 2, Y = 2)  P(X = 3, Y = 3) = 1  0.20  0.09  0.10 = 0.61 Z23456 pZ(z)pZ(z)

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 15 Sum of Independent Random Variables Using the above property, one can easily see the following results: Sum of independent binomial r.v.s with the same probability of success  is again a binomial r.v. Sum of independent Poisson r.v.s is again a Poisson r.v. Sum of independent exponential r.v.s with the same mean is a gamma r.v. Sum of independent normal r.v.s is again a normal r.v. And more.... Recall the Uniqueness Property of MGF: The MGF of a r.v. X uniquely determines its distribution, and vise versa, e.g., if the MGF of X is the same of that of a normal r.v., then, X must be normally distributed.

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU To demonstrate the above result using the MGF technique, consider two independent normal random variables and. Let Y = X 1 + X 2. The MGF of Y is 16 Sum of Independent Random Variables Sum of independent normal r.v.s is again a normal r.v. Recall the MGF of X ~ N( ,  2 ): If one can show that the MGF of a random variable has the same for as above, then one can conclude that this random variable is normal with mean and variance being, respectively, the quantities in front of ‘t’ and ‘t 2 ’. Recall the MGF of X ~ N( ,  2 ): If one can show that the MGF of a random variable has the same for as above, then one can conclude that this random variable is normal with mean and variance being, respectively, the quantities in front of ‘t’ and ‘t 2 ’. Follows from the independence between X 1 and X 2

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU It follows that Recognizing that this MGF is in the same form of the MGF of a normal random variable, Y must be normally distributed. In particular, This result can easily be extended to the case of many normal r.v.s 17 Sum of Independent Random Variables

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 18 Sum of Independent Random Variables Sum of independent binomial r.v.s with the same probability of success  is again a binomial r.v. To see this, using MGF technique. White board presentation.

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 19 Sum of Independent Random Variables Sum of independent Poisson r.v.s is again a Poisson r.v. To see this, using MGF technique. White board presentation.

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 20 Sum of Independent Random Variables Sum of independent exponential r.v.s with the same mean is a gamma r.v. To see this, using MGF technique. White board presentation.

STAT306, Term II, 09/10 Chapter 5a STAT151, Term I © Zhenlin Yang, SMU 21 Functions of Normal R.V.s In Example 5a.3, we have shown that if Z is a standard normal r.v., then X = Z 2 follows a chi-squared distribution with 1 d.f., which is seen to be a special gamma r.v. We have also shown using the MGF technique that the sum of two independent normal r.v.s is again normally distributed. There are many other functions of normal r.v.(s) of which the distributions are of interest. In particular in the context of statistical inference, functions of a random sample drawn from a normal population, or functions of two random samples drawn from two independent normal populations, are needed for the purposes of drawing statistical inferences about the normal populations. We put these into a general topic: “Sampling Distribution”, with details presented in Chapter 5b.