Distributions of Functions of Random Variables November 18, 2015

Slides:



Advertisements
Similar presentations
Order Statistics The order statistics of a set of random variables X1, X2,…, Xn are the same random variables arranged in increasing order. Denote by X(1)
Advertisements

Week 91 Example A device containing two key components fails when and only when both components fail. The lifetime, T 1 and T 2, of these components are.
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Review of Basic Probability and Statistics
Chapter 1 Probability Theory (i) : One Random Variable
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS.
Probability theory 2010 Order statistics  Distribution of order variables (and extremes)  Joint distribution of order variables (and extremes)
1 Continuous Distributions ch4. 2   A random variable X of the continuous type has a support or space S that is an interval(possibly unbounded) or a.
Normal Distribution ch5.
Section 5.4 is n  a i X i i = 1 n  M i (a i t). i = 1 M 1 (a 1 t) M 2 (a 2 t) … M n (a n t) = Y = a 1 X 1 + a 2 X 2 + … + a n X n = If X 1, X 2, …, X.
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
1 Continuous Distributions ch3. 2   A random variable X of the continuous type has a support or space S that is an interval(possibly unbounded) or a.
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33, 5.38, 5.47, 5.53, 5.62.
Probability theory 2010 Outline  The need for transforms  Probability-generating function  Moment-generating function  Characteristic function  Applications.
1 Multivariate Distributions ch4. 2 Multivariable Distributions  It may be favorable to take more than one measurement on a random experiment. –The data.
Generating Functions. The Moments of Y We have referred to E(Y) and E(Y 2 ) as the first and second moments of Y, respectively. In general, E(Y k ) is.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Section 5.5 Important Theorems in the Text: Let X 1, X 2, …, X n be independent random variables with respective N(  1,  1 2 ), N(  2,  2 2 ), …, N(
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Random Variable and Probability Distribution
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
STAT 552 PROBABILITY AND STATISTICS II
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Chap. 4 Continuous Distributions
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Convergence in Distribution
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
One Random Variable Random Process.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Chapter 3 DeGroot & Schervish. Functions of a Random Variable the distribution of some function of X suppose X is the rate at which customers are served.
Chapter 2: Random Variable and Probability Distributions Yang Zhenlin.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Ver Chapter 5 Continuous Random Variables 1 Probability/Ch5.
Chapter 5: The Basic Concepts of Statistics. 5.1 Population and Sample Definition 5.1 A population consists of the totality of the observations with which.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Conditional Expectation
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
Random Variables By: 1.
Expectations of Random Variables, Functions of Random Variables
Cumulative distribution functions and expected values
Example A device containing two key components fails when and only when both components fail. The lifetime, T1 and T2, of these components are independent.
Jiaping Wang Department of Mathematical Science 04/10/2013, Wednesday
Multinomial Distribution
Example Suppose X ~ Uniform(2, 4). Let . Find .
3.0 Functions of One Random Variable
ASV Chapters 1 - Sample Spaces and Probabilities
6.3 Sampling Distributions
TRANSFORMATION OF FUNCTION OF TWO OR MORE RANDOM VARIABLES
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Derived Distributions
Experiments, Outcomes, Events and Random Variables: A Revisit
Further Topics on Random Variables: Derived Distributions
Presentation transcript:

Distributions of Functions of Random Variables November 18, 2015 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 1/2) Distributions of Functions of Random Variables November 18, 2015

Outline 5.1 Functions of One Random Variable 5.2 Transformation of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

Functions of One Random Variable Let X be a continuous random variable with pdf f(x). If we consider a function of X, say, Y=u(X), then Y must also be a random variable with its own distribution The cdf of Y is G(y) = P(Y<=y) = P(u(X)<=y) The pdf of Y is g(y) = G’(y) (where apostrophe ’ denotes derivative)

Functions of One Random Variable Change-of-variable technique Let X be a continuous random variable with pdf f(x) with support c1<x<c2. We begin this discussion by taking Y=u(X) as a continuous increasing function of X with inverse function X=v(Y). Say the support of X maps onto the support of Y, d1=u(c1)<y<d2=u(c2). Then, the cdf of Y is Thus,

Functions of One Random Variable The derivative, g(y)=G’(y), of such an expression is given by Suppose now the function Y=u(X) and its inverse X=v(Y) are continuous decreasing functions. Then, Thus,

Functions of One Random Variable Thus, for both increasing and decreasing cases, we can write the pdf of Y:

Functions of One Random Variable Example 1 All intervals on the distribution's support are equally probable => uniform distribution.

Functions of One Random Variable Theorem 1: Suppose that a random variable X has a continuous distribution for which the cdf is F. Then, the random variable Y=F(X) has a uniform distribution Random variables from any given continuous distribution can be converted to random variables having a uniform distribution, and vice versa

Functions of One Random Variable cdf F of N(0,1) pdf of N(0,1) X ~ N(0,1) Y = F(X) ~ U(0,1)

Functions of One Random Variable Theorem 1 (converse statement): If U is a uniform random variable on (0,1), then random variable X=F-1(U) has cdf F (where F is a continuous cdf and F-1 is its inverse function) Proof: P(X<=x) = P(F-1(U)<=x) = P(U<=F(x)) = F(x)

Functions of One Random Variable Theorem 1 (converse statement) can be used to generate random variables of any distribution To generate values of X which are distributed according to the cdf F: 1. Generate a random number u from U (uniform random variable on (0,1)) 2. Compute the value x such that F(x) = u 3. Take x to be the random number distributed according to the cdf F

Functions of One Random Variable Example 2 (the transformation Y=u(X) is not one-to-one): Let Y=X2, where X is Cauchy, then where Thus, In this case of two-to-one transformation, there is a need to sum two terms, each of which is similar to the one-to-one case

Functions of One Random Variable Consider the discrete case Let X be a discrete random variable with pmf f(x)=P(X=x). Let Y=u(X) be a one-to-one transformation with inverse X=v(Y). Then, the pmf of Y is Note that, in the discrete case, the derivative |v’(y)| is not needed

Functions of One Random Variable Example 3: Let X be a uniform random variable on {1,2,…,n}. Then Y=X+a is a uniform random variable on {a+1,a+2,…,a+n}

Transformations of Two Random Variables If X1 and X2 are two continuous random variables with joint pdf f(x1,x2), and if Y1=u1(X1,X2), Y2=u2(X1,X2) has the single-valued inverse X1=v1(Y1,Y2), X2=v2(Y1,Y2), then the joint pdf of Y1 and Y2 is where | | denotes absolute value and J is the Jacobian given by where | | denotes the determinant of a matrix The derivative is replaced by the Jacobian.

Transformations of Two Random Variables Example 1: Let X1 and X2 be independent random variables, each with pdf Hence, their joint pdf is Let Y1=X1-X2, Y2=X1+X2. Thus, x1=(y1+y2)/2, x2=(y2-y1)/2, and the Jacobian

Transformations of Two Random Variables Then, the joint pdf of Y1 and Y2 is

Transformations of Two Random Variables Example 2 (Box-Muller Transformation): Let X1 and X2 be i.i.d. U(0,1). Let Thus, where Q=Z12+Z22, and the Jacobian

Transformations of Two Random Variables Since the joint pdf of X1 and X2 is it follows that the joint pdf of Z1 and Z2 is This is the joint pdf of two i.i.d. N(0,1) random variables Hence, we can generate two i.i.d. N(0,1) random variables from two i.i.d. U(0,1) random variables using this method

Random Samples Assume that we conduct an experiment n times independently. Let Xk be the random variable corresponding to the outcome of the k-th run of experiment. Then X1, X2,…, Xn form a set of random samples of size n

Random Samples For example, if we toss a die n times and let X1, X2,…, Xn be the random variables corresponding to the outcome of the k-th tossing. Then X1, X2,…, Xn form a set of random samples of size n

Theorems about Independent Random Variables Let X1, X2,…, Xn be n independent discrete random variables and h() be a function of n variables. Then, the expected value of random variable Z=h(X1, X2,…, Xn) is equal to

Theorems about Independent Random Variables Likewise, if X1, X2,…, Xn are independent continuous random variables, then

Theorems about Independent Random Variables Theorem: If X1, X2,…, Xn are independent random variables and, for i = 1, 2,…, n, E[hi(Xi)] exists, then E[h1(X1) h2(X2) … hn(Xn)] = E[h1(X1)] E[h2(X2)]… E[hn(Xn)]

Theorems about Independent Random Variables Proof for the discrete cases: The proof for the continuous cases can be derived similarly

Theorems about Independent Random Variables Theorem: Assume that X1, X2,…, Xn are n independent random variables with respective means μ1, μ2,…, μn and variances σ12, σ22,…, σn2. Then, the mean and variance of random variable where a1,a2,…,an are real constants, are and

Theorems about Independent Random Variables Proof:

Theorems about Independent Random Variables Since Xi and Xj are independent where i≠j, Therefore,

Moment-Generating Function Technique Let X be a random variable. The moment-generating function (mgf) of X is defined as It is called mgf because all of the moments of X can be obtained by successively differentiating MX(t)

Moment-Generating Function Technique For example, Thus, Similarly,

Moment-Generating Function Technique In general, the nth derivative of MX(t) evaluated at t=0 equals E[Xn], i.e., where denotes the nth derivative of

Moment-Generating Function Technique Moment generating function uniquely determines the distribution. That is, there exists a one-to-one correspondence between the moment generating function (mgf) and the distribution function (pmf/pdf) of a random variable

Moment-Generating Function Technique Example 1 (mgf of N(0,1)): where the last equality follows from the fact that the expression in the integral is the pdf of a normal random variable with mean t and variance 1 which integrates to one

Moment-Generating Function Technique Exercise (mgf of N(m,s2)):

Moment-Generating Function Technique Theorem: If X1,X2,…,Xn are independent random variables with respective mgfs, Mi(t), i=1,2,…,n, then the mgf of is

Moment-Generating Function Technique Proof:

Moment-Generating Function Technique Corollary: If X1,X2,…,Xn correspond to independent random samples from a distribution with mgf M(t), then

Moment-Generating Function Technique The mgf of the sum of independent random variables is just the product of the individual mgfs

Moment-Generating Function Technique Example 2: Recall that, let Z1, Z2, …, Zn be independent N(0,1). Then, W=Z12+Z22+…+Zn2 has a distribution that is chi-square with n degrees of freedom, denoted by Let X1, X2, …, Xn be independent chi-square random variables with r1, r2, …, rn degrees of freedom, respectively. Show that Y=X1+X2+…+Xn is

Moment-Generating Function Technique Use the moment-generating function technique: which is the mgf of a Thus, Y is