TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE

Slides:



Advertisements
Similar presentations
Week 91 Example A device containing two key components fails when and only when both components fail. The lifetime, T 1 and T 2, of these components are.
Advertisements

MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Random Variables ECE460 Spring, 2012.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
ORDER STATISTICS.
Review of Basic Probability and Statistics
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS.
G. Cowan Lectures on Statistical Data Analysis Lecture 2 page 1 Statistical Data Analysis: Lecture 2 1Probability, Bayes’ theorem 2Random variables and.
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33, 5.38, 5.47, 5.53, 5.62.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
1 STATISTICAL INFERENCE PART I EXPONENTIAL FAMILY & POINT ESTIMATION.
Chapter 4 Joint Distribution & Function of rV. Joint Discrete Distribution Definition.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
CIVL 181Tutorial 5 Return period Poisson process Multiple random variables.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Chapter 3 Random vectors and their numerical characteristics.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Chapter 2 Random variables 2.1 Random variables Definition. Suppose that S={e} is the sampling space of random trial, if X is a real-valued function.
1 ORDER STATISTICS AND LIMITING DISTRIBUTIONS. 2 ORDER STATISTICS Let X 1, X 2,…,X n be a r.s. of size n from a distribution of continuous type having.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Chapter 3 DeGroot & Schervish. Functions of a Random Variable the distribution of some function of X suppose X is the rate at which customers are served.
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
Continuous Random Variable (1) Section Continuous Random Variable What is the probability that X is equal to x?
Distributions of Functions of Random Variables November 18, 2015
Chapter 3 Discrete Random Variables and Probability Distributions  Random Variables.2 - Probability Distributions for Discrete Random Variables.3.
MULTIPLE RANDOM VARIABLES A vector random variable X is a function that assigns a vector of real numbers to each outcome of a random experiment. e.g. Random.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Random Variables By: 1.
Statistics Lecture 19.
Functions and Transformations of Random Variables
STATISTICS POINT ESTIMATION
Engineering Probability and Statistics - SE-205 -Chap 3
Example A device containing two key components fails when and only when both components fail. The lifetime, T1 and T2, of these components are independent.
3.1 Expectation Expectation Example
UNIT-2 Multiple Random Variable
Jiaping Wang Department of Mathematical Science 04/10/2013, Wednesday
Example Suppose X ~ Uniform(2, 4). Let . Find .
ORDER STATISTICS AND LIMITING DISTRIBUTIONS
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Statistics Lecture 12.
EE255/CPS226 Discrete Random Variables
ORDER STATISTICS AND LIMITING DISTRIBUTIONS
Chapter 5 Applied Statistics and Probability for Engineers
Chapter 2. Random Variables
TRANSFORMATION OF FUNCTION OF TWO OR MORE RANDOM VARIABLES
9. Two Functions of Two Random Variables
copyright Robert J. Marks II
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Derived Distributions
Statistical Inference I
HKN ECE 313 Exam 2 Review Session
Chapter 3-2 Discrete Random Variables
Further Topics on Random Variables: Derived Distributions
Presentation transcript:

TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS

TRANSFORMATION OF RANDOM VARIABLES If X is an rv with cdf F(x), then Y=g(X) is also an rv. If we write y=g(x), the function g(x) defines a mapping from the original sample space of X, S, to a new sample space, , the sample space of the rv Y. g(x): S 

TRANSFORMATION OF RANDOM VARIABLES Let y=g(x) define a 1-to-1 transformation. That is, the equation y=g(x) can be solved uniquely: Ex: Y=X-1  X=Y+1 1-to-1 Ex: Y=X²  X=± sqrt(Y) not 1-to-1 When transformation is not 1-to-1, find disjoint partitions of S for which transformation is 1-to-1.

TRANSFORMATION OF RANDOM VARIABLES If X is a discrete r.v. then S is countable. The sample space for Y=g(X) is ={y:y=g(x),x S}, also countable. The pmf for Y is

Example Let X~GEO(p). That is, Find the p.m.f. of Y=X-1 Solution: X=Y+1 P.m.f. of the number of failures before the first success Recall: X~GEO(p) is the p.m.f. of number of Bernoulli trials required to get the first success

Example Let X be an rv with pmf Let Y=X2. S ={2,  1,0,1,2}  ={0,1,4}

FUNCTIONS OF CONTINUOUS RANDOM VARIABLE Let X be an rv of the continuous type with pdf f. Let y=g(x) be differentiable for all x and non-zero. Then, Y=g(X) is also an rv of the continuous type with pdf given by

FUNCTIONS OF CONTINUOUS RANDOM VARIABLE Example: Let X have the density Let Y=eX. X=g1 (y)=log Y dx=(1/y)dy.

FUNCTIONS OF CONTINUOUS RANDOM VARIABLE Example: Let X have the density Let Y=X2. Find the pdf of Y.

CDF method Example: Let Consider . What is the p.d.f. of Y? Solution:

CDF method Example: Consider a continuous r.v. X, and Y=X². Find p.d.f. of Y. Solution:

TRANSFORMATION OF FUNCTION OF TWO OR MORE RANDOM VARIABLES BIVARIATE TRANSFORMATIONS

DISCRETE CASE Let X1 and X2 be a bivariate random vector with a known probability distribution function. Consider a new bivariate random vector (U, V) defined by U=g1(X1, X2) and V=g2(X1, X2) where g1(X1, X2) and g2(X1, X2) are some functions of X1 and X2 .

DISCRETE CASE Then, the joint pmf of (U,V) is

EXAMPLE Let X1 and X2 be independent Poisson distribution random variables with parameters 1 and 2. Find the distribution of U=X1+X2.

CONTINUOUS CASE Let X=(X1, X2, …, Xn) have a continuous joint distribution for which its joint pdf is f, and consider the joint pdf of new random variables Y1, Y2,…, Yk defined as

CONTINUOUS CASE If the transformation T is one-to-one and onto, then there is no problem of determining the inverse transformation, and we can invert the equation in (*) and obtain new equations as follows:

CONTINUOUS CASE Assuming that the partial derivatives exist at every point (y1, y2,…,yk=n). Under these assumptions, we have the following determinant J

CONTINUOUS CASE called as the Jacobian of the transformation specified by (**). Then, the joint pdf of Y1, Y2,…,Yk can be obtained by using the change of variable technique of multiple variables.

CONTINUOUS CASE As a result, the new p.d.f. is defined as follows:

Example Recall that I claimed: Let X1,X2,…,Xn be independent rvs with Xi~Gamma(i, ). Then, Prove this for n=2 (for simplicity).

M.G.F. Method If X1,X2,…,Xn are independent random variables with MGFs Mxi (t), then the MGF of is

Example Recall that I claimed: Let’s prove this.

Example Recall that I claimed: Let X1,X2,…,Xn be independent rvs with Xi~Gamma(i, ). Then, We proved this with transformation technique for n=2. Now, prove this for general n.

More Examples on Transformations Recall the relationship: If , then X~N( , 2) Let’s prove this.

Example 2 Recall that I claimed: Let X be an rv with X~N(0, 1). Then, Let’s prove this.

Example 3 Let X~N( , 2) and Y=exp(X). Find the p.d.f. of Y.

Example 4 Recall that I claimed: If X and Y have independent N(0,1) distribution, then Z=X/Y has a Cauchy distribution with =0 and σ=1. Recall the p.d.f. of Cauchy distribution: Let’s prove this claim.

Example 5 See Examples 6.3.12 and 6.3.13 in Bain and Engelhardt (pages 207 & 208 in 2nd edition). This is an example of two different transformations: In Example 6.3.12: In Example 6.3.13: X1 & X2 ~ Exp(1) Y1=X1 Y2=X1+X2 X1 & X2 ~ Exp(1) Y1=X1-X2 Y2=X1+X2

Example 6 Let X1 and X2 are independent with N(μ1,σ²1) and N(μ2,σ²2), respectively. Find the p.d.f. of Y=X1-X2.