Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.

Slides:



Advertisements
Similar presentations
Order Statistics The order statistics of a set of random variables X1, X2,…, Xn are the same random variables arranged in increasing order. Denote by X(1)
Advertisements

Functions of Random Variables. Method of Distribution Functions X 1,…,X n ~ f(x 1,…,x n ) U=g(X 1,…,X n ) – Want to obtain f U (u) Find values in (x 1,…,x.
Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
The General Linear Model. The Simple Linear Model Linear Regression.
Multivariate distributions. The Normal distribution.
Probability theory 2010 Order statistics  Distribution of order variables (and extremes)  Joint distribution of order variables (and extremes)
Sampling Distributions
Review.
1 Engineering Computation Part 6. 2 Probability density function.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 6-1 Introduction to Statistics Chapter 7 Sampling Distributions.
Probability and Statistics Review
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
The moment generating function of random variable X is given by Moment generating function.
Jointly distributed Random variables
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Some Continuous Probability Distributions Asmaa Yaseen.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Chapter 6 Sampling and Sampling Distributions
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
The Neymann-Pearson Lemma Suppose that the data x 1, …, x n has joint density function f(x 1, …, x n ;  ) where  is either  1 or  2. Let g(x 1, …,
Maximum Likelihood Estimation
Probability Theory Summary
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Moment Generating Functions
Chap 6-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 6 Introduction to Sampling.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 6-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
Sampling Distribution of the Sample Mean. Example a Let X denote the lifetime of a battery Suppose the distribution of battery battery lifetimes has 
Ch5. Probability Densities II Dr. Deshi Ye
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Distributions of Functions of Random Variables November 18, 2015
Lecture 5 Introduction to Sampling Distributions.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Functions of Random Variables
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Chapter 5: The Basic Concepts of Statistics. 5.1 Population and Sample Definition 5.1 A population consists of the totality of the observations with which.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Continuous Distributions
STAT 311 REVIEW (Quick & Dirty)
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The distribution function F(x)
The Bernoulli distribution
Some Rules for Expectation
Example Suppose X ~ Uniform(2, 4). Let . Find .
Introduction to Probability & Statistics The Central Limit Theorem
POPULATION (of “units”)
6.3 Sampling Distributions
Continuous Distributions
Moments of Random Variables
Presentation transcript:

Functions of Random Variables

Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating function method 3.Transformation method

Distribution function method Let X, Y, Z …. have joint density f(x,y,z, …) Let W = h( X, Y, Z, …) First step Find the distribution function of W G(w) = P[W ≤ w] = P[h( X, Y, Z, …) ≤ w] Second step Find the density function of W g(w) = G'(w).

Example: Student’s t distribution Let Z and U be two independent random variables with: 1. Z having a Standard Normal distribution and 2. U having a  2 distribution with degrees of freedom Find the distribution of

The density of Z is: The density of U is:

Therefore the joint density of Z and U is: The distribution function of T is:

Then where

Student’s t distribution where

Student – W.W. Gosset Worked for a distillery Not allowed to publish Published under the pseudonym “Student

t distribution standard normal distribution

Distribution of the Max and Min Statistics

Let x 1, x 2, …, x n denote a sample of size n from the density f(x). Let M = max(x i ) then determine the distribution of M. Repeat this computation for m = min(x i ) Assume that the density is the uniform density from 0 to .

Hence and the distribution function

Finding the distribution function of M.

Differentiating we find the density function of M. f(x)f(x)g(t)g(t)

Finding the distribution function of m.

Differentiating we find the density function of m. f(x)f(x)g(t)g(t)

The probability integral transformation This transformation allows one to convert observations that come from a uniform distribution from 0 to 1 to observations that come from an arbitrary distribution. Let U denote an observation having a uniform distribution from 0 to 1.

Find the distribution of X. Let Let f(x) denote an arbitrary density function and F(x) its corresponding cumulative distribution function. Hence.

has density f(x). Thus if U has a uniform distribution from 0 to 1. Then U

Use of moment generating functions

Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function p(x) if discrete) Then m X (t) = the moment generating function of X

The distribution of a random variable X is described by either 1.The density function f(x) if X continuous (probability mass function p(x) if X discrete), or 2.The cumulative distribution function F(x), or 3.The moment generating function m X (t)

Properties 1. m X (0) =

4. Let X be a random variable with moment generating function m X (t). Let Y = bX + a Then m Y (t) = m bX + a (t) = E(e [bX + a]t ) = e at m X (bt) 5. Let X and Y be two independent random variables with moment generating function m X (t) and m Y (t). Then m X+Y (t) = m X (t) m Y (t)

6. Let X and Y be two random variables with moment generating function m X (t) and m Y (t) and two distribution functions F X (x) and F Y (y) respectively. Let m X (t) = m Y (t) then F X (x) = F Y (x). This ensures that the distribution of a random variable can be identified by its moment generating function

M. G. F.’s - Continuous distributions

M. G. F.’s - Discrete distributions

Moment generating function of the gamma distribution where

using or

then

Moment generating function of the Standard Normal distribution where thus

We will use

Note: Also

Note: Also

Equating coefficients of t k, we get

Using of moment generating functions to find the distribution of functions of Random Variables

Example Suppose that X has a normal distribution with mean  and standard deviation . Find the distribution of Y = aX + b Solution: = the moment generating function of the normal distribution with mean a  + b and variance a 2  2.

Thus Z has a standard normal distribution. Special Case: the z transformation Thus Y = aX + b has a normal distribution with mean a  + b and variance a 2  2.

Example Suppose that X and Y are independent each having a normal distribution with means  X and  Y, standard deviations  X and  Y Find the distribution of S = X + Y Solution: Now

or = the moment generating function of the normal distribution with mean  X +  Y and variance Thus Y = X + Y has a normal distribution with mean  X +  Y and variance

Example Suppose that X and Y are independent each having a normal distribution with means  X and  Y, standard deviations  X and  Y Find the distribution of L = aX + bY Solution: Now

or = the moment generating function of the normal distribution with mean a  X + b  Y and variance Thus Y = aX + bY has a normal distribution with mean a  X + b  Y and variance

Special Case: Thus Y = X - Y has a normal distribution with mean  X -  Y and variance a = +1 and b = -1.

Example (Extension to n independent RV’s) Suppose that X 1, X 2, …, X n are independent each having a normal distribution with means  i, standard deviations  i (for i = 1, 2, …, n) Find the distribution of L = a 1 X 1 + a 1 X 2 + …+ a n X n Solution: Now (for i = 1, 2, …, n)

or = the moment generating function of the normal distribution with mean and variance Thus Y = a 1 X 1 + … + a n X n has a normal distribution with mean a 1  1 + …+ a n  n and variance

In this case X 1, X 2, …, X n is a sample from a normal distribution with mean , and standard deviations  and Special case:

Thus and variance has a normal distribution with mean

If x 1, x 2, …, x n is a sample from a normal distribution with mean , and standard deviations  then Summary and variance has a normal distribution with mean

Population Sampling distribution of

If x 1, x 2, …, x n is a sample from a distribution with mean , and standard deviations  then if n is large The Central Limit theorem and variance has a normal distribution with mean

We will use the following fact: Let m 1 (t), m 2 (t), … denote a sequence of moment generating functions corresponding to the sequence of distribution functions: F 1 (x), F 2 (x), … Let m(t) be a moment generating function corresponding to the distribution function F(x) then if Proof: (use moment generating functions) then

Let x 1, x 2, … denote a sequence of independent random variables coming from a distribution with moment generating function m(t) and distribution function F(x). Let S n = x 1 + x 2 + … + x n then

Is the moment generating function of the standard normal distribution Thus the limiting distribution of z is the standard normal distribution Q.E.D.