Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).

Slides:



Advertisements
Similar presentations
Week 91 Example A device containing two key components fails when and only when both components fail. The lifetime, T 1 and T 2, of these components are.
Advertisements

Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
1 Functions of Random Variables Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR.
Multivariate distributions. The Normal distribution.
Probability theory 2010 Order statistics  Distribution of order variables (and extremes)  Joint distribution of order variables (and extremes)
Probability Densities
Review.
1 Engineering Computation Part 6. 2 Probability density function.
Chapter 6 Continuous Random Variables and Probability Distributions
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Multiple random variables Transform methods (Sec , 4.5.7)
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Probability theory 2010 Conditional distributions  Conditional probability:  Conditional probability mass function: Discrete case  Conditional probability.
The moment generating function of random variable X is given by Moment generating function.
Chapter 5. Operations on Multiple R. V.'s 1 Chapter 5. Operations on Multiple Random Variables 0. Introduction 1. Expected Value of a Function of Random.
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
The Neymann-Pearson Lemma Suppose that the data x 1, …, x n has joint density function f(x 1, …, x n ;  ) where  is either  1 or  2. Let g(x 1, …,
Maximum Likelihood Estimation
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
Simulation Output Analysis
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Moment Generating Functions
CHAPTER 4 Multiple Random Variable
Marginal and Conditional distributions. Theorem: (Marginal distributions for the Multivariate Normal distribution) have p-variate Normal distribution.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
The Normal Probability Distribution
Continuous Distributions The Uniform distribution from a to b.
Ch5. Probability Densities II Dr. Deshi Ye
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
One Random Variable Random Process.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Chapter 3 DeGroot & Schervish. Functions of a Random Variable the distribution of some function of X suppose X is the rate at which customers are served.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
Distributions of Functions of Random Variables November 18, 2015
Joint Moments and Joint Characteristic Functions.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Functions of Random Variables
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Chapter 5: The Basic Concepts of Statistics. 5.1 Population and Sample Definition 5.1 A population consists of the totality of the observations with which.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Continuous Distributions
Standard Errors Beside reporting a value of a point estimate we should consider some indication of its precision. For this we usually quote standard error.
Functions and Transformations of Random Variables
Chapter 7: Sampling Distributions
Linear Combination of Two Random Variables
Some Rules for Expectation
Hidden Markov Autoregressive Models
More about Normal Distributions
Example Suppose X ~ Uniform(2, 4). Let . Find .
Introduction to Probability & Statistics The Central Limit Theorem
Tutorial 9: Further Topics on Random Variables 2
Functions of Random variables
6.3 Sampling Distributions
Chapter-1 Multivariate Normal Distributions
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Derived Distributions
Continuous Distributions
Further Topics on Random Variables: Derived Distributions
Moments of Random Variables
Presentation transcript:

Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …). 2.Identify the distribution of W from its moment generating function This procedure works well for sums, linear combinations etc.

Therorem Let X and Y denote a independent random variables each having a gamma distribution with parameters (,  1 ) and (,  2 ). Then W = X + Y has a gamma distribution with parameters (,  1 +  2 ). Proof:

Recognizing that this is the moment generating function of the gamma distribution with parameters (,  1 +  2 ) we conclude that W = X + Y has a gamma distribution with parameters (,  1 +  2 ).

Therorem (extension to n RV’s) Let x 1, x 2, …, x n denote n independent random variables each having a gamma distribution with parameters (,  i ), i = 1, 2, …, n. Then W = x 1 + x 2 + … + x n has a gamma distribution with parameters (,  1 +  2 +… +  n ). Proof:

Recognizing that this is the moment generating function of the gamma distribution with parameters (,  1 +  2 +…+  n ) we conclude that W = x 1 + x 2 + … + x n has a gamma distribution with parameters (,  1 +  2 +…+  n ). Therefore

Therorem Suppose that x is a random variable having a gamma distribution with parameters (,  ). Then W = ax has a gamma distribution with parameters ( /a,  ). Proof:

1.Let X and Y be independent random variables having a  2 distribution with 1 and 2 degrees of freedom respectively then X + Y has a  2 distribution with degrees of freedom Special Cases 2.Let x 1, x 2,…, x n, be independent random variables having a  2 distribution with 1, 2,…, n degrees of freedom respectively then x 1 + x 2 +…+ x n has a  2 distribution with degrees of freedom 1 +…+ n. Both of these properties follow from the fact that a  2 random variable with degrees of freedom is a  random variable with  = ½ and  = /2.

If z has a Standard Normal distribution then z 2 has a  2 distribution with 1 degree of freedom. Recall Thus if z 1, z 2,…, z are independent random variables each having Standard Normal distribution then has a  2 distribution with degrees of freedom.

Therorem Suppose that U 1 and U 2 are independent random variables and that U = U 1 + U 2 Suppose that U 1 and U have a  2 distribution with degrees of freedom 1 and respectively. ( 1 < ) Then U 2 has a  2 distribution with degrees of freedom 2 = - 1 Proof:

Q.E.D.

Distribution of the sample variance

Properties of the sample variance Proof:

Special Cases 1.Setting a = 0. Computing formula

2.Setting a = .

Distribution of the sample variance Let x 1, x 2, …, x n denote a sample from the normal distribution with mean  and variance  2. Let Then has a  2 distribution with n degrees of freedom.

Note: or U = U 2 + U 1 has a  2 distribution with n degrees of freedom.

has normal distribution with mean  and variance  2 /n Thus has a  2 distribution with 1 degree of freedom. We also know that has a Standard Normal distribution and

If we can show that U 1 and U 2 are independent then has a  2 distribution with n - 1 degrees of freedom. The final task would be to show that are independent

Summary Let x 1, x 2, …, x n denote a sample from the normal distribution with mean  and variance  has a  2 distribution with = n - 1 degrees of freedom. 1.than has normal distribution with mean  and variance  2 /n

The Transformation Method Theorem Let X denote a random variable with probability density function f(x) and U = h(X). Assume that h(x) is either strictly increasing (or decreasing) then the probability density of U is:

Proof Use the distribution function method. Step 1 Find the distribution function, G(u) Step 2 Differentiate G (u ) to find the probability density function g(u)

hence

or

Example Suppose that X has a Normal distribution with mean  and variance  2. Find the distribution of U = h(x) = e X. Solution:

hence This distribution is called the log-normal distribution

log-normal distribution

The Transfomation Method (many variables) Theorem Let x 1, x 2,…, x n denote random variables with joint probability density function f(x 1, x 2,…, x n ) Let u 1 = h 1 (x 1, x 2,…, x n ). u 2 = h 2 (x 1, x 2,…, x n ). u n = h n (x 1, x 2,…, x n ).  define an invertible transformation from the x’s to the u’s

Then the joint probability density function of u 1, u 2,…, u n is given by: where Jacobian of the transformation

Example Suppose that x 1, x 2 are independent with density functions f 1 (x 1 ) and f 2 (x 2 ) Find the distribution of u 1 = x 1 + x 2 u 2 = x 1 - x 2 Solving for x 1 and x 2 we get the inverse transformation

The Jacobian of the transformation

The joint density of x 1, x 2 is f(x 1, x 2 ) = f 1 (x 1 ) f 2 (x 2 ) Hence the joint density of u 1 and u 2 is:

From We can determine the distribution of u 1 = x 1 + x 2

Hence This is called the convolution of the two densities f 1 and f 2.

Example: The ex-Gaussian distribution 1. X has an exponential distribution with parameter. 2. Y has a normal (Gaussian) distribution with mean  and standard deviation . Let X and Y be two independent random variables such that: Find the distribution of U = X + Y. This distribution is used in psychology as a model for response time to perform a task.

Now The density of U = X + Y is :.

or

Where V has a Normal distribution with mean and variance  2. Hence Where  (z) is the cdf of the standard Normal distribution

g(u)g(u) The ex-Gaussian distribution