Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Chapter 3 Some Special Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
1 Engineering Computation Part 6. 2 Probability density function.
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Probability &Statistics Lecture 8
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Probability theory 2010 Conditional distributions  Conditional probability:  Conditional probability mass function: Discrete case  Conditional probability.
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Jointly distributed Random variables
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Some Continuous Probability Distributions Asmaa Yaseen.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Maximum Likelihood Estimation
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Chapter 3 Basic Concepts in Statistics and Probability
Moment Generating Functions
Some standard univariate probability distributions Characteristic function, moment generating function, cumulant generating functions Discrete distribution.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
Stochastic Models Lecture 2 Poisson Processes
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Math 4030 – 6a Joint Distributions (Discrete)
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Jointly distributed Random variables Multivariate distributions.
Joint Moments and Joint Characteristic Functions.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
Functions of Random Variables
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chapter 5: The Basic Concepts of Statistics. 5.1 Population and Sample Definition 5.1 A population consists of the totality of the observations with which.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Statistics -Continuous probability distribution 2013/11/18.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Continuous Distributions
Statistics Lecture 19.
The Exponential and Gamma Distributions
Expectations of Random Variables, Functions of Random Variables
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The distribution function F(x)
Chapter 7: Sampling Distributions
Parameter, Statistic and Random Samples
The Bernoulli distribution
Multinomial Distribution
Some Rules for Expectation
Moment Generating Functions
Example Suppose X ~ Uniform(2, 4). Let . Find .
Functions of Random variables
Handout Ch 4 實習.
6.3 Sampling Distributions
Continuous Distributions
Moments of Random Variables
Presentation transcript:

Some additional Topics

Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution

Therorem Let X and Y denote a independent random variables each having a gamma distribution with parameters (,  1 ) and (,  2 ). Then W = X + Y has a gamma distribution with parameters (,  1 +  2 ). Proof:

Recognizing that this is the moment generating function of the gamma distribution with parameters (,  1 +  2 ) we conclude that W = X + Y has a gamma distribution with parameters (,  1 +  2 ).

Therorem (extension to n RV’s) Let x 1, x 2, …, x n denote n independent random variables each having a gamma distribution with parameters (,  i ), i = 1, 2, …, n. Then W = x 1 + x 2 + … + x n has a gamma distribution with parameters (,  1 +  2 +… +  n ). Proof:

Recognizing that this is the moment generating function of the gamma distribution with parameters (,  1 +  2 +…+  n ) we conclude that W = x 1 + x 2 + … + x n has a gamma distribution with parameters (,  1 +  2 +…+  n ). Therefore

Therorem Suppose that x is a random variable having a gamma distribution with parameters (,  ). Then W = ax has a gamma distribution with parameters ( / a,  ). Proof:

1.Let X and Y be independent random variables having an exponential distribution with parameter then X + Y has a gamma distribution with  = 2 and Special Cases 2.Let x 1, x 2,…, x n, be independent random variables having a exponential distribution with parameter then S = x 1 + x 2 +…+ x n has a gamma distribution with  = n and 3.Let x 1, x 2,…, x n, be independent random variables having a exponential distribution with parameter then has a gamma distribution with  = n and n

Distribution of population – Exponential distribution Another illustration of the central limit theorem

4.Let X and Y be independent random variables having a  2 distribution with 1 and 2 degrees of freedom respectively then X + Y has a  2 distribution with degrees of freedom Special Cases -continued 5.Let x 1, x 2,…, x n, be independent random variables having a  2 distribution with 1, 2,…, n degrees of freedom respectively then x 1 + x 2 +…+ x n has a  2 distribution with degrees of freedom 1 +…+ n. Both of these properties follow from the fact that a  2 random variable with degrees of freedom is a  random variable with  = ½ and  = /2.

If z has a Standard Normal distribution then z 2 has a  2 distribution with 1 degree of freedom. Recall Thus if z 1, z 2,…, z are independent random variables each having Standard Normal distribution then has a  2 distribution with degrees of freedom.

Therorem Suppose that U 1 and U 2 are independent random variables and that U = U 1 + U 2 Suppose that U 1 and U have a  2 distribution with degrees of freedom 1 and respectively. ( 1 < ) Then U 2 has a  2 distribution with degrees of freedom 2 = - 1 Proof:

Q.E.D.

Bivariate Distributions Discrete Random Variables

The joint probability function; p(x,y) = P[X = x, Y = y]

Marginal distributions Conditional distributions

The product rule for discrete distributions Independence

Bayes rule for discrete distributions Proof:

Continuous Random Variables

Definition: Two random variable are said to have joint probability density function f(x,y) if

Marginal distributions Conditional distributions

The product rule for continuous distributions Independence

Bayes rule for continuous distributions Proof:

Example Suppose that to perform a task we first have to recognize the task, then perform the task. Suppose that the time to recognize the task, X, has an exponential distribution with l = ¼ (i,e, mean  = 1/ = 4 ) Once the task is recognized the time to perform the task, Y, is uniform from X/2 to 2X. 1.Find the joint density of X and Y. 2.Find the conditional density of X given Y = y.

Now and Thus

Graph of non-zero region of f(x,y)

Bayes rule for continuous distributions

Conditional Expectation Let U = g(X,Y) denote any function of X and Y. Then is called the conditional expectation of U = g(X,Y) given X = x.

Conditional Expectation and Variance More specifically is called the conditional expectation of Y given X = x. is called the conditional variance of Y given X = x.

An Important Rule where E X and Var X denote mean and variance with respect to the marginal distribution of X, f X (x). and

ProofLet U = g(X,Y) denote any function of X and Y. Then

Now

Example Suppose that to perform a task we first have to recognize the task, then perform the task. Suppose that the time to recognize the task, X, has an exponential distribution with = ¼ (i,e, mean  = 1/ = 4 ) Once the task is recognized the time to perform the task, Y, is uniform from X/2 to 2X. 1.Find E[XY]. 2.Find Var[XY].

Solution

Conditional Expectation: k (>2) random variables

Let X 1, X 2, …, X q, X q+1 …, X k denote k continuous random variables with joint probability density function f(x 1, x 2, …, x q, x q+1 …, x k ) then the conditional joint probability function of X 1, X 2, …, X q given X q+1 = x q+1, …, X k = x k is Definition

Let U = h( X 1, X 2, …, X q, X q+1 …, X k ) then the Conditional Expectation of U given X q+1 = x q+1, …, X k = x k is Definition Note this will be a function of x q+1, …, x k.

Example Let X, Y, Z denote 3 jointly distributed random variable with joint density function Determine the conditional expectation of U = X 2 + Y + Z given X = x, Y = y.

The marginal distribution of X,Y. Thus the conditional distribution of Z given X = x,Y = y is

The conditional expectation of U = X 2 + Y + Z given X = x, Y = y.

Thus the conditional expectation of U = X 2 + Y + Z given X = x, Y = y.

The rule for Conditional Expectation Then Let (x 1, x 2, …, x q, y 1, y 2, …, y m ) = (x, y) denote q + m random variables.

Proof (in the simple case of 2 variables X and Y)

hence

Now

The probability of a Gamblers ruin

Suppose a gambler is playing a game for which he wins 1$ with probability p and loses 1$ with probability q. Note the game is fair if p = q = ½. Suppose also that he starts with an initial fortune of i$ and plays the game until he reaches a fortune of n$ or he loses all his money (his fortune reaches 0$) What is the probability that he achieves his goal? What is the probability the he loses his fortune?

Let P i = the probability that he achieves his goal? Let Q i = 1 - P i = the probability the he loses his fortune? Let X = the amount that he was won after finishing the game If the game is fair Then E [X] = (n – i )P i + (– i )Q i = (n – i )P i + (– i ) (1 –P i ) = 0 or (n – i )P i = i(1 –P i ) and (n – i + i )P i = i

If the game is not fair Thus or

Note Also

hence or where

Note thus and

table

A waiting time paradox

Suppose that each person in a restaurant is being served in an “equal” time. That is, in a group of n people the probability that one person took the longest time is the same for each person, namely Suppose that a person starts asking people as they leave – “How long did it take you to be served”. He continues until it he finds someone who took longer than himself Let X = the number of people that he has to ask. Then E[X] = ∞.

Proof = The probability that in the group of the first x people together with himself, he took the longest

Thus The harmonic series