Functions of Random Variables

Slides:



Advertisements
Similar presentations
Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Chapter 4 Mathematical Expectation.
Chain Rules for Entropy
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Review.
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
Class notes for ISE 201 San Jose State University
Statistics Lecture 9. Last day/Today: Discrete probability distributions Assignment 3: Chapter 2: 44, 50, 60, 68, 74, 86, 110.
Probability theory 2011 Main topics in the course on probability theory  The concept of probability – Repetition of basic skills  Multivariate random.
Random Variables A Random Variable assigns a numerical value to all possible outcomes of a random experiment We do not consider the actual events but we.
Probability theory 2008 Conditional probability mass function  Discrete case  Continuous case.
Probability theory 2010 Conditional distributions  Conditional probability:  Conditional probability mass function: Discrete case  Conditional probability.
The moment generating function of random variable X is given by Moment generating function.
Joint Probability distribution
Joint Probability Distributions
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
The Neymann-Pearson Lemma Suppose that the data x 1, …, x n has joint density function f(x 1, …, x n ;  ) where  is either  1 or  2. Let g(x 1, …,
Probability Theory Summary
Objective: Objective: Use experimental and theoretical distributions to make judgments about the likelihood of various outcomes in uncertain situations.
Chapter 12 Review of Calculus and Probability
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Intro. to Stochastic Processes
Marginal and Conditional distributions. Theorem: (Marginal distributions for the Multivariate Normal distribution) have p-variate Normal distribution.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Random Variables A random variable is simply a real-valued function defined on the sample space of an experiment. Example. Three fair coins are flipped.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
Probability Distributions. We need to develop probabilities of all possible distributions instead of just a particular/individual outcome Many probability.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
1 Parrondo's Paradox. 2 Two losing games can be combined to make a winning game. Game A: repeatedly flip a biased coin (coin a) that comes up head with.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Discrete Distributions. Random Variable - A numerical variable whose value depends on the outcome of a chance experiment.
Chapter 2. Conditional Probability Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Math 4030 – 6a Joint Distributions (Discrete)
Chapter 3 DeGroot & Schervish. Functions of a Random Variable the distribution of some function of X suppose X is the rate at which customers are served.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Distributions of Functions of Random Variables November 18, 2015
President UniversityErwin SitompulPBST 4/1 Dr.-Ing. Erwin Sitompul President University Lecture 4 Probability and Statistics
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
S TOCHASTIC M ODELS L ECTURE 4 B ROWNIAN M OTIONS Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (Shenzhen) Nov 11,
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Conditional Expectation
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Continuous Distributions
Statistics 200 Objectives:
Statistics Lecture 19.
Functions and Transformations of Random Variables
Discrete-time markov chain (continuation)
Main topics in the course on probability theory
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The distribution function F(x)
Conditional Probability on a joint discrete distribution
Some Rules for Expectation
Chap 6 Continuous Random Variables Ghahramani 3rd edition
Example Suppose X ~ Uniform(2, 4). Let . Find .
Functions of Random variables
Handout Ch 4 實習.
6.3 Sampling Distributions
Chapter 5 Applied Statistics and Probability for Engineers
Discrete Distributions
Presentation transcript:

Functions of Random Variables

Methods for determining the distribution of functions of Random Variables Distribution function method Moment generating function method Transformation method

Distribution function method Let X, Y, Z …. have joint density f(x,y,z, …) Let W = h( X, Y, Z, …) First step Find the distribution function of W G(w) = P[W ≤ w] = P[h( X, Y, Z, …) ≤ w] Second step Find the density function of W g(w) = G'(w).

Use of moment generating functions Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …). Identify the distribution of W from its moment generating function This procedure works well for sums, linear combinations etc.

Some Useful Rules Let X be a random variable with moment generating function mX(t). Let Y = bX + a Then mY(t) = mbX + a(t) = E(e [bX + a]t) = eatmX (bt) Let X and Y be two independent random variables with moment generating function mX(t) and mY(t) . Then mX+Y(t) = mX (t) mY (t)

M. G. F.’s - Continuous distributions

M. G. F.’s - Discrete distributions

The Transformation Method Theorem Let X denote a random variable with probability density function f(x) and U = h(X). Assume that h(x) is either strictly increasing (or decreasing) then the probability density of U is:

The Transfomation Method (many variables) Theorem Let x1, x2,…, xn denote random variables with joint probability density function f(x1, x2,…, xn ) Let u1 = h1(x1, x2,…, xn). u2 = h2(x1, x2,…, xn).  un = hn(x1, x2,…, xn). define an invertible transformation from the x’s to the u’s

Then the joint probability density function of u1, u2,…, un is given by: where Jacobian of the transformation

The probability of a Gamblers ruin

Suppose a gambler is playing a game for which he wins 1$ with probability p and loses 1$ with probability q. Note the game is fair if p = q = ½. Suppose also that he starts with an initial fortune of i$ and plays the game until he reaches a fortune of n$ or he loses all his money (his fortune reaches 0$) What is the probability that he achieves his goal? What is the probability the he loses his fortune?

Let Pi = the probability that he achieves his goal? Let Qi = 1 - Pi = the probability the he loses his fortune? Let X = the amount that he was won after finishing the game If the game is fair Then E [X] = (n – i )Pi + (– i )Qi = (n – i )Pi + (– i ) (1 –Pi ) = 0 or (n – i )Pi = i(1 –Pi ) and (n – i + i )Pi = i

If the game is not fair Thus or

Note Also

hence or where

Note thus and

table

A waiting time paradox

Suppose that each person in a restaurant is being served in an “equal” time. That is, in a group of n people the probability that one person took the longest time is the same for each person, namely Suppose that a person starts asking people as they leave – “How long did it take you to be served”. He continues until it he finds someone who took longer than himself Let X = the number of people that he has to ask. Then E[X] = ∞.

Proof = The probability that in the group of the first x people together with himself, he took the longest

Thus The harmonic series

The harmonic series