The moment generating function of random variable X is given by Moment generating function.

Slides:



Advertisements
Similar presentations
Order Statistics The order statistics of a set of random variables X1, X2,…, Xn are the same random variables arranged in increasing order. Denote by X(1)
Advertisements

Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Occupancy Problems m balls being randomly assigned to one of n bins. (Independently and uniformly) The questions: - what is the maximum number of balls.
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
1 Continuous random variables Continuous random variable Let X be such a random variable Takes on values in the real space  (-infinity; +infinity)  (lower.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Important Random Variables Binomial: S X = { 0,1,2,..., n} Geometric: S X = { 0,1,2,... } Poisson: S X = { 0,1,2,... }
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Review of Basic Probability and Statistics
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Multiple random variables Transform methods (Sec , 4.5.7)
Probability theory 2010 Outline  The need for transforms  Probability-generating function  Moment-generating function  Characteristic function  Applications.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
4. Convergence of random variables  Convergence in probability  Convergence in distribution  Convergence in quadratic mean  Properties  The law of.
2003/04/23 Chapter 3 1頁1頁 3.6 Expected Value of Random Variables.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Probability theory 2008 Conditional probability mass function  Discrete case  Continuous case.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Probability theory 2011 Convergence concepts in probability theory  Definitions and relations between convergence concepts  Sufficient conditions for.
Section 5.6 Important Theorem in the Text: The Central Limit TheoremTheorem (a) Let X 1, X 2, …, X n be a random sample from a U(–2, 3) distribution.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Standard error of estimate & Confidence interval.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Moment Generating Functions
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
Probability Theory Overview and Analysis of Randomized Algorithms Prepared by John Reif, Ph.D. Analysis of Algorithms.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Probability and Statistics Dr. Saeid Moloudzadeh Random Variables/ Distribution Functions/ Discrete Random Variables. 1 Contents Descriptive.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Probability Refresher COMP5416 Advanced Network Technologies.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Ver Chapter 5 Continuous Random Variables 1 Probability/Ch5.
Chebyshev’s Inequality Markov’s Inequality Proposition 2.1.
Conditional Expectation
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Random Variable 2013.
Lecture 3 B Maysaa ELmahi.
Basic statistics Usman Roshan.
Probability Theory Overview and Analysis of Randomized Algorithms
Sample Mean Distributions
The distribution function F(x)
Linear Combination of Two Random Variables
Conditional Probability on a joint discrete distribution
Probability & Statistics Probability Theory Mathematical Probability Models Event Relationships Distributions of Random Variables Continuous Random.
Probability Review for Financial Engineers
Chebychev, Hoffding, Chernoff
ASV Chapters 1 - Sample Spaces and Probabilities
Chap 11 Sums of Independent Random Variables and Limit Theorem Ghahramani 3rd edition 2019/5/16.
Berlin Chen Department of Computer Science & Information Engineering
MATH 3033 based on Dekking et al
Part II: Discrete Random Variables
1/2555 สมศักดิ์ ศิวดำรงพงศ์
Moments of Random Variables
Presentation transcript:

The moment generating function of random variable X is given by Moment generating function

The moment generating function of random variable X is given by Moment generating function

The moment generating function of random variable X is given by Moment generating function

More generally,

Example: X has the Poisson distribution with parameter

If X and Y are independent, then The moment generating function of the sum of two random variables is the product of the individual moment generating functions

Let Y = X 1 + X 2 where X 1 ~Poisson( 1 ) and X 2 ~Poisson( 2 ) and X 1 and X 1 are independent, then

Note: The moment generating function uniquely determines the distribution.

If X is a random variable that takes only nonnegative values, then for any a > 0, Markov’s inequality

Proof (in the case where X is continuous):

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law of large numbers

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution with mean  and variance  Then the distribution of Central Limit Theorem

Let X and Y be two discrete random variables, then the conditional probability mass function of X given that Y = y is defined as for all values of y for which P ( Y = y )>0. Conditional probability and conditional expectations

Let X and Y be two discrete random variables, then the conditional probability mass function of X given that Y = y is defined as for all values of y for which P ( Y = y )>0. The conditional expectation of X given that Y = y is defined as Conditional probability and conditional expectations

Let X and Y be two continuous random variables, then the conditional probability density function of X given that Y = y is defined as for all values of y for which f Y ( y )>0. The conditional expectation of X given that Y = y is defined as

Proof: