Generating Functions. The Moments of Y We have referred to E(Y) and E(Y 2 ) as the first and second moments of Y, respectively. In general, E(Y k ) is.

Slides:



Advertisements
Similar presentations
STATISTICS Random Variables and Distribution Functions
Advertisements

Order Statistics The order statistics of a set of random variables X1, X2,…, Xn are the same random variables arranged in increasing order. Denote by X(1)
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Statistics 1: Introduction to Probability and Statistics Section 3-3.
1 Set #3: Discrete Probability Functions Define: Random Variable – numerical measure of the outcome of a probability experiment Value determined by chance.
Chapter 2 Discrete Random Variables
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
AP Statistics Chapter 16. Discrete Random Variables A discrete random variable X has a countable number of possible values. The probability distribution.
By : L. Pour Mohammad Bagher Author : Vladimir N. Vapnik
Section 5.4 is n  a i X i i = 1 n  M i (a i t). i = 1 M 1 (a 1 t) M 2 (a 2 t) … M n (a n t) = Y = a 1 X 1 + a 2 X 2 + … + a n X n = If X 1, X 2, …, X.
Binomial Random Variables. Binomial experiment A sequence of n trials (called Bernoulli trials), each of which results in either a “success” or a “failure”.
Probability theory 2010 Outline  The need for transforms  Probability-generating function  Moment-generating function  Characteristic function  Applications.
Section 10.6 Recall from calculus: lim= lim= lim= x  y  — x x — x kx k 1 + — y y eekek (Let y = kx in the previous limit.) ekek If derivatives.
The moment generating function of random variable X is given by Moment generating function.
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Mutually Exclusive: P(not A) = 1- P(A) Complement Rule: P(A and B) = 0 P(A or B) = P(A) + P(B) - P(A and B) General Addition Rule: Conditional Probability:
Discrete Random Variables and Probability Distributions
Approximations to Probability Distributions: Limit Theorems.
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
© 2010 Pearson Education Inc.Goldstein/Schneider/Lay/Asmar, CALCULUS AND ITS APPLICATIONS, 12e – Slide 1 of 15 Chapter 12 Probability and Calculus.
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
Describing Data Using Numerical Measures
Continuous Probability Distributions  Continuous Random Variable  A random variable whose space (set of possible values) is an entire interval of numbers.
Moment Generating Functions
AP Statistics: Section 8.2 Geometric Probability.
Continuous Distributions The Uniform distribution from a to b.
Convergence in Distribution
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
More Continuous Distributions
Discrete Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4)
One Random Variable Random Process.
Evaluating E(X) and Var X by moment generating function Xijin Ge SDSU Stat/Math Mysterious Mathematics Ahead! Student Discretion Advised.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
4.2 Binomial Distributions
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
CONTINUOUS RANDOM VARIABLES
AP Statistics Chapter 16. Discrete Random Variables A discrete random variable X has a countable number of possible values. The probability distribution.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Distributions of Functions of Random Variables November 18, 2015
Section 5 – Expectation and Other Distribution Parameters.
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Review of Final Part I Sections Jiaping Wang Department of Mathematics 02/29/2013, Monday.
Evaluating Hypotheses. Outline Empirically evaluating the accuracy of hypotheses is fundamental to machine learning – How well does this estimate accuracy.
Lecture 3 B Maysaa ELmahi.
Functions and Transformations of Random Variables
Chapter 2 Discrete Random Variables
STATISTICS Random Variables and Distribution Functions
CONTINUOUS RANDOM VARIABLES
Means and Variances of Random Variables
Moment Generating Functions
Chap 6 Continuous Random Variables Ghahramani 3rd edition
Multinomial Distribution
Generating Functions.
Handout Ch 4 實習.
Berlin Chen Department of Computer Science & Information Engineering
Continuous Distributions
Presentation transcript:

Generating Functions

The Moments of Y We have referred to E(Y) and E(Y 2 ) as the first and second moments of Y, respectively. In general, E(Y k ) is the k th moment of Y. Consider the polynomial where the moments of Y are incorporated into the coefficients

Moment Generating Function If the sum converges for all t in some interval |t| < b, the polynomial is called the moment-generating function, m(t), for the random variable Y. And we may note that for each k,

Moment Generating Function Hence, the moment-generating function is given by May rearrange, since finite for |t| < b.

Moment Generating Function That is, is the polynomial whose coefficients involve the moments of Y.

The k th moment To retrieve the k th moment from the MGF, evaluate the k th derivative at t = 0. And so, letting t = 0:

Geometric MGF For the geometric distribution,

Common MGFs The MGFs for some of the discrete distributions we’ve seen include:

Recognize the distribution Identify the distribution having the moment generating function Give the mean and variance for this distribution. Could use the derivatives, but is that necessary?

Geometric MGF Consider the MGF Use derivatives to determine the first and second moments. And so,

Geometric MGF Since We have And so,

Geometric MGF Since is for a geometric random variable with p = 1/3, our prior results tell us E(Y) = 1/p and V(Y) = (1 – p)/p 2. which do agree with our current results.

All the moments Although the mean and variance help to describe a distribution, they alone do not uniquely describe a distribution. All the moments are necessary to uniquely describe a probability distribution. That is, if two random variables have equal MGFs, (i.e., m Y (t) = m Z (t) for |t| < b ), then they have the same probability distribution.

m(aY+b)? For the random variable Y with MGF m(t), consider W = aY + b. Construct the MGF for the random variable W= 2Y + 3, where Y is a geometric random variable with p = 4/5.

E(aY+b) Now, based on the MGF, we could again consider E(W) = E(aY + b). And so, letting t = 0, as expected.

Tchebysheff’s Theorem For “bell-shaped” distributions, the empirical rule gave us a % rule for probability a value falls within 1, 2, or 3 standard deviations from the mean, respectively. When the distribution is not so bell-shaped, Tchebysheff tells use the probability of being within k standard deviations of the mean is at least 1 – 1/k 2, for k > 0. Remember, it’s just a lower bound.

A Skewed Distribution Consider a binomial experiment with n = 10 and p = 0.1.

A Skewed Distribution Verify Tchebysheff’s lower bound for k = 2: