Some standard univariate probability distributions

Slides:



Advertisements
Similar presentations
Order Statistics The order statistics of a set of random variables X1, X2,…, Xn are the same random variables arranged in increasing order. Denote by X(1)
Advertisements

Special random variables Chapter 5 Some discrete or continuous probability distributions.
Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Exponential Distribution. = mean interval between consequent events = rate = mean number of counts in the unit interval > 0 X = distance between events.
Statistics review of basic probability and statistics.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Discrete Probability Distributions
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Chapter 1 Probability Theory (i) : One Random Variable
Maximum likelihood (ML) and likelihood ratio (LR) test
Descriptive statistics Experiment  Data  Sample Statistics Sample mean Sample variance Normalize sample variance by N-1 Standard deviation goes as square-root.
Probability Densities
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
QA-2 FRM-GARP Sep-2001 Zvi Wiener Quantitative Analysis 2.
FRM Zvi Wiener Following P. Jorion, Financial Risk Manager Handbook Financial Risk Management.
Generalised linear models
Maximum likelihood (ML) and likelihood ratio (LR) test
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Statistics.
Log-linear and logistic models Generalised linear model ANOVA revisited Log-linear model: Poisson distribution logistic model: Binomial distribution Deviances.
Log-linear and logistic models
Some standard univariate probability distributions
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
The moment generating function of random variable X is given by Moment generating function.
3-1 Introduction Experiment Random Random experiment.
Some standard univariate probability distributions
Continuous Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Maximum likelihood (ML)
Discrete Random Variables and Probability Distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Exponential Distribution & Poisson Process
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
Standard Statistical Distributions Most elementary statistical books provide a survey of commonly used statistical distributions. The reason we study these.
PBG 650 Advanced Plant Breeding
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
DATA ANALYSIS Module Code: CA660 Lecture Block 3.
Chapter 3 Basic Concepts in Statistics and Probability
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
Moment Generating Functions
Some standard univariate probability distributions Characteristic function, moment generating function, cumulant generating functions Discrete distribution.
Random Sampling, Point Estimation and Maximum Likelihood.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 5 Discrete Random Variables.
1 1 Slide © 2004 Thomson/South-Western Chapter 3, Part A Discrete Probability Distributions n Random Variables n Discrete Probability Distributions n Expected.
Problem: 1) Show that is a set of sufficient statistics 2) Being location and scale parameters, take as (improper) prior and show that inferences on ……
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Lecture 3: Statistics Review I Date: 9/3/02  Distributions  Likelihood  Hypothesis tests.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Sampling and estimation Petter Mostad
Chapter 4 Continuous Random Variables and Probability Distributions  Probability Density Functions.2 - Cumulative Distribution Functions and E Expected.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Chapter 31Introduction to Statistical Quality Control, 7th Edition by Douglas C. Montgomery. Copyright (c) 2012 John Wiley & Sons, Inc.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Probability Distributions: a review
Lecture 3 B Maysaa ELmahi.
The Exponential and Gamma Distributions
Chapter 7: Sampling Distributions
Chapter 5 Statistical Models in Simulation
Multinomial Distribution
Moment Generating Functions
Continuous Distributions
Moments of Random Variables
Presentation transcript:

Some standard univariate probability distributions Characteristic, moment generating, cumulant generating functions Discrete distribution Continuous distributions Some distributions associated with normal References

Characteristic, moment generating and cumulant generating functions Characteristic function is defined as the expectation of the function - eitx Moment generating function is defined as (the expectation of etx ): Moments can be calculated in the following way. Obtain derivative of M(t) and take the value of it at t=0 Cumulant generating function is defined as the natural logarithm of the characteristic function

Discrete distributions: Binomial Let us assume that we carry out an experiment and result of the experiment is either “success” or “failure”. Probability of “success” is p. Then probability of failure will be q=1-p. We carry experiments n times. What is the probability of k successes: Characteristic function: Moment generating function: Find first and second moments

Discrete distributions: Poisson When the number of trials (n) is large and the probability of successes (p) is small and np is finite and tends to  then binomial distribution converges to Poisson distribution: Poisson distribution describes the distribution an event that occurs rarely in a short period. It is used for example in counting statistics to describe the number of the registered photons. Characteristic function is: What is the first moment?

Discrete distributions: Negative Binomial Consider experiment: Probability of “success” is p and probability of “failure” is q=1-p. We carry out experiment until k-th success. We want to find probability of j failures. (It is called sequential sampling. Sampling is carried out until some stopping rule is satisfied). If we have j failure then it means that the number of trials is k+j. Last trial was success. Then probability that we will have exactly j failures is: It is called negative binomial because coefficients are from negative binomial series: p-k=(1-q)-k Characteristic function is: What is the moment generating function? What is the first moment?

Continuous distributions: uniform Simplest form of the continuous distribution is the uniform with density: Distribution is: Moments and other properties are calculated easily.

Continuous distributions: exponential Density of exponential distribution has the form: This distribution has two origins. Maximum entropy. If we know that random variable is non-negative and we know its first moment – 1/ then maximum entropy distribution has the exponential form. From Poisson type random processes. If probability distribution of j(x) events occurring during time interval (0;x] is a Poisson with mean value x then probability of time elapsing till the first event occurs has the exponential distribution. Let Tr denotes time elapsed until r-th event Putting r=1 we get e(- x). Taking into account that P(T1>x) = 1-F1(x) and getting its derivative wrt t we arrive to exponential distribution This distribution together with Poisson is widely used in reliability studies, life testing etc.

Continuous distributions: Gamma Gamma distribution can be considered as a generalisation of the exponential distribution. It has the form: It is the probability of time x elapsing before r events happens Characteristic function of this distribution is: This distribution is widely used in many application. One of the application is the use in prior probability generation for sample variance. For this inverse Gamma distribution is used (by changing variable y = 1/x we get inverse Gamma). Gamma distribution can be generalised to non-integer values of r also (by putting (r) instead of (r-1)! )

Continuous distributions: Normal Perhaps the most popular and widely used continuous distribution is the normal distribution. Main reason for this is that usually random variable is the sum of the many random variables. According to central limit theorem under some conditions (for example: random variables are independent. first and second moments exist and finite then distribution of sum of random variables converges to normal distribution) Density of the normal distribution has the form Another remarkable fact is that if we know only know mean and variance then maximum entropy distribution is the normal. Its characteristic function is:

Exponential family Exponential family of distributions has the form Many distributions are special case of this family. Natural exponential family of distributions is the subclass of this family: Where A() is a natural parameter. If we use the fact that distribution should be normalised to 1 then characteristic function of the natural exponential family with natural parameter A() =  can be derived to be: Try to derive it. Hint: use the normalisation fact. Find D() and then use expression of characteristic function and D() . This distribution is used for fitting generalised linear models.

Continuous distributions: 2 Normal variables are called standardized if their mean is 0 and variance is 1. Distribution of the sum of the squares of n standardized normal random variables is 2 with n degrees of freedom. Density function is: If there are p linear restraints on the random variables then degree of freedom becomes n-p. Characteristic function for this distribution is: 2 is used widely in statistics for such tests as goodness of fit of a model to the experiment.

Continuous distributions: t and F-distributions Two more distributions are closely related with the normal distribution. One of them is Student’s t-distribution. It is used to test if mean value of the sample is significantly different from 0. Another and similar application is for tests of differences of means of two different samples. Distribution of ratio of the random variables with standardised normal distribution to the square root of the random variable with 2 distribution is t-distribution. Fisher’s F-distribution is a distribution of ratio of the variances of two different samples. It is used to test if their variances of two samples are different. One of the important application is in ANOVA.

Reference Johnson, N.L. & Kotz, S. (1969, 1970, 1972) Distributions in Statistics, I: Discrete distributions; II, III: Continuous univariate distributions, IV: Continuous multivariate distributions. Houghton Mufflin, New York. Mardia, K.V. & Jupp, P.E. (2000) Directional Statistics, John Wiley & Sons.