Convergence in Distribution

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Order Statistics The order statistics of a set of random variables X1, X2,…, Xn are the same random variables arranged in increasing order. Denote by X(1)
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Chapter 3 Brownian Motion 報告者:何俊儒.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula.
ORDER STATISTICS.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Independence of random variables
Chapter 1 Probability Theory (i) : One Random Variable
Probability Distributions Finite Random Variables.
CSE 3504: Probabilistic Analysis of Computer Systems Topics covered: Moments and transforms of special distributions (Sec ,4.5.3,4.5.4,4.5.5,4.5.6)
1 Review of Probability Theory [Source: Stanford University]
Section 10.6 Recall from calculus: lim= lim= lim= x  y  — x x — x kx k 1 + — y y eekek (Let y = kx in the previous limit.) ekek If derivatives.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
The moment generating function of random variable X is given by Moment generating function.
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Statistical Inference Lab Three. Bernoulli to Normal Through Binomial One flip Fair coin Heads Tails Random Variable: k, # of heads p=0.5 1-p=0.5 For.
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Approximations to Probability Distributions: Limit Theorems.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Joint Distribution of two or More Random Variables
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Review of Exam 2 Sections 4.6 – 5.6 Jiaping Wang Department of Mathematical Science 04/01/2013, Monday.
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
MTH 161: Introduction To Statistics
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
1 ORDER STATISTICS AND LIMITING DISTRIBUTIONS. 2 ORDER STATISTICS Let X 1, X 2,…,X n be a r.s. of size n from a distribution of continuous type having.
One Random Variable Random Process.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
MATH 4030 – 4B CONTINUOUS RANDOM VARIABLES Density Function PDF and CDF Mean and Variance Uniform Distribution Normal Distribution.
Consistency An estimator is a consistent estimator of θ, if , i.e., if
Engineering Statistics - IE 261
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Chapter 2: Random Variable and Probability Distributions Yang Zhenlin.
Random Variables Example:
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Chapter 3 Discrete Random Variables and Probability Distributions  Random Variables.2 - Probability Distributions for Discrete Random Variables.3.
Distributions of Functions of Random Variables November 18, 2015
Probability and Moment Approximations using Limit Theorems.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
Conditional Expectation
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
MAT 446 Supplementary Note for Ch 3
Lecture 3 B Maysaa ELmahi.
The distribution function F(x)
Parameter, Statistic and Random Samples
Conditional Probability on a joint discrete distribution
Moment Generating Functions
ASV Chapters 1 - Sample Spaces and Probabilities
Multinomial Distribution
Further Topics on Random Variables: 1
Berlin Chen Department of Computer Science & Information Engineering
1/2555 สมศักดิ์ ศิวดำรงพงศ์
Presentation transcript:

Convergence in Distribution Recall: in probability if Definition Let X1, X2,…be a sequence of random variables with cumulative distribution functions F1, F2,… and let X be a random variable with cdf FX(x). We say that the sequence {Xn} converges in distribution to X if at every point x in which F is continuous. This can also be stated as: {Xn} converges in distribution to X if for all such that P(X = x) = 0 Convergence in distribution is also called “weak convergence”. It is weaker then convergence in probability. We can show that convergence in probability implies convergence in distribution. week 11

Simple Example Assume n is a positive integer. Further, suppose that the probability mass function of Xn is: Note that this is a valid p.m.f for n ≥ 2. For n ≥ 2, {Xn} convergence in distribution to X which has p.m.f P(X = 0) = P(X = 1) = ½ i.e. X ~ Bernoulli(1/2) week 11

Example X1, X2,…is a sequence of i.i.d random variables with E(Xi) = μ < ∞. Let . Then, by the WLLN for any a > 0 as n  ∞. So… week 11

Continuity Theorem for MGFs Let X be a random variable such that for some t0 > 0 we have mX(t) < ∞ for . Further, if X1, X2,…is a sequence of random variables with and for all then {Xn} converges in distribution to X. This theorem can also be stated as follows: Let Fn be a sequence of cdfs with corresponding mgf mn. Let F be a cdf with mgf m. If mn(t)  m(t) for all t in an open interval containing zero, then Fn(x)  F(x) at all continuity points of F. Example: Poisson distribution can be approximated by a Normal distribution for large λ. week 11

Example to illustrate the Continuity Theorem Let λ1, λ2,…be an increasing sequence with λn ∞ as n  ∞ and let {Xi} be a sequence of Poisson random variables with the corresponding parameters. We know that E(Xn) = λn = V(Xn). Let then we have that E(Zn) = 0, V(Zn) = 1. We can show that the mgf of Zn is the mgf of a Standard Normal random variable. We say that Zn convergence in distribution to Z ~ N(0,1). week 11

Example Suppose X is Poisson(900) random variable. Find P(X > 950). week 11

Central Limit Theorem The central limit theorem is concerned with the limiting property of sums of random variables. If X1, X2,…is a sequence of i.i.d random variables with mean μ and variance σ2 and , then by the WLLN we have that in probability. The CLT concerned not just with the fact of convergence but how Sn /n fluctuates around μ. Note that E(Sn) = nμ and V(Sn) = nσ2. The standardized version of Sn is and we have that E(Zn) = 0, V(Zn) = 1. week 11

The Central Limit Theorem Let X1, X2,…be a sequence of i.i.d random variables with E(Xi) = μ < ∞ and Var(Xi) = σ2 < ∞. Suppose the common distribution function FX(x) and the common moment generating function mX(t) are defined in a neighborhood of 0. Let Then, for - ∞ < x < ∞ where Ф(x) is the cdf for the standard normal distribution. This is equivalent to saying that converges in distribution to Z ~ N(0,1). Also, i.e. converges in distribution to Z ~ N(0,1). week 11

Example Suppose X1, X2,…are i.i.d random variables and each has the Poisson(3) distribution. So E(Xi) = V(Xi) = 3. The CLT says that as n  ∞. week 11

Examples A very common application of the CLT is the Normal approximation to the Binomial distribution. Suppose X1, X2,…are i.i.d random variables and each has the Bernoulli(p) distribution. So E(Xi) = p and V(Xi) = p(1- p). The CLT says that as n  ∞. Let Yn = X1 + … + Xn then Yn has a Binomial(n, p) distribution. So for large n, Suppose we flip a biased coin 1000 times and the probability of heads on any one toss is 0.6. Find the probability of getting at least 550 heads. Suppose we toss a coin 100 times and observed 60 heads. Is the coin fair? week 11