§2. The central limit theorem 1. Convergence in distribution Suppose that {X n } are i.i.d. r.v.s with d.f. F n (x), X is a r.v. with F(x), if for all.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Copyright R.J. Marks II EE 505 Sums of RV’s and Long-Term Averages (The Central Limit Theorem)
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Why this can be happen to me?. Can you think, who’ll be the faster catch the fish??
Continuous Distribution. 1. Continuous Uniform Distribution f(x) 1/(b-a) abx Mean :  MIDPOINT Variance :  square length I(a,b) A continuous rV X with.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
ORDER STATISTICS.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Tutorial 10, STAT1301 Fall 2010, 30NOV2010, By Joseph Dong.
ELEC 303 – Random Signals Lecture 18 – Statistics, Confidence Intervals Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 10, 2009.
Risk Pooling in Insurance If n policies, each has independent probability p of a claim, then the number of claims follows the binomial distribution. The.
Sampling Distributions
1 Chap 5 Sums of Random Variables and Long-Term Averages Many problems involve the counting of number of occurrences of events, computation of arithmetic.
Prof. Bart Selman Module Probability --- Part d)
Probability Distributions
Probability theory 2010 Outline  The need for transforms  Probability-generating function  Moment-generating function  Characteristic function  Applications.
Combinatorics. If you flip a penny 100 times, how many heads and tales do you expect?
4. Convergence of random variables  Convergence in probability  Convergence in distribution  Convergence in quadratic mean  Properties  The law of.
The moment generating function of random variable X is given by Moment generating function.
Discrete Random Variables and Probability Distributions
Approximations to Probability Distributions: Limit Theorems.
Hamid R. Rabiee Fall 2009 Stochastic Processes Review of Elementary Probability Lecture I.
Random Variables and Probability Distributions Modified from a presentation by Carlos J. Rosas-Anderson.
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Moment Generating Functions
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
Sampling Distribution of the Sample Mean. Example a Let X denote the lifetime of a battery Suppose the distribution of battery battery lifetimes has 
Convergence in Distribution
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Chapter 2 Random variables 2.1 Random variables Definition. Suppose that S={e} is the sampling space of random trial, if X is a real-valued function.
2.4 Continuous r.v. Suppose that F(x) is the distribution function of r.v. X , if there exists a nonnegative function f(x) , (- 
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Random Variables A random variable is a variable whose value is determined by the outcome of a random experiment. Example: In a single die toss experiment,
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Confidence Interval & Unbiased Estimator Review and Foreword.
Using the Tables for the standard normal distribution.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Laplace, Pierre Simon de ( )
Probability and Moment Approximations using Limit Theorems.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Sampling Distributions Sampling Distributions. Sampling Distribution Introduction In real life calculating parameters of populations is prohibitive because.
Welcome to MM305 Unit 3 Seminar Prof Greg Probability Concepts and Applications.
Chebyshev’s Inequality Markov’s Inequality Proposition 2.1.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
4.3 Probability Distributions of Continuous Random Variables: For any continuous r. v. X, there exists a function f(x), called the density function of.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Sampling Distributions
Sampling and Sampling Distributions
Random Variable 2013.
Welcome to MM305 Unit 3 Seminar Dr
Lecture 3 B Maysaa ELmahi.
Central Limit Theorem Sampling Distributions Central Limit Theorem
Expected Values.
4.3 Probability Distributions of Continuous Random Variables:
Parameter, Statistic and Random Samples
Linear Combination of Two Random Variables
t distribution Suppose Z ~ N(0,1) independent of X ~ χ2(n). Then,
ASV Chapters 1 - Sample Spaces and Probabilities
Using the Tables for the standard normal distribution
Introduction to Probability & Statistics The Central Limit Theorem
Review of Important Concepts from STA247
4.3 Probability Distributions of Continuous Random Variables:
§ 5.3. Central Limit Theorems 1. Convergence in distribution
Presentation transcript:

§2. The central limit theorem

1. Convergence in distribution Suppose that {X n } are i.i.d. r.v.s with d.f. F n (x), X is a r.v. with F(x), if for all continuous points of F(x) we have It is said that {X n } convergence to X in distribution and denoted it by

2. Central Limit Theorems (CLT) Levy-Lindeberg’s CLT Suppose that {X n } are i.i.d. r.v.s with mean  <  and variance  2 <  , k=1, 2, …, then {X n } follows the CLT, which also means that

Suppose that Z n (n=1, 2,...) follow binomial distribution with parameters n, p(0<p<1), then De Moivre-Laplace’s CLT Proof

Example 2 A life risk company 寿险公司 has received policies 保单, assume each policy with premium 保 险费 12 dollars and mortality rate 死亡率 0.6% , the company has to paid 1000 dollars when a claim arrived, try to determine : (1) the probability that the company could be deficit 亏 损? (2)to make sure that the profit 利润 of the company is not less than dollars with probability 0.9, try to determine the most payment of each claim.

Let X denote the death of one year, then, X~B(n, p), where n= , p=0.6% , Let Y represent the profit of the company, then, Y=10000  X. By CLT, we have (1)P{Y<0}=P{10000  X<0}=1  P{X  120}  1   (7.75)=0. (2) Assume that the payment is a dollars, then P{Y>60000}=P{10000  12-X>60000}=P{X  60000/a}  0.9. By CLT, it is equal to

Abraham de Moivre Born: 26 May in Vitry (near Paris), France Died: 27 Nov in London, England

Pierre-Simon Laplace Born: 23 Mar in Beaumont-en-Auge, Normandy, France Died: 5 Mar in Paris, France