Large Sample Distribution Theory

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
ORDER STATISTICS.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
1 Chap 5 Sums of Random Variables and Long-Term Averages Many problems involve the counting of number of occurrences of events, computation of arithmetic.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
4. Convergence of random variables  Convergence in probability  Convergence in distribution  Convergence in quadratic mean  Properties  The law of.
2. Point and interval estimation Introduction Properties of estimators Finite sample size Asymptotic properties Construction methods Method of moments.
Least Squares Asymptotics Convergence of Estimators: Review Least Squares Assumptions Least Squares Estimator Asymptotic Distribution Hypothesis Testing.
Math Camp 2: Probability Theory Sasha Rakhlin. Introduction  -algebra Measure Lebesgue measure Probability measure Expectation and variance Convergence.
Probability theory 2011 Convergence concepts in probability theory  Definitions and relations between convergence concepts  Sufficient conditions for.
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Probability theory 2008 Outline of lecture 5 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different.
Approximations to Probability Distributions: Limit Theorems.
Standard error of estimate & Confidence interval.
Limits and the Law of Large Numbers Lecture XIII.
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
K. Shum Lecture 16 Description of random variables: pdf, cdf. Expectation. Variance.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
1 G Lect 3b G Lecture 3b Why are means and variances so useful? Recap of random variables and expectations with examples Further consideration.
Convergence in Distribution
Lab 3b: Distribution of the mean
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
TobiasEcon 472 Law of Large Numbers (LLN) and Central Limit Theorem (CLT)
1 ORDER STATISTICS AND LIMITING DISTRIBUTIONS. 2 ORDER STATISTICS Let X 1, X 2,…,X n be a r.s. of size n from a distribution of continuous type having.
Consistency An estimator is a consistent estimator of θ, if , i.e., if
Confidence Interval & Unbiased Estimator Review and Foreword.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Review of Statistics.  Estimation of the Population Mean  Hypothesis Testing  Confidence Intervals  Comparing Means from Different Populations  Scatterplots.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chebyshev’s Inequality Markov’s Inequality Proposition 2.1.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Parameter, Statistic and Random Samples
Ch5.4 Central Limit Theorem
Jiaping Wang Department of Mathematical Science 04/22/2013, Monday
Large Sample Theory EC 532 Burak Saltoğlu.
Standard Errors Beside reporting a value of a point estimate we should consider some indication of its precision. For this we usually quote standard error.
Supplemental Lecture Notes
Math a Discrete Random Variables
3. The X and Y samples are independent of one another.
SOME IMPORTANT PROBABILITY DISTRIBUTIONS
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Sample Mean Distributions
Parameter, Statistic and Random Samples
Linear Combination of Two Random Variables
t distribution Suppose Z ~ N(0,1) independent of X ~ χ2(n). Then,
Large Sample Theory EC 532 Burak Saltoğlu.
ASV Chapters 1 - Sample Spaces and Probabilities
ORDER STATISTICS AND LIMITING DISTRIBUTIONS
EC 331 The Theory of and applications of Maximum Likelihood Method
STOCHASTIC HYDROLOGY Random Processes
CHAPTER 15 SUMMARY Chapter Specifics
Chengyuan Yin School of Mathematics
Lecture 7 Sampling and Sampling Distributions
Stat Lab 9.
ORDER STATISTICS AND LIMITING DISTRIBUTIONS
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Econometrics I Professor William Greene Stern School of Business
Chapter 5 Properties of a Random Sample
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Econometrics I Professor William Greene Stern School of Business
Presentation transcript:

Large Sample Distribution Theory Burak Saltoğlu

Outline Convergence in Probability Laws of Large Numbers Convergence of Functions Convergence to a Random Variable Convergence in Distribution: Limiting Distributions Central Limit Theorems Asymtotic Distributions

Convergence in Probability Definition 1: Let xn be a sequence random variable where n is sample size, the random variable xn converges in probability to a constant c if the values that the x may take that are not close to c become increasingly unlikely as n increases. If xn converges to c, then we say, All the mass of the probability distribution concentrates around c.

mean square Convergence Definition mean square convergence:

Convergence in Probability Definition 3: An estimator of a parameter is a consistent estimator iff Theorem 1: The mean of a random sample from any distribution with finite mean μ and variance σ2, is a consistent estimator of μ. Proof:

Convergence in Probability Corrolary to Theorem 1: In random sampling, for any function g(x), if E[g(x)] and Var[g(x)] are finite constants, then Example: Sampling from a normal distribution,

Law of large numbers Weak Law of large numbers Based on convergence in probability Strong form of large numbers Based on Almost sure convergence

Laws of Large Numbers Khinchine’s Weak Law of Large Numbers: Remarks: 1) No finite variance assumption (unllike definition 3). 2) Requires i.i.d sampling

2. Chebychev’s weak law of large numbers There are 2 differences between Khinchine and Chebychev’s LLN Chebychev does not require a convergence to a constant More importantly Chebyshev allows heterogeneity of distributions i.e. İt does not require iid’ness.

Convergence of Functions Theorem 2(Slutsky): For a continious function, g(xn) that is not a function of n, Using Slutsky theorem, we can write some rules of plim.

Convergence of Functions Rules for Probability Limits 1) For plimx=c and plimy=d i) plim (x+y) = c+d ii) plim xy=cd iii) plim (x/y)=c/d 2) For matrices X and Y with plimX=A and plimY=B i) plim X-1=A-1 ii) plim XY=AB

Convergence in Distribution: Limiting Distributions Definition 6: xn with cdf Fn(x) converges in distribution to a random variable with cdf, F(x) if then F(x) is the limiting distribution of xn and can be shown as

Convergence in Distribution: Limiting Distributions Definition 7: The limiting mean and variance of a distribution of a random variable is those of the limiting distribution. Example: Think of Student–t distribution with n-1d.f. It has 0 mean and (n-1)/(n-3) variance.This is the exact distribution of it. However as n grows, it converges to standard normal distribution, that is, So it has 0 limitimg mean and 1 limiting variance.

Convergence in Distribution: Limiting Distributions Rules for Limiting Distributions: 1) If and plim yn=c, then 2) As a corrolary to Slutsky theorem, if and g(x) is a cont. function For example, exact distribution of t2n-1 is F(1,n) but limiting distribution of it is (N(0,1))2=ChiSquare(1), that is

Convergence in Distribution: Limiting Distributions 3) If yn has a limiting distribution and plim(xn-yn)=0 then xn has the same limiting distribution with yn

Central Limit Theorems Lindberg-Levy Central Limit Theorem:

Central Limit Theorems Delta Method: To find the limiting normal distribution of a function, we use a method called ‘delta’, a linear Taylor approximation, to have the following theorem.. Theorem 4: If g(z) is continuous function not involving sample size, n, then

Asymtotic Distributions Definition 8: An asymtotic distribution is a distribution that is used to approximate the true finite sample distribution of a random variable. A good way to derive the asymtotic distribution is using a known limiting distribution. For example, if we know then, we can approximately-asymtotically write as the asymtotic distribution of sample mean

End of the Lecture