Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Distributions of sampling statistics Chapter 6 Sample mean & sample variance.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Statistics review of basic probability and statistics.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
1. (f) Use continuity corrections for discrete random variable LEARNING OUTCOMES At the end of the lesson, students will be able to (g) Use the normal.
ORDER STATISTICS.
Independence of random variables
Review of Basic Probability and Statistics
Chapter 1 Probability Theory (i) : One Random Variable
Descriptive statistics Experiment  Data  Sample Statistics Sample mean Sample variance Normalize sample variance by N-1 Standard deviation goes as square-root.
Sampling Distributions
1 Chap 5 Sums of Random Variables and Long-Term Averages Many problems involve the counting of number of occurrences of events, computation of arithmetic.
Prof. Bart Selman Module Probability --- Part d)
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
4. Convergence of random variables  Convergence in probability  Convergence in distribution  Convergence in quadratic mean  Properties  The law of.
Stat 321 – Lecture 19 Central Limit Theorem. Reminders HW 6 due tomorrow Exam solutions on-line Today’s office hours: 1-3pm Ch. 5 “reading guide” in Blackboard.
Math Camp 2: Probability Theory Sasha Rakhlin. Introduction  -algebra Measure Lebesgue measure Probability measure Expectation and variance Convergence.
The moment generating function of random variable X is given by Moment generating function.
Review of Probability and Statistics
Statistical inference Population - collection of all subjects or objects of interest (not necessarily people) Sample - subset of the population used to.
Today Today: Chapter 8, start Chapter 9 Assignment: Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Approximations to Probability Distributions: Limit Theorems.
Chapter 5 Sampling Distributions
Limits and the Law of Large Numbers Lecture XIII.
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology The Weak Law and the Strong.
Lectures prepared by: Elchanan Mossel elena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Lecture 15: Statistics and Their Distributions, Central Limit Theorem
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
Chapter 10 – Sampling Distributions Math 22 Introductory Statistics.
Bernoulli Trials Two Possible Outcomes –Success, with probability p –Failure, with probability q = 1  p Trials are independent.
Convergence in Distribution
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
1 ORDER STATISTICS AND LIMITING DISTRIBUTIONS. 2 ORDER STATISTICS Let X 1, X 2,…,X n be a r.s. of size n from a distribution of continuous type having.
Population and Sample The entire group of individuals that we want information about is called population. A sample is a part of the population that we.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Consistency An estimator is a consistent estimator of θ, if , i.e., if
11 Confidence Intervals – Introduction A point estimate provides no information about the precision and reliability of estimation. For example, the sample.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
Week 91 Simple versus Composite Hypothesis Recall, a simple hypothesis completely specifies the distribution. A composite does not. When testing a simple.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Probability and Moment Approximations using Limit Theorems.
Chapter 5 Sampling Distributions. Introduction Distribution of a Sample Statistic: The probability distribution of a sample statistic obtained from a.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
Week 31 The Likelihood Function - Introduction Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Chebyshev’s Inequality Markov’s Inequality Proposition 2.1.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Sampling Distributions Chapter 18. Sampling Distributions A parameter is a number that describes the population. In statistical practice, the value of.
Parameter, Statistic and Random Samples
Basic statistics Usman Roshan.
Sampling and Sampling Distributions
Chapter 7 Review.
Supplemental Lecture Notes
Basic statistics Usman Roshan.
Sample Mean Distributions
Parameter, Statistic and Random Samples
Sampling Distribution Models
ASV Chapters 1 - Sample Spaces and Probabilities
ORDER STATISTICS AND LIMITING DISTRIBUTIONS
9. Limit Theorems.
Further Topics on Random Variables: 1
Presentation transcript:

week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know its value. A statistic is a function of the sample data, i.e., it is a quantity whose value can be calculated from the sample data. It is a random variable with a distribution function. The random variables X 1, X 2,…, X n are said to form a (simple) random sample of size n if the X i ’s are independent random variables and each X i has the sample probability distribution. We say that the X i ’s are iid.

week12 Example Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is. It is a statistic.

week13 Sampling Distribution of a Statistic The sampling distribution of a statistic is the distribution of values taken by the statistic in all possible samples of the same size from the same population. The distribution function of a statistic is NOT the same as the distribution of the original population that generated the original sample. Probability rules can be used to obtain the distribution of a statistic provided that it is a “simple” function of the X i ’s and either there are relatively few different values in he population or else the population distribution has a “nice” form. Alternatively, we can perform a simulation experiment to obtain information about the sampling distribution of a statistic.

week14 Markov’s Inequality If X is a non-negative random variable with E(X) 0 then, Proof:

week15 Chebyshev’s Inequality For a random variable X with E(X) 0 Proof:

week16 Law of Large Numbers Interested in sequence of random variables X 1, X 2, X 3,… such that the random variables are independent and identically distributed (i.i.d). Let Suppose E(X i ) = μ, V(X i ) = σ 2, then and Intuitively, as n  ∞, so

week17 Formally, the Weak Law of Large Numbers (WLLN) states the following: Suppose X 1, X 2, X 3,…are i.i.d with E(X i ) = μ < ∞, V(X i ) = σ 2 < ∞, then for any positive number a as n  ∞. This is called Convergence in Probability. Proof:

week18 Example Flip a coin 10,000 times. Let E(X i ) = ½ and V(X i ) = ¼. Take a = 0.01, then by Chebyshev’s Inequality Chebyshev Inequality gives a very weak upper bound. Chebyshev Inequality works regardless of the distribution of the X i ’s. The WLLN state that the proportions of heads in the 10,000 tosses converge in probability to 0.5.

week19 Strong Law of Large Number Suppose X 1, X 2, X 3,…are i.i.d with E(X i ) = μ < ∞, then converges to μ as n  ∞ with probability 1. That is This is called convergence almost surely.

week110 Central Limit Theorem The central limit theorem is concerned with the limiting property of sums of random variables. If X 1, X 2,…is a sequence of i.i.d random variables with mean μ and variance σ 2 and, then by the WLLN we have that in probability. The CLT concerned not just with the fact of convergence but how S n /n fluctuates around μ. Note that E(S n ) = nμ and V(S n ) = nσ 2. The standardized version of S n is and we have that E(Z n ) = 0, V(Z n ) = 1.

week111 The Central Limit Theorem Let X 1, X 2,…be a sequence of i.i.d random variables with E(X i ) = μ < ∞ and Var(X i ) = σ 2 < ∞. Let Then, for - ∞ < x < ∞ where Z is a standard normal random variable and Ф(z)is the cdf for the standard normal distribution. This is equivalent to saying that converges in distribution to Z ~ N(0,1). Also, i.e. converges in distribution to Z ~ N(0,1).

week112 Example Suppose X 1, X 2,…are i.i.d random variables and each has the Poisson(3) distribution. So E(X i ) = V(X i ) = 3. The CLT says that as n  ∞.

week113 Examples A very common application of the CLT is the Normal approximation to the Binomial distribution. Suppose X 1, X 2,…are i.i.d random variables and each has the Bernoulli(p) distribution. So E(X i ) = p and V(X i ) = p(1- p). The CLT says that as n  ∞. Let Y n = X 1 + … + X n then Y n has a Binomial(n, p) distribution. So for large n, Suppose we flip a biased coin 1000 times and the probability of heads on any one toss is 0.6. Find the probability of getting at least 550 heads. Suppose we toss a coin 100 times and observed 60 heads. Is the coin fair?

week114 Sampling from Normal Population If the original population has a normal distribution, the sample mean is also normally distributed. We don’t need the CLT in this case. In general, if X 1, X 2,…, X n i.i.d N(μ, σ 2 ) then S n = X 1 + X 2 +…+ X n ~ N(nμ, nσ 2 ) and

week115 Example