Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.

Slides:



Advertisements
Similar presentations
NORMAL OR GAUSSIAN DISTRIBUTION Chapter 5. General Normal Distribution Two parameter distribution with a pdf given by:
Advertisements

CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
SOLVED EXAMPLES.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Sampling Distributions
Review.
Continuous Random Variables and Probability Distributions
Chapter 6 Continuous Random Variables and Probability Distributions
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Continuous Random Variables Chap. 12. COMP 5340/6340 Continuous Random Variables2 Preamble Continuous probability distribution are not related to specific.
Statistics Lecture 14. Example Consider a rv, X, with pdf Sketch pdf.
Today Today: Chapter 9 Assignment: 9.2, 9.4, 9.42 (Geo(p)=“geometric distribution”), 9-R9(a,b) Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Today Today: More on the Normal Distribution (section 6.1), begin Chapter 8 (8.1 and 8.2) Assignment: 5-R11, 5-R16, 6-3, 6-5, 8-2, 8-8 Recommended Questions:
Continuous Random Variables and Probability Distributions
Today Today: Chapter 8 Assignment: 5-R11, 5-R16, 6-3, 6-5, 8-2, 8-8 Recommended Questions: 6-1, 6-2, 6-4, , 8-3, 8-5, 8-7 Reading: –Sections 8.1,
Chap 6-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 6 Continuous Random Variables and Probability Distributions Statistics.
Chapter 6: Some Continuous Probability Distributions:
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Continuous Probability Distribution  A continuous random variables (RV) has infinitely many possible outcomes  Probability is conveyed for a range of.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Chapter 4 Continuous Random Variables and Probability Distributions
1 Ch5. Probability Densities Dr. Deshi Ye
CHAPTER SIX FUNCTIONS OF RANDOM VARIABLES SAMPLING DISTRIBUTIONS.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
Random Sampling, Point Estimation and Maximum Likelihood.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
The Normal Probability Distribution
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Chapter 7: Introduction to Sampling Distributions Section 2: The Central Limit Theorem.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Chapter 3 DeGroot & Schervish. Functions of a Random Variable the distribution of some function of X suppose X is the rate at which customers are served.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Continuous Random Variables and Probability Distributions
Distributions of Functions of Random Variables November 18, 2015
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Chapter 5 Special Distributions Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Chapter 5: The Basic Concepts of Statistics. 5.1 Population and Sample Definition 5.1 A population consists of the totality of the observations with which.
Random Variables By: 1.
Sampling and Sampling Distributions
Ch5.4 Central Limit Theorem
STATISTICS POINT ESTIMATION
Unit 13 Normal Distribution
SOME IMPORTANT PROBABILITY DISTRIBUTIONS
Cumulative distribution functions and expected values
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
STATISTICS Random Variables and Distribution Functions
Chapter 7: Sampling Distributions
Chapter 4: Mathematical Expectation:
Some Rules for Expectation
CONCEPTS OF ESTIMATION
Handout Ch 4 實習.
Handout Ch 4 實習.
6.3 Sampling Distributions
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Chapter 2. Random Variables
Chapter 6: Some Continuous Probability Distributions:
Chapter 5 Continuous Random Variables and Probability Distributions
Continuous Distributions
Continuous Random Variables: Basics
Fundamental Sampling Distributions and Data Descriptions
Mathematical Expectation
Presentation transcript:

Chapter 5.6 From DeGroot & Schervish

Uniform Distribution

Mean and Variance

The Normal Distributions The most widely used model for random variables with continuous distributions is the family of normal distributions. the random variables studied in various physical experiments often have distributions that are approximately normal. If a large random sample is taken from some distribution, many important functions of the observations in the sample will have distributions which are approximately normal.

Properties of Normal Distributions A random variable X has the normal distribution with mean μ and variance σ 2 (−∞ 0) if X has a continuous distribution with the following p.d.f.:

The m.g.f. of Normal Distribution

Mean and Variance The mean and variance of the normal distribution are μ and σ 2, respectively. The first two derivatives of the m.g.f. of normal distribution

The Shapes of Normal Distributions The p.d.f. f (x|μ, σ 2 ) of the normal distribution with mean μ and variance σ 2 is symmetric with respect to the point x = μ. Therefore, μ is both the mean and the median of the distribution. The p.d.f. f (x|μ, σ 2 ) attains its maximum value at the point x = μ.

The Shapes of Normal Distributions

Linear Transformations If a random variable X has a normal distribution, then every linear function of X will also have a normal distribution. Theorem If X has the normal distribution with mean μ and variance σ 2 and if Y = aX + b, where a and b are given constants and a ≠0, then Y has the normal distribution with mean aμ + b and variance a 2 σ 2.

Linear Transformations Proof If ψ denotes the m.g.f. of X, if ψ Y denotes the m.g.f. of Y, then By comparing this expression for ψ Y with the m.g.f. of a normal distribution given, we see that ψ Y is the m.g.f. of the normal distribution with mean aμ + b and variance a 2 σ 2. Hence, Y must have this normal distribution.

The Standard Normal Distribution

Theorem

Converting Normal Distributions to Standard Let X have the normal distribution with mean μ and variance σ 2. Let F be the c.d.f. of X. Then Z = (X − μ)/σ has the standard normal distribution, and, for all x and all 0<p <1, Proof Z = (X − μ)/σ has the standard normal distribution. Therefore, For second equation, let p = F(x) in the first equation and then solve for x in the resulting equation.

Example

Comparisons of Normal Distributions

Linear Combinations of Normally Distributed Variables Theorem If the random variables X1,..., Xk are independent and if Xi has the normal distribution with mean μi and variance σ 2 i (i = 1,..., k), then the sum X Xk has the normal distribution with mean μ μk and variance σ σ 2 k.

Linear Combinations of Normally Distributed Variables

Corollary

Example Suppose that the heights, in inches, of the women in a certain population follow the normal distribution with mean 65 and standard deviation 1, and that the heights of the men follow the normal distribution with mean 68 and standard deviation 3. Suppose also that one woman is selected at random and, independently, one man is selected at random. Determine the probability that the woman will be taller than the man.

Example Let W denote the height of the selected woman, and let M denote the height of the selected man. Then the difference W −M has the normal distribution with mean 65 − 68=−3 and variance = 10. Therefore, if we let then Z has the standard normal distribution. It follows that

Corollary

Example Suppose that a random sample of size n is to be taken from the normal distribution with mean μ and variance 9. Determine the minimum value of n for which

Example the sample mean Xn will have the normal distribution for which the mean is μ and the standard deviation is 3/n 1/2. Therefore, if we let then Z will have the standard normal distribution. In this example, n must be chosen so that

Example

The Lognormal Distributions If log(X) has the normal distribution with mean μ and variance σ 2, we say that X has the lognormal distribution with parameters μ and σ 2.

m.g.f. of a Lognormal Distribution The moments of a lognormal random variable are easy to compute based on the m.g.f. of a normal distribution. The definition of ψ is ψ(t) = E(e tY ). Since Y = log(X), we have It follows that E(X t ) = ψ(t) for all real t. In particular, the mean and variance of X are