Chapter 4: Mathematical Expectation:

Slides:



Advertisements
Similar presentations
STATISTICS Joint and Conditional Distributions
Advertisements

STATISTICS POINT ESTIMATION Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
STATISTICS Random Variables and Distribution Functions
MEAN AND VARIANCE OF A DISTRIBUTION
Biostatistics Unit 5 Samples Needs to be completed. 12/24/13.
Discrete Distributions
Lecture 7. Distributions
Chapter 16: Random Variables
5-1 Chapter 5 Theory & Problems of Probability & Statistics Murray R. Spiegel Sampling Theory.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 16 Random Variables.
Lecture 5 1 Continuous distributions Five important continuous distributions: 1.uniform distribution (contiuous) 2.Normal distribution  2 –distribution[“ki-square”]
Commonly Used Distributions
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Continuous Random Variables Chapter 5 Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama.
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Chapter 4 Mathematical Expectation.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Variance and Standard Deviation The Expected Value of a random variable gives the average value of the distribution The Standard Deviation shows how spread.
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Chapter 6: Some Continuous Probability Distributions:
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Random Variables and Probability Distributions
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Chapter 3 Random Variables and Probability Distributions 3.1 Concept of a Random Variable: · In a statistical experiment, it is often very important to.
Statistical Experiment A statistical experiment or observation is any process by which an measurements are obtained.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
6.1B Standard deviation of discrete random variables continuous random variables AP Statistics.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Sampling Distribution of the Sample Mean. Example a Let X denote the lifetime of a battery Suppose the distribution of battery battery lifetimes has 
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
Probability & Statistics I IE 254 Summer 1999 Chapter 4  Continuous Random Variables  What is the difference between a discrete & a continuous R.V.?
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 5 Discrete Random Variables.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Discrete Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4)
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Topic 5 - Joint distributions and the CLT
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Engineering Statistics ECIV 2305
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
The Variance of a Random Variable Lecture 35 Section Fri, Mar 26, 2004.
MATH Section 3.1.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
7.2 Means & Variances of Random Variables AP Statistics.
Chapter 5: The Basic Concepts of Statistics. 5.1 Population and Sample Definition 5.1 A population consists of the totality of the observations with which.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Chapter 7 Lesson 7.4a Random Variables and Probability Distributions 7.4: Mean and Standard Deviation of a Random Variable.
Random Variables By: 1.
Chapter 4 Mathematical Expectation.
CHAPTER 6 Random Variables
Section 7.3: Probability Distributions for Continuous Random Variables
Math a Discrete Random Variables
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Chapter 4: Mathematical Expectation:
Random Variables and Probability Distributions
6.3 Sampling Distributions
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Chapter 2. Random Variables
Chapter 4 Mathematical Expectation.
Fundamental Sampling Distributions and Data Descriptions
Mathematical Expectation
Presentation transcript:

Chapter 4: Mathematical Expectation:   4.1 Mean of a Random Variable: Definition 4.1: Let X be a random variable with a probability distribution f(x). The mean (or expected value) of X is denoted by X (or E(X)) and is defined by: Example 4.1: (Reading Assignment) Example: (Example 3.3) A shipment of 8 similar microcomputers to a retail outlet contains 3 that are defective and 5 are non-defective. If a school makes a random purchase of 2 of these computers, find the expected number of defective computers purchased

Let X = the number of defective computers purchased. Solution: Let X = the number of defective computers purchased. In Example 3.3, we found that the probability distribution of X is: x 1 2 F(x)=p(X=x) or:

The expected value of the number of defective computers purchased is the mean (or the expected value) of X, which is: = (0) f(0) + (1) f(1) +(2) f(2) (computers) Example 4.3: Let X be a continuous random variable that represents the life (in hours) of a certain electronic device. The pdf of X is given by: Find the expected life of this type of devices.

Solution: (hours) Therefore, we expect that this type of electronic devices to last, on average, 200 hours.

Find E[g(X)], where g(X)=(X 1)2. Theorem 4.1: Let X be a random variable with a probability distribution f(x), and let g(X) be a function of the random variable X. The mean (or expected value) of the random variable g(X) is denoted by g(X) (or E[g(X)]) and is defined by: Example: Let X be a discrete random variable with the following probability distribution x 1 2 F(x) Find E[g(X)], where g(X)=(X 1)2.

Solution: g(X)=(X 1)2 = (01)2 f(0) + (11)2 f(1) +(21)2 f(2) = (1)2 + (0)2 +(1)2

Example: In Example 4.3, find E . Solution:

4.2 Variance (of a Random Variable): The most important measure of variability of a random variable X is called the variance of X and is denoted by Var(X) or . Definition 4.3: Let X be a random variable with a probability distribution f(x) and mean . The variance of X is defined by: Definition: The positive square root of the variance of X, ,is called the standard deviation of X. Note: Var(X)=E[g(X)], where g(X)=(X )2

The variance of the random variable X is given by: Theorem 4.2: The variance of the random variable X is given by: where Example 4.9: Let X be a discrete random variable with the following probability distribution x 1 2 3 F(x) 0.15 0.38 0.10 0.01 Find Var(X)= .

= (0) f(0) + (1) f(1) +(2) f(2) + (3) f(3) Solution: = (0) f(0) + (1) f(1) +(2) f(2) + (3) f(3) = (0) (0.51) + (1) (0.38) +(2) (0.10) + (3) (0.01) = 0.61 1. First method: =(00.61)2 f(0)+(10.61)2 f(1)+(20.61)2 f(2)+ (30.61)2 f(3) =(0.61)2 (0.51)+(0.39)2 (0.38)+(1.39)2 (0.10)+ (2.39)2 (0.01) = 0.4979 = (02) f(0) + (12) f(1) +(22) f(2) + (32) f(3) = (0) (0.51) + (1) (0.38) +(4) (0.10) + (9) (0.01) = 0.87 2. Second method: =0.87  (0.61)2 = 0.4979

Let X be a continuous random variable with the following pdf: Example 4.10: Let X be a continuous random variable with the following pdf: Find the mean and the variance of X. Solution: 5/3 17/6 17/6 – (5/3)2 = 1/8

4.3 Means and Variances of Linear Combinations of Random Variables: If X1, X2, …, Xn are n random variables and a1, a2, …, an are constants, then the random variable : is called a linear combination of the random variables X1,X2,…,Xn. Theorem 4.5: If X is a random variable with mean =E(X), and if a and b are constants, then: E(aXb) = a E(X)  b    aXb = a X ± b Corollary 1: E(b) = b (a=0 in Theorem 4.5) Corollary 2: E(aX) = a E(X) (b=0 in Theorem 4.5)

Example 4.16: Let X be a random variable with the following probability density function: Find E(4X+3). Solution: 5/4 E(4X+3) = 4 E(X)+3 = 4(5/4) + 3=8 Another solution: ; g(X) = 4X+3 E(4X+3) =

E(a1X1+a2X2+ … +anXn) = a1E(X1)+ a2E(X2)+ …+anE(Xn) Theorem: If X1, X2, …, Xn are n random variables and a1, a2, …, an are constants, then: E(a1X1+a2X2+ … +anXn) = a1E(X1)+ a2E(X2)+ …+anE(Xn)  Corollary: If X, and Y are random variables, then: E(X ± Y) = E(X) ± E(Y) Theorem 4.9: If X is a random variable with variance and if a and b are constants, then: Var(aXb) = a2 Var(X) 

Theorem: If X1, X2, …, Xn are n independent random variables and a1, a2, …, an are constants, then: Var(a1X1+a2X2+…+anXn) = Var(X1)+ Var (X2)+…+ Var(Xn)  Corollary: If X, and Y are independent random variables, then: ·      Var(aX+bY) = a2 Var(X) + b2 Var (Y) ·      Var(aXbY) = a2 Var(X) + b2 Var (Y) ·      Var(X ± Y) = Var(X) + Var (Y)

Example: Let X, and Y be two independent random variables such that E(X)=2, Var(X)=4, E(Y)=7, and Var(Y)=1. Find: 1.    E(3X+7) and Var(3X+7) 2.    E(5X+2Y2) and Var(5X+2Y2). Solution: 1. E(3X+7) = 3E(X)+7 = 3(2)+7 = 13 Var(3X+7)= (3)2 Var(X)=(3)2 (4) =36 2. E(5X+2Y2)= 5E(X) + 2E(Y) 2= (5)(2) + (2)(7)  2= 22 Var(5X+2Y2)= Var(5X+2Y)= 52 Var(X) + 22 Var(Y) = (25)(4)+(4)(1) = 104

4.4 Chebyshev's Theorem: * Suppose that X is any random variable with mean E(X)= and variance Var(X)= and standard deviation . * Chebyshev's Theorem gives a conservative estimate of the probability that the random variable X assumes a value within k standard deviations (k) of its mean , which is P( k <X<  +k). * P( k <X<  +k) 1

Theorem 4.11:(Chebyshev's Theorem) Let X be a random variable with mean E(X)= and variance Var(X)=2, then for k>1, we have: P( k <X<  +k)  1  P(|X | < k)  1 Example 4.22: Let X be a random variable having an unknown distribution with mean =8 and variance 2=9 (standard deviation =3). Find the following probability: (a) P(4 <X< 20) (b) P(|X8|  6)

P( k <X<  +k) 1 Solution: (a) P(4 <X< 20)= ?? P( k <X<  +k) 1 (4 <X< 20)= ( k <X<  +k)    4=  k   4= 8 k(3) or 20= + k  20= 8+ k(3)   4= 8 3k  20= 8+ 3k  3k=12  3k=12  k=4  k=4 Therefore, P(4 <X< 20) , and hence,P(4 <X< 20)  (approximately)

(b) P(|X 8|  6)= ?? P(|X8|  6)=1  P(|X8| < 6) P(|X 8| < 6)= ?? P(|X | < k) 1 (|X8| < 6) = (|X | < k) 6= k  6=3k  k=2 P(|X8| < 6)  1  P(|X8| < 6)  1   1  P(|X8| < 6)   P(|X8|  6)  Therefore, P(|X8|  6)  (approximately)

Another solution for part (b): P(|X8| < 6) = P(6 <X8 < 6) = P(6 +8<X < 6+8) = P(2<X < 14) (2<X <14) = ( k <X<  +k) 2=  k  2= 8 k(3)  2= 8 3k  3k=6  k=2 P(2<X < 14)   P(|X8| < 6)  1  P(|X8| < 6)  1   1  P(|X8| < 6)   P(|X8|  6)  Therefore, P(|X8|  6)  (approximately)