Mathematical Expectation

Slides:



Advertisements
Similar presentations
Chapter 4: Mathematical Expectation:
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Chapter 4 Mathematical Expectation.
April 2, 2015Applied Discrete Mathematics Week 8: Advanced Counting 1 Random Variables In some experiments, we would like to assign a numerical value to.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Lecture note 6 Continuous Random Variables and Probability distribution.
Independence of random variables
Class notes for ISE 201 San Jose State University
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Chapter 4: Joint and Conditional Distributions
Eighth lecture Random Variables.
Chris Morgan, MATH G160 February 3, 2012 Lecture 11 Chapter 5.3: Expectation (Mean) and Variance 1.
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
1 Random Variables and Discrete probability Distributions SESSION 2.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
4.1 Mathematical Expectation
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Variance and Covariance
1 Chapter 16 Random Variables. 2 Expected Value: Center A random variable assumes a value based on the outcome of a random event.  We use a capital letter,
Sampling Distribution of the Sample Mean. Example a Let X denote the lifetime of a battery Suppose the distribution of battery battery lifetimes has 
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Statistics for Business & Economics
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which.
Topic 5 - Joint distributions and the CLT
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
EGR Joint Probability Distributions The probabilities associated with two things both happening, e.g. … –probability associated with the hardness.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Engineering Statistics ECIV 2305
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
The Variance of a Random Variable Lecture 35 Section Fri, Mar 26, 2004.
President UniversityErwin SitompulPBST 4/1 Dr.-Ing. Erwin Sitompul President University Lecture 4 Probability and Statistics
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chapter 5: The Basic Concepts of Statistics. 5.1 Population and Sample Definition 5.1 A population consists of the totality of the observations with which.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Chapter 5 Joint Probability Distributions and Random Samples
Variance and Covariance
Section 7.3: Probability Distributions for Continuous Random Variables
Applied Discrete Mathematics Week 11: Relations
4.1 Mathematical Expectation
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
3.1 Expectation Expectation Example
Chapter 4: Mathematical Expectation:
4.1 Mathematical Expectation
Some Rules for Expectation
Multinomial Distribution
Introduction to Probability & Statistics The Central Limit Theorem
Random Variable X, with pmf p(x) or pdf f(x)
Discrete Probability Distributions
4.1 Mathematical Expectation
4.1 Mathematical Expectation
Independence of random variables
Handout Ch 4 實習.
Chapter 2. Random Variables
4.1 Mathematical Expectation
4.1 Mathematical Expectation
4.1 Mathematical Expectation
Presentation transcript:

Mathematical Expectation Chapter 4: Mathematical Expectation

Mean of a Random Variable: Definition: Let X be a random variable with a probability distribution 𝑓(𝑥). The mean (or expected value) of X is denoted by 𝜇 𝑋 (or E(X)) and is defined by:

Example: A shipment of 8 similar microcomputers to a retail outlet contains 3 that are defective and 5 are non-defective. If a school makes a random purchase of 2 of these computers, find the expected number of defective computers purchased Solution: Let X = the number of defective computers purchased. we found that the probability distribution of X is:

The expected value of the number of defective computers purchased is the mean (or the expected value) of X, which is: 𝐸 𝑋 = 𝜇 𝑋 = 𝑥=0 2 𝑥.𝑓(𝑥) =0.𝑓 0 +1.𝑓 1 +2.𝑓 2

Example: A box containing 7 components is sampled by a quality inspector; the box contains 4 good components and 3 defective components. A sample of 3 is taken by the inspector. Find the expected value of the number of good components in this sample.

Solution: Let X represent the number of good components in the sample. The probability distribution of X is

Thus, if a sample of size 3 is selected at random over and over again from a box of 4 good components and 3 defective components, it will contain, on average, 1.7 good components.

Example Let X be a continuous random variable that represents the life (in hours) of a certain electronic device. The pdf of X is given by: Find the expected life of this type of devices.

Solution: hours Therefore, we can expect this type of device to last, on average, 200 hours.

Theorem Let X be a random variable with a probability distribution 𝑓(𝑥), and let 𝑔(𝑋) be a function of the random variable 𝑋. The mean (or expected value) of the random variable 𝑔(𝑋) is denoted by 𝜇 𝑔(𝑋) (or 𝐸[𝑔(𝑋)]) and is defined by:

Example: Let X be a discrete random variable with the following probability distribution Find 𝐸[𝑔(𝑋)], where 𝑔(𝑋)=(𝑋 −1)2.

Solution:

Example: Suppose that the number of cars X that pass through a car wash between 4:00 P.M. and 5:00 P.M. on any sunny Friday has the following probability distribution: Let 𝑔(𝑋) = 2𝑋−1 represent the amount of money, in dollars, paid to the attendant by the manager. Find the attendant’s expected earnings for this particular time period.

Solution:

Example Let X be a continuous random variable that represents the life (in hours) of a certain electronic device. The pdf of X is given by: Find 𝐸 1 𝑥

Solution:

Definition

Example Let X and Y be the random variables with joint probability distribution Find the expected value of 𝑔(𝑋, 𝑌 ) = 𝑋𝑌 .

Solution:

Exercises 4.1 – 4.2 – 4.10 – 4.13 and 4.17 on page 117

Variance (of a Random Variable) The most important measure of variability of a random variable X is called the variance of X and is denoted by 𝑉𝑎𝑟(𝑋) or 𝜎 2

Definition Let X be a random variable with a probability distribution 𝑓(𝑥) and mean μ. The variance of X is defined by:

Definition: (standard deviation)

Theorem Proof : See page 121

Example Let X be a discrete random variable with the following probability distribution

Solution:

Example

Solution:

Means and Variances of Linear Combinations of Random Variables

Theorem If X is a random variable with mean 𝜇=𝐸(𝑋), and if a and b are constants, then: 𝐸(𝑎𝑋±𝑏) = 𝑎 𝐸(𝑋) ± 𝑏 ⇔ 𝜇 𝑎𝑋±𝑏 = 𝑎 𝜇 𝑋 ±𝑏 Corollary 1: E(b) = b (a=0 in Theorem) Corollary 2: E(aX) = a E(X) (b=0 in Theorem)

Example Let X be a random variable with the following probability density function: Find E(4X+3).

Solution: Another solution:

Theorem:

Corollary: If X, and Y are random variables, then: 𝐸(𝑋 ± 𝑌) = 𝐸(𝑋) ± 𝐸(𝑌)

Theorem If X is a random variable with variance 𝑉𝑎𝑟 𝑥 = 𝜎 𝑥 2 𝑉𝑎𝑟 𝑥 = 𝜎 𝑥 2 and if a and b are constants, then:

Theorem:

Corollary: If X, and Y are independent random variables, then: • Var(aX+bY) = a2 Var(X) + b2 Var (Y) • Var(aX−bY) = a2 Var(X) + b2 Var (Y) • Var(X ± Y) = Var(X) + Var (Y)

Example: Let X, and Y be two independent random variables such that E(X)=2, Var(X)=4, E(Y)=7, and Var(Y)=1. Find: 1. E(3X+7) and Var(3X+7) 2. E(5X+2Y−2) and Var(5X+2Y−2).

Solution: 1. E(3X+7) = 3E(X)+7 = 3(2)+7 = 13 Var(3X+7)= (3)2 Var(X)=(3)2 (4) =36 2. E(5X+2Y−2)= 5E(X) + 2E(Y) −2= (5)(2) + (2)(7) − 2= 22 Var(5X+2Y−2)= Var(5X+2Y)= 52 Var(X) + 22 Var(Y) = (25)(4)+(4)(1) = 104

𝑃(𝜇− 𝑘𝜎 <𝑋< 𝜇 +𝑘𝜎). Chebyshev's Theorem Suppose that X is any random variable with mean 𝐸(𝑋)=𝜇 and variance 𝑉𝑎𝑟(𝑋)= 𝜎 2 and standard deviation 𝜎. Chebyshev's Theorem gives a conservative estimate of the probability that the random variable X assumes a value within k standard deviations (𝑘𝜎) of its mean μ, which is 𝑃(𝜇− 𝑘𝜎 <𝑋< 𝜇 +𝑘𝜎). 𝑃(𝜇− 𝑘𝜎 <𝑋< 𝜇 +𝑘𝜎)≈ 1− 1 𝑘 2

Theorem Let X be a random variable with mean 𝐸(𝑋)=𝜇 and variance 𝑉𝑎𝑟(𝑋)=𝜎2, then for 𝑘>1, we have:

Example Let X be a random variable having an unknown distribution with mean μ=8 and variance σ2=9 (standard deviation σ=3). Find the following probability: (a) P(−4 <X< 20) (b) P(|X−8| ≥ 6)

Solution: (a) P(−4 <X< 20)= ?? (−4 <𝑋< 20)= (𝜇− 𝑘𝜎 <𝑋< 𝜇 +𝑘𝜎)

or

Another solution for part (b): P(|X−8| < 6) = P(−6 <X−8 < 6) = P(−6 +8<X < 6+8) = P(2<X < 14) (2<X <14) = (μ− kσ <X< μ +kσ) 2= μ− kσ ⇔ 2= 8− k(3) ⇔ 2= 8− 3k ⇔ 3k=6 ⇔ k=2

Definition:

Theorem: The covariance of two random variables X and Y with means 𝜇 𝑋 and 𝜇 𝑌 , respectively, is given by 𝜎 𝑋𝑌 = 𝐸(𝑋𝑌 ) − 𝜇 𝑋 𝜇 𝑌 Proof: See page 124

Definition

Notes: 𝜌 𝑋𝑌 is free of the units of X and Y. The correlation coefficient satisfies the inequality −1 ≤ 𝜌 𝑋𝑌 ≤ 1. It assumes a value of zero when σ 𝑋𝑌 =0

Example Find the covariance and the correlation coefficient of X and Y for the following joint distribution

Solution:

Therefore,

To calculate the correlation coefficient

Example: The fraction X of male runners and the fraction Y of female runners who compete in marathon races are described by the joint density function Find the covariance and the correlation coefficient of X and Y .

Solution:

To calculate the correlation coefficient