Download presentation

1
**Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 4 Notes**

Class notes for ISE 201 San Jose State University Industrial & Systems Engineering Dept. Steve Kennedy 1

2
**Mean of a Set of Observations**

Suppose an experiment involves tossing 2 coins. The result is either 0, 1, or 2 heads. Suppose the experiment is repeated 15 times, and suppose that 0 heads is observed 3 times, 1 head 8 times, and 2 heads 4 times. What is the average number of heads flipped? x bar = ( ) / 15 = ((0)(3) + (1)*(8) + (2)*(4)) / 15 = 1.07 This could also be written as a weighted average, x bar = (0)(3/15) + (1)(8/15) + (2)(4/15) = where 3/15, 8/15, etc. are the fraction of times the given number of heads came up. The average is also called the mean.

3
**Mean of a Random Variable**

A similar technique, taking the probability of an outcome times the value of the random variable for that outcome, is used to calculate the mean of a random variable. The mean or expected value of a random variable X with probability distribution f (x), is = E (X) = x x f(x) if discrete, or = E (X) = x x f(x) dx if continuous

4
**Mean of a Random Variable Depending on X**

If X is a random variable with distribution f(x). The mean g(X) of the random variable g(X) is g(X) = E [g(X)] = x g(x) f(x) if discrete, or g(X) = E [g(X)] = x g(x) f(x) dx if continuous

5
**Expected Value for a Joint Distribution**

If X and Y are random variables with joint probability distribution f (x,y). The mean or expected value g(X,Y) of the random variable g (X,Y) is g(X,Y) = E [g(X,Y)] = x y g(x,y) f(x,y) if discrete, or g(X,Y) = E [g(X,Y)] = x y g(x,y) f(x,y) dy dx if continuous Note that the mean of a distribution is a single value, so it doesn't make sense to talk of the mean the distribution f (x,y).

6
**Variance What was the variance of a set of observations?**

The variance 2 of a random variable X with distribution f(x) is 2 = E [(X - )2] = x (x - )2 f(x) if discrete, or 2 = E [(X - )2] = x (x - )2 f(x) dx if continuous An equivalent and easier computational formula, also easy to remember, is 2 = E [X2] - E [X]2 = E [X2] - 2 "The expected value of X2 - the expected value of X...squared." Derivation from the previous formula is simple.

7
Variance of a Sample There's also a somewhat similar, better computational formula for s2. What is s2? What was the original formula for the variance of a sample? The formula is

8
Covariance If X and Y are random variables with joint probability distribution f (x,y), the covariance, XY , of X and Y is defined as XY = E [(X - X)(Y - Y)] The better computational formula for covariance is XY = E (XY) - X Y Note that although the standard deviation can't be negative, the covariance XY can be negative. Covariance will be useful later when looking at the linear relationship between two random variables.

9
**Correlation Coefficient**

If X and Y are random variables with covariance XY and standard deviations X and Y respectively, the correlation coefficient XY is defined as XY = XY / ( X Y ) Correlation coefficient notes: What are the units of XY ? What is the possible range of XY ? What is the meaning of the correlation coefficient? If XY = 1 or -1, then there is an exact linear relationship between Y and X (i.e., Y = a + bX). If XY = 1, then b > 0, and if XY = -1, then b < 0. Can show this by calculating the covariance of X and a + bX, which simplifies to b / b2 = 1.

10
**Linear Combinations of Random Variables**

If a and b are constants, E (aX + b) = a E(X) + b Also holds if a = 0 or b = 0. If we add two functions, E [g(X) h(X)] = E [g(X)] E [h(X)] Also true for functions of two or more random variables. That is, E [g(X,Y) h(X,Y)] = E [g(X,Y)] E [h(X,Y)]

11
**Functions of Two or More Random Variables**

The expected value of the sum of two random variables is equal to the sum of the expected values. E (X Y) = E(X) E(Y) The expected value of the product of two independent random variables is equal to the product of the expected values. E (X Y) = E(X) E(Y)

12
**Variance Relationships**

For a random variable X with variance 2 2aX + b = a2 X2 So adding a constant does what? And multiplying by a constant does what? For two random variables X and Y, 2aX + bY = a2 X2 + b2 Y abXY What if X and Y are independent? XY = 0. Note that the correlation coefficient is also 0.

13
**Chebyshev's Inequality**

The probability that any random variable X will assume a value within k standard deviations of the mean is at least /k2. That is P ( - k < X < + k) /k2 This theorem is both very general and very weak. Very general, since it holds for any probability distribution. Very weak for the same reason, because it is a worst-case limit that holds for any distribution. If we know the distribution, we can get a better limit than this (how?), so this is only used when the distribution is unknown. Care must be taken, however, not to assume an underlying distribution when the distribution is really unknown.

Similar presentations

© 2020 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google