Presentation is loading. Please wait.

Presentation is loading. Please wait.

Random Variables A random variable is simply a real-valued function defined on the sample space of an experiment. Example. Three fair coins are flipped.

Similar presentations


Presentation on theme: "Random Variables A random variable is simply a real-valued function defined on the sample space of an experiment. Example. Three fair coins are flipped."— Presentation transcript:

1 Random Variables A random variable is simply a real-valued function defined on the sample space of an experiment. Example. Three fair coins are flipped. The number of heads, Y, that appear is a random variable. Let us list the sample space, S. Sample PointNo. of Heads, YProbability (H,H,H)31/8 (H,H,T)21/8 (H,T,H)21/8 (T,H,H)21/8 (H,T,T)11/8 (T,H,T)11/8 (T,T,H)11/8 (T,T,T)01/8

2 Example, continued. P{Y = 0} = P({s|Y(s) = 0}) = P{(T,T,T)} = 1/8 P{Y = 1} = P({s|Y(s) = 1}) = P{(H,T,T),(T,H,T),(T,T,H)} = 3/8 P{Y = 2} = P({s|Y(s) = 2}) = P{(H,H,T),(H,T,H),(T,H,H)} = 3/8 P{Y = 3} = P({s|Y(s) = 3}) = P{(H,H,H)} = 1/8 Since Y must take on one of the values 0, 1, 2, 3, we must have and this agrees with the probabilities listed above.

3 Cumulative distribution function of a random variable For a random variable X, the function F defined by is called the cumulative distribution function, or simply, the distribution function. Clearly, F is a nondecreasing function of t. All probability questions about X can be answered in terms of the cumulative distribution function F. For example,  

4 Proof of For sets A and B, where B A, P(A B) = P(A) P(B). Let A = {s| X(s) b}, B = {s| X(s) a}, a < b. A B = {s| a< X(s) b}. P(A B) = P(A) P(B) = F(b) – F(a).

5 Properties of the cumulative distribution function For a random variable X, the cumulative distribution function (c. d. f.) F was defined by 1. F is non decreasing. 2. 3. 4. F is right continuous. The previous properties of F imply that

6 Example of a distribution function Suppose that a bus arrives at a station every day between 10am and 10:30am, at random. Let X be the arrival time. Therefore, the distribution function is:

7 Discrete vs. Continuous Random Variables If a set is in one-to-one correspondence with the positive integers, the set is said to be countable. If the number of values taken on by a random variable is either finite or countable, then the random variable is said to be discrete. The number of heads which appear in 3 flips of a coin is a discrete random variable. If the set of values of a random variable is neither finite nor countable, we say the random variable is continuous. The random variable defined as the time that a bus arrives at a station is an example of a continuous random variable. In Chapter 5, the random variables are discrete, while in Chapter 6, they are continuous.

8 Probability Mass Function For a discrete random variable X, we define the probability mass function p(a) of X by If X is a discrete random variable taking the values x 1, x 2, …, then Example. For our coin flipping example, we plot p(x i ) vs. x i : 0 1 2 3 0.375 0.25 0.125 x p(x)

9 Example of a probability mass function on a countable set Suppose X is a random variable taking values in the positive integers. We define p(i) = for i = 1, 2, 3, … Since this defines a probability mass function. P(X is odd) = sum of heights of red bars = 2/3 and P(X is even) = sum of heights of blue bars = 1/3.

10 Cumulative distribution function of a discrete random variable The distribution function of a discrete random variable can be expressed as where p(a) is the probability mass function. If X is a discrete random variable whose possible values are x 1, x 2, x 3 …, where x 1 <x 2 < x 3 …, then its distribution function is a step function. That is, F is constant on the intervals [x i-1, x i ) and then takes a step (or jump) of size p(x i ) at x i. (See next slide for an example).

11 Random variable Y, number of heads, when 3 coins are tossed Probability Mass Function Cumulative Distribution Function

12 Random variable with both discrete and continuous features Define random variable X as follows: (1) Flip a fair coin (2) If the coin is H, define X to be a randomly selected value from the interval [0, 1/2]. (3) If the coin is T, define X to be 1. The cdf for X is derived next. For t < 0, P(X t) = 0 follows easily. For P(X t) = P(X t| coin is H)∙P(coin is H) = (2t)∙(1/2) = t For P(X t) = P(X 1/2) = 1/2. For P(X t) = P(X 1/2) + P(X = 1) = 1/2 + 1/2 = 1.

13 CDF for random variable X from previous slide Let the cdf for X be F. Then

14 Expected value of a discrete random variable For a discrete random variable X having probability mass function p(x), the expectation or expected value of X, denoted by E(X), is defined by We see that the expected value of x is a weighted average of the possible values that x can take on, each value being weighted by the probability that x assumes it. The expectation of random variable X is also called the mean of X and the notation µ = E(X) is used. Example. A single fair die is thrown. What is the expectation of the number of dots showing on the top face of the die? Let X be the number of dots on the top face. Then

15 Intuitive idea of expectation of a discrete random variable The expected value of a random variable is the average value that the random variable takes on. If for some game, E(X) = 0, then the game is called fair. For random variable X, if half the time X = 0 and the other half of the time X = 10, then the average value of X is E(X) = 5. For random variable Y, if one-third of the time Y = 6 and two- thirds of the time Y = 15, then the average value of Y is E(Y) = 12. Let Z be the amount you win in a lottery. If you win a million dollars with probability 10 -6 and it costs you $2 for a ticket, your expected winnings are E(Z) = 999998(10 -6 ) + (–2)(1 – 10 -6 ) = –1 dollars.

16 Pascal’s Wager—First Use of Expectation to Make a Decision Suppose we are unsure of God’s existence, so we assign a probability of ½ to existence and ½ to nonexistence. Let X be the benefit derived from leading a pious life. X is infinite (eternal happiness) if God exists, however we lose a finite amount (d) of time and treasure devoted to serving God if He doesn’t exist. E(X) = Thus, the expected return on piety is positive infinity. Therefore, says Pascal, every reasonable person should follow the laws of God.

17 Expectation of a function of a discrete random variable. Theorem. If X is a discrete random variable that takes on one of the values x i, i 1, with respective probabilities p(x i ), then for any real-valued function g, Corollary. For real numbers a and b, Example. Let X be a random variable which takes the values –1, 0, 1 with probabilities 0.2, 0.5, and 0.3, respectively. Let g(x) = x 2. We have that g(X) is a random variable which takes on the values 0 and 1 with equal probability. Hence, Note that

18 Law of Unconscious Statistician (Theorem from previous slide) Example. Let Y = g(X) = 7X–X 2. Let X be outcome for a fair die.

19 Determining Insurance Premiums Suppose a 36 year old man wants to buy $50,000 worth of term life insurance for a 20-year term. Let p 36 be the probability that this man survives 20 more years. For simplicity, assume the man pays premiums for 20 years. If the yearly premium is C/20, where C is the total of the premiums the man pays, how should the insurance company choose C? Let the income to the insurance company be X. We have For the company to make money,

20 Variance and standard deviation of a discrete random variable The variance of a discrete random variable X, denoted by Var(X), is defined by The variance is a measure of the spread of the possible values of X. The quantity is called the standard deviation of X. Example. Suppose X has value k, k > 0, with probability 0.5 and value –k with probability 0.5. Then E(X) = 0 and Var(X) = E(X 2 ) = k 2. Also, the standard deviation of X is k.

21 Keno versus Bolita Let B and K be the amount that you win in one play of Bolita and Keno, respectively. (See Example 4.26 in the textbook.) E(B) = –0.25 and E(K) = –0.25 In the long run, your losses are the same with the two games. Var(B) = 55.69 and Var(K) = 1.6875 Based on these variances, we conclude that the risk with Keno is far less than the risk with Bolita.

22 More about variance and standard deviation Theorem. Var(X) = E(X 2 ) – (E(X)) 2. Theorem. For constants a and b, Problem. If E(X) = 2 and E(X 2 ) = 13, find the variance of –4X+12. Solution. Var(X) = E(X 2 ) – (E(X)) 2 = 13 – 4 = 9. Definition. E(X n ) is the nth moment of X.

23 Standardized Random Variables Let X be a random variable with mean  and standard deviation . The random variable X* = (X   )/  is called the standardized X. It follows directly that E(X*) = 0 and Var(X*) = 1. Standardization is particularly useful if two or more random variables with different distributions must be compared. Example. By using standardization, we can compare the home run records of Babe Ruth and Barry Bonds.


Download ppt "Random Variables A random variable is simply a real-valued function defined on the sample space of an experiment. Example. Three fair coins are flipped."

Similar presentations


Ads by Google