Presentation is loading. Please wait.

Presentation is loading. Please wait.

Probability Distributions: Finite Random Variables.

Similar presentations


Presentation on theme: "Probability Distributions: Finite Random Variables."— Presentation transcript:

1 Probability Distributions: Finite Random Variables

2 Random Variables A random variable is a numerical value associated with the outcome of an experiment.  Examples: The number of heads that appear when flipping three coins The sum obtained when two fair dice are rolled In both examples, we are not as interested in the outcomes of the experiment as we are in knowing a number value for the experiment

3 Example—Flipping a Coin 3 Times Suppose that we flip a coin 3 times and record each flip. S = {HHH, HHT, HTH, HTT, THH, THT, TTT, TTT} Let X be a random variable that records the # of heads we get for flipping a coin 3 times.

4 Example (cont) The possible values our random variable X can assume:  X = x where x = {0, 1, 2, 3} Notice that the values of X are:  Countable, i.e. we can list all possible values  The values are whole numbers When a random variable’s values are countable the random variable is called a finite random variable. And when the values are whole #’s the random variable is discrete.

5 Probabilities Just as with events, we can also talk about probabilities of random variables. The probability a random variable assumes a certain value is written as  P(X =x) Notice that X is the random variable for the # of heads and x is the value the variable assumes.

6 Probabilities We can list all the probabilities for our random variable in a table. The pattern of probabilities for a random variable is called its probability distribution. For a finite discrete random variable, this pattern of probabilities is called the probability mass function ( p.m.f ). X=xX=xP(X=x)P(X=x) 01/8 13/8 2 31/8

7 Probability Mass Function We consider this table to be a function because each value of the random variable has exactly one probability associated with it. Because of this we use function notation to say: X=xX=xP(X=x)P(X=x) 01/8 13/8 2 31/8

8 Properties of Probability Mass Function Because the p.m.f is a function it has a domain and range like any other function you’ve seen:  Domain: {all whole # values random variable}  Range:  Sum:

9 Representing the p.m.f. Because the p.m.f function uses only whole # values in its domain, we often use histograms to show pictorially the distribution of probabilities. Here is a histogram for our coin example:

10 Things to notice: The height of each rectangle corresponds to P(X=x) The sum of all heights is equal to 1

11 Cumulative Distribution Function The same probability information is often given in a different form, called the cumulative distribution function, c.d.f. Like the p.m.f. the c.d.f. is a function which we denote as F x (x) (upper case F) with the following properties:  Domain: the set of all real #s  Range: 0≤ F x (x) ≤1  F x (x) = P(X≤x)  As x →∞, F x (x) →1 AND As x →-∞, F x (x) →0

12 Graphing the c.d.f. Let’s graph the c.d.f. for our coin example. According to our definitions from the previous slide:  Domain: the set of all real #s  Range: 0≤ F x (x) ≤1  F x (x) = P(X≤x)

13 Graphing (cont) Here’s the p.m.f. : Because the domain for the cdf is the set of all real numbers, any value of x that is less than zero would mean that F x (x) is 0 since there is no way for a flip of three coins to have less than 0 heads. The probability is zero! Also, because the number of heads we can get is always at most 3, F x (x) = 1 when x ≥ 3. X=x0123 P(X=x)1/83/8 1/8

14 Graphing (cont) Now we need to look at what happens for the other values. If 0≤ x <1, then F x (x) = P(X ≤ x) = P(X=0) = 1/8 If 1≤ x <2, then F x (x) = P(X ≤ x) = P(X=1) +P(X=0)= 1/8+3/8=4/8 If 2≤ x <3, then F x (x) = P(X ≤ x) = P(X=2)+P(X=1) +P(X=0)= 1/8+3/8+3/8=7/8

15 The c.d.f. All of the previous information is best summarized with a piece-wise function:

16 The graph of the c.d.f.

17 Things to notice The graph is a step-wise function. This is typically what you will see for finite discrete random variables. Domain: the set of all real #s Range: 0≤ F x (x) ≤1 F x (x) = P(X≤x) As x →∞, F x (x) →1 AND As x →-∞, F x (x) →0 At each x-value where there is a jump, the size of the jump tells us the P(X=x). Because of this, we can write a p.m.f. function from a c.d.f. function and vice-versa

18 Expected Value of Finite Discrete Random Variable Expected Value of a Discrete Random Variable is Note, this is the sum of each of the heights of each rectangle in the p.m.f., multiplied by the respective value of X in the pmf.

19 Example Box contains four $1 chips, three $5 chips, two $25 chips, and one $100 chip. Let X be the denomination of a chip selected at random. The p.m.f. of X is displayed below. X$1.00$5.00$25.00$100.00 f X (X)0.400.300.200.10

20 Questions What is P(X=25)? What is P(X≤25)? What P(X≥5)? Graph the c.d.f. What is the E(X)?

21 Answers

22 Answer(CDF)

23 Expected Value $ 10.4 0.40 $ 50.3 1.50 $ 250.2 5.00 $ 1000.110.00 Sum1.0$16.90

24 Bernoulli Random Variables Bernoulli Random Variables are a special case of discrete random variables In a Bernoulli Trial there are only two outcomes: success or failure

25 Bernoulli Random Variable Let X stand for the number of successes in n Bernoulli Trials, X is called a Binomial Random Variable Binomial Setting: 1.You have n repeated trials of an experiment. 2. On a single trial, there are two possible outcomes, success or failure. 3.The probability of success is the same from trial to trial. 4.The outcome of each trial is independent. Expected Value of a Binomial R.V. is E(X)=np, p is probability of success

26 Loaded Coin Suppose you have a coin that is biased towards heads. Let’s suppose that on any given flip of the coin your get heads about 60% of the time and tails 40% of the time. Let X be the random variable for the number of heads obtained in flipping the coin three times.

27 Loaded Coin S = {HHH, HHT, HTH, HTT, THH, THT, TTT, TTT} A “success” in this experiment will be the occurrence of a head and a “failure” will be when we get a tail. Because getting a head is more likely now, we need to look at what the probability is for getting each of the outcomes in our sample space.

28 Loaded coin For example: What is the probability of getting HHH? Because each trial or flip is independent, we can say that: Similarly, we can also ask what the probabilities are for other outcomes in our experiment.

29 Loaded Coin OutcomeProbability HHH(0.60)(0.60)(0.60)=0.216 HHT(0.60)(0.60)(0.40)=0.144 HTH(0.60)(0.40)(0.60)=0.144 HTT(0.60)(0.40)(0.40)=0.096 THH(0.40)(0.60)(0.60)=0.144 THT(0.40)(0.60)(0.40)=0.096 TTH(0.40)(0.40)(0.60)=0.096 TTT(0.40)(0.40)(0.40)=0.064

30 Loaded Coin: p.m.f. X=xP(X=x) 00.064 10.288 20.432 30.216

31 Loaded Coin: Graph of pmf

32 Your Turn! Graph the cdf of our biased coin example Excel: BINOMDIST


Download ppt "Probability Distributions: Finite Random Variables."

Similar presentations


Ads by Google