Section 2.5 Important definition in the text: The definition of the moment generating function (m.g.f.) Definition 2.5-1 If S is the space for a random.

Slides:



Advertisements
Similar presentations
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Advertisements

Probability Three basic types of probability: Probability as counting
Discrete Uniform Distribution
DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Chapter 5 Discrete Random Variables and Probability Distributions
The Bernoulli distribution Discrete distributions.
Section 5.2 The Binomial Distribution
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Section 2.1 Important definitions in the text: The definition of random variable and space of a random variable Definition The definition of probability.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
1 Discrete Structures & Algorithms Discrete Probability.
Chapter 1 Probability Theory (i) : One Random Variable
Section 2.4 For any random variable X, the cumulative distribution function (c.d.f.) of X is defined to be F(x) = P(X  x).
Section 2.6 Consider a random variable X = the number of occurrences in a “unit” interval. Let = E(X) = expected number of occurrences in a “unit” interval.
Review.
Section 2.3 Suppose X is a discrete-type random variable with outcome space S and p.m.f f(x). The mean of X is The variance of X is The standard deviation.
Section 6.1 Let X 1, X 2, …, X n be a random sample from a distribution described by p.m.f./p.d.f. f(x ;  ) where the value of  is unknown; then  is.
Math 310 Section 7.2 Probability. Succession of Events So far, our discussion of events have been in terms of a single stage scenario. We might be looking.
Section 5.3 Suppose X 1, X 2, …, X n are independent random variables (which may be either of the discrete type or of the continuous type) each from the.
Bernoulli Distribution
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Suppose an ordinary, fair six-sided die is rolled (i.e., for i = 1, 2, 3, 4, 5, 6, there is one side with i spots), and X = “the number of spots facing.
Section 10.6 Recall from calculus: lim= lim= lim= x  y  — x x — x kx k 1 + — y y eekek (Let y = kx in the previous limit.) ekek If derivatives.
Discrete Random Variables and Probability Distributions
Probability Models Binomial, Geometric, and Poisson Probability Models.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Probability Distributions: Finite Random Variables.
Copyright © Cengage Learning. All rights reserved. 3.5 Hypergeometric and Negative Binomial Distributions.
Lesson 6 – 2b Hyper-Geometric Probability Distribution.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Discrete Distributions
5.1 Basic Probability Ideas
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Lesson 8 – R Taken from And modified slightlyhttp://
Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics.
AP Statistics: Section 8.2 Geometric Probability.
Section 7.2. Section Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Convergence in Distribution
King Saud University Women Students
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
List one thing that has a probability of 0?. agenda 1) notes on probability 2) lesson 1 example 1, 2 Exercise 5-8 Problem set 1-3 3)start lesson 3.
Lesson 6 - R Discrete Probability Distributions Review.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Lesson 6 – 2c Negative Binomial Probability Distribution.
Some Common Discrete Random Variables. Binomial Random Variables.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Chapter 3 Discrete Random Variables and Probability Distributions  Random Variables.2 - Probability Distributions for Discrete Random Variables.3.
Chapter 2: Random Variable and Probability Distributions Yang Zhenlin.
Random Variables Example:
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Section 1.3 Each arrangement (ordering) of n distinguishable objects is called a permutation, and the number of permutations of n distinguishable objects.
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U Authors: Gary Greer (with.
Chapter 5 Special Distributions Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
3/7/20161 Now it’s time to look at… Discrete Probability.
Lesson 6 – 2a Geometric Probability Distribution.
Random Probability Distributions BinomialMultinomial Hyper- geometric
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
1. 2 At the end of the lesson, students will be able to (c)Understand the Binomial distribution B(n,p) (d) find the mean and variance of Binomial distribution.
Chapter 8: The Binomial and Geometric Distributions 8.2 – The Geometric Distributions.
Random Variables.
A casino claims that its roulette wheel is truly random
A casino claims that its roulette wheel is truly random
Geometric Poisson Negative Binomial Gamma
Presentation transcript:

Section 2.5 Important definition in the text: The definition of the moment generating function (m.g.f.) Definition If S is the space for a random variable X with p.m.f. f(x), then M(t) =and M(0) = M / (t) =and M / (0) = M // (t) =and M // (0) = M /// (t) =and M /// (0) = and in general, M (n) (t) =and M (n) (0) = xSxS e tx f(x) xSxS f(x) =1 xSxS xe tx f(x) xSxS xf(x) =E(X) xSxS x 2 e tx f(x) xSxS x 2 f(x) =E(X 2 ) xSxS x 3 e tx f(x) xSxS x 3 f(x) =E(X 3 ) xSxS x n e tx f(x) xSxS x n f(x) =E(X n )

1. (a) (b) (c) An envelope contains two blue sheets of paper and three green sheets of paper. Two sheets are randomly selected without replacement, and the random variable X is defined to be the number of blue sheets. Name the type of distribution X has. Find the p.m.f. for X. Find the m.g.f. for X. X has a hypergeometric distribution with N =, N 1 =, n =.522 f(x) = if x = 0 if x = 1 if x = 2 3 — 10 3 — 5 1 — 10 M(t) = 3 + 6e t + e 2t ————— for –  < t <  10

2. (a) (b) (c) An envelope contains two blue sheets of paper and three green sheets of paper. Two sheets are randomly selected with replacement, and the random variable X is defined to be the number of blue sheets. Name the type of distribution X has. Find the p.m.f. for X. Find the m.g.f. for X. X has a b(, ) distribution.20.4 f(x) = if x = 0 if x = 1 if x = 2 9 — — 25 4 — 25 M(t) = e t + 4e 2t —————– for –  < t <  25

3.Suppose the random variable X has m.g.f. M(t) = (e –9t + e 4t ) 3 /8. Find P(X < 0). (e –9t + e 4t ) 3 M(t) = ———— = 8 e –27t + 3e –18t e 4t + 3e –9t e 8t + e 12t ———————————— = 8 e –27t + 3e –14t + 3e –t + e 12t —————————— = — e –27t + — e –14t + — e –t + — e 12t =  e tx f(x) x The p.m.f. for X must be f(x) = if x = –27 if x = –14 if x = –1 if x = 12 1 — 8 3 — 8 3 — 8 1 — 8 P(X < 0) =1/8 + 3/8 + 3/8 =7/8

4.A random variable X has a b(n,p) distribution. Find the m.g.f. for X, and use the m.g.f. to find E(X) and Var(X). M(t) = E(e tX ) = n  e tx x = 0 p x (1 – p) n–x = nxnx n  x = 0 (pe t ) x (1 – p) n–x = nxnx (pe t + 1 – p) n = (1 – p + pe t ) n for –  < t <  Since M(t) = (1 – p + pe t ) n,M (t) =, and M  (t) =, then n(1 – p + pe t ) n–1 pe t n(1 – p + pe t ) n–1 pe t + n(n – 1)(1 – p + pe t ) n–2 p 2 e 2t M (0) =M  (0) = npnp + n(n – 1)p 2 E(X) =  = Var(X) =  2 = np E(X 2 ) – [E(X)] 2 =np + n(n – 1)p 2 – (np) 2 =np(1 – p)

Return to Section 2.4 A Bernoulli experiment is one which must result in one of two mutually exclusive and exhaustive outcomes often labeled in general as “success” and “failure”. The probability of “success” is often denoted as p, and the probability of “failure” is often denoted as q, that is, q = 1 − p. If X is a random variable defined to be one (1) when a success occurs and zero (0) when a failure occurs, then X is said to have a Bernoulli distribution with success probability p. The p.m.f. of X is f(x) = E(X) =Var(X) = The m.g.f. of X is p x (1 – p) 1–x if x = 0, 1. pp(1 – p) M(t) = (1 – p + pe t ) for –  < t < 

A sequence of Bernoulli trials is a sequence of independent Bernoulli experiments where the probability of the outcome labeled “success”, often denoted as p, remains the same on each trial; the probability of “failure” is often denoted as q, that is, q = 1 – p. Suppose X 1, X 2,..., X n make up a sequence of Bernoulii trials. If X = X 1 + X X n, then X is a random variable equal to the number of successes in the sequence of Bernoulli trials, and X is said to have a binomial distribution with success probability p, denoted b(n, p). The p.m.f. of X is f(x) = E(X) = Var(X) = The m.g.f. of X is n!—— x! (n – x)! p x (1 – p) n–x if x = 0, 1, …, n Return to Class Exercise #3 in Section 2.4 M(t) = (1 – p + pe t ) n for –  < t <  np np(1 – p)

(c)Consider the random variable Q = “the number of clear marbles when 3 marbles are selected at random with replacement” with p.m.f. f(q). Find f(q), E(Q), and Var(Q). f(q) = E(Q) = Var(Q) = 3q3q 2 — 7 q 5 — 7 3 – q if q = 0, 1, 2, 3  Q = (3)(2/7) =6/7  Q 2 = (3)(2/7)(5/7) =30/49

(d)Consider the random variable V = “the number of clear marbles when 7 marbles are selected at random with replacement” with p.m.f. g(v). Find g(v), E(V), and Var(V). g(v) = E(V) = Var(V) = 7v7v 2 — 7 v 5 — 7 7 – v if v = 0, 1, 2, …, 7  V =  V 2 = (7)(2/7) =2 (7)(2/7)(5/7) =10/7

(e)Consider the random variable W = “the number of colored marbles when 7 marbles are selected at random with replacement” with p.m.f. h(w). Find h(w), E(W), and Var(W). (Note that V + W = 7.) h(w) = E(W) = Var(W) = 7w7w 5 — 7 w 2 — 7 7 – w if w = 0, 1, 2, …, 7  W =  W 2 = (7)(5/7) =5 (7)(5/7)(2/7) =10/7 Return to Section 2.5

5. (a) (b) A random variable X has p.m.f. f(x) = 2(1/3) x if x = 1, 2, 3, …. Verify that f(x) is a p.m.f. Find the m.g.f. for X.   2(1/3) x = x = 1  2  (1/3) x = x = 1  2/3  (1/3) x = x = 0 2 — 3 1 ——— = 1 – 1/3 1 M(t) = E(e tX ) =   e tx (2)(1/3) x = x = 1  2  (e t /3) x = x = 1  2(e t /3)  (e t /3) x–1 = x = 1 2(e t /3) [ 1 + (e t /3) + (e t /3) 2 + (e t /3) 3 + (e t /3) 4 +… ] = 1 2(e t /3) ———— = 1 – (e t /3) 2e t —— for t < ln(3) 3 – e t

(c) Use the moment generating function to find E(X) and Var(X). 2 Since M(t) = ———, M (t) =, and 3e –t – 1 M  (t) =, then 6e –t (3e –t – 1) –2 – 6e –t (3e –t – 1) –2 + 36e –2t (3e –t – 1) –3 M (0) =M  (0) = 6/4 = 3/2– 6/4 + 36/8 = 3 E(X) =  = Var(X) =  2 = 3/2 E(X 2 ) – [E(X)] 2 =3 – (3/2) 2 =3/4

6. (a) Among all the boxes of cereal produced at a certain factory, 8% are underweight. Boxes are randomly selected and weighed, and the following random variables are defined: X = Y = V = W = S = number of boxes weighed to get the first underweight box number of boxes weighed to get the third underweight box number of acceptable boxes weighed to get the first underweight box number of acceptable boxes weighed to get the third underweight box number of boxes weighed before the third underweight box Find each of the following: P(X = 1) = 0.08 P(X = 3) = (0.92)(0.08)P(X = 2) = P(X = 4) =(0.92) 2 (0.08)(0.92) 3 (0.08)

Find each of the following: P(Y = 3) = P(Y = 5) = P(Y = 4) = P(Y = 6) = (b) (0.08) 3 (0.92)(0.08) (0.92) 2 (0.08) (0.92) 3 (0.08) Find the p.m.f. of X. (c) Find the p.m.f. of Y. (d) f 1 (x) =(0.92) x–1 (0.08)if x = 1, 2, 3, … f 2 (y) = y – 1 y – 3 if y = 3, 4, 5, …(0.92) y–3 (0.08) 3 same as y – 1 2

Suppose independent Bernoulli trials are performed until the rth success is observed. If the random variable X is defined to be the number of trials to observe the rth success, then the random variable X is said to have a negative binomial distribution; in the special case where r = 1, we say that X has a geometric distribution. The p.m.f. of X is f(x) = E(X) = Var(X) = The m.g.f. of X is x – 1 r – 1 if x = r, r + 1, r + 2, …p r (1 – p) x–r same as x – 1 x – r

7.Do Text Exercise R(t) = ln[M(t)]R / (t) = M / (t) —— M(t) R // (t) = M(t)M // (t) – [M / (t)] 2 ———————— [M(t)] 2 R / (0) = M / (0) —— = M(0) E(X) —— = 1 E(X) =  R // (0) = M(0)M // (0) – [M / (0)] 2 ————————— = [M(0)] 2 E(X 2 ) – [E(X)] 2 —————— = 1 2 Var(X) =  2

Look at the handout “Some Well-Known Series”: 1 ——— = 1 + v + v 2 + v 3 + v 4 + v 5 + v 6 + … for  1 < v < 1 1 – v Take the anti-derivative of both sides to get  ln(1 – v) = v + —v 2 + —v 3 + —v 4 + —v 5 + —v 6 + … for  1 < v < Let w = 1 – v (1 – w) 2 (1 – w) 3 (1 – w) 4 (1 – w) 5 (1 – w) 6  ln(w) = 1 – w + ——— + ——— + ——— + ——— + ——— + … for 0 < w < 2 (w – 1) 2 (w – 1) 3 (w – 1) 4 (w – 1) 5 (w – 1) 6 ln(w) = w – 1 – ——— + ——— – ——— + ——— – ——— + … for 0 < w < 2

1 ——— = 1 + w + w 2 + w 3 + w 4 + w 5 + w 6 + … for  1 < w < 1 1 – w Take the derivative of both sides to get 1 ——— = 1 + 2w + 3w 2 + 4w 3 + 5w 4 + 6w 5 + 7w 6 + … for  1 < w < 1 (1 – w) 2 Take the derivative of both sides to get ——— = 2 + (3)(2)w + (4)(3)w 2 + (5)(4)w 3 + (6)(5)w 4 + (7)(6)w 5 + … (1 – w) 3 for  1 < w < 1 Divide both sides by 2 to get

1 (4)(3) (5)(4) (6)(5) (7)(6) ——— = 1 + 3w + ——w 2 + ——w 3 + ——w 4 + ——w 5 + … (1 – w) for  1 < w < 1 We can recognize the general pattern to be ——— = (1 – w) r   w x–r x = r x – 1 r – 1 for |w| < 1

8.A random variable X has a negative binomial distribution. Find the m.g.f. for X, and use the m.g.f. to find E(X) and Var(X). M(t) = E(e tX ) =   e tx x = r p r (1 – p) x–r = x – 1 r – 1   x = r (pe t ) r [(1 – p)e t ] x–r = x – 1 r – 1   x = r [(1 – p)e t ] x–r = x – 1 r – 1 (pe t ) r —————— for t < – ln(1 – p) [1 – (1 – p)e t ] r R(t) = ln[M(t)] =rln(pe t ) – rln[1 – (1 – p)e t ] = rln(p) + rt – rln[1 – (1 – p)e t ] R / (t) =r +r + r(1 – p)e t ————— = 1 – (1 – p)e t R // (t) = r(1 – p)e t —————— [1 – (1 – p)e t ] 2 r ————— 1 – (1 – p)e t E(X) =  = R / (0) = r— p r— p Var(X) =  2 = R // (0) = r(1 – p) ——— p 2

Suppose independent Bernoulli trials are performed until the rth success is observed. If the random variable X is defined to be the number of trials to observe the rth success, then the random variable X is said to have a negative binomial distribution; in the special case where r = 1, we say that X has a geometric distribution. The p.m.f. of X is f(x) = E(X) = Var(X) = The m.g.f. of X is x – 1 r – 1 if x = r, r + 1, r + 2, …p r (1 – p) x–r same as x – 1 x – r r— p r— p r(1 – p) ——— p 2 (pe t ) r M(t) =—————— for t < – ln(1 – p) [1 – (1 – p)e t ] r Return to Class Exercise #6

Find each of the following: (e) E(X) = 1 —— = — = Var(X) = (1)(0.92) ———– = (0.08) —– = E(Y) = 3 —— = — = Var(Y) = (3)(0.92) ———– = (0.08) —— = E(V) = Var(V) = E(X – 1) = Var(X – 1) = 23 — = —– = Var(X) =  X =  X 2 =  Y 2 =  V 2 =  Y =  V =

E(W) = Var(W) = E(S) = Var(S) =  W 2 =  S 2 =  W =  S = E(Y – 3) = E(Y – 1) = 69 — = — = Var(Y – 3) = Var(Y – 1) = Var(Y) = 1725 —— = —— =

Find each of the following: (f) the p.m.f. of V the p.m.f. of W the p.m.f. of S f 3 (v) =(0.92) v (0.08)if v = 0, 1, 2, … f 4 (w) =(0.92) w (0.08) 3 if w = 0, 1, 2, … w + 2 w same as w f 5 (s) =(0.92) s–2 (0.08) 3 if s = 2, 3, 4, … s s – 2 same as s2s2

Find the expected number of boxes that must be weighed in order to obtain the third underweight box. (g) E(Y) =  Y = 75 / 2 = 37.5 Find the probability that exactly 5 boxes must be weighed in order to find the first underweight box (i.e., the fifth box weighed is the first underweight box). (h) P(X = 5) =(0.92) 4 (0.08) = Find the probability that at least 5 boxes must be weighed in order to find the first underweight box. (i) P(X  5) = P(first 4 boxes weighed are acceptable) = (0.92) 4 =0.7164

Find the probability that at most 5 boxes must be weighed in order to find the first underweight box. (j) P(X  5) =1 – P(X  6) = 1 – P(first 5 boxes weighed are acceptable) = 1 – (0.92) 5 = Find the probability that exactly 8 boxes must be weighed in order to find the third underweight box (i.e. exactly 5 acceptable boxes are weighed before the third underweight box). (k) P(Y = 8) = 7272 (0.92) 5 (0.08) 3 =

Find the probability that at least 6 boxes must be weighed in order to find the second underweight box (i.e. at least 4 acceptable boxes are weighed before the second underweight box). (l) P(first 5 boxes are all acceptable or contain exactly 1 underweight box) = P(first 5 boxes are all acceptable) + P(first 5 boxes contain exactly 1 underweight box) = (0.92) 5 +(0.92) 4 (0.08) =

Find the probability that at most 6 boxes must be weighed in order to find the second underweight box (at most 4 acceptable boxes are weighed before the second underweight box). (m) 1 – P(at least 7 boxes must be weighed to find the 2nd underweight box) = 1 – [ (0.92) 6 + (0.92) 5 (0.08) ] =

9. (a) (b) Find E(X) and Var(X), if the moment generating function of X is M(t) = 0.64e t ———— for t < – ln(0.36) 1 – 0.36e t 10 We recognize this as the moment generating function for a negative binomial distribution with p = and r =. E(X) = Var(X) = /0.64 = 125 / 8 (10) (0.36) / (0.64) 2 = 1125 / 128 M(t) = (4e –t – 3) –1 for t < ln(4/3) After some algebra, we recognize this as the moment generating function for a geometric distribution with p =. E(X) =Var(X) = 1/4 412

(c) We do not recognize this moment generating function. M / (t) = M // (t) = M / (0) = M // (0) = E(X) =Var(X) = 20e –5t (4e –5t – 3) –2 – 100e –5t (4e –5t – 3) – e –10t (4e –5t – 3) – – (20) 2 = 300 M(t) = (4e –5t – 3) –1 for t < (1/5)ln(4/3)

(d)M(t) = 4/(4 – e t ) – 1/3 for t < ln(4) We do not recognize this moment generating function. M(t) = 4(4 – e t ) –1 – 1/3 M / (t) = M // (t) = M / (0) = M // (0) = E(X) =Var(X) = 4e t (4 – e t ) –2 4e t (4 – e t ) –2 + 8e 2t (4 – e t ) –3 4/94/9 + 8/27 = 20/27 4/920/27 – (4/9) 2 = 44/81 (e) M(t) = ( e t ) 27 for –  < t <  We recognize this as the moment generating function for a binomial distribution with p = and n =. E(X) =Var(X) = (27)(0.8) = 21.6(27)(0.8)(0.2) = 4.32

P(Ms. A wins) = (1/2) + (1/2) 3 + (1/2) 5 + (1/2) 7 + … = P(H) + P(TTH) + P(TTTTH) + … = (1/2){1 + (1/2) 2 + [(1/2) 2 ] 2 + [(1/2) 2 ] 3 + … } = 1 1 — ——— = 2 1 – 1/4 2 — (a) Two people, Ms. A and Mr. B, take turns flipping a coin which has probability p of displaying heads, and Ms. A gets to flip first. Suppose the person who gets a head first wins the game. Find the probability that Ms. A wins with a fair coin (i.e., p = 0.5).

(b)For what value(s) of p will the game in part (a) to be fair (i.e., both Ms. A and Mr. B have an equal chance of winning)? P(Ms. A wins) = p + (1 – p) 2 p + (1 – p) 4 p + (1 – p) 6 p + … = P(H) + P(TTH) + P(TTTTH) + … = p{1 + (1 – p) 2 + [(1 – p) 2 ] 2 + [(1 – p) 2 ] 3 + … } = p ———— 1 – (1 – p) 2 p For what value(s) of p is ———— = 1/2 ? 1 – (1 – p) 2 There are no values of p which make this equation true, and therefore it is not possible for the game to be fair.

P(Ms. A wins) = p + (1 – p)p 2 + (1 – p) 2 p 3 + (1 – p) 3 p 4 + … = P(H) + P(THH) + P(THTHH) + … = p{1 + (1 – p)p + [(1 – p)p] 2 + [(1 – p)p] 3 + … } = p ————— 1 – (1 – p)p p For what value(s) of p is ————— = 1/2 ? 1 – (1 – p)p 3 –  5 p =———  (c)Suppose Ms. A must get a head to win, and Mr. B must get a tail to win. For what value(s) of p will this game be fair?

11. (a) A factory produces pieces of candy in equal proportions of eight different colors: red, orange, purple, pink, blue, green, brown, and white. If random pieces of candy are purchased, find the expected number of pieces of candy to obtain at least one of each color. Define the following random variables: X 1 = number of purchases to obtain any color X 2 = number of purchases to obtain a color different from the 1 st color obtained X 3 = number of purchases to obtain a color different from the 1 st and 2 nd colors obtained. X 8 = number of purchases to obtain a color different from the 1 st, 2 nd, 3 rd, …, and 7 th colors obtained The expected number of purchases to obtain one candy of each color is E(X 1 + X 2 + X 3 + X 4 + X 5 + X 6 + X 7 + X 8 ) = E(X 1 ) + E(X 2 ) + … + E(X 8 ).

X 1 has a geometric distribution with p = 1, that is, X 1 = 1. X 2 has a geometric distribution with p =7/8. X 3 has a geometric distribution with p =6/8 = 3/4. X 4 has a geometric distribution with p =5/8. X 5 has a geometric distribution with p =4/8 = 1/2. X 6 has a geometric distribution with p =3/8. X 7 has a geometric distribution with p =2/8 = 1/4. X 8 has a geometric distribution with p =1/8. For k = 1, 2, …, 8, X k has a geometric distribution with p =, and E(X k ) = (9 – k) / 8 8 / (9 – k). E(X 1 ) + E(X 2 ) + … + E(X 8 ) = 8/8 + 8/7 + 8/6 + … + 8/2 + 8/1 

(b)Sandy has three pieces of candy in her pocket: a red, an orange, and a purple. She is carrying a lot of material and only has one hand free. With her free hand, she reaches into her pocket to select one piece of candy at random. If it is the purple candy, she holds onto it. If it is not purple, she holds the selected piece in her palm while selecting one of the other two, after which releases the piece of candy in her palm. She repeats this process until she obtains the purple piece. Find the expected number of selections to obtain the purple piece. X = number of selections to obtain the purple candy Space of X ={1, 2, 3, …} P(X = 1) =P(X = 2) = P(X = 3) =P(X = 4) = 1/3(2/3)(1/2) (2/3)(1/2) 2 (2/3)(1/2) 3 The p.m.f. of X is f(x) = 1/3if x = 1 (2/3)(1/2) x–1 if x = 2, 3, 4, …

M(t) = E(e tX ) =  (1/3) e t +  e tx (2/3)(1/2) x–1 = x = 2  (1/3) e t + (e 2t /3)  e t(x–2) (1/2) x–2 = x = 2 (1/3) e t +(e 2t /3) [ 1 + (e t /2) + (e t /2) 2 + (e t /2) 3 + (e t /2) 4 + … ] = 1 (1/3) e t +(e 2t /3) ————= 1 – (e t /2) 2e 2t e t /3 + ——— 6 – 3e t M (t) = E(X) = M (0) = e t /3 + 4e 2t (6 – 3e t ) –1 + 2e 2t (–1)(6 – 3e t ) –2 (–3e t ) 1/3 + 4/3 + 6/9 = 7/3 if t < ln(2)

The sum of the probabilities is — +— +— +— +… = — +— ——— = – 1/5 1 so f(x) is a p.m.f. E(X) = (0)— + (1) — + (2) — + (3) — + … = — ———— = 5 (1 – 1/5) 2 5 — The random variable X has p.m.f. 3/4 if x = 0 f(x) = (1/5) x if x = 1, 2, 3, … Verify that f(x) is a p.m.f., and find E(X). Note: The mean could also be found by first finding the m.g.f.