ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring 2013 4. Random variables part one.

Slides:



Advertisements
Similar presentations
Sta220 - Statistics Mr. Smith Room 310 Class #7.
Advertisements

1. Frequency Distribution & Relative Frequency Distribution 2. Histogram of Probability Distribution 3. Probability of an Event in Histogram 4. Random.
Chapter 4 Probability and Probability Distributions
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Combinatorial Analysis.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Probability Distributions Finite Random Variables.
Random Variables A Random Variable assigns a numerical value to all possible outcomes of a random experiment We do not consider the actual events but we.
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
Random Variable (RV) A function that assigns a numerical value to each outcome of an experiment. Notation: X, Y, Z, etc Observed values: x, y, z, etc.
Expected Value.  In gambling on an uncertain future, knowing the odds is only part of the story!  Example: I flip a fair coin. If it lands HEADS, you.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Conditional probability part two.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Conditional probability part two.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
Chapter 6 Random Variables. Make a Sample Space for Tossing a Fair Coin 3 times.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Axioms of Probability part one.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
Chapter 3 Random Variables and Probability Distributions 3.1 Concept of a Random Variable: · In a statistical experiment, it is often very important to.
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
Chapter 5: Probability Distribution What is Probability Distribution? It is a table of possible outcomes of an experiment and the probability of each outcome.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
Expected values and variances. Formula For a discrete random variable X and pmf p(X): Expected value: Variance: Alternate formula for variance:  Var(x)=E(X^2)-[E(X)]^2.
Chapter 11 Probability Sample spaces, events, probabilities, conditional probabilities, independence, Bayes’ formula.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Conditional probability.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Chapter 5.1 Probability Distributions.  A variable is defined as a characteristic or attribute that can assume different values.  Recall that a variable.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Chapter 5: Random Variables and Discrete Probability Distributions
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Properties of expectation.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
2.1 Introduction In an experiment of chance, outcomes occur randomly. We often summarize the outcome from a random experiment by a simple number. Definition.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
© The McGraw-Hill Companies, Inc., Chapter 6 Probability Distributions.
Barnett/Ziegler/Byleen Finite Mathematics 11e1 Learning Objectives for Section 8.5 The student will be able to identify what is meant by a random variable.
Probability Distribution
3. Conditional probability
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Axioms of Probability part one.
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
Part 1: Theory Ver Chapter 3 Conditional Probability and Independence.
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
L56 – Discrete Random Variables, Distributions & Expected Values
AP STATISTICS Section 7.1 Random Variables. Objective: To be able to recognize discrete and continuous random variables and calculate probabilities using.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
Andrej Bogdanov ENGG 2430A: Probability and Statistics for Engineers Spring Axioms of Probability.
Week 21 Rules of Probability for all Corollary: The probability of the union of any two events A and B is Proof: … If then, Proof:
Probability and Simulation The Study of Randomness.
3/7/20161 Now it’s time to look at… Discrete Probability.
Probability Distribution. Probability Distributions: Overview To understand probability distributions, it is important to understand variables and random.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Continuous Random Variables.
Probability Distributions
Discrete Random Variables
5.2 Mean, Variance, Standard Deviation, and Expectation
Sequences, Series, and Probability
Conditional Probability
Quiz minutes We will take our 5 minute break after all quizzes are turned in. For today’s lesson you do not need your text book , but you will.
Random Variable, Probability Distribution, and Expected Value
ASV Chapters 1 - Sample Spaces and Probabilities
1. Probabilistic Models.
Expected values and variances
3. Independence and Random Variables
4. Expectation and Variance Joint PMFs
5. Conditioning and Independence
M248: Analyzing data Block A UNIT A3 Modeling Variation.
e is the possible out comes for a model
Lesson #5: Probability Distributions
Presentation transcript:

ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one

Random variable A discrete random variable assigns a discrete value to every outcome in the sample space. { HH, HT, TH, TT } Example N = number of H s

Probability mass function ¼¼¼ ¼ N = number of H s p(0) = P(N = 0) = P({ TT }) = 1/4 p(1) = P(N = 1) = P({ HT, TH }) = 1/2 p(2) = P(N = 2) = P({ HH }) = 1/4 { HH, HT, TH, TT } Example The probability mass function (p.m.f.) of discrete random variable X is the function p(x) = P(X = x)

Probability mass function We can describe the p.m.f. by a table or by a chart. x p(x) ¼ ½ ¼ x p(x)p(x)

Example A change occurs when a coin toss comes out different from the previous one. Toss a coin 3 times. Calculate the p.m.f. of the number of changes.

Balls We draw 3 balls without replacement from this urn: Let X be the sum of the values on the balls. What is the p.m.f. of X ? 0

Balls X = sum of values on the 3 balls 0 P(X = 0) P(X = 1) = P(E 100 ) + P(E 11(-1) ) E abc : we chose balls of type a, b, c = P(E 000 ) + P(E 1(-1)0 ) = (1 + 3×3×3)/C(9, 3) = 28/84 = (3×3 + 3×3)/C(9, 3) = 18/84 P(X = -1) = P(E (-1)00 ) + P(E (-1)(-1)1 )= (3×3 + 3×3)/C(9, 3) = 18/84 P(X = 2) = P(E 110 )= 3×3/C(9, 3)= 9/84 P(X = -2) = P(E (-1)(-1)0 )= 3×3/C(9, 3)= 9/84 P(X = 3) = P(E 111 )= 1/C(9, 3)= 1/84 P(X = -3) = P(E (-1)(-1)(-1) )= 1/C(9, 3)= 1/84 1

Probability mass function p.m.f. of sum of values on the 3 balls The events “ X = x ” are disjoint and partition the sample space, so for every p.m.f ∑ x p(x) = 1

Coupon collection

There are n types of coupons. Every day you get one. You want a coupon of type 1. By when will you get it? Probability model Let E i be the event you get a type 1 coupon on day i We also assume E 1, E 2, … are independent Since there are n types, we assume P(E 1 ) = P(E 2 ) = … = 1/n

Coupon collection Let X 1 be the day on which you get coupon 1 P(X 1 ≤ d) = 1 – P(X 1 > d) = 1 – P(E 1 c ) P(E 2 c ) … P(E d c ) = 1 – (1 – 1/n) d = 1 – P(E 1 c E 2 c … E d c )

Coupon collection There are n types of coupons. Every day you get one. By when will you get all the coupon types? Solution Let X t be the day on which you get a type t coupon Let X be the day on which you collect all coupons (X ≤ d) = (X 1 ≤ d) and (X 2 ≤ d) … (X n ≤ d) (X > d) = (X 1 > d) ∪ (X 2 > d) ∪ … ∪ (X n > d) not independent!

Coupon collection We calculate P(X > d) by inclusion-exclusion P(X > d) = ∑ P(X t > d) – ∑ P(X t > d and X u > d) + … P(X 1 > d) = (1 – 1/n) d P(X 1 > d and X 2 > d) = P(F 1 … F d ) by symmetry P(X t > d) = (1 – 1/n) d F i = “day i coupon is not of type 1 or 2” = P(F 1 ) … P(F d ) = (1 – 2/n) d independent events

Coupon collection P(X 1 > d) = (1 – 1/n) d P(X 1 > d and X 2 > d) = (1 – 2/n) d P(X 1 > d and X 2 > d and X 3 > d) = (1 – 3/n) d and so on so P(X > d) = C(n, 1) (1 – 1/n) d – C(n, 2) (1 – 2/n) d + … = ∑ i = 1 (-1) i+1 C(n, i) (1 – i/n) d n P(X > d) = ∑ P(X t > d) – ∑ P(X t > d and X u > d) + …

Coupon collection n = 15 d Probability of collecting all n coupons by day d P(X ≤ d)

Coupon collection dd n = 5n = 10 n = 15n =

Coupon collection p = 0.5 Day on which the probability of collecting all n coupons first exceeds p n p = 0.5 n The function n ln n ln 1/(1 – p)

Coupon collection 16 teams 17 coupons per team 272 coupons it takes 1624 days to collect all coupons.

Something to think about There are 91 students in ENGG 2040C. Every Tuesday I call 6 students to do problems on the board. There are 11 such Tuesdays. What are the chances you are never called?

Expected value The expected value (expectation) of a random variable X with p.m.f. p is E[X] = ∑ x x p(x) N = number of H s x 0 1 p(x) ½ ½ E[N] = 0 ½ + 1 ½ = ½ Example

Expected value Example N = number of H s x p(x) ¼ ½ ¼ E[N] = 0 ¼ + 1 ½ + 2 ¼ = 1 E[N]E[N] The expectation is the average value the random variable takes when experiment is done many times

Expected value Example F = face value of fair 6-sided die E[F] = =

Russian roulette Alice Bob N = number of rounds what is E[N] ?

Chuck-a-luck If it doesn’t appear, you lose $1. If appears k times, you win $ k.

Chuck-a-luck P = profit E[P] = -1 (5/6) (5/6) 2 (1/6) (5/6)(1/6) (5/6) 3 = -17/ n p(n)p(n) 1 6 ( ) ( ) ( ) Solution

Utility Should I come to class this Tuesday? C ome S kip not called called F /916/91 E[C]E[C] = 1.37… 5 85/ /91 E[S]E[S] = 40.66… / /91

Average household size In 2011 the average household in Hong Kong had 2.9 people. Take a random person. What is the average number of people in his/her household? B: 2.9 A: < 2.9 C: > 2.9

Average household size average household size 3 3 average size of random person’s household 3 4⅓4⅓

Average household size What is the average household size? household size12345more % of households From Hong Kong Annual Digest of Statistics, 2012 ≈ 1× × × × × ×.035 = 2.91 Probability model The sample space are the households of Hong Kong Equally likely outcomes X = number of people in the household E[X]E[X]

Average household size Take a random person. What is the average number of people in his/her household? Probability model The sample space are the people of Hong Kong Equally likely outcomes Y = number of people in household Let’s find the p.m.f. p Y (y) = P(Y = y)

Average household size pY(y)pY(y) # people in y person households # people = y × ( # y person households ) # people = y × ( # y person households )/( # households ) ( # people )/( # households ) = ? y × p X (y) = p.m.f. of X must equal ∑ y y p X (y) = E[X]

Average household size X = number of people in a random household Y = number of people in household of a random person p Y (y) = y p X (y) E[X]E[X] E[Y] = ∑ y y p Y (y) ∑ y y 2 p X (y) E[X]E[X] = household size12345more % of households E[Y] ≈ 1 2 × × × × × × ≈ 3.521

Functions of random variables ∑ y y 2 p X (y) E[X]E[X] = E[Y]E[Y] In general, if X is a random variable and f a function, then Z = f(X) is a random variable with p.m.f. E[X2]E[X2] E[X]E[X] = p Z (z) = ∑ x: f(x) = z p X (x).

Preview E[Y]E[Y] E[X2]E[X2] E[X]E[X] = X = number of people in a random household Y = number of people in household of a random person Next time we’ll show that for every random variable E[X 2 ] ≥ (E[X]) 2 So E[Y] ≥ E[X]. The two are equal only if all households have the same size.