Lecture 9 5.3 Discrete Probability. 5.3 Bayes’ Theorem We have seen that the following holds: We can write one conditional probability in terms of the.

Slides:



Advertisements
Similar presentations
Stats for Engineers: Lecture 3. Conditional probability Suppose there are three cards: A red card that is red on both sides, A white card that is white.
Advertisements

Lecture Discrete Probability. 5.2 Recap Sample space: space of all possible outcomes. Event: subset of of S. p(s) : probability of element s of.
Discrete Probability Chapter 7.
Presentation on Probability Distribution * Binomial * Chi-square
© 2004 Prentice-Hall, Inc.Chap 5-1 Basic Business Statistics (9 th Edition) Chapter 5 Some Important Discrete Probability Distributions.
Chapter 5 Discrete Random Variables and Probability Distributions
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Basic Business Statistics.
© 2003 Prentice-Hall, Inc.Chap 5-1 Basic Business Statistics (9 th Edition) Chapter 5 Some Important Discrete Probability Distributions.
ฟังก์ชั่นการแจกแจงความน่าจะเป็น แบบไม่ต่อเนื่อง Discrete Probability Distributions.
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 13.
Random Variables and Expectation. Random Variables A random variable X is a mapping from a sample space S to a target set T, usually N or R. Example:
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
22C:19 Discrete Structures Discrete Probability Fall 2014 Sukumar Ghosh.
Lec 18 Nov 12 Probability – definitions and simulation.
Chapter 4 Discrete Random Variables and Probability Distributions
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 4-1 Introduction to Statistics Chapter 5 Random Variables.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Statistics.
Probability Mass Function Expectation 郭俊利 2009/03/16
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Discrete Random Variables: The Binomial Distribution
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
1 9/8/2015 MATH 224 – Discrete Mathematics Basic finite probability is given by the formula, where |E| is the number of events and |S| is the total number.
Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2004 Lecture 22April 1, 2004Carnegie Mellon University
The Binomial Distribution Permutations: How many different pairs of two items are possible from these four letters: L, M. N, P. L,M L,N L,P M,L M,N M,P.
Binomial Distributions Calculating the Probability of Success.
Expected values and variances. Formula For a discrete random variable X and pmf p(X): Expected value: Variance: Alternate formula for variance:  Var(x)=E(X^2)-[E(X)]^2.
COMP 170 L2 L18: Random Variables: Independence and Variance Page 1.
Lecture Discrete Probability. 5.3 Bayes’ Theorem We have seen that the following holds: We can write one conditional probability in terms of the.
Section 7.2. Section Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Basic Business Statistics.
Great Theoretical Ideas in Computer Science.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 6 Random Variables 6.3 Binomial and Geometric.
COMP 170 L2 L17: Random Variables and Expectation Page 1.
King Saud University Women Students
Chapter 16: Random Variables
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
22C:19 Discrete Structures Discrete Probability Spring 2014 Sukumar Ghosh.
COMPSCI 102 Introduction to Discrete Mathematics.
Probability Distributions: Binomial & Normal Ginger Holmes Rowell, PhD MSP Workshop June 2006.
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Probability Distributions
By Satyadhar Joshi. Content  Probability Spaces  Bernoulli's Trial  Random Variables a. Expectation variance and standard deviation b. The Normal Distribution.
Bernoulli Trials, Geometric and Binomial Probability models.
Great Theoretical Ideas in Computer Science for Some.
6.3 Binomial and Geometric Random Variables
Lecturer : FATEN AL-HUSSAIN Discrete Probability Distributions Note: This PowerPoint is only a summary and your main source should be the book.
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
1. 2 At the end of the lesson, students will be able to (c)Understand the Binomial distribution B(n,p) (d) find the mean and variance of Binomial distribution.
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
CHAPTER 5 Discrete Probability Distributions. Chapter 5 Overview  Introduction  5-1 Probability Distributions  5-2 Mean, Variance, Standard Deviation,
CHAPTER 6 Random Variables
Probability Distributions
Introduction to Discrete Mathematics
Basic statistics Usman Roshan.
Random Variables.
22C:19 Discrete Math Discrete Probability
Discrete Probability Distributions
Discrete Probability Chapter 7 With Question/Answer Animations
Probability.
e is the possible out comes for a model
Discrete Random Variables: Basics
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Discrete Random Variables: Basics
Discrete Random Variables: Basics
A random experiment gives rise to possible outcomes, but any particular outcome is uncertain – “random”. For example, tossing a coin… we know H or T will.
Chapter 11 Probability.
Presentation transcript:

Lecture Discrete Probability

5.3 Bayes’ Theorem We have seen that the following holds: We can write one conditional probability in terms of the other: Bayes’ Theorem

5.3 Example: What is the probability that a family with 2 kids has two boys, given that they have at least one boy? (all possibilities are equally likely).  S: all possibilities: {BB, GB, BG, GG}. E: family has two boys: {BB}. F: family has at least one boy: {BB, GB, BG}. E F = {BB} p(E|F) = (1/4) / (3/4) = 1/3 E BB F GB BG GG Now we compute the probability of P(F|E), what is the probability that a family with two boys has at least one boy ? P(F|E) = P(E|F) P(F) / P(E) = 1/3 * ¾ / ¼ = 1

5.3 Expected Values The definition of an expected value of a random variable is: This equivalent to: Example: What is the expected number of heads if we toss a fair coin n times?  We know that the distribution for this experiment is the Binomial distribution:

5.3 Therefore we need to compute:

5.3 Expectation are linear: Theorem: E(X1+X2) = E(X1) + E(X2) E(aX + b) = aE(X) + b Examples: 1) Expected value for the sum of the values when a pair of dice is rolled:  X1 = value of first die, X2 value of second die: E(X1+X2) = E(X1) + E(X2) = 2 * ( )/6 = 7. 2) Expected number of heads when a fair coin is tossed n times (see example previous slide)  Xi is the outcome coin toss i. Each has a probability of p of coming up heads. linearity: E(X1+...+Xn) = E(X1) E(xn) = n p.

5.3 More examples: A person checking out coats mixed the labels up randomly. When someone collects his coat, he checks out a coat randomly from the remaining coats. What is the expected number of correctly returned coats? There are n coats checked in.  Xi = 1 of correctly returned, and 0 if wrongly returned. Since the labels are randomly permuted, E(Xi) = 1/n E(X1+...Xn) = n 1/n = 1 (independent of the number of checked in coats)

5.3 Geometric distribution Q: What is the distribution of waiting times until a tail comes up, when we toss a fair coin? A: Possible outcomes: T, HT, HHT, HHHT, HHHHT,.... (infinitely many possibilities) P(T) = p, P(HT) = (1-p) p, P(HHT) = (1-p)^2 p,.... geometric distribution Normalization: (matlab) X(s) = number of tosses before success.

5.3 Geometric Distr. Here is how you can compute the expected value of the waiting time:

5.3 Independence Definition: Two random variables X(s) and Y(s) on a sample space S are independent if the following holds: Examples 1) Pair of dice is rolled. X1 is value first die, X2 value second die. Are these independent?  P(x1=r1) = 1/6 P(X2=r2)=1/6 P(X1=r1 AND X2=r2)=1/36 = P(X1=r1) P(X2=r2):  YES independent. 2) Are X1 and X=X1+X2 independent?  P(X=12) =1/36 P(X1=1)=1/6 P(X=12 AND X1=1)=0 which is not the product: P(X=12) P(X1=1)

5.3 Independence Theorem: If two random variables X and Y are independent over a sample space S then: E(XY)=E(X) E(Y). (proof, read book) Note1: The reverse is not true: Two random variables do not have to be independent for E(XY)=E(X)E(Y) to hold. Note2: If 2 random variables are not independent, it follows that E(XY) does not have to be equal to E(X)E(Y), although it might still happen. Example: X counts number of heads when a coin is tossed twice: P(X=0) =1/4 (TT) P(X=1)=1/2 (HT,TH) P(X=2) =1/4 (HH). E(X) = 1x½+2x1/4=1. Y counts the number of tails: E(Y)=1 as well (symmetry, switch role H,T). However, P(XY=0) = 1/2 (HH,TT) P(XY=1) =1/2 (HT,TH) E(XY) = 0x1/2 + 1x1/2=1/2

5.3 Variance The average of a random variable tells us noting about the spread of a probability distribution. (matlab demo) Thus we introduce the variance of a probability distribution: Definition: The variance of a random variable X over a sample space S is given by: variance standard deviation (this is the width of the distribution)

5.3 Variance Theorem: For independent random variables the variances add: (proof in book) Example: 1) We toss 2 coins, Xi(H)=1, Xi(T)=0. What is the STD of X=X1+X2?  X1 and X2 are independent. V(X1+X2)=V(X1)+V(X2)=2V(X1) E(X1)=1/2 V(X1) = (0-1/2)^2 x ½ + (1-1/2)^2 x ½ =1/4 V(X) = ½ STD(X)=sqrt(1/2).

5.3 Variance What is the variance of the number of successes when n independent Bernoulli trials are performed. V(X) = V(X1+...+Xn)=nV(X1) V(X1) = (0-p)^2 x (1-p) + (1-p)^2 x p = p^2(1-p) + p(1-p)^2=p(1-p) V(X)=np(1-p) (matlab demo)