C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Longin Jan Latecki.

Slides:



Advertisements
Similar presentations
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Advertisements

Chapter 12 Probability © 2008 Pearson Addison-Wesley. All rights reserved.
Lecture (7) Random Variables and Distribution Functions.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Discrete Probability Distributions
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Michael Maurizi Instructor Longin Jan Latecki C9:
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Lec 18 Nov 12 Probability – definitions and simulation.
Review of Basic Probability and Statistics
Chapter 5 Basic Probability Distributions
Test 2 Stock Option Pricing
Probability Distributions Finite Random Variables.
Probability Distributions
1 Review of Probability Theory [Source: Stanford University]
Probability Distributions Random Variables: Finite and Continuous Distribution Functions Expected value April 3 – 10, 2003.
Discrete Random Variables and Probability Distributions
Discrete Random Variables: The Binomial Distribution
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Probability Distributions: Finite Random Variables.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Discrete Probability Distributions
MATH 3033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Tim Birbeck Instructor Longin Jan Latecki C2: Outcomes,
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Binomial Distributions Calculating the Probability of Success.
The Negative Binomial Distribution An experiment is called a negative binomial experiment if it satisfies the following conditions: 1.The experiment of.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C2: Outcomes, events, and probability.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Michael Baron. Probability and Statistics for Computer Scientists,
Binomial Experiment A binomial experiment (also known as a Bernoulli trial) is a statistical experiment that has the following properties:
Binomial Probability Distribution
COMP 170 L2 L17: Random Variables and Expectation Page 1.
Chapter Four Random Variables and Their Probability Distributions
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Math b (Discrete) Random Variables, Binomial Distribution.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Random Variables Presentation 6.. Random Variables A random variable assigns a number (or symbol) to each outcome of a random circumstance. A random variable.
40S Applied Math Mr. Knight – Killarney School Slide 1 Unit: Statistics Lesson: ST-5 The Binomial Distribution The Binomial Distribution Learning Outcome.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Random Variables Learn how to characterize the pattern of the distribution of values that a random variable may have, and how to use the pattern to find.
Random Variables Example:
Probability Distributions, Discrete Random Variables
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
AP STATISTICS Section 7.1 Random Variables. Objective: To be able to recognize discrete and continuous random variables and calculate probabilities using.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
Chapter 6: Random Variables
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Probability Distribution. Probability Distributions: Overview To understand probability distributions, it is important to understand variables and random.
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics B: Michael Baron. Probability and Statistics for Computer Scientists,
Probability Distribution for Discrete Random Variables
Negative Binomial Experiment
3 Discrete Random Variables and Probability Distributions
ECE 313 Probability with Engineering Applications Lecture 7
Random variables (r.v.) Random variable
Random Variables.
C4: DISCRETE RANDOM VARIABLES
Engineering Probability and Statistics - SE-205 -Chap 3
Discrete Probability Distributions
Samples and Populations
Chapter 3 Discrete Random Variables and Probability Distributions
Introduction to Probability and Statistics
Random Variables Binomial Distributions
Bernoulli Trials Two Possible Outcomes Trials are independent.
Discrete Random Variables: Basics
Discrete Random Variables: Basics
Discrete Random Variables: Basics
Chapter 11 Probability.
Presentation transcript:

C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki

Discrete Random Variables Discrete random variables (RVs) are obtained by counting and have sample spaces which are countable. The values that represent each outcome are usually integers. Random variables are denoted by capital letters. For example, a RV X: is the number of times that we flip a coin until H comes up The possible outcomes are denoted by lower case letters: a=1, a=2, a=3...

Example from Section 4.1 RV S is the sum of two independent throws with a die. The sample space is Ω = {(ω1, ω2) : ω1, ω2 ∈ {1, 2,..., 6} } = {(1, 1), (1, 2),..., (1, 6), (2, 1),..., (6, 5), (6, 6)}. Hence the RV S is the function S : Ω → R, given by S( ω1, ω2 ) = ω1 + ω2 for (ω1, ω2) ∈ Ω Let {S = k} = {(ω1, ω2) ∈ Ω : S( ω1, ω2) = k }. We denote the probability of the event {S = k} by P(S = k), although formally we should write P({S = k}) instead of P(S = k). In our example, S attains only the values k = 2, 3,..., 12 with positive probability. For example, P(S = 2) = P((1, 1) ) = 1/36, P(S = 3) = P({(1, 2), (2, 1)}) = 2/36, while P(S = 13) = P( ∅ ) = 0, because 13 is an impossible outcome. Another example of a RV is the function M : Ω → R, given by M( ω1, ω2) =max{ω1, ω2} for (ω1, ω2) ∈ Ω.

Probability Mass Function The probability mass function (pmf) of a discrete random variable maps each possible outcome in the sample space to it's corresponding probability. As an example we give the probability mass function p of M The sum of the probabilities of all possible outcomes will always be equal to 1.

Example 3.1. (Baron) Consider an experiment of tossing 3 fair coins and counting the number of heads. Let X be the number of heads. Prior to an experiment, its value is not known. All we can say is that X has to be an integer between 0 and 3. We can compute probabilities Hence X is a discrete RV with the following pmf: We know that X as a RV is a function M : Ω → R. What is Ω here?

Probability Distribution Function The distribution function of a random variable X, also referred to as the cumulative distribution function (CDF) yields the probability that X will take a value less than or equal to a. Hence, the value of F(a) is equal to the sum of all probabilities of outcomes less than or equal to a: If we are given a CDF, we can get the pmf with the following formula: for some sufficiently small ε>0.

Graphs of pmf and CDF Probability Mass Function Cumulative Distribution Function

Example 3.3 (Baron) (Errors in independent modules) A program consists of two modules. The number of errors X1 in the first module has the pmf P1(x), and the number of errors X2 in the second module has the pmf P2(x), independently of X1 given by the table. Find the pmf and cdf of Y = X1 + X2, the total number of errors. Solution. We break the problem into steps. First, determine all possible values of Y, then compute the probability of each value. Clearly, the number of errors Y is an integer that can be as low as = 0 and as high as = 5. Since P2(3) = 0, the second module has at most 2 errors. Check: CDF pmf

Bernoulli Distribution The Bernoulli distribution is used to model an experiment with only two outcomes, success and failure. The parameter p is the chance for success. An example is flipping a coin, where “heads” may be success and “tails” may be failure.

Binomial Distribution The Binomial Distribution represents multiple Bernoulli trials. The parameter n is the number of trials, and the parameter p is the probability of success as in the Bernoulli distribution. P(X=k) is the probability of k successful outcomes in n trials.

Probability mass function and distribution function of the Bin(10, 1/4 ) distribution. See Section 4.3

Geometric Distribution A geometric distribution gives information about the probability of success after k attempts. The parameter p is the probability of success on the kth try. This means that all previous k-1 tries failed An example of this would be finding the probability that you will hit a bullseye with a dart on your kth toss.

We had a geometric distribution in Ch. 2: Example: If we flip a coin until it lands on heads, the random variable X of the experiment is the number of times the coin needs to be flipped until heads came up. If the chance of landing on heads is p, the chance of landing on tails is 1-p. Therefore: P(X=1) = p. P(X=2) = (1 – p) 1 p. P(X=3) = (1-p) 2 p and in general P(X=n) = (1-p) n-1 p The pmf and CDF for p=1/4, i.e., of Geo(1/4)