Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.

Slides:



Advertisements
Similar presentations
Discrete Uniform Distribution
Advertisements

DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Chapter 12 Probability © 2008 Pearson Addison-Wesley. All rights reserved.
Lecture (7) Random Variables and Distribution Functions.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
The Bernoulli distribution Discrete distributions.
Chapter 2 Discrete Random Variables
Discrete Probability Distributions
Random Variables.
Review of Basic Probability and Statistics
Probability Distributions Finite Random Variables.
Probability Distributions
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Special discrete distributions Sec
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Discrete Random Variables and Probability Distributions
Discrete and Continuous Distributions G. V. Narayanan.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Probability Distributions: Finite Random Variables.
Review of Probability Theory. © Tallal Elshabrawy 2 Review of Probability Theory Experiments, Sample Spaces and Events Axioms of Probability Conditional.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
The Negative Binomial Distribution An experiment is called a negative binomial experiment if it satisfies the following conditions: 1.The experiment of.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Binomial Experiment A binomial experiment (also known as a Bernoulli trial) is a statistical experiment that has the following properties:
COMP 170 L2 L17: Random Variables and Expectation Page 1.
King Saud University Women Students
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 15.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Math b (Discrete) Random Variables, Binomial Distribution.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Discrete Random Variables. Discrete random variables For a discrete random variable X the probability distribution is described by the probability function,
Probability (outcome k) = Relative Frequency of k
Random Variables Learn how to characterize the pattern of the distribution of values that a random variable may have, and how to use the pattern to find.
Random Variables Example:
Probability Distributions, Discrete Random Variables
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
AP STATISTICS Section 7.1 Random Variables. Objective: To be able to recognize discrete and continuous random variables and calculate probabilities using.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
MATH 2311 Section 3.2. Bernoulli Trials A Bernoulli Trial is a random experiment with the following features: 1.The outcome can be classified as either.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
Chapter5 Statistical and probabilistic concepts, Implementation to Insurance Subjects of the Unit 1.Counting 2.Probability concepts 3.Random Variables.
Discrete Random Variable Random Process. The Notion of A Random Variable We expect some measurement or numerical attribute of the outcome of a random.
3.1 Discrete Random Variables Present the analysis of several random experiments Discuss several discrete random variables that frequently arise in applications.
Probability Distributions
Discrete and Continuous Randonm Variables; Group Activity Solution
Discrete Probability Distributions
3 Discrete Random Variables and Probability Distributions
ECE 313 Probability with Engineering Applications Lecture 7
Random variables (r.v.) Random variable
C4: DISCRETE RANDOM VARIABLES
Engineering Probability and Statistics - SE-205 -Chap 3
Chapter 2 Discrete Random Variables
Probability Review for Financial Engineers
MATH 2311 Section 3.2.
M248: Analyzing data Block A UNIT A3 Modeling Variation.
Discrete Random Variables: Basics
Discrete Random Variables: Basics
Experiments, Outcomes, Events and Random Variables: A Revisit
Random Variables Binomial and Hypergeometric Probabilities
Discrete Random Variables: Basics
Chapter 11 Probability.
PROBLEMS ON BINOMIAL DISTRIBUTION.  Introduction  What is binomial distribution?  Definition of binomial distribution  Assumptions of binomial distribution.
Applied Statistical and Optimization Models
Presentation transcript:

Random Variables

A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.

Let S be the sample space of an experiment consisting of tossing two fair dice. Then, S = {(1, 1}, (1, 2),..., (6, 6)}. Let X be a random variable defined over S that assigns to each outcome the sum of the dice. Then, X ((1,1)) = 2, X ((1,2)) = 3,... X ((6,6)) = 12. Example 1

Usually, we specify the distribution of a random variable without reference to the probability space.

If X denotes the random variable that is defined as the sum of two fair dice, then

If X denotes the random variable that is defined as the sum of two fair dice, then P { X = 2} = P {(1, 1)} = 1/36

If X denotes the random variable that is defined as the sum of two fair dice, then P { X = 2} = P {(1, 1)} = 1/36 P { X = 3} = P {(1, 2), (2, 1)} = 2/36

If X denotes the random variable that is defined as the sum of two fair dice, then P { X = 2} = P {(1, 1)} = 1/36 P { X = 3} = P {(1, 2), (2, 1)} = 2/36 P { X = 4} = P {(1, 3), (3, 1), (2, 2)} = 3/36

If X denotes the random variable that is defined as the sum of two fair dice, then P { X = 2} = P {(1, 1)} = 1/36 P { X = 3} = P {(1, 2), (2, 1)} = 2/36 P { X = 4} = P {(1, 3), (3, 1), (2, 2)} = 3/36 P { X = 5} = P {(1, 4), (4, 1), (2, 3), (3, 2)} = 4/36

If X denotes the random variable that is defined as the sum of two fair dice, then P { X = 2} = P {(1, 1)} = 1/36 P { X = 3} = P {(1, 2), (2, 1)} = 2/36 P { X = 4} = P {(1, 3), (3, 1), (2, 2)} = 3/36 P { X = 5} = P {(1, 4), (4, 1), (2, 3), (3, 2)} = 4/36 P { X = 6} = P {(1, 5), (5, 1), (2, 4), (4, 2), (3, 3)} = 5/36 P { X = 7} = P {(1, 6), (6, 1), (3, 4), (4, 3), (5, 2), (2, 5)} = 6/36 P { X = 8} = P {(2, 6), (6, 2), (3, 5), (5, 3), (4, 4)} = 5/36 P { X = 9} = P {(3, 6), (6, 3), (5, 4), (4, 5)} = 4/36 P { X = 10} = P {(4, 6), (6, 4), (5, 5)} = 3/36 P { X = 11} = P {(5, 6), (6, 5)} = 2/36 P { X = 12} = P {(6, 6)} = 1/36

The random variable X takes on values X = n, where n = 2,..., 12. Since the events corresponding to each value are mutually exclusive, then:

Let S be the sample space of an experiment consisting of tossing two fair coins. Then, S = {( H, H }, ( H, T ), ( T, H ), ( T, T )}. Let Y be a random variable defined over S that assigns to each outcome the number of heads Example 2

Let S be the sample space of an experiment consisting of tossing two fair coins. Then, S = {( H, H }, ( H, T ), ( T, H ), ( T, T )}. Let Y be a random variable defined over S that assigns to each outcome the number of heads. Then, Y is a random variable that takes on values 0, 1, 2: P ( Y =0) = 1/4 P ( Y =1) = 2/4 P ( Y =2) = 1/4. P ( Y =0) + P ( Y =1) + P ( Y =2) = 1. Example 2

A die is repeatedly tossed until a six appears. Let X denote the number of tosses required, assuming successive tosses are independent. Example 3

A die is repeatedly tossed until a six appears. Let X denote the number of tosses required, assuming successive tosses are independent. The random variables X takes on values 1, 2,..., with respective probabilities: Example 3

A die is repeatedly tossed until a six appears. Let X denote the number of tosses required, assuming successive tosses are independent. The random variables X takes on values 1, 2,..., with respective probabilities: P ( X =1) = 1/6 P ( X =2) = (5/6)(1/6) P ( X =3) = (5/6) 2 (1/6) P ( X =4) = (5/6) 3 (1/6)... P ( X = n ) = (5/6) n -1 (1/6) Example 3

The distribution function F (also called cumulative distribution function (cdf)) of a random variable is defined by F ( x ) = P ( X ≤ x ), where x is a real number. Distribution functions

The distribution function F (also called cumulative distribution function (cdf)) of a random variable is defined by F ( x ) = P ( X ≤ x ), where x is a real number. Distribution functions upper case lower case

Properties of distribution functions

Note that P ( X < x ) does not necessarily equal F ( x ) since F ( x ) includes the probability that X equals x.

A discrete random variable is a random variable that takes on either a finite or a countable number of states. Discrete random variables

The probability mass function (pmf) of a discrete random variable is defined by p ( x ) = P ( X = x ). If x takes on values x 1, x 2,..., then Probability mass function

The probability mass function (pmf) of a discrete random variable is defined by p ( x ) = P ( X = x ). If x takes on values x 1, x 2,..., then The cumulative distribution function F is given by Probability mass function

Let X be a random with pmf p (2) = 0.25, p (4) = 0.6, and p (6) = Then, the cdf F of X is given by Example

Let X be a random with pmf p (2) = 0.25, p (4) = 0.6, and p (6) = Then, the cdf F of X is given by Example

Let X be a random variable that takes on values 1 (success) or 0 (failure), then the pmf of X is given by p (0) = P ( X =0) = 1 - p and p (1) = P ( X =1) = p where 0 ≤ p ≤ 1 is the probability of “success.” A random variable that has the above pmf is said to be a Bernoulli random variable. The Bernoulli random variable

A random variable that has the following pmf is said to be a geometric random variable with parameter p. p ( n ) = P ( X = n ) = (1 – p) n -1 p, for n = 1, 2,.... The Geometric random variable

A random variable that has the following pmf is said to be a geometric random variable with parameter p. p ( n ) = P ( X = n ) = (1 – p) n -1 p, for n = 1, 2,.... Example: A series of independent trials, each having a probability p of being a success, are performed until a success occurs. Let X be the number of trials required until the first success. The Geometric random variable

A random variable that has the following pmf is said to be a geometric random variable with parameters ( n, p ) The Binomial random variable

Example: A series of n independent trials, each having a probability p of being a success and 1 – p of being a failure, are performed until a success occurs. Let X be the number of successes in the n trials.

A random variable that has the following pmf is said to be a Poisson random variable with parameter  The Poisson random variable

The number of cars sold per day by a dealer is Poisson with parameter = 2. What is the probability of selling no cars today? What is the probability of receiving 100? Solution: P ( X =0) = e -2  P(X = 2)= e -2 (2 2 /2!)  0.270

Example: The number of cars sold per day by a dealer is Poisson with parameter = 2. What is the probability of selling no cars today? What is the probability of receiving 100? Solution: P ( X =0) = e -2  P(X = 2)= e -2 (2 2 /2!)  0.270

A continuous random variable is a random variable whose set of possible values is uncountable. In particular, we say that Continuous random variables