Chapter 3-2 Discrete Random Variables

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Chapter 2 Concepts of Prob. Theory
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Review of Basic Probability and Statistics
5.4 Joint Distributions and Independence
Continuous Random Variables. For discrete random variables, we required that Y was limited to a finite (or countably infinite) set of values. Now, for.
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
Probability Distributions
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Eighth lecture Random Variables.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Jointly distributed Random variables
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Joint Distribution of two or More Random Variables
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
LECTURE IV Random Variables and Probability Distributions I.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Discrete Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4)
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Chapter Four Random Variables and Their Probability Distributions
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Probability (outcome k) = Relative Frequency of k
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
Chapter 3-2 Discrete Random Variables 主講人 : 虞台文. Content Functions of a Single Discrete Random Variable Discrete Random Vectors Independent of Random.
Chapter 3 Multivariate Random Variables
Random Variables Example:
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Continuous Random Variables and Probability Distributions
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
© by Yu Hen Hu 1 ECE533 Digital Image Processing Review of Probability, Random Process, Random Field for Image Processing.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
Chapter 9: Joint distributions and independence CIS 3033.
Basics of Multivariate Probability
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 3: Discrete Random Variables and Their Distributions CIS.
ICS 253: Discrete Structures I
3. Random Variables (Fig.3.1)
Statistics Lecture 19.
Availability Availability - A(t)
What is Probability? Quantification of uncertainty.
Graduate School of Information Sciences, Tohoku University
Chapter Four Random Variables and Their Probability Distributions
Conditional Probability on a joint discrete distribution
Chapter 5 Statistical Models in Simulation
Multinomial Distribution
Welcome to the wonderful world of Probability
EE255/CPS226 Discrete Random Variables
... DISCRETE random variables X, Y Joint Probability Mass Function y1
ASV Chapters 1 - Sample Spaces and Probabilities
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
Dept. of Electrical & Computer engineering
5. Conditioning and Independence
Lectures prepared by: Elchanan Mossel Yelena Shvets
ASV Chapters 1 - Sample Spaces and Probabilities
Experiments, Outcomes, Events and Random Variables: A Revisit
Discrete Random Variables and Probability Distributions
Random Variables Binomial and Hypergeometric Probabilities
6.1 Construct and Interpret Binomial Distributions
Presentation transcript:

Chapter 3-2 Discrete Random Variables 主講人:虞台文

Content Functions of a Single Discrete Random Variable Discrete Random Vectors Independent of Random Variables Multinomial Distributions Sums of Independent Variables  Generating Functions Functions of Multiple Random Variables

Chapter 3-2 Discrete Random Variables Functions of a Single Discrete Random Variable

計程車司機的心聲 這傢伙上車後會要跑幾公里(X)? X為一隨機變數

這傢伙上車後我可以從他口袋掏多少錢(Y)? 隨機變數之函式亦為隨機變數。 Y = g(X) 計程車司機的心聲  這傢伙上車後會要跑幾公里(X)? X為一隨機變數 Y亦為一隨機變數 這傢伙上車後我可以從他口袋掏多少錢(Y)?

這傢伙上車後我可以從他口袋掏多少錢(Y)? Y = g(X) 若pX(x)已知, pY(y)=? 計程車司機的心聲  這傢伙上車後會要跑幾公里(X)? X為一隨機變數 Y亦為一隨機變數 這傢伙上車後我可以從他口袋掏多少錢(Y)?

The Problem Y = g(X) and pX(x) is available.

這瓶十元 Example 17 這瓶只要五元 福氣啦!!!

這瓶十元 Example 17 這瓶只要五元 福氣啦!!!

這瓶十元 Example 17 這瓶只要五元 福氣啦!!!

Example 17

Example 18 n=10, p=0.2.

Example 18 n=10, p=0.2.

Example 18 n=10, p=0.2.

Pay 100$, #bottles (X3) obtained? Example 18 n=10, p=0.2.

Example 18 n=10, p=0.2. Pay 100$, #bottles (X3) obtained? Let Y (X3) denote #lucky bottles obtained.

Chapter 3-2 Discrete Random Variables Discrete Random Vectors

Definition  Random Vectors A discrete r-dimensional random vector X is a function X:   Rr with a finite or countable infinite image of {x1, x2, …}.

Example 19

1 Example 19

2 Example 19

pX(x) = P(X1 = x1, X2 = x2, … , Xr = xr), Definition  Joint Pmf Let random vector X = (X1, X2, …, Xr). The joint pmf (jpmf) for X is defined as pX(x) = P(X1 = x1, X2 = x2, … , Xr = xr), where x = (x1, x2, … , xr).

Example 20 There are three cards numbered 1, 2 and 3. Randomly draw two cards among them without replacement. Let X, Y represent the number of the 1st and 2nd card, respectively. Find the jpmf of X, Y. X Y

Example 20 There are three cards numbered 1, 2 and 3. Randomly draw two cards among them without replacement. Let X, Y represent the number of the 1st and 2nd card, respectively. Find the jpmf of X, Y. X Y

Properties of Jpmf's p(x)  0, x  Rr; {x | p(x)  0} is a finite or countably infinite subset of Rr;

Definition  Marginal Probability Mass Functions Let X = (X1, …, Xi , …, Xr) be an r-dimensional random vectors. The ith marginal probability mass function defined by

Example 21 Find pX(x) and pY (y) of Example 20. X Y

Example 21 Find pX(x) and pY (y) of Example 20. X Y

Example 22 X = # 4 Y = # pX,Y(x, y) = ? pX (x) = ? pY (y) = ?

Example 22 X = # 4 Y = # pX,Y(x, y) pX,Y(x, y) = ? pX (x) = ? pY (y) = ? p(X < 3)= ? p(X + Y < 4)= ? 4 Example 22 pX,Y(x, y)

Example 22 X = # 4 Y = # pX,Y(x, y) pX,Y(x, y) = ? pX (x) = ? pY (y) = ? p(X < 3)= ? p(X + Y < 4)= ? 4 Example 22 pX,Y(x, y)

Example 22 X = # 4 Y = # pX,Y(x, y) pX,Y(x, y) = ? pX (x) = ? pY (y) = ? p(X < 3)= ? p(X + Y < 4)= ? 4 Example 22 pX,Y(x, y)

Example 22 X = # 4 Y = # pX,Y(x, y) pX,Y(x, y) = ? pX (x) = ? pY (y) = ? p(X < 3)= ? p(X + Y < 4)= ? 4 Example 22 pX,Y(x, y)

Example 22 X = # 4 Y = # pX,Y(x, y) pX,Y(x, y) = ? pX (x) = ? pY (y) = ? p(X < 3)= ? p(X + Y < 4)= ? 4 Example 22 pX,Y(x, y)

Chapter 3-2 Discrete Random Variables Independent Random Variables

Definition Let X1, X2, …, Xr be r discrete random variables having densities , respectively. These random variables are said to be mutually independent if their jpdf p(x1, x2, …, xr) satisfies

Example 23 Tossing two dice, let X, Y represent the face values of the 1st and 2nd dice, respectively. 1. pX,Y (x, y) = ?. 2. Are X, Y independent?

Example 23

Fact  ? ? ?

Fact

Fact 

Example 24 Consider Example 23. Find P(X  2, Y  4).

Example 24

Example 24

Example 24 Z1有何意義?

Example 24

Example 24

Example 24

Example 24 p’ p’

Example 24

Example 24 Fact: cdf pmf

Example 24

Example 24

Example 24

Chapter 3-2 Discrete Random Variables Multinomial Distributions

Generalized Bernoulli Trials A sequence of n independent trials. Each trial has r distinct outcomes with probabilities p1, p2, …, pr such that

Multinomial Distributions Define X=(X1, X2, …, Xr) st Xi is the number of trials that resulted in the ith outcome. satisfies

Multinomial Distributions Define X=(X1, X2, …, Xr) st Xi is the number of trials that resulted in the ith outcome. satisfies

Example 26 If a pair of dice are tossed 6 times, what is the probability of obtaining a total of 7 or 11 twice, a matching pair one, and any other combination 3 times? Three outcomes: 7 or 11 match others X1  #7 or 11; X2  #matches; X3  #others.

Chapter 3-2 Discrete Random Variables Sums of Independent Variables  Generating Functions

The Sum of Independent Random Variables

Example 27 Let X, Y be two independent random variables each uniformly distributed over 0, 1, 2, …, n. Find P(X+Y = z).

Example 27 Let X, Y be two independent random variables each uniformly distributed over 0, 1, 2, …, n. Find P(X+Y = z). Case 1: z{0, 1, …, n} Case 2: z{n+1, n+2, …, 2n} n n z  n z z  n z

Example 27 Let X, Y be two independent random variables each uniformly distributed over 0, 1, 2, …, n. Find P(X+Y = z). Case 1: z{0, 1, …, n} Case 2: z{n+1, n+2, …, 2n}

Example 27 Let X, Y be two independent random variables each uniformly distributed over 0, 1, 2, …, n. Find P(X+Y = z). Case 1: z{0, 1, …, n} Case 2: z{n+1, n+2, …, 2n}

Probability Generating Functions 機率母函數 Probability Generating Functions Probabilities Probabilities

Probability Generating Functions pgf Probability Generating Functions Let X be a nonnegative integer-valued random variable. Its probability generating function GX(t) is defined as:

Probability Generating Functions pgf Probability Generating Functions Let X be a nonnegative integer-valued random variable. Its probability generating function GX(t) is defined as: x 2 1

Probability Generating Functions pgf Probability Generating Functions Let X be a nonnegative integer-valued random variable. Its probability generating function GX(t) is defined as: x 2 1

Probability Generating Functions pgf Probability Generating Functions Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(); 3. Z ~ G(p); 4. U ~ NB(r, p).

Probability Generating Functions pgf Probability Generating Functions Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(); 3. Z ~ G(p); 4. U ~ NB(r, p).

Probability Generating Functions pgf Probability Generating Functions Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(); 3. Z ~ G(p); 4. U ~ NB(r, p).

Probability Generating Functions pgf Probability Generating Functions Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(); 3. Z ~ G(p); 4. U ~ NB(r, p).

Probability Generating Functions pgf Probability Generating Functions Compute the pgf’s for the following distributions: 1. X ~ B(n, p); 2. Y ~ P(); 3. Z ~ G(p); 4. U ~ NB(r, p). Exercise

Important Generating Functions

Theorem 2  Sums of Independent Random Variables Let X, Y be two independent, nonnegative integer-valued random variables. Then,

Theorem 2  Sums of Independent Random Variables Pf) Let Z=X+Y.

Theorem 2  Sums of Independent Random Variables Fact: and . . .

Example 29 Use pgf to recompute Example 27. Example 27 Let X, Y be two independent random variables each uniformly distributed over 0, 1, 2, …, n. Find P(X+Y = z). Example 29 Use pgf to recompute Example 27.

Example 29 Use pgf to recompute Example 27. Example 27 Let X, Y be two independent random variables each uniformly distributed over 0, 1, 2, …, n. Find P(X+Y = z). Example 29 Use pgf to recompute Example 27.

Theorem 3

Theorem 3 表何意義?

Theorem 3

Theorem 3 表何意義?

Theorem 3

Theorem 3 表何意義?

Theorem 3

Theorem 3 表何意義?

Theorem 3

Theorem 3 表何意義?

Theorem 3

熟記!!!請靈活的將它們用於解題 Theorem 3

Chapter 3-2 Discrete Random Variables Functions of Multiple Random Variables

Functions of Multiple Random Variables Let X, Y be two random variables with jpmf pX,Y(x, y). 1-1 Suppose that pU,V(u, v)=?

Functions of Multiple Random Variables Let X, Y be two random variables with jpmf pX,Y(x, y). 1-1 Suppose that pU,V(u, v)=? Example: pX,Y(x, y)  已知 pU,V(u, v) = ? X $/month Y $/month

Functions of Multiple Random Variables 1-1 implies invertible. Functions of Multiple Random Variables Let X, Y be two random variables with jpmf pX,Y(x, y). 1-1 Suppose that pU,V(u, v)=? Example: pX,Y(x, y)  已知 pU,V(u, v) = ?

Functions of Multiple Random Variables 1-1 implies invertible. Functions of Multiple Random Variables Let X, Y be two random variables with jpmf pX,Y(x, y). 1-1 Suppose that pU,V(u, v)=?

Example 30 Let X~B(n, p1), Y~B(m, p2) be two independent random variables. U = X + Y V = X  Y Let Find pU,V(u, v).

Example 30 Let X~B(n, p1), Y~B(m, p2) be two independent random variables. U = X + Y V = X  Y Let Find pU,V(u, v). and