Presentation is loading. Please wait.

Presentation is loading. Please wait.

STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.

Similar presentations


Presentation on theme: "STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function."— Presentation transcript:

1 STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution function F X. Let B be any subset of real numbers. Then can be determined solely from the values of F X (x). Proof:

2 STA347 - week 52 Expectation In the long run, rolling a die repeatedly what average result do you expect? In 6,000,000 rolls expect about 1,000,000 1’s, 1,000,000 2’s etc. Average is For a random variable X, the Expectation (or expected value or mean) of X is the expected average value of X in the long run. Symbols: μ, μ X, E(X) and EX.

3 STA347 - week 53 Expectation of Discrete Random Variable For a discrete random variable X with pmf whenever the sum converge absolutely.

4 STA347 - week 54 Examples 1) Roll a die. Let X = outcome on 1 roll. Then E(X) = 3.5. 2) Bernoulli trials and. Then 3) X ~ Binomial(n, p). Then 4) X ~ Geometric(p). Then 5) X ~ Poisson(λ). Then

5 STA347 - week 55 Expectation of Continuous Random Variable For a continuous random variable X with density whenever this integral converge absolutely.

6 STA347 - week 56 Examples 1) X ~ Uniform(a, b). Then 2) X ~ Exponential(λ). Then 3) X­ is a random variable with density (i) Check if this is a valid density. (ii) Find E(X)

7 STA347 - week 57 4) X ~ Gamma(α, λ). Then 5) X ~ Beta(α, β). Then

8 STA347 - week 58 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof:

9 STA347 - week 59 Examples 1. Suppose X ~ Uniform(0, 1). Let then, 2. Suppose X ~ Poisson(λ). Let, then

10 STA347 - week 510 Properties of Expectation For X, Y random variables and constants, E(aX + b) = aE(X) + b Proof: Continuous case E(aX + bY) = aE(X) + bE(Y) Proof to come… If X is a non-negative random variable, then E(X) = 0 if and only if X = 0 with probability 1. If X is a non-negative random variable, then E(X) ≥ 0 E(a) = a

11 STA347 - week 511 Moments The k th moment of a distribution is E(X k ). We are usually interested in 1 st and 2 nd moments (sometimes in 3 rd and 4 th ) Some second moments: 1. Suppose X ~ Uniform(0, 1), then 2. Suppose X ~ Geometric(p), then

12 STA347 - week 512 Variance The expected value of a random variable E(X) is a measure of the “center” of a distribution. The variance is a measure of how closely concentrated to center (µ) the probability is. It is also called 2nd central moment. Definition The variance of a random variable X is Claim: Proof: We can use the above formula for convenience of calculation. The standard deviation of a random variable X is denoted by σ X ; it is the square root of the variance i.e..

13 STA347 - week 513 Properties of Variance For X, Y random variables and are constants, then Var(aX + b) = a 2 Var(X) Proof: Var(aX + bY) = a 2 Var(X) + b 2 Var(Y) + 2abE[(X – E(X ))(Y – E(Y ))] Proof: Var(X) ≥ 0 Var(X) = 0 if and only if X = E(X) with probability 1 Var(a) = 0

14 STA347 - week 514 Examples 1. Suppose X ~ Uniform(0, 1), then and therefore 2. Suppose X ~ Geometric(p), then and therefore 3. Suppose X ~ Bernoulli(p), then and therefore,

15 STA347 - week 515 Joint Distribution of two or More Random Variables Sometimes more than one measurement (r.v.) is taken on each member of the sample space. In cases like this there will be a few random variables defined on the same probability space and we would like to explore their joint distribution. Joint behavior of 2 random variable (continuous or discrete), X and Y is determined by their joint cumulative distribution function n – dimensional case

16 STA347 - week 516 Properties of Joint Distribution Function For random variables X, Y, F X,Y : R 2  [0,1] given by F X,Y (x,y) = P(X ≤ x,Y ≤ x) F X,Y (x,y) is non-decreasing in each variable i.e. if x 1 ≤ x 2 and y 1 ≤ y 2. and

17 STA347 - week 517 Discrete case Suppose X, Y are discrete random variables defined on the same probability space. The joint probability mass function of 2 discrete random variables X and Y is the function p X,Y (x,y) defined for all pairs of real numbers x and y by For a joint pmf p X,Y (x,y) we must have: p X,Y (x,y) ≥ 0 for all values of x,y and

18 STA347 - week 518 Example for illustration Toss a coin 3 times. Define, X: number of heads on 1st toss, Y: total number of heads. The sample space is Ω ={TTT, TTH, THT, HTT, THH, HTH, HHT, HHH}. We display the joint distribution of X and Y in the following table Can we recover the probability mass function for X and Y from the joint table? To find the probability mass function of X we sum the appropriate rows of the table of the joint probability function. Similarly, to find the mass function for Y we sum the appropriate columns.

19 STA347 - week 519 Marginal Probability Function The marginal probability mass function for X is The marginal probability mass function for Y is

20 STA347 - week 520 Case of several discrete random variables is analogous. If X 1,…,X m are discrete random variables on the same sample space with joint probability function The marginal probability function for X 1 is The 2-dimentional marginal probability function for X 1 and X 2 is

21 STA347 - week 521 Example Roll a die twice. Let X: number of 1’s and Y: total of the 2 die. There is no available form of the joint mass function for X, Y. We display the joint distribution of X and Y with the following table. The marginal probability mass function of X and Y are Find P(X ≤ 1 and Y ≤ 4)

22 STA347 - week 522 The Joint Distribution of two Continuous R.V’s Definition Random variables X and Y are (jointly) continuous if there is a non-negative function f X,Y (x,y) such that for any “reasonable” 2-dimensional set A. f X,Y (x,y) is called a joint density function for (X, Y). In particular, if A = {(X, Y): X ≤ x, Y ≤ x}, the joint CDF of X,Y is From Fundamental Theorem of Calculus we have

23 STA347 - week 523 Properties of joint density function for all It’s integral over R 2 is

24 STA347 - week 524 Example Consider the following bivariate density function Check if it’s a valid density function. Compute P(X > Y).

25 STA347 - week 525 Marginal Densities and Distribution Functions The marginal (cumulative) distribution function of X is The marginal density of X is then Similarly the marginal density of Y is

26 STA347 - week 526 Example Consider the following bivariate density function Check if it is a valid density function. Find the joint CDF of (X, Y) and compute P(X ≤ ½,Y ≤ ½ ). Compute P(X ≤ 2,Y ≤ ½ ). Find the marginal densities of X and Y.

27 STA347 - week 527 Generalization to higher dimensions Suppose X, Y, Z are jointly continuous random variables with density f(x,y,z), then Marginal density of X is given by: Marginal density of X, Y is given by :

28 STA347 - week 528 Example Given the following joint CDF of X, Y Find the joint density of X, Y. Find the marginal densities of X and Y and identify them.

29 STA347 - week 529 Example Consider the joint density where λ is a positive parameter. Check if it is a valid density. Find the marginal densities of X and Y and identify them.

30 STA347 - week 530 Independence of random variables Recall the definition of random variable X: mapping from Ω to R such that for. By definition of F, this implies that (X > 1.4) is an event and for the discrete case (X = 2) is an event. In general is an event for any set A that is formed by taking unions / complements / intersections of intervals from R. Definition Random variables X and Y are independent if the events and are independent.

31 STA347 - week 531 Theorem Two discrete random variables X and Y with joint pmf p X,Y (x,y) and marginal mass function p X (x) and p Y (y), are independent if and only if Proof: Question: Back to the rolling die 2 times example, are X and Y independent?

32 STA347 - week 532 Theorem Suppose X and Y are jointly continuous random variables. X and Y are independent if and only if given any two densities for X and Y their product is the joint density for the pair (X,Y) i.e. Proof: If X and Y are independent random variables and Z =g(X), W = h(Y) then Z, W are also independent.

33 STA347 - week 533 Example Suppose X and Y are discrete random variables whose values are the non- negative integers and their joint probability function is Are X and Y independent? What are their marginal distributions? Factorization is enough for independence, but we need to be careful of constant terms for factors to be marginal probability functions.

34 STA347 - week 534 Example and Important Comment The joint density for X, Y is given by Are X, Y independent? Independence requires that the set of points where the joint density is positive must be the Cartesian product of the set of points where the marginal densities are positive i.e. the set of points where f X,Y (x,y) >0 must be (possibly infinite) rectangles.


Download ppt "STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function."

Similar presentations


Ads by Google