STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.

Slides:



Advertisements
Similar presentations
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Advertisements

Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Independence of random variables
Probability &Statistics Lecture 8
Continuous Random Variables and Probability Distributions
The Erik Jonsson School of Engineering and Computer Science Chapter 2 pp William J. Pervin The University of Texas at Dallas Richardson, Texas.
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Probability Distributions: Finite Random Variables.
Joint Distribution of two or More Random Variables
Stat 1510: Introducing Probability. Agenda 2  The Idea of Probability  Probability Models  Probability Rules  Finite and Discrete Probability Models.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Week 41 Continuous Probability Spaces Ω is not countable. Outcomes can be any real number or part of an interval of R, e.g. heights, weights and lifetimes.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
K. Shum Lecture 16 Description of random variables: pdf, cdf. Expectation. Variance.
Expected values and variances. Formula For a discrete random variable X and pmf p(X): Expected value: Variance: Alternate formula for variance:  Var(x)=E(X^2)-[E(X)]^2.
MTH3003 PJJ SEM I 2015/2016.  ASSIGNMENT :25% Assignment 1 (10%) Assignment 2 (15%)  Mid exam :30% Part A (Objective) Part B (Subjective)  Final Exam:
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
Random Variables A random variable is simply a real-valued function defined on the sample space of an experiment. Example. Three fair coins are flipped.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
5.3 Random Variables  Random Variable  Discrete Random Variables  Continuous Random Variables  Normal Distributions as Probability Distributions 1.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Random Variables an important concept in probability.
Convergence in Distribution
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
CS433 Modeling and Simulation Lecture 03 – Part 01 Probability Review 1 Dr. Anis Koubâa Al-Imam Mohammad Ibn Saud University
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
Consistency An estimator is a consistent estimator of θ, if , i.e., if
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Random Variables an important concept in probability.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 6 Random Variables 6.1 Discrete and Continuous.
Probability Distributions
Review of Chapter
Chapter 2: Random Variable and Probability Distributions Yang Zhenlin.
Random Variables Example:
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Continuous Random Variables and Probability Distributions
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Random Variables By: 1.
Discrete Random Variable Random Process. The Notion of A Random Variable We expect some measurement or numerical attribute of the outcome of a random.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Lecture 3 B Maysaa ELmahi.
Math 145 October 5, 2010.
Random Variables.
Discrete and Continuous Random Variables
Conditional Probability on a joint discrete distribution
Math 145.
Probability Review for Financial Engineers
Multinomial Distribution
Probability distributions
CHAPTER 6 Random Variables
Random Variable Two Types:
CHAPTER 6 Random Variables
CHAPTER 6 Random Variables
Independence of random variables
CHAPTER 6 Random Variables
CHAPTER 6 Random Variables
CHAPTER 6 Random Variables
Expected values and variances
Random Variables and Probability Distributions
Math 145 September 4, 2011.
Math 145 September 29, 2008.
CHAPTER 6 Random Variables
Section 1 – Discrete and Continuous Random Variables
Math 145 October 1, 2013.
Presentation transcript:

STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution function F X. Let B be any subset of real numbers. Then can be determined solely from the values of F X (x). Proof:

STA347 - week 52 Expectation In the long run, rolling a die repeatedly what average result do you expect? In 6,000,000 rolls expect about 1,000,000 1’s, 1,000,000 2’s etc. Average is For a random variable X, the Expectation (or expected value or mean) of X is the expected average value of X in the long run. Symbols: μ, μ X, E(X) and EX.

STA347 - week 53 Expectation of Discrete Random Variable For a discrete random variable X with pmf whenever the sum converge absolutely.

STA347 - week 54 Examples 1) Roll a die. Let X = outcome on 1 roll. Then E(X) = ) Bernoulli trials and. Then 3) X ~ Binomial(n, p). Then 4) X ~ Geometric(p). Then 5) X ~ Poisson(λ). Then

STA347 - week 55 Expectation of Continuous Random Variable For a continuous random variable X with density whenever this integral converge absolutely.

STA347 - week 56 Examples 1) X ~ Uniform(a, b). Then 2) X ~ Exponential(λ). Then 3) X­ is a random variable with density (i) Check if this is a valid density. (ii) Find E(X)

STA347 - week 57 4) X ~ Gamma(α, λ). Then 5) X ~ Beta(α, β). Then

STA347 - week 58 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof:

STA347 - week 59 Examples 1. Suppose X ~ Uniform(0, 1). Let then, 2. Suppose X ~ Poisson(λ). Let, then

STA347 - week 510 Properties of Expectation For X, Y random variables and constants, E(aX + b) = aE(X) + b Proof: Continuous case E(aX + bY) = aE(X) + bE(Y) Proof to come… If X is a non-negative random variable, then E(X) = 0 if and only if X = 0 with probability 1. If X is a non-negative random variable, then E(X) ≥ 0 E(a) = a

STA347 - week 511 Moments The k th moment of a distribution is E(X k ). We are usually interested in 1 st and 2 nd moments (sometimes in 3 rd and 4 th ) Some second moments: 1. Suppose X ~ Uniform(0, 1), then 2. Suppose X ~ Geometric(p), then

STA347 - week 512 Variance The expected value of a random variable E(X) is a measure of the “center” of a distribution. The variance is a measure of how closely concentrated to center (µ) the probability is. It is also called 2nd central moment. Definition The variance of a random variable X is Claim: Proof: We can use the above formula for convenience of calculation. The standard deviation of a random variable X is denoted by σ X ; it is the square root of the variance i.e..

STA347 - week 513 Properties of Variance For X, Y random variables and are constants, then Var(aX + b) = a 2 Var(X) Proof: Var(aX + bY) = a 2 Var(X) + b 2 Var(Y) + 2abE[(X – E(X ))(Y – E(Y ))] Proof: Var(X) ≥ 0 Var(X) = 0 if and only if X = E(X) with probability 1 Var(a) = 0

STA347 - week 514 Examples 1. Suppose X ~ Uniform(0, 1), then and therefore 2. Suppose X ~ Geometric(p), then and therefore 3. Suppose X ~ Bernoulli(p), then and therefore,

STA347 - week 515 Joint Distribution of two or More Random Variables Sometimes more than one measurement (r.v.) is taken on each member of the sample space. In cases like this there will be a few random variables defined on the same probability space and we would like to explore their joint distribution. Joint behavior of 2 random variable (continuous or discrete), X and Y is determined by their joint cumulative distribution function n – dimensional case

STA347 - week 516 Properties of Joint Distribution Function For random variables X, Y, F X,Y : R 2  [0,1] given by F X,Y (x,y) = P(X ≤ x,Y ≤ x) F X,Y (x,y) is non-decreasing in each variable i.e. if x 1 ≤ x 2 and y 1 ≤ y 2. and

STA347 - week 517 Discrete case Suppose X, Y are discrete random variables defined on the same probability space. The joint probability mass function of 2 discrete random variables X and Y is the function p X,Y (x,y) defined for all pairs of real numbers x and y by For a joint pmf p X,Y (x,y) we must have: p X,Y (x,y) ≥ 0 for all values of x,y and

STA347 - week 518 Example for illustration Toss a coin 3 times. Define, X: number of heads on 1st toss, Y: total number of heads. The sample space is Ω ={TTT, TTH, THT, HTT, THH, HTH, HHT, HHH}. We display the joint distribution of X and Y in the following table Can we recover the probability mass function for X and Y from the joint table? To find the probability mass function of X we sum the appropriate rows of the table of the joint probability function. Similarly, to find the mass function for Y we sum the appropriate columns.

STA347 - week 519 Marginal Probability Function The marginal probability mass function for X is The marginal probability mass function for Y is

STA347 - week 520 Case of several discrete random variables is analogous. If X 1,…,X m are discrete random variables on the same sample space with joint probability function The marginal probability function for X 1 is The 2-dimentional marginal probability function for X 1 and X 2 is

STA347 - week 521 Example Roll a die twice. Let X: number of 1’s and Y: total of the 2 die. There is no available form of the joint mass function for X, Y. We display the joint distribution of X and Y with the following table. The marginal probability mass function of X and Y are Find P(X ≤ 1 and Y ≤ 4)

STA347 - week 522 The Joint Distribution of two Continuous R.V’s Definition Random variables X and Y are (jointly) continuous if there is a non-negative function f X,Y (x,y) such that for any “reasonable” 2-dimensional set A. f X,Y (x,y) is called a joint density function for (X, Y). In particular, if A = {(X, Y): X ≤ x, Y ≤ x}, the joint CDF of X,Y is From Fundamental Theorem of Calculus we have

STA347 - week 523 Properties of joint density function for all It’s integral over R 2 is

STA347 - week 524 Example Consider the following bivariate density function Check if it’s a valid density function. Compute P(X > Y).

STA347 - week 525 Marginal Densities and Distribution Functions The marginal (cumulative) distribution function of X is The marginal density of X is then Similarly the marginal density of Y is

STA347 - week 526 Example Consider the following bivariate density function Check if it is a valid density function. Find the joint CDF of (X, Y) and compute P(X ≤ ½,Y ≤ ½ ). Compute P(X ≤ 2,Y ≤ ½ ). Find the marginal densities of X and Y.

STA347 - week 527 Generalization to higher dimensions Suppose X, Y, Z are jointly continuous random variables with density f(x,y,z), then Marginal density of X is given by: Marginal density of X, Y is given by :

STA347 - week 528 Example Given the following joint CDF of X, Y Find the joint density of X, Y. Find the marginal densities of X and Y and identify them.

STA347 - week 529 Example Consider the joint density where λ is a positive parameter. Check if it is a valid density. Find the marginal densities of X and Y and identify them.

STA347 - week 530 Independence of random variables Recall the definition of random variable X: mapping from Ω to R such that for. By definition of F, this implies that (X > 1.4) is an event and for the discrete case (X = 2) is an event. In general is an event for any set A that is formed by taking unions / complements / intersections of intervals from R. Definition Random variables X and Y are independent if the events and are independent.

STA347 - week 531 Theorem Two discrete random variables X and Y with joint pmf p X,Y (x,y) and marginal mass function p X (x) and p Y (y), are independent if and only if Proof: Question: Back to the rolling die 2 times example, are X and Y independent?

STA347 - week 532 Theorem Suppose X and Y are jointly continuous random variables. X and Y are independent if and only if given any two densities for X and Y their product is the joint density for the pair (X,Y) i.e. Proof: If X and Y are independent random variables and Z =g(X), W = h(Y) then Z, W are also independent.

STA347 - week 533 Example Suppose X and Y are discrete random variables whose values are the non- negative integers and their joint probability function is Are X and Y independent? What are their marginal distributions? Factorization is enough for independence, but we need to be careful of constant terms for factors to be marginal probability functions.

STA347 - week 534 Example and Important Comment The joint density for X, Y is given by Are X, Y independent? Independence requires that the set of points where the joint density is positive must be the Cartesian product of the set of points where the marginal densities are positive i.e. the set of points where f X,Y (x,y) >0 must be (possibly infinite) rectangles.