Chapter 3 Random vectors and their numerical characteristics.

Slides:



Advertisements
Similar presentations
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Advertisements

Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Lecture note 6 Continuous Random Variables and Probability distribution.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Chapter 2: Probability Random Variable (r.v.) is a variable whose value is unknown until it is observed. The value of a random variable results from an.
Introduction to stochastic process
5.4 Joint Distributions and Independence
Probability Densities
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
Lecture 5 Probability and Statistics. Please Read Doug Martinson’s Chapter 3: ‘Statistics’ Available on Courseworks.
Probability theory 2011 Main topics in the course on probability theory  The concept of probability – Repetition of basic skills  Multivariate random.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
4. Review of Basic Probability and Statistics
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Random Variable and Probability Distribution
Joint Probability distribution
Jointly distributed Random variables
1 Random Variables and Discrete probability Distributions SESSION 2.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Chapter6 Jointly Distributed Random Variables
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Lecture 14: Multivariate Distributions Probability Theory and Applications Fall 2005 October 25.
CHAPTER 4 Multiple Random Variable
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
LECTURE IV Random Variables and Probability Distributions I.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
Chapter 2 Random variables 2.1 Random variables Definition. Suppose that S={e} is the sampling space of random trial, if X is a real-valued function.
2.4 Continuous r.v. Suppose that F(x) is the distribution function of r.v. X , if there exists a nonnegative function f(x) , (- 
Statistics for Business & Economics
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Operations on Multiple Random Variables
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Chapter 3 Multivariate Random Variables
Probability theory The department of math of central south university Probability and Statistics Course group.
Chapter Eight Expectation of Discrete Random Variable
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
Continuous Random Variables and Probability Distributions
Jointly distributed Random variables Multivariate distributions.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
Chapter 5 Joint Probability Distributions and Random Samples
4.3 Covariance ﹠Correlation
Statistics Lecture 19.
Random Variable 2013.
Main topics in the course on probability theory
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Some Rules for Expectation
Multinomial Distribution
Random Variable X, with pmf p(x) or pdf f(x)
How accurately can you (1) predict Y from X, and (2) predict X from Y?
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Chap 8 Bivariate Distributions Ghahramani 3rd edition
Analysis of Engineering and Scientific Data
Handout Ch 4 實習.
Handout Ch 4 實習.
RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC.
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Presentation transcript:

Chapter 3 Random vectors and their numerical characteristics

3.1 Random vectors 1.n-dimension variables n random variables X 1 , X 2 ,...,X n compose a n-dimension vector (X 1, X 2,...,X n ), and the vector is named n-dimension variables or random vector. 2. Joint distribution of random vector Define function F(x 1,x 2,…x n )= P(X 1 ≦ x 1,X 2 ≦ x 2,...,Xn ≦ x n ) the joint distribution function of random vector (X 1,X 2,...,X n ).

Let (X, Y) be bivariable and (x, y)  R 2, define F(x,y)=P{X  x, Y  y} the bivariate joint distribution of (X, Y) Bivariate distribution Geometric interpretation : the value of F( x, y) assume the probability that the random points belong to eara in dark

For (x 1, y 1 ), (x 2, y 2 )  R 2, (x 1 < x 2 , y 1 <y 2 ), then P{x 1 <X  x 2 , y 1 <Y  y 2 } = F(x 2, y 2 ) - F(x 1, y 2 ) - F (x 2, y 1 ) + F (x 1, y 1 ). (x 1, y 1 ) (x 2, y 2 ) (x 2, y 1 ) (x 1, y 2 )

Suppose that the joint distribution of (X,Y) is (x,y), find the probability that (X,Y) stands in erea G. Answer

Joint distribution F(x, y) has the following characteristics: (1) For all (x, y)  R 2, 0  F(x, y)  1,

(2) Monotonically increment for any fixed y  R, x 1 <x 2 yields F(x 1, y)  F(x 2, y) ; for any fixed x  R, y 1 <y 2 yields F(x, y 1 )  F(x, y 2 ). (3) right-hand continuous for x  R, y  R,

(4) for all (x 1, y 1 ), (x 2, y 2 )  R 2, (x 1 < x 2 , y 1 <y 2 ), F(x 2, y 2 ) - F(x 1, y 2 ) - F (x 2, y 1 ) + F (x 1, y 1 )  0. Conversely, any real-valued function satisfied the aforementioned 4 characteristics must be a joint distribution of some bivariable.

Example 1. Let the joint distribution of (X,Y) is 1)Find the value of A , B , C 。 2)Find P{0<X<2,0<Y<3} Answer

Discrete joint distribution If both x and y are discrete random variable, then,(X, Y) take values in (xi, yj), (i, j = 1, 2, … ), it is said that X and Y have a discrete joint distribution. The joint distribution is defined to be a function such that for any points (x i, y j ), P{X = x i, Y = y j,} = p ij , (i, j = 1, 2, … ). That is (X, Y) ~ P{X = x i, Y = y j,} = p ij , (i, j = 1, 2, … ) ,

X Y y 1 y 2 … y j … p 11 p P 1j... p 21 p P 2j... p i1 p i2... P ij Characteristics of joint distribution : (1)p ij  0, i, j = 1, 2, … ; (2) x1x2xix1x2xi The joint distribution can also be specified in the following table

Example 2. Suppose that there are two red balls and three white balls in a bag, catch one ball form the bag twice without put back, and define X and Y as follows: Please find the joint distribution of (X,Y) X Y 1 0

Continuous joint distributions and density functions 1. It is said that two random variables (X, Y) have a continuous joint distribution if there exists a nonnegative function f (x, y) such that for all (x, y)  R 2 , the distribution function s atisfies and denote it with (X, Y) ~ f (x, y) , (x, y)  R 2

2. characteristics of f(x, y) (1) f (x, y)  0, (x, y)  R 2 ; (2) (3) 若 f (x, y) 在 (x, y)  R 2 处连续,则有

(4) For any region G  R 2, Let Find P{X>Y} 1 1 x y

Find (1)the value of A ; (2) the value of F(1,1) ; (3) the probability of (X, Y)stand in region D : x  0, y  0, 2X+3y  6 Answer (1) Since 1 1

(3)

3. Bivariate uniform distribution Bivariate (X, Y) is said to follow uniform distribution if the density function of is specified by By the definition, one can easily to find that if (X, Y) is uniformly distributed, then

Suppose that (X,Y) is uniformly distributed on area D, which is indicated by the graph on the left hand, try to determine: (1)the density function of (X,Y) ; (2)P{Y<2X} ; (3)F(0.5,0.5) Answer

where ,  1 、  2 are constants and  1 >0,  2 >0 、 |  |<1 are also constant, then, it is said that (X, Y) follows the two-dimensional normal distribution with parameters  1,  2,  1,  2,  and denoted it by (2)Two dimensional normal distribution Suppose that the density function of (X, Y) is specified by

The concept of joint distribution can be easily generalized to n-dimensional random vectors. Definition. Suppose that (X 1,X 2,...X n ) is a n-dimensional random vector , if for any n-dimensional cube there exists a nonnegative f(x 1,x 2,...x n ) such that It is said that (X1,X2,...Xn) follows continuous n- dimensional distribution with density function f(x 1,x 2,...x n )

Definition. Suppose that (X 1, X 2,...X n ) assume finite or countable points on R n. We call that X1,X2,...X n ) is a discrete distributed random vector and P{X 1 =x 1, X 2 =x 2,...X n =x n }, (x 1,x 2,...x n ) ∈ R n is the distribution law of (X 1, X 2,...X n )

Multidimensional random varibables Discrete d.f. Distribution function Continuous d.f. Probability for area Standardized P{(X,Y)  G}

Determine: ( 1 ) P{X  0},(2)P{X  1},(3)P{Y  y 0 } EX: Suppose that ( X , Y ) has density funciton x y D Answer: P{X  0}=0

F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution and independence F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define the marginal distribution of ( X, Y) with respect to X and Y respectively

Example 1. Suppose that the joint distribution of (X,Y) is specified by Determine F X (x) and F Y (y) 。

Marginal distribution for discrete distribution Suppose that (X, Y) ~ P{X = x i, Y = y j,} = p ij , i, j = 1, 2, … Define P{X = x i } = p i. = , i = 1, 2, … P{Y = y j } = p. j = , j = 1, 2, … the marginal distribution of (X, Y) with respect to X and Y respectively.

Marginal density function the joint distribution of (X,Y) with respec to X and Y. Suppose that (X, Y) ~ f (x, y), (x, y)  R 2, define One canc easily to find that the marginal distribution of N(  1,  2,  1 2,  2 2,  ) is N(  1,  1 2 ) and and N(  2,  2 2 ).

Example 3. Suppose that the joint density function of (X,Y)is specified by Determine (1) the value of c; (2)the marginal distribution of (X,Y) with respect to X

Independence of random vectors Definition It is said that X is independent of Y for any real number a<b, c<d,p{a<X  b,c<Y  d} =p{a<X  b}p{c<Y  d},i.e. event{a<X  b}is independent of {c<Y  d}. Theorem A sufficient and necessary condition for random variables X and Y to be independent is F(x,y)=F X (x)F Y (y)

Remark (1) If (X,Y) is continuously distributed, then a sufficient condition for X and Y to be independent is f(x,y)=f X (x)f Y (y) (2) If (X,Y) is discrete distributed with law P i,j =P{X=x i, Y=y j },i,j=1,2,...then a sufficient and continously for X and Y to be independent is P i,j =P i . P  j

EX : try to determine that whether the (X,Y) in Example1 , Exmaple2 , Example3 are independent or not? Example 4. Suppose that the d.f. of (X,Y) is give by the following chart and that X is independent of Y, try to determine a and b.

§ 3.3 CONDITIONAL DISTRIBUTION Definition 3.7 Suppose is an discrete two-dimensional distribution, for given, if, then the conditional distribution law for given can be represented by which is defined as (3.14) and (1) (2)

Definition 3.9 Suppose that for any, holds, if exist, then the limit is called the conditional distribution of for given and denoted it by or.

Theorem 3.6 Suppose that has continuous density function and then (3.15) is a continuous d.f. and its density function is which is called the conditional density function of given the condition and denoted it by

3.4 Functions of random vectors Functions of discrete random vectors Suppose that (X, Y) ~ P(X = x i, Y = y j ) = p ij , i, j = 1, 2, … then Z = g(X, Y) ~ P{Z = z k } = = p k, k = 1, 2, … (X,Y)(x 1,y 1 )(x 1,y 2 )…(x i,y j )… p ij p 11 p 12 p ij Z=g(X,Y)g(x 1,y 1 )g(x 1,y 2 )g(x i,y j ) or

EX Suppose that X and Y are independent and both are uniformly distributed on 0-1 with law X 0 1 P q p Try to determine the distribution law of (1)W = X + Y ; (2) V = max(X, Y) ; (3) U = min(X, Y); (4)The joint distribution law of w and V.

(X,Y)(0,0)(0,1)(1,0)(1,1) p ij W=X+YW=X+Y V = max(X, Y) U = min(X, Y) V W

Example 3.17 Suppose and are independent of then each other, then Example 3.18 Suppose and are independent of, then

Let be the joint density function of, then the density function of can be given by the following way. and

Suppose is the density function of and has continuous partial derivatives, the inverse transformation exists, the Jacobian determinant J is defined as follows: Define then the joint density function of can be determined as

Example 3.23 Suppose that is independent of with marginal density and respectively, then the density function of is specified by Remark Under the conditions of Example 3.23, one can easily find the density function of is

1.Definition Suppose that the variance of r.v. X and Y exist, define the expectation E{[X  E(X)][Y  E(Y)]} is the covariance of X and Y, Cov(X, Y)=E(XY-E(X)E(Y). If Cov(X,Y)=0 , It is said that X and Y are uncorrelated What is the different between the concept of “X and Y are independent”and the one of “X and Y are uncorrelated”? Numerical characteristics of random vectors

Example 2 Suppose that (X, Y) is uniformly distributed on D={(X, Y) : x 2 +y 2  1}.Prove that X and Y are uncorrelated but not independent. Proof

Thus X and Y are uncorrelated. Since Thus, X is not independent of Y.

2. Properties of covariance: (1) Cov(X, Y)=Cov(Y, X); (2) Cov(X,X)=D(X);Cov(X,c)=0 (3) Cov(aX, bY)=abCov(X, Y), where a, b are constants Proof Cov(aX, bY)=E(aXbY)-E(aX)E(bY) =abE(XY)-aE(X)bE(Y) =ab[E(XY)-E(X)E(Y)] =abCov(X,Y)

(4) Cov(X+Y , Z)=Cov(X, Z)+Cov(Y, Z); Proof Cov(X+Y , Z)= E[(X+Y)Z]-E(X+Y)E(Z) =E(XZ)+E(YZ)-E(X)E(Z)-E(Y)E(Z) =Cov(X,Z)+Cov(Y,Z) (5) D(X+Y)=D(X)+D(Y)+2Cov(X, Y). Proof Remark D(X-Y)=D[X+(-Y)] =D(X)+D(Y)-2Cov(X,Y)

Definition of covariance Properties of Covariance Independent and c uncorrelated Coefficient

Coefficien ts Definition Suppose that r.v. X , Y has finite variance, dentoed by DX>0,DY>0 , respectively, then, is name the coefficient of r.v. X and Y. Obviously, EX * =0 , DX * =1 and Introducewhich is the standardized of X

Properties of coefficients (1) |  XY |  1 ; (2) |  XY |=1  There exists constants a, b such that P {Y= aX+b}=1 ; (3) X and Y are uncorrelated   XY ; 1. Suppose that (X,Y) are uniformly distributed on D:0<x<1,0<y<x, try to determine the coefficient of X and Y. D 1 x=y Answer

D 1

What does Example 2 indicate ? Answer 1) 2)

Thus, if ( X , Y ) follow two-dimensional distribution, then “X and Y are independent” is equvalent to “X and Y are uncorrelated” 。