Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tch-prob1 Chapter 4. Multiple Random Variables Ex. 4.1. Select a student’s name from an urn. S In some random experiments, a number of different quantities.

Similar presentations


Presentation on theme: "Tch-prob1 Chapter 4. Multiple Random Variables Ex. 4.1. Select a student’s name from an urn. S In some random experiments, a number of different quantities."— Presentation transcript:

1 tch-prob1 Chapter 4. Multiple Random Variables Ex. 4.1. Select a student’s name from an urn. S In some random experiments, a number of different quantities are measured.

2 tch-prob2 A vector random variable X is a function that assigns a vector of real numbers to each outcome in S, the sample space of the random experiment. 4.1 Vector Random Variables

3 tch-prob3 Event Examples Consider the two-dimensional random variable X=(X,Y). Find the region of the plane corresponding to events

4 tch-prob4 Product Form We are particularly interested in events that have the product form x1x1 x2x2 y1y1 y2y2

5 tch-prob5 Product Form A fundamental problem in modeling a system with a vector random variable involves specifying the probability of product-form events Many events of interest are not of product form. However, the non-product-form events can be approximated by the union of product-form events. Ex.

6 tch-prob6 4.2 Pairs of Random variables A. Pairs of discrete random variables - Let X=(X,Y) assume values from - The joint pmf of X is It gives the probability of the occurrence of the pair - The probability of any event A is the sum of the pmf over the outcomes in A: - When A=S,

7 tch-prob7 Marginal pmf We are also interested in the probabilities of events involving each of the random variables in isolation. These can be found in terms of the Marginal pmf. In general, knowledge of the marginal pmf’s is insufficient to specify the joint pmf.

8 tch-prob8 Ex. 4.6. Loaded dice: A random experiment consists of tossing two loaded dice and noting the pair of numbers (X,Y) facing up. The joint pmf 123456 12/421/42 2 2/421/42 3 2/421/42 4 2/421/42 5 2/421/42 6 2/42 j k The marginal pmf P[X=j]=P[Y=k]=1/6.

9 tch-prob9 Ex. 4.7. Packetization problem: The number of bytes N in a message has a geometric distribution with parameter 1-p and range S N ={0,1,2,….}. Suppose that messages are broken into packets of maximum length M bytes.Let Q be the number of full packets and let R be the number of bytes left over. Find the joint pmf and marginal pmf’s of Q and R.

10 tch-prob10 joint cdf of X and Y The joint cdf of X and Y is defined as the probability of the product-form event marginal cdf

11 tch-prob11 joint cdf of X and Y

12 tch-prob12 B B A

13 tch-prob13 joint pdf of two jointly continuous random variables X and Y are jointly Continuous if the probabilities of events involving (X,Y) can be expressed as an integral of a pdf,.

14 tch-prob14 Marginal pdf: obtained by integrating out the variables that are not of interest.

15 tch-prob15 Ex. 4.10. A randomly selected point (X,Y) in the unit square has uniform joint pdf given by

16 tch-prob16 Ex. 4.11 Find the normalization constant c and the marginal pdf’s for the following joint pdf:

17 tch-prob17 Ex. 4.12 10

18 tch-prob18 Ex. 4.13 The joint pdf of X and Y is We say that X and Y are jointly Gaussian. Find the marginal pdf’s.

19 tch-prob19 4.3 Independence of Two Random Variables X and Y are independent random variables if any event A 1 defined in terms of X is independent of any event A 2 defined in terms of Y; P[ X in A 1, Y in A 2 ] = P[ X in A 1 ] P[ Y in A 2 ] Suppose that X,Y are discrete random variables, and suppose we are interested in the probability of the event where A 1 involves only X and A 2 involves only Y. “  ”If X and Y are independent, then A 1 and A 2 are independent events. Let

20 tch-prob20 “” “”

21 tch-prob21 In general, X, Y are independent iff If X and Y are independent r.v.,then g(X) and h(Y) are also independent. # A and A’ are equivalent events; B and B’ are equivalent events.

22 tch-prob22 Ex.4.15 In the loaded dice experiment in Ex. 4.6, the tosses are not independent. Ex. 4.16 Q and R in Ex. 4.7 are independent. Ex.4.17 X and Y in Ex. 4.11 are not independent, even though the joint pdf appears to factor.

23 tch-prob23 4.4 Conditional Probability and Conditional Expectation Many random variables of practical interest are not independent. We are interested in the probability P[Y in A] given X=x? conditional probability A. If X is discrete, can obtain conditional cdf of Y given X=x k The conditional pdf, if the derivative exists, is

24 tch-prob24 If X and Y are independent - If X and Y are discrete If X and Y are independent

25 tch-prob25 B. If X is continuous, P[ X = x] = 0 conditional cdf of Y given X = x conditional pdf.

26 tch-prob26 Discrete continuous discrete continuous Theorem on total probability

27 tch-prob27

28 tch-prob28

29 tch-prob29

30 tch-prob30 Conditional Expectation The conditional expectation of Y given X=x is or if X,Y are discrete.

31 tch-prob31 can be generalized to

32 tch-prob32 [ X Y ] [ 0,0 ] 0.1 [ 1,0 ] [ 1,1 ] [ 2,0 ] [ 2,1 ] [ 2,2 ] [ 3,0 ] [ 3,1 ] [ 3,2 ] [ 3,3 ] E[Y] = 1 E[X] = 2.0

33 tch-prob33

34 tch-prob34 Ex. 4.25 Find the mean of Y in Ex. 4.22 using conditional expectation. Ex. 4.26 Find the mean and variance of the number of customer arrivals N during the service time T of a specific customer in Ex. 4.23.

35 tch-prob35 4.5 Multiple Random Variables Extend the methods for specifying probabilities of pairs of random variables to the case of n random variables. We say that are jointly continuous random variables if

36 tch-prob36

37 tch-prob37 X 1 and X 3 are independent zero-mean, unit-variance Gaussian r.v.s.

38 tch-prob38 Independence

39 tch-prob39 4.6 Functions of Several Random Variables Quite often we are interested in one or more functions of random variables involved with some experiment. For example, sum, maximum or minimum of X 1, X 2, …,X n.

40 tch-prob40 Example 4.31 Z=X+Y Superposition integral If X and Y are independent r.v., convolution integral

41 tch-prob41 Example 4.32 Sum of Non-Independent r.v.s Z=X+Y, X,Y zero-mean, unit-variance with correlation coefficient

42 tch-prob42 Sum of these two non-independent Gaussian r.v.s is also a Gaussian r.v.

43 tch-prob43 Ex.4.33 A system with standby redundancy. Let T1 and T2 be the lifetimes of the two components. They are independent exponentially distributed with the same mean. The system lifetime is Erlang m=2

44 tch-prob44 Let Z = g (X,Y). Given Y = y, Z = g (X,y) is a function of one r.v. X. Can first find from then find The conditional pdf can be used to find the pdf of a function of several random variables.

45 tch-prob45 Example 4.34 Z = X/Y X,Y indep., exponentially distributed with mean one. Assume Y = y, Z = X/y is a scaled version of X

46 tch-prob46

47 tch-prob47

48 tch-prob48 (z, z)

49 tch-prob49 (z, z)

50 tch-prob50 Transformation of Random Vectors Joint cdf of

51 tch-prob51 Example 4.35 W = min (X,Y), Z = max (X,Y) If z>w If z<w

52 tch-prob52 pdf of Linear Transformation Linear Transformation V = a X + b Y W = c X + e Y assume Equivalent event

53 tch-prob53 dP = ?

54 tch-prob54

55 tch-prob55 Example 4.36 X,Y jointly Gaussian

56 tch-prob56 V, W are independent, zero mean, Gaussian r.v.s with variance, and, respectively. see Fig 4-16 Contours of equal value of the joint pdf of XY

57 tch-prob57 Pdf of General Transformation invertible Fig 4.17a

58 tch-prob58

59 tch-prob59 Jacobian of the transformation Jacobian of the Inverse Transformation Can be shown that

60 tch-prob60 Example 4.37 X,Y zero mean, unit-variance, indep. Gaussian r.v.s

61 tch-prob61 V,W independent Linear transformation method can be used even if we are interested in only one function of random variables. -by defining an “auxiliary” r.v. Rayleigh

62 tch-prob62 Ex. 4.38 X: zero-mean, unit-variance Gaussian Y: Chi-square r.v. with n degrees of freedom X and Y are independent find pdf of Let W=Y, then

63 tch-prob63

64 tch-prob64

65 tch-prob65 4.7 Expected Value of Function of Random Variables Z=g(X,Y) Ex. 4.39 Z=X+Y X, Y need not be independent In general,

66 tch-prob66 Ex. 4.40. X,Y independent r.v.s and let The jkth joint moment of X and Y is when j=1, k=1 E[XY]: the correlation of X and Y If E[XY]=0, then X and Y are orthogonal.

67 tch-prob67 The jkth central moment of X and Y When j=1, k=1 E[(X-E[X])(Y-E[Y])]=COV(X,Y) covariance of X and Y COV(X,Y)=E[XY-XE[Y]-YE[X]+E[X]E[Y]] =E[XY]-2E[X]E[Y]+E[X]E[Y] =E[XY]-E[X]E[Y] Ex. 4.41. X,Y independent COV(X,Y)=E[(X-E[X])(Y-E[Y])] =E[X-E[X]]E[Y-E[Y]] =0

68 tch-prob68 The correlation coefficient of X and Y X,Y are uncorrelated if If X,Y are independent, then COV(X,Y)=0,,  X, Y uncorrelated. X,Y uncorrelated does not necessarily imply X,Y are independent.

69 tch-prob69 X,Y uncorrelated does not necessarily imply X,Y are independent. Ex. 4.42

70 tch-prob70 Joint characteristic Function If X and Y are independent r.v.s

71 tch-prob71 If Z=aX+bY

72 tch-prob72 4.8 Jointly Gaussian Random Variables X,Y are said to be jointly Gaussian if Contours of constant pdf

73 tch-prob73 Marginal p.d.f. Conditional pdf *

74 tch-prob74 We now show that is indeed the correlation coefficient. Correlation Coefficient

75 tch-prob75 jointly Gaussian Random Variables

76 tch-prob76 The pdf of the jointly Gaussian random variables is completely specified by the individual means and variances and the pairwise covariances. Ex. 4.46 Verify that (4.83) becomes (4.79) when n=2. Ex. 4.48

77 tch-prob77 Linear Transformation of Gaussian Random Variables From elementary properties of matrices,

78 tch-prob78 Thus, Y jointly Gaussian with mean n and covariance matrix C.

79 tch-prob79 If we can find a A s.t., a diagonal matrix

80 tch-prob80 Ex. 4.49

81 tch-prob81 Ex.4.50 is jointly Gaussian with mean and covariance matrix

82 tch-prob82 Joint Characteristic Function of n jointly Gaussian random variables

83 tch-prob83 4.9 Mean Square Estimation We are interested in estimating the value of an inaccessible random variable Y in terms of the observation of an accessible random variable X. The estimate for Y is given by a function of X, g(X). 1. Estimating a r.v. Y by a constant a so that the mean square error (m.s.e) is minimized :

84 tch-prob84 2. Estimating Y by g(X) = a X + b Differentiate w.r.t. a

85 tch-prob85 Minimum mean square error (mmse) linear estimator for Y Zero-mean, unit-variance version of X Orthogonality condition In deriving a*, we obtain

86 tch-prob86 Mean square error of best linear estimator

87 tch-prob87 3. Best mmse estimator of Y is in general a non-linear function of X, g(X) The constant that minimizes is Regression curve is the estimator for Y in terms of X that yields the smallest m.s.e.

88 tch-prob88 Ex. 4.51 Let X be uniformly distributed in (-1,1) and let Y=X. Find the best linear estimator and best estimator of Y in terms of X. Ex. 4.52 Find the mmse estimator of Y in terms of X when X and Y are jointly Gaussian random variables. 2


Download ppt "Tch-prob1 Chapter 4. Multiple Random Variables Ex. 4.1. Select a student’s name from an urn. S In some random experiments, a number of different quantities."

Similar presentations


Ads by Google