Tch-prob1 Chapter 4. Multiple Random Variables Ex. 4.1. Select a student’s name from an urn. S In some random experiments, a number of different quantities.

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Random Variables ECE460 Spring, 2012.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Probability Densities
G. Cowan Lectures on Statistical Data Analysis Lecture 2 page 1 Statistical Data Analysis: Lecture 2 1Probability, Bayes’ theorem 2Random variables and.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Chapter 6 Continuous Random Variables and Probability Distributions
Probability theory 2011 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different definitions.
Random Vectors Shruti Sharma Ganesh Oka References :-
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
Review of Probability and Statistics
Probability theory 2008 Outline of lecture 5 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different.
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Random Variable and Probability Distribution
Lecture II-2: Probability Review
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
1 Lecture 14: Jointly Distributed Random Variables Devore, Ch. 5.1 and 5.2.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
One Random Variable Random Process.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
EE 5345 Multiple Random Variables
Operations on Multiple Random Variables
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Generalized Linear Models (GLMs) and Their Applications.
Multiple Discrete Random Variables. Introduction Consider the choice of a student at random from a population. We wish to know student’s height, weight,
Chapter 5a:Functions of Random Variables Yang Zhenlin.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
Continuous Random Variables and Probability Distributions
Distributions of Functions of Random Variables November 18, 2015
Probability and Moment Approximations using Limit Theorems.
Joint Moments and Joint Characteristic Functions.
Chapter 20 Statistical Considerations Lecture Slides The McGraw-Hill Companies © 2012.
MULTIPLE RANDOM VARIABLES A vector random variable X is a function that assigns a vector of real numbers to each outcome of a random experiment. e.g. Random.
One Function of Two Random Variables
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Pattern Recognition Mathematic Review Hamid R. Rabiee Jafar Muhammadi Ali Jalali.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
5 pair of RVs.
Pattern Recognition Mathematic Review Hamid R. Rabiee Jafar Muhammadi Ali Jalali.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
Random Variables By: 1.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 3: Discrete Random Variables and Their Distributions CIS.
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Review of Probability Theory
3.1 Expectation Expectation Example
UNIT-2 Multiple Random Variable
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
CS723 - Probability and Stochastic Processes
Chapter 2. Random Variables
5 pair of RVs.
9. Two Functions of Two Random Variables
HKN ECE 313 Exam 2 Review Session
Continuous Random Variables: Basics
Presentation transcript:

tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities are measured.

tch-prob2 A vector random variable X is a function that assigns a vector of real numbers to each outcome in S, the sample space of the random experiment. 4.1 Vector Random Variables

tch-prob3 Event Examples Consider the two-dimensional random variable X=(X,Y). Find the region of the plane corresponding to events

tch-prob4 Product Form We are particularly interested in events that have the product form x1x1 x2x2 y1y1 y2y2

tch-prob5 Product Form A fundamental problem in modeling a system with a vector random variable involves specifying the probability of product-form events Many events of interest are not of product form. However, the non-product-form events can be approximated by the union of product-form events. Ex.

tch-prob6 4.2 Pairs of Random variables A. Pairs of discrete random variables - Let X=(X,Y) assume values from - The joint pmf of X is It gives the probability of the occurrence of the pair - The probability of any event A is the sum of the pmf over the outcomes in A: - When A=S,

tch-prob7 Marginal pmf We are also interested in the probabilities of events involving each of the random variables in isolation. These can be found in terms of the Marginal pmf. In general, knowledge of the marginal pmf’s is insufficient to specify the joint pmf.

tch-prob8 Ex Loaded dice: A random experiment consists of tossing two loaded dice and noting the pair of numbers (X,Y) facing up. The joint pmf /421/42 2 2/421/42 3 2/421/42 4 2/421/42 5 2/421/42 6 2/42 j k The marginal pmf P[X=j]=P[Y=k]=1/6.

tch-prob9 Ex Packetization problem: The number of bytes N in a message has a geometric distribution with parameter 1-p and range S N ={0,1,2,….}. Suppose that messages are broken into packets of maximum length M bytes.Let Q be the number of full packets and let R be the number of bytes left over. Find the joint pmf and marginal pmf’s of Q and R.

tch-prob10 joint cdf of X and Y The joint cdf of X and Y is defined as the probability of the product-form event marginal cdf

tch-prob11 joint cdf of X and Y

tch-prob12 B B A

tch-prob13 joint pdf of two jointly continuous random variables X and Y are jointly Continuous if the probabilities of events involving (X,Y) can be expressed as an integral of a pdf,.

tch-prob14 Marginal pdf: obtained by integrating out the variables that are not of interest.

tch-prob15 Ex A randomly selected point (X,Y) in the unit square has uniform joint pdf given by

tch-prob16 Ex Find the normalization constant c and the marginal pdf’s for the following joint pdf:

tch-prob17 Ex

tch-prob18 Ex The joint pdf of X and Y is We say that X and Y are jointly Gaussian. Find the marginal pdf’s.

tch-prob Independence of Two Random Variables X and Y are independent random variables if any event A 1 defined in terms of X is independent of any event A 2 defined in terms of Y; P[ X in A 1, Y in A 2 ] = P[ X in A 1 ] P[ Y in A 2 ] Suppose that X,Y are discrete random variables, and suppose we are interested in the probability of the event where A 1 involves only X and A 2 involves only Y. “  ”If X and Y are independent, then A 1 and A 2 are independent events. Let

tch-prob20 “” “”

tch-prob21 In general, X, Y are independent iff If X and Y are independent r.v.,then g(X) and h(Y) are also independent. # A and A’ are equivalent events; B and B’ are equivalent events.

tch-prob22 Ex.4.15 In the loaded dice experiment in Ex. 4.6, the tosses are not independent. Ex Q and R in Ex. 4.7 are independent. Ex.4.17 X and Y in Ex are not independent, even though the joint pdf appears to factor.

tch-prob Conditional Probability and Conditional Expectation Many random variables of practical interest are not independent. We are interested in the probability P[Y in A] given X=x? conditional probability A. If X is discrete, can obtain conditional cdf of Y given X=x k The conditional pdf, if the derivative exists, is

tch-prob24 If X and Y are independent - If X and Y are discrete If X and Y are independent

tch-prob25 B. If X is continuous, P[ X = x] = 0 conditional cdf of Y given X = x conditional pdf.

tch-prob26 Discrete continuous discrete continuous Theorem on total probability

tch-prob27

tch-prob28

tch-prob29

tch-prob30 Conditional Expectation The conditional expectation of Y given X=x is or if X,Y are discrete.

tch-prob31 can be generalized to

tch-prob32 [ X Y ] [ 0,0 ] 0.1 [ 1,0 ] [ 1,1 ] [ 2,0 ] [ 2,1 ] [ 2,2 ] [ 3,0 ] [ 3,1 ] [ 3,2 ] [ 3,3 ] E[Y] = 1 E[X] = 2.0

tch-prob33

tch-prob34 Ex Find the mean of Y in Ex using conditional expectation. Ex Find the mean and variance of the number of customer arrivals N during the service time T of a specific customer in Ex

tch-prob Multiple Random Variables Extend the methods for specifying probabilities of pairs of random variables to the case of n random variables. We say that are jointly continuous random variables if

tch-prob36

tch-prob37 X 1 and X 3 are independent zero-mean, unit-variance Gaussian r.v.s.

tch-prob38 Independence

tch-prob Functions of Several Random Variables Quite often we are interested in one or more functions of random variables involved with some experiment. For example, sum, maximum or minimum of X 1, X 2, …,X n.

tch-prob40 Example 4.31 Z=X+Y Superposition integral If X and Y are independent r.v., convolution integral

tch-prob41 Example 4.32 Sum of Non-Independent r.v.s Z=X+Y, X,Y zero-mean, unit-variance with correlation coefficient

tch-prob42 Sum of these two non-independent Gaussian r.v.s is also a Gaussian r.v.

tch-prob43 Ex.4.33 A system with standby redundancy. Let T1 and T2 be the lifetimes of the two components. They are independent exponentially distributed with the same mean. The system lifetime is Erlang m=2

tch-prob44 Let Z = g (X,Y). Given Y = y, Z = g (X,y) is a function of one r.v. X. Can first find from then find The conditional pdf can be used to find the pdf of a function of several random variables.

tch-prob45 Example 4.34 Z = X/Y X,Y indep., exponentially distributed with mean one. Assume Y = y, Z = X/y is a scaled version of X

tch-prob46

tch-prob47

tch-prob48 (z, z)

tch-prob49 (z, z)

tch-prob50 Transformation of Random Vectors Joint cdf of

tch-prob51 Example 4.35 W = min (X,Y), Z = max (X,Y) If z>w If z<w

tch-prob52 pdf of Linear Transformation Linear Transformation V = a X + b Y W = c X + e Y assume Equivalent event

tch-prob53 dP = ?

tch-prob54

tch-prob55 Example 4.36 X,Y jointly Gaussian

tch-prob56 V, W are independent, zero mean, Gaussian r.v.s with variance, and, respectively. see Fig 4-16 Contours of equal value of the joint pdf of XY

tch-prob57 Pdf of General Transformation invertible Fig 4.17a

tch-prob58

tch-prob59 Jacobian of the transformation Jacobian of the Inverse Transformation Can be shown that

tch-prob60 Example 4.37 X,Y zero mean, unit-variance, indep. Gaussian r.v.s

tch-prob61 V,W independent Linear transformation method can be used even if we are interested in only one function of random variables. -by defining an “auxiliary” r.v. Rayleigh

tch-prob62 Ex X: zero-mean, unit-variance Gaussian Y: Chi-square r.v. with n degrees of freedom X and Y are independent find pdf of Let W=Y, then

tch-prob63

tch-prob64

tch-prob Expected Value of Function of Random Variables Z=g(X,Y) Ex Z=X+Y X, Y need not be independent In general,

tch-prob66 Ex X,Y independent r.v.s and let The jkth joint moment of X and Y is when j=1, k=1 E[XY]: the correlation of X and Y If E[XY]=0, then X and Y are orthogonal.

tch-prob67 The jkth central moment of X and Y When j=1, k=1 E[(X-E[X])(Y-E[Y])]=COV(X,Y) covariance of X and Y COV(X,Y)=E[XY-XE[Y]-YE[X]+E[X]E[Y]] =E[XY]-2E[X]E[Y]+E[X]E[Y] =E[XY]-E[X]E[Y] Ex X,Y independent COV(X,Y)=E[(X-E[X])(Y-E[Y])] =E[X-E[X]]E[Y-E[Y]] =0

tch-prob68 The correlation coefficient of X and Y X,Y are uncorrelated if If X,Y are independent, then COV(X,Y)=0,,  X, Y uncorrelated. X,Y uncorrelated does not necessarily imply X,Y are independent.

tch-prob69 X,Y uncorrelated does not necessarily imply X,Y are independent. Ex. 4.42

tch-prob70 Joint characteristic Function If X and Y are independent r.v.s

tch-prob71 If Z=aX+bY

tch-prob Jointly Gaussian Random Variables X,Y are said to be jointly Gaussian if Contours of constant pdf

tch-prob73 Marginal p.d.f. Conditional pdf *

tch-prob74 We now show that is indeed the correlation coefficient. Correlation Coefficient

tch-prob75 jointly Gaussian Random Variables

tch-prob76 The pdf of the jointly Gaussian random variables is completely specified by the individual means and variances and the pairwise covariances. Ex Verify that (4.83) becomes (4.79) when n=2. Ex. 4.48

tch-prob77 Linear Transformation of Gaussian Random Variables From elementary properties of matrices,

tch-prob78 Thus, Y jointly Gaussian with mean n and covariance matrix C.

tch-prob79 If we can find a A s.t., a diagonal matrix

tch-prob80 Ex. 4.49

tch-prob81 Ex.4.50 is jointly Gaussian with mean and covariance matrix

tch-prob82 Joint Characteristic Function of n jointly Gaussian random variables

tch-prob Mean Square Estimation We are interested in estimating the value of an inaccessible random variable Y in terms of the observation of an accessible random variable X. The estimate for Y is given by a function of X, g(X). 1. Estimating a r.v. Y by a constant a so that the mean square error (m.s.e) is minimized :

tch-prob84 2. Estimating Y by g(X) = a X + b Differentiate w.r.t. a

tch-prob85 Minimum mean square error (mmse) linear estimator for Y Zero-mean, unit-variance version of X Orthogonality condition In deriving a*, we obtain

tch-prob86 Mean square error of best linear estimator

tch-prob87 3. Best mmse estimator of Y is in general a non-linear function of X, g(X) The constant that minimizes is Regression curve is the estimator for Y in terms of X that yields the smallest m.s.e.

tch-prob88 Ex Let X be uniformly distributed in (-1,1) and let Y=X. Find the best linear estimator and best estimator of Y in terms of X. Ex Find the mmse estimator of Y in terms of X when X and Y are jointly Gaussian random variables. 2