Jointly distributed Random variables Multivariate distributions.

Slides:



Advertisements
Similar presentations
Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Advertisements

Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Center of gravity and Centroids MET 2214
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
15 MULTIPLE INTEGRALS.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Multivariate distributions. The Normal distribution.
5.4 Joint Distributions and Independence
Class notes for ISE 201 San Jose State University
CENTER OF GRAVITY, CENTER OF MASS AND CENTROID FOR A BODY
16 MULTIPLE INTEGRALS.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Continuous Random Variables and Probability Distributions
The Multivariate Normal Distribution, Part 1 BMTRY 726 1/10/2014.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
Jointly distributed Random variables
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability Distributions
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Joint Distribution of two or More Random Variables
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Joint Probability Distributions Leadership in Engineering
Mathematics for Business (Finance)
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Copyright © Cengage Learning. All rights reserved.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
Multiple Integration Copyright © Cengage Learning. All rights reserved.
Multiple Integration 14 Copyright © Cengage Learning. All rights reserved.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Statistics for Business & Economics
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
Operations on Multiple Random Variables
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Chapter 3 Multivariate Random Variables
Copyright © Cengage Learning. All rights reserved.
Section 16.3 Triple Integrals. A continuous function of 3 variables can be integrated over a solid region, W, in 3-space just as a function of two variables.
Continuous Random Variables and Probability Distributions
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Multiple Integration Copyright © Cengage Learning. All rights reserved.
Section 16.1 Definite Integral of a Function of Two Variables.
3.4 Joint Probability Distributions
CHAPTER 9.10~9.17 Vector Calculus.
Statistics Lecture 19.
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Copyright © Cengage Learning. All rights reserved.
The distribution function F(x)
Some Rules for Expectation
Copyright © Cengage Learning. All rights reserved.
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Chapter 15 Multiple Integrals
Copyright © Cengage Learning. All rights reserved.
UNIT I –DOUBLE INTEGRALS
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Lectures prepared by: Elchanan Mossel Yelena Shvets
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Continuous Distributions
Moments of Random Variables
Presentation transcript:

Jointly distributed Random variables Multivariate distributions

Discrete Random Variables

The joint probability function; p(x,y) = P[X = x, Y = y]

Continuous Random Variables

Definition: Two random variable are said to have joint probability density function f(x,y) if

Ifthen defines a surface over the x – y plane

Multiple Integration

A A f(x,y)

If the region A = {(x,y)| a ≤ x ≤ b, c ≤ y ≤ d} is a rectangular region with sides parallel to the coordinate axes: x y d c ab A f(x,y) Then

A f(x,y) To evaluate Then evaluate the outer integral First evaluate the inner integral

x y d c ab y f(x,y) = area under surface above the line where y is constant dy Infinitesimal volume under surface above the line where y is constant

A f(x,y) The same quantity can be calculated by integrating first with respect to y, than x. Then evaluate the outer integral First evaluate the inner integral

x y d c ab x f(x,y) = area under surface above the line where x is constant dx Infinitesimal volume under surface above the line where x is constant

f(x,y) Example: Compute Now

f(x,y) The same quantity can be computed by reversing the order of integration

Integration over non rectangular regions

Suppose the region A is defined as follows A = {(x,y)| a(y) ≤ x ≤ b(y), c ≤ y ≤ d} x y d c a(y)a(y) b(y)b(y) A Then

If the region A is defined as follows A = {(x,y)| a ≤ x ≤ b, c(x) ≤ y ≤ d(x) } x y b a d(x)d(x) c(x)c(x) A Then

In general the region A can be partitioned into regions of either type x y A1A1 A3A3 A4A4 A2A2 A

f(x,y) Example: Compute the volume under f(x,y) = x 2 y + xy 3 over the region A = {(x,y)| x + y ≤ 1, 0 ≤ x, 0 ≤ y} x y x + y = 1 (1, 0) (0, 1)

f(x,y) Integrating first with respect to x than y x y x + y = 1 (1, 0) (0, 1) (0, y) (1 - y, y) A

and

Now integrating first with respect to y than x x y x + y = 1 (1, 0) (0, 1) (x, 0) (x, 1 – x ) A

Hence

Continuous Random Variables

Definition: Two random variable are said to have joint probability density function f(x,y) if

Definition: Let X and Y denote two random variables with joint probability density function f(x,y) then the marginal density of X is the marginal density of Y is

Definition: Let X and Y denote two random variables with joint probability density function f(x,y) and marginal densities f X (x), f Y (y) then the conditional density of Y given X = x conditional density of X given Y = y

The bivariate Normal distribution

Let where This distribution is called the bivariate Normal distribution. The parameters are  1,  2,  1,  2 and 

Surface Plots of the bivariate Normal distribution

Note: is constant when is constant. This is true when x 1, x 2 lie on an ellipse centered at  1,  2.

Marginal and Conditional distributions

Marginal distributions for the Bivariate Normal distribution Recall the definition of marginal distributions for continuous random variables: and It can be shown that in the case of the bivariate normal distribution the marginal distribution of x i is Normal with mean  i and standard deviation  i.

The marginal distributions of x 2 is where Proof:

Now:

Hence Also and

Finally

and

Summarizing where and

Thus

Thus the marginal distribution of x 2 is Normal with mean  2 and standard deviation  2. Similarly the marginal distribution of x 1 is Normal with mean  1 and standard deviation  1.

Conditional distributions for the Bivariate Normal distribution Recall the definition of conditional distributions for continuous random variables: and It can be shown that in the case of the bivariate normal distribution the conditional distribution of x i given x j is Normal with: and mean standard deviation

Proof

where and Hence Thus the conditional distribution of x 2 given x 1 is Normal with: and mean standard deviation

Bivariate Normal Distribution with marginal distributions

Bivariate Normal Distribution with conditional distribution

(  1,  2 ) x2x2 x1x1 Regression Regression to the mean Major axis of ellipses