F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.

Slides:



Advertisements
Similar presentations
Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Advertisements

Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
1 Continuous random variables Continuous random variable Let X be such a random variable Takes on values in the real space  (-infinity; +infinity)  (lower.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Midterms. 2.Hellmuth/Gold. 3.Poisson. 4.Continuous distributions.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Multivariate Distributions
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
5.4 Joint Distributions and Independence
Statistics Lecture 18. Will begin Chapter 5 today.
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
Jointly distributed Random variables
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability Distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Chapter6 Jointly Distributed Random Variables
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Jointly Distributed Random Variables
CIVL 181Tutorial 5 Return period Poisson process Multiple random variables.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Lecture 14: Multivariate Distributions Probability Theory and Applications Fall 2005 October 25.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Chapter 3 Random vectors and their numerical characteristics.
Probability & Statistics I IE 254 Summer 1999 Chapter 4  Continuous Random Variables  What is the difference between a discrete & a continuous R.V.?
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
2.4 Continuous r.v. Suppose that F(x) is the distribution function of r.v. X , if there exists a nonnegative function f(x) , (- 
Statistics for Business & Economics
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Chapter 3 Multivariate Random Variables
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Probability theory The department of math of central south university Probability and Statistics Course group.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Jointly distributed Random variables Multivariate distributions.
MULTIPLE RANDOM VARIABLES A vector random variable X is a function that assigns a vector of real numbers to each outcome of a random experiment. e.g. Random.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 9 – Continuous Random Variables: Joint PDFs, Conditioning, Continuous Bayes Farinaz Koushanfar.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 8 – Continuous Random Variables: PDF and CDFs Farinaz Koushanfar ECE Dept., Rice University.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
1 Two Discrete Random Variables The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is placed on each possible.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
3.4 Joint Probability Distributions
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
12.SPECIAL PROBABILITY DISTRIBUTIONS
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Basics of Multivariate Probability
Chapter 5 Joint Probability Distributions and Random Samples
Statistics Lecture 19.
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Two Discrete Random Variables
Some Rules for Expectation
Suppose you roll two dice, and let X be sum of the dice. Then X is
Random Variable X, with pmf p(x) or pdf f(x)
Lecture 5 b Faten alamri.
Analysis of Engineering and Scientific Data
ASV Chapters 1 - Sample Spaces and Probabilities
Lecture 14: Multivariate Distributions
Lectures prepared by: Elchanan Mossel Yelena Shvets
Chapter 3-2 Discrete Random Variables
Moments of Random Variables
Presentation transcript:

F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57 the marginal cdfs of ( X, Y) with respect to X and Y respectively

Example 1. Suppose that the joint distribution of (X,Y) is specified by Determine F X (x) and F Y (y) 。

Marginal distribution for discrete distribution Suppose that (X, Y) ~ P{X = x i, Y = y j,} = p ij , i, j = 1, 2, … Defin e-P57 P{X = x i } = p i. = , i = 1, 2, … P{Y = y j } = p. j = , j = 1, 2, … the marginal pmf of (X, Y) with respect to X and Y respectively.

Marginal density function the marginal pdf of (X,Y) with respect to X and Y. Example 3.4-P59 Suppose that (X, Y) ~ f (x, y), (x, y)  R 2, define

the marginal pdf of Y

Example 3. Suppose that the joint density function of (X,Y)is specified by Determine (1) the value of c; (2)the marginal distribution of (X,Y) with respect to X

(1) Bivariate uniform distribution Bivariate (X, Y) is said to follow uniform distribution if the density function is specified by By the definition, one can easily to find that if (X, Y) is uniformly distributed, then

Suppose that (X,Y) is uniformly distributed on area D, which is indicated by the graph on the right hand, try to determine: (1)the density function of (X,Y) ; (2)P{Y<2X} ; (3)F(0.5,0.5) Answer

where ,  1 、  2 are constants and  1 >0,  2 >0 、 |  |<1 are also constant, then, it is said that (X, Y) follows the two-dimensional normal distribution with parameters  1,  2,  1,  2,  and denoted it by (2)Two dimensional normal distribution Suppose that the density function of (X, Y) is specified by

Example The joint pdf of (X,Y) is Find the marginal pdf of X and Y.

Solution temporary fix when so temporary fix

when So

Homework : P67: 5,6