Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.

Slides:



Advertisements
Similar presentations
9. Two Functions of Two Random Variables
Advertisements

Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Random Variables ECE460 Spring, 2012.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
6. Mean, Variance, Moments and Characteristic Functions
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Probability Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
4.7 Brownian Bridge 報告者 : 劉彥君 Gaussian Process Definition 4.7.1: A Gaussian process X(t), t ≥ 0, is a stochastic process that has the property.
Statistics Lecture 18. Will begin Chapter 5 today.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Jointly distributed Random variables
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Joint Distribution of two or More Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
1 1. Basics Probability theory deals with the study of random phenomena, which under repeated experiments yield different outcomes that have certain underlying.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
1 5. Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v?
1 One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint p.d.f.
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Functions of a Random Variable.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
Operations on Multiple Random Variables
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Geology 6600/7600 Signal Analysis 02 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Continuous Random Variables and Probability Distributions
Jointly distributed Random variables Multivariate distributions.
Joint Moments and Joint Characteristic Functions.
One Function of Two Random Variables
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Statistics Lecture 19.
12. Principles of Parameter Estimation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
5. Functions of a Random Variable
5. Functions of a Random Variable
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
CS723 - Probability and Stochastic Processes
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
11. Conditional Density Functions and Conditional Expected Values
9. Two Functions of Two Random Variables
12. Principles of Parameter Estimation
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
8. One Function of Two Random Variables
Presentation transcript:

Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density (pdf) –Marginal Densities –Independence 1

Multiple Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record the height and weight of each person in a community or the number of people and the total income in a family, we need two numbers. 2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17 Two Continuous Random Variables Let X and Y denote two random variables (r.v) based on a probability model. Then and

What about the probability that the pair of r.vs (X,Y) belongs to an arbitrary region D? In other words, how does one estimate, for example, Towards this, we define the joint probability distribution function of X and Y to be where x and y are arbitrary real numbers.

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

we get (ii) To prove (1), we note that for and the mutually exclusive property of the events on the right side gives which proves (1). Similarly (2) follows. (1) (2) (i) Properties

38

39

40

41

42

43

44

45

46

47

48 It is useful to know the formula for differentiation under integrals. Let Then its derivative with respect to x is given by

The joint P.D.F and/or the joint p.d.f represent complete information about the r.vs, and their marginal p.d.fs can be evaluated from the joint p.d.f. However, given marginals, (most often) it will not be possible to compute the joint p.d.f. Consider the following example: Example : Given Obtain the marginal p.d.fs and Solution: It is given that the joint p.d.f is a constant in the shaded region in Fig.. To determine that constant c.

Thus c = 2. Using marginal pdf equations: and similarly Clearly, in this case given and above, it will not be possible to obtain the original joint p.d.f. Example : X and Y are said to be jointly normal (Gaussian) distributed, if their joint p.d.f has the following form: Eq:1

51 Example of joint normal pdf rho=0.02 X,Y~N(0,1)

52 By direct integration, and completing the square, it can be shown that ~ and similarly ~ Following the above notation, we will denote : (Eq:1) as Once again, knowing the marginals as above alone doesn’t tell us everything about the joint p.d.f in given in Eq#1 As we show below, the only situation where the marginal p.d.fs can be used to recover the joint p.d.f is when the random variables are statistically independent. PILLAI

Independence of r.vs Definition: The random variables X and Y are said to be statistically independent if the events and are independent events for any two sets A and B in x and y axes respectively. Applying the above definition to the events and we conclude that, if the r.vs X and Y are independent, then i.e., or equivalently, if X and Y are independent, then we must have Cond#1

If X and Y are discrete-type r.vs then their independence implies The previous equations give us the procedure to test for independence. Given: obtain the marginal p.d.fs and and examine whether Cond#1 is valid. If so, the r.vs are independent, otherwise they are dependent. Returning back to Previous Example, we observe by direct verification that Hence X and Y are dependent r.vs in that case.

55 In case of two jointly Gaussian r.vs as in above are independent if and only if the fifth parameter is zero i.e.

Example : Given Determine whether X and Y are independent. Solution: Similarly In this case and hence X and Y are independent random variables.