The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Laws of division of casual sizes. Binomial law of division.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Independence of random variables
Ch 4 & 5 Important Ideas Sampling Theory. Density Integration for Probability (Ch 4 Sec 1-2) Integral over (a,b) of density is P(a
Chapter 2: Probability Random Variable (r.v.) is a variable whose value is unknown until it is observed. The value of a random variable results from an.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
Class notes for ISE 201 San Jose State University
QA-2 FRM-GARP Sep-2001 Zvi Wiener Quantitative Analysis 2.
FRM Zvi Wiener Following P. Jorion, Financial Risk Manager Handbook Financial Risk Management.
Chapter 6 Continuous Random Variables and Probability Distributions
Probability theory 2011 Main topics in the course on probability theory  The concept of probability – Repetition of basic skills  Multivariate random.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Review of Probability and Statistics
Today Today: More Chapter 5 Reading: –Important Sections in Chapter 5: Only material covered in class Note we have not, and will not cover moment/probability.
1A.1 Copyright© 1977 John Wiley & Son, Inc. All rights reserved Review Some Basic Statistical Concepts Appendix 1A.
Random Variable and Probability Distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
Section 8 – Joint, Marginal, and Conditional Distributions.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Chapter 3 Random vectors and their numerical characteristics.
Copyright © 2011 Pearson Education, Inc. Association between Random Variables Chapter 10.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
1 Lecture 14: Jointly Distributed Random Variables Devore, Ch. 5.1 and 5.2.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Multiple Random Variables & OperationsUnit-2. MULTIPLE CHOICE TRUE OR FALSE FILL IN THE BLANKS Multiple.
Statistics for Business & Economics
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
Chapter Eight Expectation of Discrete Random Variable
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Continuous Random Variables and Probability Distributions
Distributions of Functions of Random Variables November 18, 2015
Joint Moments and Joint Characteristic Functions.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
The Erik Jonsson School of Engineering and Computer Science Chapter 4 pp William J. Pervin The University of Texas at Dallas Richardson, Texas.
1 Two Discrete Random Variables The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is placed on each possible.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
Inequalities, Covariance, examples
Probability Review for Financial Engineers
Main topics in the course on probability theory
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Chapter 4: Mathematical Expectation:
Review of Probability Concepts
Some Basic Probability Concepts
Tutorial 4 SEG7550 8th Oct..
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Independence of random variables
Handout Ch 4 實習.
Handout Ch 4 實習.
Chapter 2. Random Variables
IE 360: Design and Control of Industrial Systems I
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Mathematical Expectation
Presentation transcript:

The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained by i. summing the probabilities corresponding to each y value and the given x. (discrete case) f X (x)=  Y f XY (x,y)|X=x Joint Probability Distribution

The conditional probability distribution function of X given Y is denoted by f X|Y (x|y). f X|Y (x|y)=f X, Y (x,y)/ f Y (y) [We similarly define f Y|X (y|x)] Random variables X and Y are independent if and only if f X|Y (x|y)=f X (x) for all x and y ii. Integrating out Y from the joint pdf (continuous case) f X (x)=  Y f XY (x,y)dy

Joint Probability Distribution Covariance of two random variables X and Y Cov(X,Y)  E{(X-  X )(Y-  Y )}  E(XY- Y  X –X  Y +  X  Y )  E(XY) -  X E(Y) –  Y E(X) +  X  Y  E(XY)–  X  Y -  X  Y +  X  Y  E(XY)-  X  Y

The Coefficient of Correlation between two random variables X and Y  (X,Y)  Cov(X,Y)/  X  Y Two random variables X and Y are uncorrelated if  (X,Y)  0; or if E(XY)=  X  Y

An important result: Suppose that X and Y are two random variables that have mean  X,  Y and standard deviation  X and  Y respectively. Then for Z  aX +bY E(Z)  a  X + b  Y VarZ  a 2 VarX + b 2 VarY + 2abcov(X,Y)  a 2   X + b 2   Y + 2ab  XY  X  Y

VarZ  a 2 VarX + b 2 VarY + 2abcov(X,Y)  a 2   X + b 2   Y + 2ab  XY  X  Y where  XY  Coefficient of Correlation between X and Y. If  XY = -1 then VarZ = (a  X -b  Y ) 2

Joint Probability Distribution Recall: Two random variables x and y are independent if their joint p.d.f. f XY (x,y) is the product of the respective marginal p.d.f. f X (x) and f Y (y). That is, f XY (x,y) = f X (x). f y (y)

Theorem: Independence of two random variables X and Y imply that they are uncorrelated (but the converse is not always true)

Proof: E(XY) =  xy f XY (x,y)dxdy E(XY) =  xy g(x)h(y)dxdy E(XY) =  (  x g(x)dx)y h(y)dy E(XY) =  (  X )y h(y)dy E(XY) =  X  y h(y)dy E(XY) =  X  Y

The Normal Distribution A continuous distribution with the pdf: f(X) = {1/(  } e –1/2[(X-  x )/  x )2 ] For the Standard Normal Distribution Variable Z,  f(z) = {1/  } e –(1/2) z 2

Suppose that X and Y are two random variables such that they have mean  X,  Y and standard deviation  X and  Y respectively. Also assume that both X and Y are normally distributed. Then if W  aX +bY

W ~ Normal(  w,   w ) with  w  a  X + b  Y and   w  a 2   X + b 2  X + 2ab  XY  X  Y where  XY is the relevant correlation coefficient.

Message: A linear combination of two or more normally distributed (independent or not) r.v.s has a normal distribution as well.

The   distribution: Consider Z ~ Normal(0,1). Consider Y = Z 2. Then Y has a   dd istribution of 1 degree of freedom (d.o.f.). We write it as Y ~   .

Consider Z 1, Z 2, …Z n independent random variables each ~ Normal(0,1) Then their sum  Z i 2 has a   distribution with d.o.f. = n. That is,  Z i 2 ~   (n)

Consider two independent random variables X ~ Normal(0,1) and Y ~   (n) Then the variable w  X/  (Y/n) has a t-distribution with d.o.f. = n. That is, w ~ tt (n)

An Application Then the variable w  (X MEAN –  s/  n) hh as a t-distribution with d.o.f. = n-1 where s is an unbiased estimator of  Consider X ~ (   ). Then X MEAN ~ Normal (   /n) if n is ‘large’ (CLT) Consider X ~ Normal(   ). Then X MEAN ~ Normal (   /n)

Suppose that X ~   (m). and Y ~  (n) and the variables X and Y are independent. V ~ F m,n Then the variable v ≡ (x/m) /(y/n) has an F distribution with the numerator d.o.f. = m and the denominator d.o.f = n.

Suppose that X ~  (1). and Y ~  (n) and the variables X and Y are independent. Then the variable v ≡ x /(y/n) has an F distribution with the numerator d.o.f. = 1 and the denominator d.o.f = n. V ~ F 1,n Clearly,  V ~ t (n)