Statistics for Business & Economics

Slides:



Advertisements
Similar presentations
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Advertisements

Chapter 5 Discrete Random Variables and Probability Distributions
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Multivariate Distributions
Lecture note 6 Continuous Random Variables and Probability distribution.
Chapter 4 Discrete Random Variables and Probability Distributions
5.4 Joint Distributions and Independence
1 Random Variables and Discrete probability Distributions Chapter 7.
Probability Densities
Class notes for ISE 201 San Jose State University
Chapter 6 Continuous Random Variables and Probability Distributions
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
Mutually Exclusive: P(not A) = 1- P(A) Complement Rule: P(A and B) = 0 P(A or B) = P(A) + P(B) - P(A and B) General Addition Rule: Conditional Probability:
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Today Today: More Chapter 5 Reading: –Important Sections in Chapter 5: Only material covered in class Note we have not, and will not cover moment/probability.
1 Fin500J Topic 10Fall 2010 Olin Business School Fin500J: Mathematical Foundations in Finance Topic 10: Probability and Statistics Philip H. Dybvig Reference:
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
Jointly distributed Random variables
1 Random Variables and Discrete probability Distributions SESSION 2.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability Distributions
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
: Appendix A: Mathematical Foundations 1 Montri Karnjanadecha ac.th/~montri Principles of.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Random Variables and Discrete probability Distributions
Chapter 3 Random vectors and their numerical characteristics.
Two Random Variables W&W, Chapter 5. Joint Distributions So far we have been talking about the probability of a single variable, or a variable conditional.
1 Lecture 14: Jointly Distributed Random Variables Devore, Ch. 5.1 and 5.2.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Operations on Multiple Random Variables
Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which.
Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete.
Math 4030 – 6a Joint Distributions (Discrete)
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Continuous Random Variables and Probability Distributions
Jointly distributed Random variables Multivariate distributions.
Chapter 5 Sampling Distributions. Introduction Distribution of a Sample Statistic: The probability distribution of a sample statistic obtained from a.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Probability and Information Theory
Engineering Probability and Statistics - SE-205 -Chap 4
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Keller: Stats for Mgmt & Econ, 7th Ed
Chapter 4: Mathematical Expectation:
Some Rules for Expectation
Handout Ch 4 實習.
Handout Ch 4 實習.
Chapter 5 Applied Statistics and Probability for Engineers
Chapter 2. Random Variables
Further Topics on Random Variables: Covariance and Correlation
Discrete Random Variables and Probability Distributions
Further Topics on Random Variables: Covariance and Correlation
Mathematical Expectation
Presentation transcript:

Statistics for Business & Economics Joint Probability Distributions

Learning Objectives Define two discrete random variables Describe two continuous random variables Discuss covariance and correlation Define bivariate normal Distribution Describe linear combination of random variables Discuss Chebyshev’s inequality As a result of this class, you will be able to ...

Two Discrete Random Variables 9

Why Two Discrete Random Variables ? To find out the simultaneous behavior of two random variables. Example: X and Y are random variables defined for the two dimensions of a part of a production line. What is the probability that X is within the range of 2.95 ~ 3.05 and Y is within the range of 7.60~7.80?

Two Discrete Random Variables If X and Y are discrete random variables, the joint distribution of X and Y is a description of the set of points (x, y) in the range of (X, Y) along with the probability of each point. It is sometimes referred to as the bivariate probability distribution or bivariate distribution.

Two Discrete Random Variables  

Two Discrete Random Variables ---Example A financial company uses X=1, 2,3 to represent low, medium, and high income customers respectively. Also they use Y=1,2,3,4 to represent mutual funds, bonds, stocks, and options respectively. Then the joint probability of X and Y could be represented by following table.

Two Discrete Random Variables ---Example Y=1 Y=2 Y=3 Y=4 X=1 0.1 X=2 0.2 X=3 1 ) , ( (2) (1) y x f XY =  

Two Discrete Random Variables ---Example Y

Marginal Probability Distribution (1/2)  

Marginal Probability Distribution (2/2) where Rx denotes the set of all points in the range of (X, Y) for which X=x and Ry denotes the set of all points in the range of (X, Y) for which Y=y

Mean of Marginal Probability Distribution If the marginal probability distribution of X has the probability mass function fX(x), then

Example(1/3) A financial company use X=1,2,3 to represent low, medium, and high income customers respectively. Also they use Y=1,2,3,4 to represent mutual funds, bonds, stocks, and options respectively. Then the joint probability of X and Y could be represented by following table.

Example(2/3) Y=1 Y=2 Y=3 Y=4 X=1 0.1 X=2 0.2 X=3

Example(3/3) Please find marginal probability of fX(x,y), fY(x,y), and their mean and variance.

Solution to Example(1/3)

Solution to Example(2/3)

Solution to Example(3/3) Y=1 Y=2 Y=3 Y=4 fX(x) X=1 0.1 X=2 0.2 0.5 X=3 0.4 fY(Y) 0.3 1

Conditional Probability Distribution Given discrete random variable X and Y with joint probability mass function fXY(x, y) the conditional probability mass function of Y given X = x is

Properties of Conditional Probability Distribution

Example(1/3) Same example mentioned above Y=1 Y=2 Y=3 Y=4 X=1 0.1 X=2 X=2 0.2 X=3

Example(2/3) Computed results Y=1 Y=2 Y=3 Y=4 fX(x) X=1 0.1 X=2 0.2 X=2 0.2 0.5 X=3 0.4 fY(Y) 0.3 1

Example(3/3) The conditional probability of Y given X = 2 is

Properties of Conditional Probability Distribution Conditional mean of Y given X = x Conditional variance of Y given X = x

Example Please find E(X|y=1)

Example Please find V(X|y=1).

Independence of Two Discrete Random Variables(1/2) For discrete random variables X and Y, if any one of the following properties is true, then the others are also true, and X and Y are independent .

Independence of Two Discrete Random Variables(2/2) Y X y x f P XY and of range in the ) , ( all for 4 with 3 2 1 | = >

Example In above example, are X and Y independent ?

Solution to Example P(X=1, Y=1)=0.1 fX(Y=1)=0.4 fY(X=1)=0.1 Thus X and Y are dependent!

Two Continuous Random Variables 9

Two Continuous Random Variables A joint probability density function for the continuous random variables X and Y, denoted as fXY(x, y), satisfied the following properties.

Example Suppose X is the time to failure of a component, and Y is also the time to failure of its spare part. If the joint probability density function of X and Y is: Please find P(X≦1000, Y≦2000)=?

Solution to Example 0.915 (p.221 in supplement material)

Marginal Probability Distributions If the joint probability density function of continuous random variables X and Y is fXY (x,y), then the probability density functions of X and Y are Rx denotes the set of all points in the range of (X,Y) for which X=x and Ry denotes the set of all points in the range of (X,Y) for which Y=y

Example From above example, please find P(Y>2000)=?

Solution to Example 0.05 (p.223~p.224)

Conditional Probability Distribution Given continuous random variable X and Y with joint probability density function fXY(x, y) the conditional probability density function of Y given X = x is f ( x , y ) f ( x , y ) = XY Y | x f ( x ) X for all f ( x ) > X

Properties of Conditional Probability Distribution

Example From above example, please find the conditional density function Y given that X=x.

Solution to Example p.225

Properties of Conditional Probability Distribution Conditional mean of Y given X = x Conditional variance of Y given X = x

Independence of Two Continuous Random Variables (1/2) For continuous random variables X and Y, if any one of the following properties is true, then the others are also true, and X and Y are independent .

Independence of Two continuous Random Variables(2/2) 1 ) f ( x , y ) = f ( x ) f ( y ) for all x and y XY X Y ( 2 ) f ( y ) = f ( y ) for all x and y with f ( x ) > Y | x Y X ( 3 ) f ( x ) = f ( x ) for all x and y with f ( y ) > X | y X Y ( 4 ) P ( X A , Y B ) = P ( X A ) P(Y B ) for any sets A and B in the range of X and Y, respectively.

Covariance and Correlation 9

Covariance Expected value of a function of two random variables h(x, y)

Covariance The covariance between the random variables X and Y, denoted as cov(X,Y) or σXY ,

Example(1/3) A financial company use X=1,2,3 to represent low, medium, and high income customers respectively. Also they use Y=1,2,3,4 to represent mutual funds, bonds, stocks, and options respectively. Then the joint probability of X and Y could be represented by following table.

Example(2/3) Y=1 Y=2 Y=3 Y=4 X=1 0.1 X=2 0.2 X=3

Example(3/3) Please find the covariance of X and Y.

Solution to Example

Correlation The correlation between the random variables X and Y, denoted as XY ,

Correlation If X and Y are independent random variables, then

Example From the above example, please find the correlation of X and Y.

Solution to Example

Bivariate Normal Distribution 9

Bivariate Normal Distribution The probability density function of a bivariate normal distribution for -∞<x<∞ and -∞<y<∞, with parameters σX >0, σy >0 , -∞<μX<∞ and -∞< μY <∞, and -1<ρ<1

Bivariate Normal Distribution If X and Y have a bivariate normal distribution with joint probability density fXY(x,y; σX, σY, μX, μY, ρ), then the marginal probability distributions of X and Y are normal with means μX and μY, and standard deviations σX and σY, respectively.

Bivariate Normal Distribution If X and Y have a bivariate normal distribution with joint probability density fXY(x,y; σX, σY, μX, μY, ρ), then the correlation between X and Y is ρ

Bivariate Normal Distribution If X and Y have a bivariate normal distribution with ρ=0, then X and Y are independent.

Linear Combinations of Random Variables 9

Linear Combinations of Random Variables Given random variables X1, X 2, …., Xp and constants c1, c 2, …., cp , then Y= c1 X1+ c 2 X 2 …., cp Xp is a linear combination of X1, X 2, …., Xp .

Linear Combinations of Random Variables If Y= c1 X1+ c 2 X 2 …., cp Xp , then E(Y)= c1 E(X1)+ c 2 E(X 2)…., cp E(Xp)

Linear Combinations of Random Variables If Y= c1 X1+ c 2 X 2 …., cp Xp , then Furthermore, if X1, X 2, …., Xp , are independent, then

Linear Combinations of Random Variables If with E(Xi)=μ for i=1,2,…,p, then X = ( X + X + ... + X ) / p 1 2 p Furthermore, if X1, X 2, …., Xp , are also independent, with for i=1,2,…,p, then V ( X ) = σ 2 i

Reproductive Property of the Normal Distribution If X1, X 2, …., Xp , are independent, normal random variables with E(Xi)=μi and V(Xi)=σi2 , for i=1,2,..,p, then Y= c1 X1+ c 2 X 2 …., cp Xp is a normal random variable with E(Y)= c1 E(X1)+ c 2 E(X 2)…., cp E(Xp) and

Chebyshev’s Inequality 9

Chebyshev’s Inequality For any random variable X with mean μ and variance σ2 , P(|X- μ| ≧ c σ) ≦ 1/c 2 for c >0

Example The process of drilling holes in PCB produces diameters with a standard deviation of 0.01 millimeter. How many diameters must be measured so that the probability is at least 8/9 that the average of the measured diameters is within 0.005 of the process mean.

Solution to Example(1/2) Let X1, X2, …, Xn be the independent random variables that denote the diameters of n holes. So the average measured diameter is :

Solution to Example(2/2) P(|X- μ| ≧ c σ) ≦ 1/c 2 So