Further Topics on Random Variables: Covariance and Correlation

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Chapter 5 Discrete Random Variables and Probability Distributions
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Chapter 4 Discrete Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Probability theory 2008 Outline of lecture 5 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different.
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Random Variable and Probability Distribution
1 Random Variables and Discrete probability Distributions SESSION 2.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Definition of Covariance The covariance of X & Y, denoted Cov(X,Y), is the number where  X = E(X) and  Y = E(Y). Computational Formula:
Operations on Multiple Random Variables
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Continuous Random Variables and Probability Distributions
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
1 Two Discrete Random Variables The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is placed on each possible.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Inference about the slope parameter and correlation
Chapter 5 Joint Probability Distributions and Random Samples
Jiaping Wang Department of Mathematical Science 04/22/2013, Monday
Inequalities, Covariance, examples
STATISTICS Joint and Conditional Distributions
STATISTICS Random Variables and Distribution Functions
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Two Discrete Random Variables
Tutorial 8: Further Topics on Random Variables 1
Chapter 10: Covariance and Correlation
Tutorial 9: Further Topics on Random Variables 2
Random Variable X, with pmf p(x) or pdf f(x)
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Simple Linear Regression and Correlation
Handout Ch 4 實習.
IE 360: Design and Control of Industrial Systems I
pairing data values (before-after, method1 vs
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Berlin Chen Department of Computer Science & Information Engineering
Discrete Random Variables: Expectation, Mean and Variance
Further Topics on Random Variables: 1
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Covariance and Correlation
Independence and Counting
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Derived Distributions
Experiments, Outcomes, Events and Random Variables: A Revisit
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Introduction to Probability: Solutions for Quizzes 4 and 5
Propagation of Error Berlin Chen
Discrete Random Variables: Expectation, Mean and Variance
Discrete Random Variables: Expectation, Mean and Variance
Quick Review of Probability
Independence and Counting
Berlin Chen Department of Computer Science & Information Engineering
Independence and Counting
Berlin Chen Department of Computer Science & Information Engineering
Propagation of Error Berlin Chen
Chapter 10: Covariance and Correlation
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Chapter 10: Covariance and Correlation
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Continuous Random Variables: Basics
Presentation transcript:

Further Topics on Random Variables: Covariance and Correlation Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University Reference: - D. P. Bertsekas, J. N. Tsitsiklis, Introduction to Probability , Sections4.2

Covariance (1/3) The covariance of two random variables and is defined by An alternative formula is A positive or negative covariance indicates that the values of and tend to have the same or opposite sign, respectively A few other properties

Covariance (2/3) Note that if and are independent Therefore Thus, if and are independent, they are also uncorrelated However, the converse is generally not true! (See Example 4.13)

Covariance (3/3) Example 4.13. The pair of random variables (X, Y ) takes the values (1, 0), (0, 1), (−1, 0), and (0,−1), each with probability ¼ Thus, the marginal pmfs of X and Y are symmetric around 0, and E[X] = E[Y ] = 0 Furthermore, for all possible value pairs (x, y), either x or y is equal to 0, which implies that XY = 0 and E[XY ] = 0. Therefore, cov(X, Y ) = E[(X − E[X] )(Y − E[Y ])] = E[XY ] = 0, and X and Y are uncorrelated However, X and Y are not independent since, for example, a nonzero value of X fixes the value of Y to zero

Correlation (1/3) Also denoted as “Correlation Coefficient” The correlation coefficient of two random variables and is defined as It can be shown that (see the end-of-chapter problems) : positively correlated : negatively correlated : uncorrelated

Correlation (2/3) It can be shown that if and only if there exists a positive (or negative, respectively) constant such that

Correlation (3/3) Figure 4.11: Examples of positively (a) and negatively (b) correlated random variables (a) (b)

An Example Consider independent tosses of a coin with probability of a head to . Let and be the numbers of heads and tails, respectively, and let us look at the correlation coefficient of and .

Variance of the Sum of Random Variables If are random variables with finite variance, we have More generally, See the textbook for the proof of the above formula and see also Example 4.15 for the illustration of this formula

An Example Example 4.15. Consider the hat problem discussed in Section 2.5, where people throw their hats in a box and then pick a hat at random. Let us find the variance of , the number of people who pick their own hat. ?