Definition of Covariance The covariance of X & Y, denoted Cov(X,Y), is the number where  X = E(X) and  Y = E(Y). Computational Formula:

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Correlation and regression Dr. Ghada Abo-Zaid
Chapter 5 Discrete Random Variables and Probability Distributions
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Independence of random variables
Chapter 2: Probability Random Variable (r.v.) is a variable whose value is unknown until it is observed. The value of a random variable results from an.
Chapter 4 Discrete Random Variables and Probability Distributions
The Simple Linear Regression Model: Specification and Estimation
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
the Sample Correlation Coefficient
Chapter 6 Continuous Random Variables and Probability Distributions
Chap 10 More Expectations and Variances Ghahramani 3rd edition
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
The Simple Regression Model
Review of Probability and Statistics
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Random Variable and Probability Distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
1 Random Variables and Discrete probability Distributions SESSION 2.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Lecture 5 Correlation and Regression
Section 8 – Joint, Marginal, and Conditional Distributions.
 The relationship of two quantitative variables.
Random variables Petter Mostad Repetition Sample space, set theory, events, probability Conditional probability, Bayes theorem, independence,
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 11 – Derived distributions, covariance, correlation and convolution Dr. Farinaz Koushanfar.
S1: Chapter 6 Correlation Dr J Frost Last modified: 21 st November 2013.
Two Random Variables W&W, Chapter 5. Joint Distributions So far we have been talking about the probability of a single variable, or a variable conditional.
COVARIANCE DAN CORRELATION PORTFOLIO Pertemuan 11 Matakuliah: F Analisis Kuantitatif Tahun: 2009.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
1 G Lect 8b G Lecture 8b Correlation: quantifying linear association between random variables Example: Okazaki’s inferences from a survey.
Basic Statistics Correlation Var Relationships Associations.
Copyright © 2011 Pearson Education, Inc. Association between Random Variables Chapter 10.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
Chapters 7 and 10: Expected Values of Two or More Random Variables
Continuous Distributions The Uniform distribution from a to b.
1 G Lect 4a G Lecture 4a f(X) of special interest: Normal Distribution Are These Random Variables Normally Distributed? Probability Statements.
Variables and Random Variables àA variable is a quantity (such as height, income, the inflation rate, GDP, etc.) that takes on different values across.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Statistics for Business & Economics
1 G Lect 2w Review of expectations Conditional distributions Regression line Marginal and conditional distributions G Multiple Regression.
The Simple Linear Regression Model: Specification and Estimation ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Measures of Association: Pairwise Correlation
Chapter 5 Joint Probability Distributions The adventure continues as we consider two or more random variables all at the same time. Chapter 5B Discrete.
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
Correlation Analysis. 2 Introduction Introduction  Correlation analysis is one of the most widely used statistical measures.  In all sciences, natural,
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The Simple Regression Model
Lectures prepared by: Elchanan Mossel Yelena Shvets
Chapter 10: Covariance and Correlation
Multinomial Distribution
Tutorial 9: Further Topics on Random Variables 2
How accurately can you (1) predict Y from X, and (2) predict X from Y?
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Two-way analysis of variance (ANOVA)
Independence of random variables
Handout Ch 4 實習.
Handout Ch 4 實習.
IE 360: Design and Control of Industrial Systems I
Further Topics on Random Variables: Covariance and Correlation
Chapter 10: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Chapter 10: Covariance and Correlation
Mathematical Expectation
Presentation transcript:

Definition of Covariance The covariance of X & Y, denoted Cov(X,Y), is the number where  X = E(X) and  Y = E(Y). Computational Formula:

Variance of a Sum

Covariance and Independence If X & Y are independent, then Cov(X,Y) = 0. If Cov(X,Y) = 0, it is not necessarily true that X & Y are independent!

The Sign of Covariance If the sign of Cov(X,Y) is positive, above-average values of X tend to be associated with above-average values of Y and below-average values of X tend to be associated with below- average values of Y. If the sign of Cov(X,Y) is negative, above-average values of X tend to be associated with below-average values of Y and vice versa. If the Cov(X,Y) is zero, no such association exists between the variables X and Y.

Correlation The sign of the covariance has a nice interpretation, but its magnitude is more difficult to interpret. It is easier to interpret the correlation of X and Y. Correlation is a kind of standardized covariance, and

Conditions for X & Y to be Uncorrelated The following conditions are equivalent: Corr(X,Y) = 0 Cov(X,Y) = 0 E(XY) = E(X)E(Y) in which case X and Y are uncorrelated. Independent variables are uncorrelated. Uncorrelated variables are not necessarily independent!

Let (X, Y) have uniform distribution on the four points (-1,0), (0,1), (0,-1) and (1,0). Show that X and Y are uncorrelated but not independent. What is the variance of X + Y?

Let T 1 and T 3 be the times of the first and third arrivals in a Poisson process with rate. –Find Corr(T 1,T 3 ). –What is the variance of T 1 + T 3 ?