Chap 10 More Expectations and Variances Ghahramani 3rd edition

Slides:



Advertisements
Similar presentations
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Advertisements

Independence of random variables
Chapter 2: Probability Random Variable (r.v.) is a variable whose value is unknown until it is observed. The value of a random variable results from an.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Correlation and Simple Regression Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
the Sample Correlation Coefficient
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
Chapter 6 Continuous Random Variables and Probability Distributions
Probability theory 2011 Main topics in the course on probability theory  The concept of probability – Repetition of basic skills  Multivariate random.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Review of Probability and Statistics
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
The Multivariate Normal Distribution, Part 2 BMTRY 726 1/14/2014.
Joint Probability distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Definition of Covariance The covariance of X & Y, denoted Cov(X,Y), is the number where  X = E(X) and  Y = E(Y). Computational Formula:
Jointly Distributed Random Variables
Section 8 – Joint, Marginal, and Conditional Distributions.
Review of Probability Concepts ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes SECOND.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
Lectures prepared by: Elchanan Mossel elena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
1 G Lect 4a G Lecture 4a f(X) of special interest: Normal Distribution Are These Random Variables Normally Distributed? Probability Statements.
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
Chapter 16 Random Variables
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
Probability and Statistics
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Statistics for Business & Economics
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Operations on Multiple Random Variables
The Lognormal Distribution
The Simple Linear Regression Model: Specification and Estimation ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 18 The Lognormal Distribution. Copyright © 2006 Pearson Addison-Wesley. All rights reserved The Normal Distribution Normal distribution.
Chapter Eight Expectation of Discrete Random Variable
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Continuous Random Variables and Probability Distributions
Chapter 4 Multivariate Normal Distribution. 4.1 Random Vector Random Variable Random Vector X X 1, , X p are random variables.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Chapter 5 Joint Probability Distributions and Random Samples
Probability Review for Financial Engineers
Main topics in the course on probability theory
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Lectures prepared by: Elchanan Mossel Yelena Shvets
Review of Probability Concepts
Suppose you roll two dice, and let X be sum of the dice. Then X is
Chap 6 Continuous Random Variables Ghahramani 3rd edition
Chapter 10: Covariance and Correlation
Random Variable X, with pmf p(x) or pdf f(x)
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Chap 4 Distribution Functions and Discrete Random Variables Ghahramani 3rd edition 2019/1/3.
Chap 8 Bivariate Distributions Ghahramani 3rd edition
Chap 7 Special Continuous Distributions Ghahramani 3rd edition
Independence of random variables
Analysis of Engineering and Scientific Data
Handout Ch 4 實習.
Handout Ch 4 實習.
Chap 9 Multivariate Distributions Ghahramani 3rd edition
Chap 11 Sums of Independent Random Variables and Limit Theorem Ghahramani 3rd edition 2019/5/16.
Further Topics on Random Variables: Covariance and Correlation
Chapter 10: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Chapter 10: Covariance and Correlation
Mathematical Expectation
Presentation transcript:

Chap 10 More Expectations and Variances Ghahramani 3rd edition 2017/4/17

Outline 10.1 Expected values of sums of random variables 10.2 Covariance 10.3 Correlation 10.4 Conditioning on random variables 10.5 Bivariate normal distribution

10.1 Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

Expected values of sums of random variables

10.2 Covariance Var(aX+bY)=E[(aX+bY)-E(aX+bY)]2 =E[(aX+bY)-aEX-bEY]2 Motivation: Var(aX+bY)=E[(aX+bY)-E(aX+bY)]2 =E[(aX+bY)-aEX-bEY]2 =E[a[X-EX]+b[Y-EY]]2 =E[a2[X-EX]2+b2[Y-EY]2 +2ab[X-EX][Y-EY]]

Covariance Def Let X and Y be jointly distributed r. v.’s; then the covariance of X and Y is defined by Cov(X, Y)=E[(X-EX)(Y-EY)] Note that Cov(X, X)=Var(X), and also by Cauchy-Schwarz inequality

Covariance Thm 10.4 Var(aX+bY) =a2Var(X)+b2Var(Y)+2abCov(X,Y). In particular, if a=b=1, Var(X+Y)=Var(X)+Var(Y)+2Cov(X,Y)

Covariance

Covariance 1. X and Y are positively correlated if Cov(X,Y) > 0. 2. X and Y are negatively correlated if Cov(X,Y) < 0. 3. X and Y are uncorrelated if Cov(X,Y) = 0.

Covariance If X and Y are independent then Cov(X,Y)=EXY-EXEY=0. But the converse is not true Ex 10.9 Let X be uniformly distributed over (-1,1) and Y=X2. Then Cov(X,Y)=E(X3)-EXE(X2)=0. So X and Y are uncorrelated but surely X and Y are dependent.

Covariance Ex 10.12 Let X be the lifetime of an electronic system and Y be the lifetime of one of its components. Suppose that the electronic system fails if the component does (but not necessarily vice versa). Furthermore, suppose that the joint density function of X and Y (in years) is given by

Covariance (a) Determine the expected value of the remaining lifetime of the component when the system dies. (b) Find the covariance of X and Y. Sol:

Covariance

Covariance

Covariance Ex 10.13 Let X be the number of 6’s in n rolls of a fair die. Find Var(X).

Covariance Sol:

Covariance Ex 10.15 X ~ B(n,p). Find Var(X). Sol:

Covariance Ex 10.16 X ~ NB(r,p). Find Var(X). Sol:

10.3 Correlation Motivation: Suppose X and Y, when measured in centimeters, Cov(X,Y)=0.15. But if we change the measurements to millimeters, the X1=10X and Y1=10Y and Cov(X1,Y1)=Cov(10X,10Y)=100Cov(X,Y)=15 This shows that Cov(X,Y) is sensitive to the units of measurement.

Correlation

Correlation

Correlation

Correlation

Correlation

Correlation

Correlation

Correlation

10.4 Conditioning on random variables Skip 10.4 Conditioning on random variables 10.5 Bivariate normal distribution