Variance and Covariance

Slides:



Advertisements
Similar presentations
Chapter 3 Properties of Random Variables
Advertisements

Chapter 2.3 Counting Sample Points Combination In many problems we are interested in the number of ways of selecting r objects from n without regard to.
Chapter 4 Mathematical Expectation.
Lecture note 6 Continuous Random Variables and Probability distribution.
Probability Densities
Discrete Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Chapter 6 Continuous Random Variables and Probability Distributions
Statistical Background
CHAPTER 6 Statistical Analysis of Experimental Data
Continuous Random Variables and Probability Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
Eighth lecture Random Variables.
Modern Navigation Thomas Herring
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
4.1 Mathematical Expectation
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Discrete Random Variables Chapter 4.
Chapter 7: Random Variables
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
Chapter 5 Discrete Random Variables and Probability Distributions ©
+ The Practice of Statistics, 4 th edition – For AP* STARNES, YATES, MOORE Chapter 6: Random Variables Section 6.1 Discrete and Continuous Random Variables.
QBM117 Business Statistics Probability and Probability Distributions Continuous Probability Distributions 1.
 A probability function is a function which assigns probabilities to the values of a random variable.  Individual probability values may be denoted by.
Review of Probability Concepts ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes SECOND.
1 G Lect 3b G Lecture 3b Why are means and variances so useful? Recap of random variables and expectations with examples Further consideration.
Chapter 6 Random Variables
Some Continuous Probability Distributions
Review of Probability Concepts ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Continuous Distributions The Uniform distribution from a to b.
 A probability function is a function which assigns probabilities to the values of a random variable.  Individual probability values may be denoted by.
Variables and Random Variables àA variable is a quantity (such as height, income, the inflation rate, GDP, etc.) that takes on different values across.
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 5 Discrete Random Variables.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 7 Sampling Distributions.
1 1 Slide IS 310 – Business Statistics IS 310 Business Statistics CSU Long Beach.
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Probability Distributions. Statistical Experiments – any process by which measurements are obtained. A quantitative variable x, is a random variable if.
Topic 5 - Joint distributions and the CLT
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
President UniversityErwin SitompulPBST 3/1 Dr.-Ing. Erwin Sitompul President University Lecture 3 Probability and Statistics
 A probability function is a function which assigns probabilities to the values of a random variable.  Individual probability values may be denoted.
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Continuous Random Variables and Probability Distributions
Lecture 1: Basic Statistical Tools. A random variable (RV) = outcome (realization) not a set value, but rather drawn from some probability distribution.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
President UniversityErwin SitompulPBST 4/1 Dr.-Ing. Erwin Sitompul President University Lecture 4 Probability and Statistics
President UniversityErwin SitompulPBST 10/1 Lecture 10 Probability and Statistics Dr.-Ing. Erwin Sitompul President University
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
President UniversityErwin SitompulPBST 9/1 Lecture 9 Probability and Statistics Dr.-Ing. Erwin Sitompul President University
Copyright © Cengage Learning. All rights reserved. 8 PROBABILITY DISTRIBUTIONS AND STATISTICS.
Sampling Distributions
Sampling Distribution Estimation Hypothesis Testing
Discrete Random Variables
Variance and Covariance
Section 7.3: Probability Distributions for Continuous Random Variables
Continuous Random Variables
Chapter 4: Mathematical Expectation:
4.1 Mathematical Expectation
Review of Probability Concepts
Virtual University of Pakistan
4.1 Mathematical Expectation
4.1 Mathematical Expectation
数据的矩阵描述.
Chapter 2. Random Variables
4.1 Mathematical Expectation
Discrete Random Variables and Probability Distributions
4.1 Mathematical Expectation
Mathematical Expectation
Presentation transcript:

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance The mean or expected value of a random variable X is important because it describes the center of the probability distribution. However, the mean does not give adequate description of the shape and variability in the distribution. Distribution with equal means but different dispersions (variability) The most important measure of variability of a random variable X is obtained by letting g(X) = (X– μ)2. This variability measure is referred to as the variance of the random variable X or the variance of the probability distribution of X. It is denoted by Var(X) or the symbol , or simply by .

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance Let X be a random variable with probability distribution f(x) and mean μ. The variance of X is if X is discrete, and if X is continuous. The positive square root of the variance, σ, is called the standard deviation of X.

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance Company A Let the random variable X represent the number of cars that are used for official business purposes on any given workday. The probability distribution for company A and company B are Company B Show that the variance of the probability distribution for company B is greater than that of company A. Clearly, the variance of the number of cars that are used for official business purposes is greater for company B than for company A.

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance The variance of a random variable X is also given by Let the random variable X represent the number of defective parts for a machine when 3 parts are sampled from a production line and tested. The following is the probability distribution of X Calculate the variance σ2.

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance The weekly demand for a drinking-water product, in thousands liters, from a local chain of efficiency stores, is a continuous random variable X having the probability density Find the mean and variance of X.

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance Let X be a random variable with probability distribution f(x). The variance of the random variable g(X) is if X is discrete, and if X is continuous.

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance Calculate the variance of g(X) = 2X + 3, where X is a random variable with probability distribution given as

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance Let X be a random variable with density function Find the variance of the random variable g(X) = 4X + 3 if it is known that the expected value of g(X) = 8.

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance Let X and Y be a random variables with probability distribution f(x, y). The covariance of the random variables X and Y is if X and Y are discrete, and if X and Y are continuous. σXY >0, Positive correlation σXY <0 Negative correlation

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance The covariance of two random variables X and Y with means μX and μY, respectively, is given by

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance Referring back again to the “ballpoint pens” example, find the covariance of X and Y. See again Lecture 4

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance The fraction X of male runners and the fraction Y of female runners who compete in marathon races is described by the joint density function Find the covariance of X and Y

Variance and Covariance Chapter 4.2 Variance and Covariance Variance and Covariance Although the covariance between two random variables does provide information regarding the nature of the relationship, the magnitude of σXY does not indicate anything regarding the strength of the relationship, since σXY is not scale free. This means, that its magnitude will depend on the units measured for both X and Y. There is a scale-free version of the covariance called the correlation coefficient, that is used widely in statistics. Let X and Y be random variables with covariance σXY and standard deviation σX and σY, respectively. The correlation coefficient X and Y is

Means of Linear Combinations of X Chapter 4.3 Means and Variances of Linear Combinations of Random Variables Means of Linear Combinations of X If a and b are constant, then Applying theorem to the discrete random variable g(X) = 2X – 1, rework the carwash example.

Means of Linear Combinations of X Chapter 4.3 Means and Variances of Linear Combinations of Random Variables Means of Linear Combinations of X Let X be a random variable with density function Find the expected value of g(X) = 4X + 3 by using the theorem presented recently.

Means of Linear Combinations of X Chapter 4.3 Means and Variances of Linear Combinations of Random Variables Means of Linear Combinations of X The expected value of the sum or difference of two or more functions of a random variable X is the sum or difference of the expected values of the functions. That is Let X be a random variable with probability distribution as given next. Find the expected value of Y = (X – 1)2.

Means of Linear Combinations of X Chapter 4.3 Means and Variances of Linear Combinations of Random Variables Means of Linear Combinations of X The weekly demand for a certain drink, in thousands of liters, at a chain of convenience stores is a continuous random variable g(X) = X2 + X – 2, where X has the density function Find the expected value for the weekly demand of the drink.

Means of Linear Combinations of X Chapter 4.3 Means and Variances of Linear Combinations of Random Variables Means of Linear Combinations of X The expected value of the sum or difference of two or more functions of a random variables X and Y is the sum or difference of the expected values of the functions. That is Let X and Y be two independent random variables. Then

Means of Linear Combinations of X Chapter 4.3 Means and Variances of Linear Combinations of Random Variables Means of Linear Combinations of X In producing gallium-arsenide microchips, it is known that the ratio between gallium and arsenide is independent of producing a high percentage of workable wafers, which are the main components of microchips. Let X denote the ratio of gallium to arsenide and Y denote the percentage of workable microwafers retrieved during a 1-hour period. X and Y are independent random variables with the joint density being known as Illustrate that E(XY) = E(X)E(Y).

Means of Linear Combinations of X Chapter 4.3 Means and Variances of Linear Combinations of Random Variables Means of Linear Combinations of X Hence, it is proven that

Means of Linear Combinations of X Chapter 4.3 Means and Variances of Linear Combinations of Random Variables Means of Linear Combinations of X If a and b are constant, then If X and Y are random variables with joint probability distribution f(x, y), then

Means of Linear Combinations of X Chapter 4.3 Means and Variances of Linear Combinations of Random Variables Means of Linear Combinations of X If X and Y are random variables with variances , , and covariance σXY = –2, find the variance of the random variable Z = 3X – 4Y + 8. Let X and Y denote the amount of two different types of impurities in a batch of a certain chemical product. Suppose that X and Y are independent random variables with variances and . Find the variance of the random variable Z = 3X – 2Y + 5.

Chapter 4.4 Chebyshev’s Theorem Chebyshev’s Theorem As we already discussed, the variance of a random variable tells us something about the variability of the observation about the mean. If a variable has a small variance or standard deviation, we would expect most of the values to be grouped around the mean. The probability that a random variable assumes a value within a certain interval about the main is greater in this case. If we think of probability in terms of area, we would expect a continuous distribution with a small standard deviation to have most of its area close to μ. Variability of continuous observations about the mean

Chapter 4.4 Chebyshev’s Theorem Chebyshev’s Theorem We can argue the same way for a discrete distribution. The spread out of an area in the probability histogram indicates a more variable distribution of measurements or outcomes. Variability of discrete observations about the mean

Chapter 4.4 Chebyshev’s Theorem Chebyshev’s Theorem A Russian mathematician P. L. Chebyshev discovered that the fraction of the area between any two values symmetric about the mean is related to the standard deviation. |Chebyshev’s Theorem| The probability that any random variable X will assume a value within k standard deviations of the mean is at least 1 – 1/k2. That is Chebyshev’s Theorem holds for any distribution of observations and, for this reason, the results are usually weak. The value given by the theorem is a lower bound only. Exact probabilities can only be determined when the probability distribution is known. The use of Chebyshev’s Theorem is relegated to situations where the form of the distribution is unknown.

Chebyshev’s Theorem and Normal Distribution Chapter 4.4 Chebyshev’s Theorem Chebyshev’s Theorem and Normal Distribution

Chapter 4.4 Chebyshev’s Theorem Chebyshev’s Theorem A random variable X has a mean μ = 8, a variance σ2 = 9, and an unknown probability distribution. Find P(–4 < X < 20) P(|X – 8| ≥ 6)

Probability and Statistics Homework 5 For the joint probability distribution of the two random variables X and Y as given in the following figure, calculate the covariance of X and Y. (Mo.E5.27 p.0172) The photoresist thickness in semiconductor manufacturing has a mean of 10 micrometers and a standard deviation of 1 micrometer. Bound the probability that the thickness is less than 6 or greater than 14 micrometers. (Mo.S5.25 p05.15)