4.1 Mathematical Expectation

Slides:



Advertisements
Similar presentations
JMB Chapter 6 Part 1 v2 EGR 252 Spring 2009 Slide 1 Continuous Probability Distributions Many continuous probability distributions, including: Uniform.
Advertisements

Chapter 4 Mathematical Expectation.
Lecture note 6 Continuous Random Variables and Probability distribution.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Discrete Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Business 90: Business Statistics
CONTINUOUS RANDOM VARIABLES. Continuous random variables have values in a “continuum” of real numbers Examples -- X = How far you will hit a golf ball.
Today Today: More Chapter 3 Reading: –Please read Chapter 3 –Suggested Problems: 3.2, 3.9, 3.12, 3.20, 3.23, 3.24, 3R5, 3R9.
SOME STATISTICAL CONCEPTS Chapter 3 Distributions of Data Probability Distribution –Expected Rate of Return –Variance of Returns –Standard Deviation –Covariance.
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
The role of probability in statistics In statistical inference, we want to make general statements about the population based on measurements taken from.
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
Calculating Expected Return
Lecture Presentation Software to accompany Investment Analysis and Portfolio Management Seventh Edition by Frank K. Reilly & Keith C. Brown Chapter 7.
Chapter 5 Discrete Random Variables and Probability Distributions ©
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Variance and Covariance
Continuous Distributions The Uniform distribution from a to b.
JMB Chapter 5 Part 2 EGR Spring 2011 Slide 1 Multinomial Experiments  What if there are more than 2 possible outcomes? (e.g., acceptable, scrap,
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
1 1 Slide IS 310 – Business Statistics IS 310 Business Statistics CSU Long Beach.
JMB Chapter 3 Lecture 1 9th edEGR Slide 1 Chapter 3: Random Variables and Probability Distributions  Definition and nomenclature  A random variable.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
1 Estimating Return and Risk Chapter 7 Jones, Investments: Analysis and Management.
Chapter 7 Expected Return and Risk. Explain how expected return and risk for securities are determined. Explain how expected return and risk for portfolios.
JMB Chapter 3 Lecture 1EGR Spring 2008 Slide 1 Defining Probabilities: Random Variables  Examples:  Out of 100 heart catheterization procedures.
Math 4030 – 6a Joint Distributions (Discrete)
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
EGR Joint Probability Distributions The probabilities associated with two things both happening, e.g. … –probability associated with the hardness.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
President UniversityErwin SitompulPBST 4/1 Dr.-Ing. Erwin Sitompul President University Lecture 4 Probability and Statistics
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
7-1 Chapter 7 Charles P. Jones, Investments: Analysis and Management, Tenth Edition, John Wiley & Sons Prepared by G.D. Koppenhaver, Iowa State University.
Chapter 7 Lesson 7.4a Random Variables and Probability Distributions 7.4: Mean and Standard Deviation of a Random Variable.
Chapter 5 Joint Probability Distributions and Random Samples
Chapter 4 Mathematical Expectation.
Covariance/ Correlation
Covariance/ Correlation
Variance and Covariance
Math a Discrete Random Variables
4.1 Mathematical Expectation
Reference: (Material source and pages)
Covariance/ Correlation
Covariance/ Correlation
Means and Variances of Random Variables
Chapter 4: Mathematical Expectation:
4.1 Mathematical Expectation
Introduction to Probability & Statistics The Central Limit Theorem
Discrete Probability Distributions
Continuous Probability Distributions Part 2
4.1 Mathematical Expectation
4.1 Mathematical Expectation
Continuous Probability Distributions Part 2
Covariance/ Correlation
Continuous Probability Distributions Part 2
Chapter 3: Random Variables and Probability Distributions
Chapter 5: Discrete Probability Distributions
Chapter 2. Random Variables
Chapter 3: Random Variables and Probability Distributions
Chapter 3: Random Variables and Probability Distributions
4.1 Mathematical Expectation
4.1 Mathematical Expectation
Discrete Random Variables and Probability Distributions
Continuous Probability Distributions Part 2
4.1 Mathematical Expectation
Mathematical Expectation
Presentation transcript:

4.1 Mathematical Expectation Example: Repair costs for a particular machine are represented by the following probability distribution: What is the expected value of the repairs? That is, over time what do we expect repairs to cost on average? x $50 $200 $350 P(X = x) 0.3 0.2 0.5 The expected value or mean of a probability distribution is the long run theoretical average. JMB Chapter 4 Lecture 1 EGR 252 2011

Expected Value – Repair Costs μ = mean of the probability distribution For discrete variables, μ = E(X) = ∑ x f(x) So, for our example, E(X) = 50(0.3) + 200(0.2) + 350(0.5) = $230 E(x) <weighted average> E(X) = 50(0.3) + 200(0.2) + 350(0.5) = $230 JMB Chapter 4 Lecture 1 EGR 252 2011

Another Example – Investment By investing in a particular stock, a person can take a profit in a given year of $4000 with a probability of 0.3 or take a loss of $1000 with a probability of 0.7. What is the investor’s expected gain on the stock? X $4000 -$1000 P(X) 0.3 0.7 E(X) = $4000 (0.3) -$1000(0.7) = $500 X 4000 -1000 P(X) 0.3 0.7 E(X) = 4000 (0.3) -1000(0.7) = 500 JMB Chapter 4 Lecture 1 EGR 252 2011

Expected Value - Continuous Variables For continuous variables, μ = E(X) = E(X) = ∫ x f(x) dx Vacuum cleaner example: problem 7 pg. 88 x, 0 < x < 1 f(x) = 2-x, 1 ≤ x < 2 0, elsewhere (in hundreds of hours.) { = 1 * 100 = 100.0 hours of operation annually, on average JMB Chapter 4 Lecture 1 EGR 252 2011

Functions of Random Variables Ex 4.4. pg. 111: Probability of X, the number of cars passing through a car wash in one hour on a sunny Friday afternoon, is given by Let g(X) = 2X -1 represent the amount of money paid to the attendant by the manager. What can the attendant expect to earn during this hour on any given sunny Friday afternoon? E[g(X)] = Σ g(x) f(x) = Σ (2X-1) f(x) = (2*4-1)(1/12) +(2*5-1)(1/12) …+(2*9-1)(1/6) = $12.67 x 4 5 6 7 8 9 P(X = x) 1/12 1/4 1/6 Σ (2x-1) f(x) = 7(1/12) + 9(1/12) … + 17(1/6) = $12.67 JMB Chapter 4 Lecture 1 EGR 252 2011

4.2 Variance of a Random Variable Recall our example: Repair costs for a particular machine are represented by the following probability distribution: What is the variance of the repair cost? That is, how might we quantify the spread of costs? x $50 200 350 P(X = x) 0.3 0.2 0.5 JMB Chapter 4 Lecture 1 EGR 252 2011

Variance – Discrete Variables For discrete variables, σ2 = E [(X - μ)2] = ∑ (x - μ)2 f(x) = E (X2) - μ2 Recall, for our example, μ = E(X) = $230 Preferred method of calculation: σ2 = [E(X2)] – μ2 = 502 (0.3) + 2002 (0.2) + 3502 (0.5) – 2302 = $17,100 Alternate method of calculation: σ2 = E(X- μ)2 f(x) = (50-230)2 (0.3) + (200-230)2 (0.2) + (350-230)2 (0.5) = $17,100 E(X2) – μ2 = 502 (0.3) + 2002 (0.2) + 3502 (0.5) – 2302 = $17,100 JMB Chapter 4 Lecture 1 EGR 252 2011

Variance - Investment Example By investing in a particular stock, a person can take a profit in a given year of $4000 with a probability of 0.3 or take a loss of $1000 with a probability of 0.7. What are the variance and standard deviation of the investor’s gain on the stock? E(X) = $4000 (0.3) -$1000 (0.7) = $500 σ2 = [∑(x2 f(x))] – μ2 = (4000)2(0.3) + (-1000)2(0.7) – 5002 = $5,250,000 σ = $2291.29 X 4000 -1000 P(X) 0.3 0.2 E(X) = 4000 (0.3) -1000(0.7) = 500 σ2 = ∑(x2 f(x)) –μ2 = (4000)2(0.3) + (-1000)2(0.7) – 5002 = $5,250,000 σ =$2291.29 JMB Chapter 4 Lecture 1 EGR 252 2011

Variance of Continuous Variables For continuous variables, σ2 = E [(X - μ)2] =[∫ x2 f(x) dx] – μ2 Recall our vacuum cleaner example pr. 7 pg. 88 x, 0 < x < 1 f(x) = 2-x, 1 ≤ x < 2 0, elsewhere (in hundreds of hours of operation.) What is the variance of X? The variable is continuous, therefore we will need to evaluate the integral. { E(X) = ∫x2 f(x)dx – μ2 JMB Chapter 4 Lecture 1 EGR 252 2011

Variance Calculations for Continuous Variables (Preferred calculation) What is the standard deviation? σ = 0.4082 hours [∫01 x3 dx + ∫12x2 (2-x)dx] – μ2 = x4/4 |10 + (2x3/3 – x4/4)|12 - 12 = 0.1667 σ = 0.4082 JMB Chapter 4 Lecture 1 EGR 252 2011

Covariance/ Correlation A measure of the nature of the association between two variables Describes a potential linear relationship Positive relationship Large values of X result in large values of Y Negative relationship Large values of X result in small values of Y “Manual” calculations are based on the joint probability distributions Statistical software is often used to calculate the sample correlation coefficient (r) JMB Chapter 4 Lecture 1 EGR 252 2011

What if the distribution is unknown? Chebyshev’s theorem: The probability that any random variable X will assume a value within k standard deviations of the mean is at least 1 – 1/k2. That is, P(μ – kσ < X < μ + kσ) ≥ 1 – 1/k2 “Distribution-free” theorem – results are weak If we believe we “know” the distribution, we do not use Chebyshev’s theorem to characterize variability JMB Chapter 4 Lecture 1 EGR 252 2011