Class notes for ISE 201 San Jose State University

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

MGMT 242 Spring, 1999 Random Variables and Probability Distributions Chapter 4 “Never draw to an inside straight.” from Maxims Learned at My Mother’s Knee.
Chapter 5 Discrete Random Variables and Probability Distributions
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Basic Business Statistics.
1 MF-852 Financial Econometrics Lecture 3 Review of Probability Roy J. Epstein Fall 2003.
Lecture 2 Today: Statistical Review cont’d:
Lecture note 6 Continuous Random Variables and Probability distribution.
AP Statistics Chapter 16. Discrete Random Variables A discrete random variable X has a countable number of possible values. The probability distribution.
Chapter 4 Discrete Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Probability Densities
Introduction to Econometrics The Statistical Analysis of Economic (and related) Data.
Class notes for ISE 201 San Jose State University
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Statistics.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 10 Notes Class notes for ISE 201 San Jose State University.
Class notes for ISE 201 San Jose State University
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Continuous Random Variables and Probability Distributions
Review of Probability and Statistics
Class notes for ISE 201 San Jose State University
Class notes for ISE 201 San Jose State University
Joint Probability distribution
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
4.1 Mathematical Expectation
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 11 Section 1 – Slide 1 of 34 Chapter 11 Section 1 Random Variables.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Variance and Covariance
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Basic Business Statistics.
PROBABILITY CONCEPTS Key concepts are described Probability rules are introduced Expected values, standard deviation, covariance and correlation for individual.
Binomial Experiment A binomial experiment (also known as a Bernoulli trial) is a statistical experiment that has the following properties:
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
5-1 Random Variables and Probability Distributions The Binomial Distribution.
Statistics for Business & Economics
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
1 COVARIANCE, COVARIANCE AND VARIANCE RULES, AND CORRELATION Covariance The covariance of two random variables X and Y, often written  XY, is defined.
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
AP Statistics Chapter 16. Discrete Random Variables A discrete random variable X has a countable number of possible values. The probability distribution.
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
Continuous Random Variables and Probability Distributions
Chapter 3 Discrete Random Variables and Probability Distributions  Random Variables.2 - Probability Distributions for Discrete Random Variables.3.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U Authors: Gary Greer (with.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Chapter 4 Discrete Random Variables and Probability Distributions
The simple linear regression model and parameter estimation
Variance and Covariance
Math a Discrete Random Variables
Basic statistics Usman Roshan.
Keller: Stats for Mgmt & Econ, 7th Ed
4.1 Mathematical Expectation
Chapter 16.
Chapter 3: Getting the Hang of Statistics
Discrete Distributions
4.1 Mathematical Expectation
4.1 Mathematical Expectation
Chapter 3: Getting the Hang of Statistics
AP Statistics Chapter 16 Notes.
Financial Econometrics Fin. 505
Further Topics on Random Variables: Covariance and Correlation
4.1 Mathematical Expectation
Further Topics on Random Variables: Covariance and Correlation
4.1 Mathematical Expectation
Mathematical Expectation
Presentation transcript:

Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 4 Notes Class notes for ISE 201 San Jose State University Industrial & Systems Engineering Dept. Steve Kennedy 1

Mean of a Set of Observations Suppose an experiment involves tossing 2 coins. The result is either 0, 1, or 2 heads. Suppose the experiment is repeated 15 times, and suppose that 0 heads is observed 3 times, 1 head 8 times, and 2 heads 4 times. What is the average number of heads flipped? x bar = (0+0+0+1+1+1+1+1+1+1+1+2+2+2+2) / 15 = ((0)(3) + (1)*(8) + (2)*(4)) / 15 = 1.07 This could also be written as a weighted average, x bar = (0)(3/15) + (1)(8/15) + (2)(4/15) = 1.07 where 3/15, 8/15, etc. are the fraction of times the given number of heads came up. The average is also called the mean.

Mean of a Random Variable A similar technique, taking the probability of an outcome times the value of the random variable for that outcome, is used to calculate the mean of a random variable. The mean or expected value  of a random variable X with probability distribution f (x), is  = E (X) = x x f(x) if discrete, or  = E (X) = x x f(x) dx if continuous

Mean of a Random Variable Depending on X If X is a random variable with distribution f(x). The mean g(X) of the random variable g(X) is g(X) = E [g(X)] = x g(x) f(x) if discrete, or g(X) = E [g(X)] = x g(x) f(x) dx if continuous

Expected Value for a Joint Distribution If X and Y are random variables with joint probability distribution f (x,y). The mean or expected value g(X,Y) of the random variable g (X,Y) is g(X,Y) = E [g(X,Y)] = x y g(x,y) f(x,y) if discrete, or g(X,Y) = E [g(X,Y)] = x y g(x,y) f(x,y) dy dx if continuous Note that the mean of a distribution is a single value, so it doesn't make sense to talk of the mean the distribution f (x,y).

Variance What was the variance of a set of observations? The variance 2 of a random variable X with distribution f(x) is 2 = E [(X - )2] = x (x - )2 f(x) if discrete, or 2 = E [(X - )2] = x (x - )2 f(x) dx if continuous An equivalent and easier computational formula, also easy to remember, is 2 = E [X2] - E [X]2 = E [X2] - 2 "The expected value of X2 - the expected value of X...squared." Derivation from the previous formula is simple.

Variance of a Sample There's also a somewhat similar, better computational formula for s2. What is s2? What was the original formula for the variance of a sample? The formula is

Covariance If X and Y are random variables with joint probability distribution f (x,y), the covariance, XY , of X and Y is defined as XY = E [(X - X)(Y - Y)] The better computational formula for covariance is XY = E (XY) - X Y Note that although the standard deviation  can't be negative, the covariance XY can be negative. Covariance will be useful later when looking at the linear relationship between two random variables.

Correlation Coefficient If X and Y are random variables with covariance XY and standard deviations X and Y respectively, the correlation coefficient XY is defined as XY = XY / ( X Y ) Correlation coefficient notes: What are the units of XY ? What is the possible range of XY ? What is the meaning of the correlation coefficient? If XY = 1 or -1, then there is an exact linear relationship between Y and X (i.e., Y = a + bX). If XY = 1, then b > 0, and if XY = -1, then b < 0. Can show this by calculating the covariance of X and a + bX, which simplifies to b / b2 = 1.

Linear Combinations of Random Variables If a and b are constants, E (aX + b) = a E(X) + b Also holds if a = 0 or b = 0. If we add two functions, E [g(X)  h(X)] = E [g(X)]  E [h(X)] Also true for functions of two or more random variables. That is, E [g(X,Y)  h(X,Y)] = E [g(X,Y)]  E [h(X,Y)]

Functions of Two or More Random Variables The expected value of the sum of two random variables is equal to the sum of the expected values. E (X  Y) = E(X)  E(Y) The expected value of the product of two independent random variables is equal to the product of the expected values. E (X Y) = E(X) E(Y)

Variance Relationships For a random variable X with variance 2 2aX + b = a2 X2 So adding a constant does what? And multiplying by a constant does what? For two random variables X and Y, 2aX + bY = a2 X2 + b2 Y2 + 2abXY What if X and Y are independent? XY = 0. Note that the correlation coefficient is also 0.

Chebyshev's Inequality The probability that any random variable X will assume a value within k standard deviations of the mean is at least 1 - 1/k2. That is P ( - k < X <  + k)  1 - 1/k2 This theorem is both very general and very weak. Very general, since it holds for any probability distribution. Very weak for the same reason, because it is a worst-case limit that holds for any distribution. If we know the distribution, we can get a better limit than this (how?), so this is only used when the distribution is unknown. Care must be taken, however, not to assume an underlying distribution when the distribution is really unknown.