3.1 Expectation Expectation Example

Slides:



Advertisements
Similar presentations
Random Variables ECE460 Spring, 2012.
Advertisements

Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Probability Densities
Review.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Continuous Random Variables and Probability Distributions
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Continuous Random Variables and Probability Distributions.
Variance Fall 2003, Math 115B. Basic Idea Tables of values and graphs of the p.m.f.’s of the finite random variables, X and Y, are given in the sheet.
Lecture II-2: Probability Review
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Continuous Probability Distribution  A continuous random variables (RV) has infinitely many possible outcomes  Probability is conveyed for a range of.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
2.1 Random Variable Concept Given an experiment defined by a sample space S with elements s, we assign a real number to every s according to some rule.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
Continuous Distributions The Uniform distribution from a to b.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
One Random Variable Random Process.
1 Continuous Probability Distributions Continuous Random Variables & Probability Distributions Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering.
Random Variables Presentation 6.. Random Variables A random variable assigns a number (or symbol) to each outcome of a random circumstance. A random variable.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
Operations on Multiple Random Variables
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Continuous Random Variables and Probability Distributions
One Function of Two Random Variables
Random Variables By: 1.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Continuous Distributions
Expectations of Random Variables, Functions of Random Variables
Chapter Five The Binomial Probability Distribution and Related Topics
MECH 373 Instrumentation and Measurements
3. Random Variables (Fig.3.1)
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
ECE 313 Probability with Engineering Applications Lecture 7
Section 7.3: Probability Distributions for Continuous Random Variables
Expectations of Random Variables, Functions of Random Variables
Applied Discrete Mathematics Week 11: Relations
SOME IMPORTANT PROBABILITY DISTRIBUTIONS
Cumulative distribution functions and expected values
Appendix A: Probability Theory
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Graduate School of Information Sciences, Tohoku University
Review of Probability and Estimators Arun Das, Jason Rebello
Mean & Variance of a Distribution
Example Suppose X ~ Uniform(2, 4). Let . Find .
STOCHASTIC HYDROLOGY Random Processes
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Functions of Random variables
2.1 Properties of PDFs mode median expectation values moments mean
RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC.
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
Chapter 2. Random Variables
9. Two Functions of Two Random Variables
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Derived Distributions
Experiments, Outcomes, Events and Random Variables: A Revisit
8. One Function of Two Random Variables
Continuous Distributions
Random variable. Let (, , P) is probability space.
Further Topics on Random Variables: Derived Distributions
Continuous Random Variables: Basics
Presentation transcript:

3.1 Expectation Expectation Example 3.1-1 ... The process of averaging when a r.v. is involved Notation : “the mathematical expectation of X”, “the expected value of X”, “the mean value of X”, “the statistical average of X” Example 3.1-1 90 people are randomly selected 8, 12, 28, 22, 15, and 5 people have 18¢, 45¢, 64¢, 72¢, 77¢, and 95¢, respectively. - ... These terms correspond to PDF

Expected Value of a Random Variable 3.1 Expectation Expected Value of a Random Variable In the previous example, if X is the discrete r.v. “fractional dollar value of pocket coins,” it has 90 discrete values xi that occur with probabilities P(xi), and its expected value E[X] is xi : the fractional dollar coin P(xi) : the ratio of the number of people for the given dollar value to the total number of people

3.1 Expectation The expected value of any r.v. X If X happen to be discrete with N possible values xi having probabilities P(xi) of occurrence

3.1 Expectation Example 3.1-2 Determine the mean value of the following continuous, exponentially distributed r.v.

3.1 Expectation Expected Value of a Function of a r.v. Example 3.1-3 The expected value of a real function g(·) of X If X is a discrete r.v. Example 3.1-3 A particular random voltage V having Rayleigh r.v., a = 0, b = 5 The power Y = g(V) = V2 Find the average power

3.1 Expectation Example 3.1-4 Entropy

3.1 Expectation g(X) : a sum of N functions gn(X), n = 1,2,…,N The expected value of the sum of N functions of a r.v. X = the sum of the N expected values of the individual function of the r.v. Conditional Expected Value Conditional expected value of X

Conditional pdf & mean (rolling a die) 3.1 Expectation Conditional pdf & mean (rolling a die) 1 2 3 4 5 6 1/6 1 2 3 4 5 6 1/3

Moments about the Origin The nth moment denoted by mn The function g(X) = Xn, n = 0,1,… in m0 = 1 : the area of the function fX(x) m1 = E[X] ; the expected value of X The 1st order moment is very similar to the center of gravity

3.2 Moments Consider a stick with a uniform density of 1 kg/m and length of 4 m  total weight = 1 kg/m x 4 m = 4 kg Where is the center of gravity ? If the density is not uniform, then fx(x) shall be a function of the density of stick. 5 1

3.2 Moments Central Moments Moments about the mean value of X The function g(X) = (X-E[X])n, n = 0,1,… 0 = 1 : the area of fX(x) 1 = 0

3.2 Moments Variance and Skew Variance Standard deviation The 2nd central moment 2 : Standard deviation The positive square root of variance A measure of the spread in the function fX(x) about the mean Variance can be found from a knowledge of the 1st and 2nd moments

3.2 Moments Example 3.2-1 Let X have the exponential density function Find the variance of X

The third central moment 3.2 Moments The third central moment A measure of the asymmetry of fX(x) about The skew of the density function Zero skew : if a density is symmetric about The normalized third central moment is known as the skewness of the density function, or, alternatively, as the coefficient of skewness

3.2 Moments Example 3.2-2 Continue Example 3.2-1. Exponential density Compute the skew and coefficient of skewness

Chebychev’s Inequality 3.2 Moments Chebychev’s Inequality For a r.v. X with mean and variance

3.2 Moments The Week Law of Large Numbers If X1, X2, ..., Xn is a sequence of independent r. v. with identical PDF’s and E[Xi]=, Var[Xi]=2, Ex) How many samples should be taken if we want to have a prob. of at least 0.95 that the sample mean will not deviate by more than /10 from the true mean ?

3.2 Moments Example 3.2-3 Find the largest probability that any r.v.’s values are smaller than its mean by 3 standard deviation or larger than its mean by the same amount

An alternative form of Chebychev’s inequality 3.2 Moments An alternative form of Chebychev’s inequality If for a r.v., then for any If the variance of a r.v. X approaches zero, the probability approaches 1, i.e. X will equal its mean value Markov’s Inequality For a nonnegative r.v. X

3.4 Transformations of a R.V. We may wish to transform (change) one r.v. X into a new r.v. Y by means of a transformation The density function fX(x) or distribution function FX(x) of X is known, and the problem is to determine the density function fY(y) or distribution function FY(y) of Y. The problem : viewed as a “black box” with input X, output Y, and “transfer characteristic” Y = T(X). X : discrete, continuous, or mixed r.v. T : linear, nonlinear, segmented, staircase, etc

3.4 Transformations of a R.V. A R.V. and an increasing transformation A B C D E 1 2 3 4 5 9 16 25 Sample Space X(s) Y

3.4 Transformations of a R.V. Monotonic Transformations of a Continuous Random Variable A transformation T is called Monotonically increasing if T(x1)<T(x2) for any x1<x2 Monotonically decreasing if T(x1)>T(x2) for any x1<x2 Consider the increasing transformation T : continuous and differentiable at all values of x which fX(x)0 Let Y have a value y0 corresponding to the value x0 of X in shown Figure T-1 : the inverse of the transformation T

3.4 Transformations of a R.V. The one-to-one correspondence between X and Y The probability of the event {Yy0} must equal the probability of the event {Xx0} Differentiate both side w.r.t y0

3.4 Transformations of a R.V. A R. V. and a decreasing transformation A B C D E 1 2 3 4 5 -1 -4 -9 -16 -25 Sample Space X(s) Y

3.4 Transformations of a R.V. Consider the decreasing transformation

3.4 Transformations of a R.V. Example 3.4-1 Y = T(X) = aX+b, a,b : any real constant X = T-1(Y) = (Y-b)/a, dx/dy = 1/a X : Gaussian r.v. A linear transformation of a Gaussian r.v. produces another Gaussian r.v See also that Ex.5-1 in Papoulis’ (pp. 124)

Note) a=2, b=1

3.4 Transformations of a R.V. A R. V. and a non-monotonic transformation A B C D E 1 2 3 4 5 5 8 9 Sample Space X(s) Y

3.4 Transformations of a R.V. Nonmonotonic Transformations of a Continuous r.v. : A transformation may not be monotonic There may now be more than one interval of values of X that corresponds to the event P{Yy0} For the values y0 shown in Figure, the event {Yy0} corresponds to the event {Xx1 and x2  X  x3} The probability of the event {Yy0} now equals the probability of the event {x values yielding Yy0}  {x|Yy0} poof) see pp. 130, Papoulis’

3.4 Transformations of a R.V. Example 3.4-2 Find fY(y) for the square-law transformation Y=T(X)=cX2, c>0 (Sol-1) cdf  differentiate cdf  pdf See also that Ex. 5-2 on pp. 125, Papoulis’

3.4 Transformations of a R.V. (Sol-2) using the formula

3.4 Transformations of a R.V. Transformation of a Discrete r.v. If X is a discrete r.v. while Y = T(X) is a continuous transformation If the transformation is monotonic, there is one-to-one correspondence between X and Y so that a set {yn} corresponds to the set {xn} through the equation yn=T(xn) The probability P(yn) equals P(xn) If T is not monotonic, the above procedure remains valid except there now exists the possibility that more than one value xn corresponds to a value yn In such a case P(yn) will equal the sum of the probabilities of the various xn which yn = T(xn)

3.4 Transformations of a R.V. Example 3.4-3 Consider a discrete r.v. X having values x = -1, 0, 1, and 2 with respective probabilities 0.1, 0.3, 0.4, and 0.2 Consider Y = 2-X2+(X3/3) Find the density of Y The values of X map to respective values of Y given by y = 2/3, 2, 4/3 and 2/3 The two values x = -1 and x = 2 map to one value y = 2/3 The probability of {Y = 2/3} is the sum of probabilities P{X = -1} and P{X = 3}

3.5 Computer Generation of One Random Variable How to generate the samples with the distribution we want Suppose that we have the samples following uniform distribution How do we do for it? linear transformation T(x)

3.5 Computer Generation of One Random Variable We assume initially that T(X) is a monotonically nondecreasing functions so that applies For uniform X, FX(x) = x where 0<x<1 We solve for the inverse the above eq. Since any distribution function FY(y) is nondecreasing, its inverse is nondecreasing, and the initial assumption is always satisfied Given a specified distribution FY(y) for Y, we find the inverse function by solving FY(y) = x for y  this result is T(x)

3.5 Computer Generation of One Random Variable Example 3.5-1 Find the transformation required to generate the Rayleigh random variable with a = 0 We solve for y and find

3.5 Computer Generation of One Random Variable Example 3.5-2 Find the required transformation to convert (0,1) uniform r.v. to a r.v. with the arcsine distribution

3.5 Computer Generation of One Random Variable Example 3.5-3

Homework assignment Programming Due : next Tuesday 2-1) Generate 10000 random numbers on [0,1] and plot their density function Hint) c-functions : rand(), srand() 2-2) Transform the 10000 random numbers of 1) using the linear transform obtained in Prob)3.5-1, and plot their density function Due : next Tuesday