Lebesgue measure: Lebesgue measure m0 is a measure on i.e., 1. 2. disjoint It generalizes the concept of length on
Lebesgue integral Example
Lebesgue integral Example In other words, the value of the integral is independent of the representation of the simple functions in this example
The Lebesgue integral: The Lebesgue integral is defined using Lebesgue measure - For indicator functions, - For simple funcitons, - For non-negative functions, - For general functions,
General Probability Spaces: (W, P) is a probability triple 1. W, a nonempty set, called the sample space, which contains all possible outcomes of the same random experiment. 2. a s-algebra of subsets of W. 3. P, a probability measure on (W, ), i.e., a function which assigns to each set a number representing the probability that the outcome of the random experiment lies in the set A.
Integration using general probability measure: Let X be a random variable on 1. Indicator function: 2. Simple function:
Integration of random variables 3. X is nonnegative: (Xn: simple function) 4. General:
Expectation and other properties: c: constant if if A and B are disjoint
Monotone Convergence Theorem: Let Xn, n = 1, 2,…. be a sequence of functions converging almost surely to a random variable X, i.e., a.s. Assume that a.s. Then or equivalently,
Probability measure induced (引導)by a random variable: X is a random variable on (W, P) We could define the expectation of X as This does not look like familiar in the old definition … at least to me A more familiar density formula can be derived from the so-called induced measure using X (random variable)
Induced measure: For a random variable X on we write So The induced measure of B is a measure on s.t. In fact, the induced measure LX is a probability measure, because
Expectation using density formula: We now have two measures on LX(A); the induced measure, m0; Lebesgue measure The two measures on are connected through a “density” (if exists) satisfying i.e., under a certain condition, there exists j s.t. where j is the Radon-Nikodym derivative of LX wrt. m0:
Suppose f is a function on Then we have Density formula Expectation of f To prove this, we use the “standard machine.”
Standard machine: Prove !! Step 1: Start with the assumption that f: indicator function. Step 2: Then extend to the simple function case. Step 3: Construct a sequence of nonnegative simple functions which converges to a nonnegative function f. Use the Monotone Convergence Theorem to get the integral. Step 4: For a general (integrable) function f, first split into positive and negative parts, and integrate them separately.
Probability distributions using Lebesgue measure: Uniform distribution on [0, 1] For W = [0, 1], B([0, 1]), let Then LX is a probability measure because m0([0, 1]) = 1. Standard normal distribution For let To compute LX(A), one can also use the Riemann integral,
Independence: Definition 1.8: We say that two sets are independent if Definition 1.9: We say that two s-algebras are independent if Definition 1.9: We say that two random variables, X and Y, are independent if s-algabra generated by these random variables are independent, i.e., s(X) and s(Y) are independent.
Independence of two functions: If two random variables, X and Y, are independent, then two functions, g(X) and h(Y), are also independent. Proof: At first, recall that For each there exists s.t. Therefore Similarly, Since s(X) and s(Y) are independent, we conclude s(g(X)) and s(h(Y)) are independent.
Variance, covariance, and correlation: If two random variables X and Y are independent, More generally, if X and Y are independent
Theory Application Discrete Continuous Understand the important concept of conditional expectation
Definition of conditional expectation: Probability triple G: sub-s-algebra of X: random variable on The conditional expectation of X given G is a G-measurable random variable Y satisfying We write Y=E( X | G ). Conditional expectation always exists, if Conditional expectation is unique.
Illustrative example: Binomial process given by 3 coin tosses : stock price at time k p is the probability of H q=1-p is the probability of T W={HHH,HHT,HTH,HTT,….} s-algebra of subsets of W The coin tosses are independent filtration
Expectation and partial averages: where We will compute a partial average of X on a sub-s-algebra of in the 3 coin toss example.
and unions of these sets} Let X = S3(w) and
Let s(S2)-measurable random variable Similarly, one can show that holds for every set in s(S3) with We also write instead of
N coin tosses: Start from S0 : deteministic t=k+1 p q t=k Sk uSk dSk p is the probability of H q=1-p is the probability of T W: sample space s-algebra of subsets of W The coin tosses are independent filtration We compute
First of all, note that, if (k+1)-th entry there is always
Let -measurable random variable Conditional expectation (denoted by )
Conditional expectation: t=k+1 p q t=k Sk uSk dSk In this case, becomes a function of Sk which gives an estimate based on the information of Sk.