Lebesgue measure: Lebesgue measure m0 is a measure on i.e., 1. 2.

Slides:



Advertisements
Similar presentations
Order Statistics The order statistics of a set of random variables X1, X2,…, Xn are the same random variables arranged in increasing order. Denote by X(1)
Advertisements

The Spectral Representation of Stationary Time Series.
Random Variables ECE460 Spring, 2012.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Independence of random variables
Review of Basic Probability and Statistics
Class notes for ISE 201 San Jose State University
2.3 General Conditional Expectations 報告人:李振綱. Review Def (P.51) Let be a nonempty set. Let T be a fixed positive number, and assume that for each.
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Chapter 2 Random variables 2.1 Random variables Definition. Suppose that S={e} is the sampling space of random trial, if X is a real-valued function.
Probability and Measure September 2, Nonparametric Bayesian Fundamental Problem: Estimating Distribution from a collection of Data E. ( X a distribution-valued.
One Random Variable Random Process.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Consistency An estimator is a consistent estimator of θ, if , i.e., if
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
1.6 Change of Measure 1. Introduction We used a positive random variable Z to change probability measures on a space Ω. is risk-neutral probability measure.
Statistical Estimation Vasileios Hatzivassiloglou University of Texas at Dallas.
Distribution Function properties Let us consider the experiment of tossing the coin 3 times and observing the number of heads The probabilities and the.
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Chapter 4 Discrete Random Variables and Probability Distributions
Random Variables By: 1.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Primbs, MS&E345 1 Measure Theory in a Lecture. Primbs, MS&E345 2 Perspective  -Algebras Measurable Functions Measure and Integration Radon-Nikodym Theorem.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Applied statistics Usman Roshan.
Expectations of Random Variables, Functions of Random Variables
3. Random Variables (Fig.3.1)
Lecture 3 B Maysaa ELmahi.
Math a Discrete Random Variables
Expectations of Random Variables, Functions of Random Variables
What is Probability? Quantification of uncertainty.
CONTINUOUS RANDOM VARIABLES
Appendix A: Probability Theory
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
The distribution function F(x)
Parameter, Statistic and Random Samples
The Bernoulli distribution
3.1 Expectation Expectation Example
Chapter 3 Discrete Random Variables and Probability Distributions
Some Rules for Expectation
More about Posterior Distributions
Multinomial Distribution
Tutorial 9: Further Topics on Random Variables 2
Distributions and expected value
STOCHASTIC HYDROLOGY Random Processes
Addition of Independent Normal Random Variables
13. The Weak Law and the Strong Law of Large Numbers
The Spectral Representation of Stationary Time Series
Chapter 3 Brownian Motion 洪敏誠 2009/07/31 /23.
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
11. Conditional Density Functions and Conditional Expected Values
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Experiments, Outcomes, Events and Random Variables: A Revisit
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Discrete Random Variables and Probability Distributions
Berlin Chen Department of Computer Science & Information Engineering
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Continuous Distributions
13. The Weak Law and the Strong Law of Large Numbers
Fundamental Sampling Distributions and Data Descriptions
Presentation transcript:

Lebesgue measure: Lebesgue measure m0 is a measure on i.e., 1. 2. disjoint It generalizes the concept of length on

Lebesgue integral Example

Lebesgue integral Example In other words, the value of the integral is independent of the representation of the simple functions in this example

The Lebesgue integral: The Lebesgue integral is defined using Lebesgue measure - For indicator functions, - For simple funcitons, - For non-negative functions, - For general functions,

General Probability Spaces: (W, P) is a probability triple 1. W, a nonempty set, called the sample space, which contains all possible outcomes of the same random experiment. 2. a s-algebra of subsets of W. 3. P, a probability measure on (W, ), i.e., a function which assigns to each set a number representing the probability that the outcome of the random experiment lies in the set A.

Integration using general probability measure: Let X be a random variable on 1. Indicator function: 2. Simple function:

Integration of random variables 3. X is nonnegative: (Xn: simple function) 4. General:

Expectation and other properties: c: constant if if A and B are disjoint

Monotone Convergence Theorem: Let Xn, n = 1, 2,…. be a sequence of functions converging almost surely to a random variable X, i.e., a.s. Assume that a.s. Then or equivalently,

Probability measure induced (引導)by a random variable: X is a random variable on (W, P) We could define the expectation of X as This does not look like familiar in the old definition … at least to me A more familiar density formula can be derived from the so-called induced measure using X (random variable)

Induced measure: For a random variable X on we write So The induced measure of B is a measure on s.t. In fact, the induced measure LX is a probability measure, because

Expectation using density formula: We now have two measures on LX(A); the induced measure, m0; Lebesgue measure The two measures on are connected through a “density” (if exists) satisfying i.e., under a certain condition, there exists j s.t. where j is the Radon-Nikodym derivative of LX wrt. m0:

Suppose f is a function on Then we have Density formula Expectation of f To prove this, we use the “standard machine.”

Standard machine: Prove !! Step 1: Start with the assumption that f: indicator function. Step 2: Then extend to the simple function case. Step 3: Construct a sequence of nonnegative simple functions which converges to a nonnegative function f. Use the Monotone Convergence Theorem to get the integral. Step 4: For a general (integrable) function f, first split into positive and negative parts, and integrate them separately.

Probability distributions using Lebesgue measure: Uniform distribution on [0, 1] For W = [0, 1], B([0, 1]), let Then LX is a probability measure because m0([0, 1]) = 1. Standard normal distribution For let To compute LX(A), one can also use the Riemann integral,

Independence: Definition 1.8: We say that two sets are independent if Definition 1.9: We say that two s-algebras are independent if Definition 1.9: We say that two random variables, X and Y, are independent if s-algabra generated by these random variables are independent, i.e., s(X) and s(Y) are independent.

Independence of two functions: If two random variables, X and Y, are independent, then two functions, g(X) and h(Y), are also independent. Proof: At first, recall that For each there exists s.t. Therefore Similarly, Since s(X) and s(Y) are independent, we conclude s(g(X)) and s(h(Y)) are independent.

Variance, covariance, and correlation: If two random variables X and Y are independent, More generally, if X and Y are independent

Theory Application Discrete Continuous Understand the important concept of conditional expectation

Definition of conditional expectation: Probability triple G: sub-s-algebra of X: random variable on The conditional expectation of X given G is a G-measurable random variable Y satisfying We write Y=E( X | G ). Conditional expectation always exists, if Conditional expectation is unique.

Illustrative example: Binomial process given by 3 coin tosses : stock price at time k p is the probability of H q=1-p is the probability of T W={HHH,HHT,HTH,HTT,….} s-algebra of subsets of W The coin tosses are independent filtration

Expectation and partial averages: where We will compute a partial average of X on a sub-s-algebra of in the 3 coin toss example.

and unions of these sets} Let X = S3(w) and

Let s(S2)-measurable random variable Similarly, one can show that holds for every set in s(S3) with We also write instead of

N coin tosses: Start from S0 : deteministic t=k+1 p q t=k Sk uSk dSk p is the probability of H q=1-p is the probability of T W: sample space s-algebra of subsets of W The coin tosses are independent filtration We compute

First of all, note that, if (k+1)-th entry there is always

Let -measurable random variable Conditional expectation (denoted by )

Conditional expectation: t=k+1 p q t=k Sk uSk dSk In this case, becomes a function of Sk which gives an estimate based on the information of Sk.