CLT and Levy Stable Processes John Rundle Econophysics PHYS 250

Slides:



Advertisements
Similar presentations
Estimation of Means and Proportions
Advertisements

STAT 497 APPLIED TIME SERIES ANALYSIS
ELEC 303 – Random Signals Lecture 18 – Statistics, Confidence Intervals Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 10, 2009.
Chapter 6 Continuous Random Variables and Probability Distributions
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 6-1 Chapter 6 The Normal Distribution and Other Continuous Distributions.
Continuous Random Variables and Probability Distributions
TOPIC 5 Normal Distributions.
The Lognormal Distribution
QA in Finance/ Ch 3 Probability in Finance Probability.
B AD 6243: Applied Univariate Statistics Understanding Data and Data Distributions Professor Laku Chidambaram Price College of Business University of Oklahoma.
Theory of Probability Statistics for Business and Economics.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
1 Chapter 7 Sampling Distributions. 2 Chapter Outline  Selecting A Sample  Point Estimation  Introduction to Sampling Distributions  Sampling Distribution.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
B AD 6243: Applied Univariate Statistics Data Distributions and Sampling Professor Laku Chidambaram Price College of Business University of Oklahoma.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
CHAPTER 2.3 PROBABILITY DISTRIBUTIONS. 2.3 GAUSSIAN OR NORMAL ERROR DISTRIBUTION  The Gaussian distribution is an approximation to the binomial distribution.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
THE NORMAL DISTRIBUTION
Chapter 6 The Normal Distribution and Other Continuous Distributions
The normal distribution
Confidence Intervals Cont.
Introduction to Probability - III John Rundle Econophysics PHYS 250
Chapter 5 Confidence Interval
Continuous Probability Distributions
Supplemental Lecture Notes
ASV Chapters 1 - Sample Spaces and Probabilities
Normal Distribution and Parameter Estimation
Chapter 4 Continuous Random Variables and Probability Distributions
12. Principles of Parameter Estimation
Statistical Analysis Urmia University
3. The X and Y samples are independent of one another.
Joint Probability Distributions and Random Samples
Sampling Distributions and Estimation
MTH 161: Introduction To Statistics
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Chapter 5 Joint Probability Distributions and Random Samples
Distribution of the Sample Means
Sample Mean Distributions
Frequency and Distribution
Chapter 7: Sampling Distributions
Lecture 13 Sections 5.4 – 5.6 Objectives:
Gaussian (Normal) Distribution
3.1 Expectation Expectation Example
Chapter 4 – Part 3.
Gaussian (Normal) Distribution
ASV Chapters 1 - Sample Spaces and Probabilities
Econ 3790: Business and Economics Statistics
Econometric Models The most basic econometric model consists of a relationship between two variables which is disturbed by a random error. We need to use.
Lecture 2 – Monte Carlo method in finance
The normal distribution
Chapter 5 Continuous Random Variables and Probability Distributions
STOCHASTIC HYDROLOGY Random Processes
Virtual University of Pakistan
Lecture 7 Sampling and Sampling Distributions
LECTURE 09: BAYESIAN LEARNING
PROBABILITY DISTRIBUTION
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
Lecture 2: Signals Concepts & Properties
12. Principles of Parameter Estimation
Further Topics on Random Variables: 1
Chapter 5 Continuous Random Variables and Probability Distributions
The Normal Distribution
Continuous Random Variables: Basics
Presentation transcript:

CLT and Levy Stable Processes John Rundle Econophysics PHYS 250 Lecture 12 CLT and Levy Stable Processes John Rundle Econophysics PHYS 250

Central Limit Theorem https://en. wikipedia In probability theory, the central limit theorem (CLT) establishes that, for the most commonly studied scenarios, when independent random variables are added, their sum tends toward a normal distribution (commonly known as a bell curve) even if the original variables themselves are not normally distributed. In more precise terms, given certain conditions, the arithmetic mean of a sufficiently large number of iterates of independent random variables, each with a well-defined (finite) expected value and finite variance, will be approximately normally distributed, regardless of the underlying distribution. The theorem is a key concept in probability theory because it implies that probabilistic and statistical methods that work for normal distributions can be applicable to many problems involving other types of distributions.

Central Limit Theorem https://en. wikipedia To illustrate the meaning of the theorem, suppose that a sample is obtained containing a large number of observations, each observation being randomly generated in a way that does not depend on the values of the other observations, and that the arithmetic average of the observed values is computed. If this procedure is performed many times, the central limit theorem says that the computed values of the average will be distributed according to the normal distribution (commonly known as a "bell curve"). A simple example of this is that if one flips a coin many times the probability of getting a given number of heads in a series of flips should follow a normal curve, with mean equal to half the total number of flips in each series.

Central Limit Theorem https://en. wikipedia

Central Limit Theorem http://www. slideshare

Levy Stable Processes https://en. wikipedia In probability theory, a distribution or a random variable is said to be stable if a linear combination of two independent copies of a random sample has the same distribution, up to location and scale parameters. The stable distribution family is also sometimes referred to as the Lévy alpha-stable distribution, after Paul Lévy, the first mathematician to have studied it. Of the four parameters defining the family, most attention has been focused on the stability parameter, α (see following). Stable distributions have 0 < α ≤ 2, with the upper bound corresponding to the normal distribution, and α = 1 to the Cauchy distribution. The distributions have undefined variance for α < 2, and undefined mean for α ≤ 1. The importance of stable probability distributions is that they are "attractors" for properly normed sums of independent and identically-distributed (iid) random variables.

Levy Stable Processes https://en. wikipedia A non-degenerate distribution is a stable distribution if it satisfies the following property: Let X1 and X2 be independent copies of a random variable X. Then X is said to be stable if for any constants a > 0 and b > 0 the random variable aX1 + bX2 has the same distribution as cX + d for some constants c > 0 and d. The distribution is said to be strictly stable if this holds with d = 0. Since the normal distribution, the Cauchy distribution, and the Lévy distribution all have the above property, it follows that they are special cases of stable distributions. Such distributions form a four-parameter family of continuous probability distributions parametrized by location and scale parameters μ and c, respectively, and two shape parameters β and α, roughly corresponding to measures of asymmetry and concentration, respectively (see the figures).

Levy Stable Processes https://en. wikipedia Of the four parameters defining the family, most attention has been focused on the stability parameter, α (see panel). Stable distributions have 0 < α ≤ 2, with the upper bound corresponding to the normal distribution, and α = 1 to the Cauchy distribution. The distributions have undefined variance for α < 2, and undefined mean for α ≤ 1. The importance of stable probability distributions is that they are "attractors" for properly normed sums of independent and identically-distributed (iid) random variables.

Levy Stable Processes https://en. wikipedia By the classical central limit theorem the properly normed sum of a set of random variables, each with finite variance, will tend towards a normal distribution as the number of variables increases. Without the finite variance assumption, the limit may be a stable distribution that is not normal. Mandelbrot referred to such distributions as "stable Paretian distributions”, after Vilfredo Pareto. In particular, he referred to those maximally skewed in the positive direction with 1<α<2 as "Pareto-Lévy distributions”,which he regarded as better descriptions of stock and commodity prices than normal distributions.

Fourier Transforms https://en.wikipedia.org/wiki/Fourier_transform The Fourier transform decomposes a function of time (a signal) into the frequencies that make it up, in a way similar to how a musical chord can be expressed as the frequencies (or pitches) of its constituent notes. The Fourier transform of a function of time itself is a complex-valued function of frequency, whose absolute value represents the amount of that frequency present in the original function, and whose complex argument is the phase offset of the basic sinusoid in that frequency. The Fourier transform is called the frequency domain representation of the original signal.

Fourier Transforms https://en.wikipedia.org/wiki/Fourier_transform The term Fourier transform refers to both the frequency domain representation and the mathematical operation that associates the frequency domain representation to a function of time. The Fourier transform is not limited to functions of time, but in order to have a unified language, the domain of the original function is commonly referred to as the time domain. For many functions of practical interest, one can define an operation that reverses this, the inverse Fourier transformation of a frequency domain representation combines the contributions of all the different frequencies to recover the original function of time.

Fourier Transforms https://en.wikipedia.org/wiki/Fourier_transform

Fourier Transforms https://en.wikipedia.org/wiki/Fourier_transform

Levy Stable Processes https://en. wikipedia

Levy Stable Processes https://en. wikipedia

Infinitely Divisible Processes https://en. wikipedia In probability theory, a probability distribution is infinitely divisible if it can be expressed as the probability distribution of the sum of an arbitrary number of independent and identically distributed random variables. The characteristic function of any infinitely divisible distribution is then called an infinitely divisible characteristic function. More rigorously, the probability distribution F is infinitely divisible if, for every positive integer n, there exist n independent identically distributed random variables Xn1, ..., Xnn whose sum Sn = Xn1 + … + Xnn has the distribution F.

Infinitely Divisible Processes https://en. wikipedia Every infinitely divisible probability distribution corresponds in a natural way to a Lévy process. A Lévy process is a stochastic process { Lt : t ≥ 0 } with stationary independent increments Stationary means that for s < t, the probability distribution of Lt − Ls depends only on t − s and where independent increments means that that difference Lt − Ls is independent of the corresponding difference on any interval not overlapping with [s, t], and similarly for any finite number of mutually non-overlapping intervals.

Infinitely Divisible Processes https://en. wikipedia The Poisson distribution, the negative binomial distribution, the Gamma distribution and the degenerate distribution are examples of infinitely divisible distributions As are the normal distribution, Cauchy distribution and all other members of the stable distribution family. The uniform distribution and the binomial distribution are not infinitely divisible, as are all distribution with bounded (finite) support. The Student's t-distribution is infinitely divisible, while the distribution of the reciprocal of a random variable having a Student's t-distribution, is not.

Levy Stable Processes https://en. wikipedia A generalized central limit theorem Another important property of stable distributions is the role that they play in a generalized central limit theorem. The central limit theorem states that the sum of a number of independent and identically distributed (i.i.d.) random variables with finite variances will tend to a normal distribution as the number of variables grows. A generalization due to Gnedenko and Kolmogorov states that the sum of a number of random variables with symmetric distributions having power-law tails (Paretian tails), decreasing as |x|−α−1 where 0 < α < 2 (and therefore having infinite variance), will tend to a stable distribution f ( x ; α , 0 , c , 0 ) as the number of summands grows. If α>2 then the sum converges to a stable distribution with stability parameter equal to 2, i.e. a Gaussian distribution.[10]

Levy Stable Processes https://en. wikipedia

Levy Stable Processes - Phase Diagram https://en. wikipedia From: Martin Sewall, Characterization of Financial Time Series, UCL Research Note RN/11/01 January 20, 2011

Comparison to Data http://finance. martinsewell

Comparison to Data http://finance. martinsewell PDF of returns for the Shanghai market data with Δt = 1 (daily returns) This plot is compared to a stable symmetric Levy distribution using the value α = 1.44 determined from the slope [in a log-log plot of the central peak of the PDF as a function of the time increment]. Two attempts to fit a Gaussian are also shown. The wider Gaussian is chosen to have the same standard deviation as the empirical data. However, the peak in the data is much narrower and higher than this Gaussian, and the tails are fatter. The narrower Gaussian is chosen to fit the central portion, however the standard deviation is now too small. It can be seen that the data has tails which are much fatter and furthermore have a non-Gaussian functional dependence." Johnson, Jefferies and Hui (2003)