Lecture 2 Basics of probability in statistical simulation and stochastic programming Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius,

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Continuous Random Variables Chapter 5 Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama.
Random Variables ECE460 Spring, 2012.
Sampling: Final and Initial Sample Size Determination
Sampling Distributions (§ )
ELEC 303 – Random Signals Lecture 18 – Statistics, Confidence Intervals Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 10, 2009.
Evaluation (practice). 2 Predicting performance  Assume the estimated error rate is 25%. How close is this to the true error rate?  Depends on the amount.
Stochastic Differentiation Lecture 3 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO Working Group on Continuous.
Programme in Statistics (Courses and Contents). Elementary Probability and Statistics (I) 3(2+1)Stat. 101 College of Science, Computer Science, Education.
A gentle introduction to Gaussian distribution. Review Random variable Coin flip experiment X = 0X = 1 X: Random variable.
Statistics Lecture 20. Last Day…completed 5.1 Today Parts of Section 5.3 and 5.4.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
4. Convergence of random variables  Convergence in probability  Convergence in distribution  Convergence in quadratic mean  Properties  The law of.
Standard Normal Distribution
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Continuous random variables Uniform and Normal distribution (Sec. 3.1, )
The moment generating function of random variable X is given by Moment generating function.
Nonlinear Stochastic Programming by the Monte-Carlo method Lecture 4 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO.
Statistical Theory; Why is the Gaussian Distribution so popular? Rob Nicholls MRC LMB Statistics Course 2014.
Lecture II-2: Probability Review
Continuous Probability Distribution  A continuous random variables (RV) has infinitely many possible outcomes  Probability is conveyed for a range of.
R. Kass/S07 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Standard error of estimate & Confidence interval.
Joint Distribution of two or More Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Review of Probability.
Computer Simulation A Laboratory to Evaluate “What-if” Questions.
L7.1b Continuous Random Variables CONTINUOUS RANDOM VARIABLES NORMAL DISTRIBUTIONS AD PROBABILITY DISTRIBUTIONS.
Stochastic Approximation and Simulated Annealing Lecture 8 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO Working.
STA291 Statistical Methods Lecture 16. Lecture 15 Review Assume that a school district has 10,000 6th graders. In this district, the average weight of.
Chapter 14 Monte Carlo Simulation Introduction Find several parameters Parameter follow the specific probability distribution Generate parameter.
1 Lesson 3: Choosing from distributions Theory: LLN and Central Limit Theorem Theory: LLN and Central Limit Theorem Choosing from distributions Choosing.
Physics 270 – Experimental Physics. Standard Deviation of the Mean (Standard Error) When we report the average value of n measurements, the uncertainty.
Stochastic Linear Programming by Series of Monte-Carlo Estimators Leonidas SAKALAUSKAS Institute of Mathematics&Informatics Vilnius, Lithuania
Advanced Higher Statistics Data Analysis and Modelling Hypothesis Testing Statistical Inference AH.
1 6. Reliability computations Objectives Learn how to compute reliability of a component given the probability distributions on the stress,S, and the strength,
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
Mathematics and Statistics Boot Camp II David Siroky Duke University.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Lecture V Probability theory. Lecture questions Classical definition of probability Frequency probability Discrete variable and probability distribution.
Monte-Carlo method for Two-Stage SLP Lecture 5 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO Working Group on Continuous.
Chapter5: Evaluating Hypothesis. 개요 개요 Evaluating the accuracy of hypotheses is fundamental to ML. - to decide whether to use this hypothesis - integral.
The final exam solutions. Part I, #1, Central limit theorem Let X1,X2, …, Xn be a sequence of i.i.d. random variables each having mean μ and variance.
Topic 5 - Joint distributions and the CLT
Sampling and estimation Petter Mostad
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Probability and Moment Approximations using Limit Theorems.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Computacion Inteligente Least-Square Methods for System Identification.
Evaluating Hypotheses. Outline Empirically evaluating the accuracy of hypotheses is fundamental to machine learning – How well does this estimate its.
Sampling Distributions Chapter 18. Sampling Distributions A parameter is a number that describes the population. In statistical practice, the value of.
Random Variables By: 1.
Evaluating Hypotheses. Outline Empirically evaluating the accuracy of hypotheses is fundamental to machine learning – How well does this estimate accuracy.
Inference: Conclusion with Confidence
Appendix A: Probability Theory
Sampling Distributions
The distribution function F(x)
Chapter 7: Sampling Distributions
Decomposition Methods
Probability & Statistics Probability Theory Mathematical Probability Models Event Relationships Distributions of Random Variables Continuous Random.
Lecture 2 – Monte Carlo method in finance
Lecture Slides Elementary Statistics Twelfth Edition
Lecture Slides Elementary Statistics Twelfth Edition
POPULATION (of “units”)
CHAPTER 15 SUMMARY Chapter Specifics
Statistics Lecture 12.
POPULATION (of “units”)
Sampling Distributions (§ )
Experiments, Outcomes, Events and Random Variables: A Revisit
Presentation transcript:

Lecture 2 Basics of probability in statistical simulation and stochastic programming Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO Working Group on Continuous Optimization

Content Random variables and random functions Law of Large numbers Central Limit Theorem Computer simulation of random numbers Estimation of multivariate integrals by the Monte-Carlo method

Simple remark Probability theory displays the library of mathematical probabilistic models Statistics gives us the manual how to choose the probabilistic model coherent with collected data Statistical simulation (Monte-Carlo method) gives us knowledge how to simulate random environment by computer

Random variable Random variable is described by Set of support Probability measure Probability measure is described by distribution function:

Probabilistic measure Probabilistic measure has three components: Continuous; Discrete (integer); Singular.

Continuous r.v. Continuous r.v. is described by probability density function Thus:

Continuous variable If probability measure is absolutely continuous, the expected value of random function:

Discrete variable Discrete r.v. is described by mass probabilities:

Discrete variable If probability measure is discrete, the expected value of random function is sum or series:

Singular variable Singular r.v. probabilistic measure is concentrated on the set having zero Borel measure (say, Kantor set).

Law of Large Numbers (Chebyshev, Kolmogorov) here are independent copies of r. v. ,

What did we learn ? The integral is approximated by the sampling average if the sample size N is large, here is the sample of copies of r.v. , distributed with the density .

Central limit theorem (Gauss, Lindeberg, ...) here

Beri-Essen theorem where

What did we learn ? According to the LLN: Thus, apply CLT to evaluate the statistical error of approximation and its validity.

Example Let some event occurred n times repeating N independent experiments. Then confidence interval of probability of event : (1,96 – 0,975 quantile of normal distribution, confidence interval – 5% ) here If the Beri-Esseen condition is valid: !!!

Statistical integrating … ??? Main idea – to use the gaming of a large number of random events

Statistical integration

Statistical simulation and Monte-Carlo method (Shapiro, (1985), etc)

Simulation of random variables There is a lot of techniques and methods to simulate r.v. Let r.v. be uniformly distributed in the interval (0,1] Then, the random variable , where , is distributed with the cumulative distribution function

N=100, 1000

Wrap-Up and conclusions the expectations of random functions, defined by the multivariate integrals, can be approximated by sampling averages according to the LLN, if the sample size is sufficiently large; the CLT can be applied to evaluate the reliability and statistical error of this approximation