Intensive Actuarial Training for Bulgaria January 2007 Lecture 0 – Review on Probability Theory By Michael Sze, PhD, FSA, CFA.

Slides:



Advertisements
Similar presentations
Special random variables Chapter 5 Some discrete or continuous probability distributions.
Advertisements

Exponential Distribution. = mean interval between consequent events = rate = mean number of counts in the unit interval > 0 X = distance between events.
Discrete Uniform Distribution
Important Random Variables Binomial: S X = { 0,1,2,..., n} Geometric: S X = { 0,1,2,... } Poisson: S X = { 0,1,2,... }
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Chapter 1 Probability Theory (i) : One Random Variable
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
Geometric Random Variables N ~ Geometric(p) # Bernoulli trials until the first success pmf: f(k) = (1-p) k-1 p memoryless: P(N=n+k | N>n) = P(N=k) –probability.
1 Review of Probability Theory [Source: Stanford University]
P robability Midterm Practice Condition Independence 郭俊利 2009/04/13.
Lesson #15 The Normal Distribution. For a truly continuous random variable, P(X = c) = 0 for any value, c. Thus, we define probabilities only on intervals.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Continuous random variables Uniform and Normal distribution (Sec. 3.1, )
Probability and Statistics Review
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Probability Mass Function Expectation 郭俊利 2009/03/16
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Ya Bao Fundamentals of Communications theory1 Random signals and Processes ref: F. G. Stremler, Introduction to Communication Systems 3/e Probability All.
The Erik Jonsson School of Engineering and Computer Science Chapter 2 pp William J. Pervin The University of Texas at Dallas Richardson, Texas.
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Mutually Exclusive: P(not A) = 1- P(A) Complement Rule: P(A and B) = 0 P(A or B) = P(A) + P(B) - P(A and B) General Addition Rule: Conditional Probability:
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
DATA ANALYSIS Module Code: CA660 Lecture Block 3.
IRDM WS Chapter 2: Basics from Probability Theory and Statistics 2.1 Probability Theory Events, Probabilities, Random Variables, Distributions,
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Moment Generating Functions
Ex St 801 Statistical Methods Probability and Distributions.
Day 2 Review Chapters 5 – 7 Probability, Random Variables, Sampling Distributions.
Continuous Distributions The Uniform distribution from a to b.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Properties of expectation.
ENGR 610 Applied Statistics Fall Week 3 Marshall University CITE Jack Smith.
Probability Theory Overview and Analysis of Randomized Algorithms Prepared by John Reif, Ph.D. Analysis of Algorithms.
Convergence in Distribution
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
The Erik Jonsson School of Engineering and Computer Science Chapter 3 pp William J. Pervin The University of Texas at Dallas Richardson, Texas.
MATH 4030 – 4B CONTINUOUS RANDOM VARIABLES Density Function PDF and CDF Mean and Variance Uniform Distribution Normal Distribution.
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
Intensive Actuarial Training for Bulgaria January 2007 Lecture 10 – Risk Theory and Utility Theory By Michael Sze, PhD, FSA, CFA.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
2. Introduction to Probability. What is a Probability?
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Chapter 2 Probability, Statistics and Traffic Theories
Chapter Eight Expectation of Discrete Random Variable
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 9 – Continuous Random Variables: Joint PDFs, Conditioning, Continuous Bayes Farinaz Koushanfar.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Uniform, normal, and exponential. 2.Exponential example. 3.Uniform.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Review of Final Part I Sections Jiaping Wang Department of Mathematics 02/29/2013, Monday.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
3.1 Discrete Random Variables Present the analysis of several random experiments Discuss several discrete random variables that frequently arise in applications.
Probability Refresher
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
ONE DIMENSIONAL RANDOM VARIABLES
Random Variable 2013.
The distribution function F(x)
Moment Generating Functions
Moment Generating Functions
Probability Review for Financial Engineers
Lecture 5 b Faten alamri.
STATISTICAL MODELS.
Useful Discrete Random Variable
Exam 2 - Review Chapters
7. Continuous Random Variables II
Chapter 3 : Random Variables
Continuous Distributions
Presentation transcript:

Intensive Actuarial Training for Bulgaria January 2007 Lecture 0 – Review on Probability Theory By Michael Sze, PhD, FSA, CFA

Topics Covered Some definitions and properties Moment generating functions Some common probability distributions Conditional probability Properties of expectations

Some Definitions and Properties Cumulative distribution function F(x) –F is non-decreasing: a < b  F(a) < F(b) –Lim b  F(b) = 1 –Lim a  -  F(a) = 0 –F is right continuous:b n  b  Lim n  F(b n ) = b E[X] =  x p(x) = , where p(x) = P(X = x) –E[g(x)] =  i x i g(x i ) p(x i ) –E[aX + b] = a E[X] + b –E[X 2 ] =  i x i 2 p(x i ) –Var(X) = E[(X -  ) 2 ] = E[X 2 ] – (E[X]) 2 –Var(a X + b) = a 2 Var (X)

Moment Generating Functions Definition: mgf M X (t) = E[e t x ] Properties: –There is a 1 – 1 correspondence between f(x) and M X (t) –X, Y independent r.v.  M X+Y (t)=M X (t).M Y (t) –X 1,…,X n indep.  M  i xi (t)=  i M xi (t) –mgf for f 1 + f 2 + f 3 = M x1 (t) + M x2 (t) + M x3 (t) –M’ X (0) = E[X] –M (n) X (0) = E[X n ]

Some Common Discrete Probability Distributions Binomial random variable (r.v.) with parameters (n, p) Poisson r.v. with parameter Geometric r.v. with parameter p Negative binomial r.v. with parameter (r, p)

Some Common Continuous Probability Distributions Uniform r.v. on (a, b) Normal r.v. with parameter ( ,  2 ) Exponential r.v. with parameter Gamma r.v. with parameters (t, ), t, > 0

Binomial r.v. B(n, p) n is integer, 0  p  1 Probability of getting i heads in n trials p(i) = n C i p i q n – i E[X] = n p Var(X) = n p q M X (t) = (p e t + q) n

Poisson r.v. with parameter > 0, the expected number of events Poisson is good approximation of binomial for large n, small p, and not too big np  n p p(i) = P(X = i) = e - x ( i / i!) E[X] = Var(X) = M X (t) = exp [ (e t - 1) ]

Geometric r.v with parameter p 0  p  1, probability of success in one trial Geometric r.v. is used to study the probability of getting the success in n trials p(n) = P(X = n) = q n - 1 p E[X] = 1/p Var(X) = q / p 2. M X (t) = p e t / ( 1 - q e t )

Negative Binomial r.v. with parameter r, p p = probability of success in each trial r = number of successes wanted Negative binomial r.v. is used to study the probability of getting first r successes in n trials p(n) = P(X = n) = n - 1 C r - 1 q n - r p r. E[X] = r / p Var(X) = r q / p 2 M X (t) = [p e t / ( 1 - q e t )] r

Uniform r.v. on (a, b) a < x < b f(x) =  1 / (b – a) for a < x < b  0 otherwise F(c) =  (c – a) / (b – a) for a < x < b  0 otherwise E[X] = (a + b) / 2 Var(X) = (b – a) 2 / 12 M X (t) = (e tb - e ta ) / [t (b - a)]

Normal r.v. with parameters ( ,  2 ) By central limit theorem, many r.v. can be approximated by a normal distribution f(x) = [1/  (2  2 )] exp [ - (x -  ) 2 / 2  2 ] E[X] =  Var(X) =  2. M X (t) = exp [  t +  2 t 2 /2 ]

Exponential r.v. with parameter > 0 Exponential r.v. X gives the amount of waiting time until the next event happens X is memoryless: P(X>s+t|X>t) = P(X>s) for all s, t  0 f(x) = e - x. for x  0, 0 otherwise F(a) = 1 - e - a E[X] = 1 / Var(X) = 1 / 2 M X (t) = / ( - t )

Gamma r.v. with parameters (s, ) s, > 0 Exponential r.v. X gives the amount of waiting time until the next s events happen f(x) = e - x ( x) s – 1 /  (t) for x  0, 0 otherwise  (s) =  0  e - y y s – 1 dy  (n) = (n – 1)!,  (1) =  (0) = 1 E[X] = s / Var(X) = s / 2 M X (t) = [ / ( - t )] s

Conditional Probability Definition:For P(F)>0, P(E|F) = P(EF)/P(F) Properties: –For A 1,…,A n,whereA i  A j =  for  i  j (exclusive), and  A i = S(exhaustive), then P(B) =  i P(B|A i ) P(A i ) –Baye’s Theorem: For P(B)>0, P(A|B) = [P(B|A).P(A)]/P(B) –E[X|A] =  i x i P(x i |A) –E[X|  A i ] =  i E(X|A i ) P(A i )

Properties of Expectation E[X + Y] = E[X] + E[Y] E[  i X i ] =  i E[X i ] If X,Y are independent, then E[g(X) h(Y)] = E[g(X)] E[h(Y)] Def.: Cov(X,Y) = E[(X-E[X])(Y-E[Y])] Cov(X,Y) = Cov(Y,X) Cov(X,X) = Var(X) Cov(aX,Y) = a Cov(X,Y)

Properties of Expectation(continued) Cov(  i X i,  j Y j ) =  i  j Cov(X i,Y j ) Var(  i X i ) =  i Var(X i ) +  i  j Cov(X i,Y j ) If S N = X 1 +…+X N is a compound process –X i are mutually independent, –X i are independent of N, and –X i have the same distribution, then E[S N ] =  i E[X i ] Var(S N ) = E[N] Var(X) + Var(N) (E[X]) 2