Download presentation

Presentation is loading. Please wait.

Published byBrenda Waithe Modified over 2 years ago

1
1 Continuous random variables Continuous random variable Let X be such a random variable Takes on values in the real space (-infinity; +infinity) (lower bound; upper bound) Instead of using P(X=i) Use the probability density function f X (t) Or f X (t)dt

2
2 Cumulative function of continuous r.v.

3
3 Distribution function: properties Properties for pdf

4
4 Uniform random variable

5
5 Exponential distribution is the foundation of most of the stochastic processes Makes the Markov processes ticks is used to describe the duration of sthg CPU service Telephone call duration Or anything you want to model as a service time

6
6 Exponential random variable

7
7 Link between Poisson and Exponential If the arrival process is Poisson # arrivals per time unit follows the Poisson distribution With parameter λ => inter-arrival time is exponentially distributed With mean = 1/ λ = average inter-arrival time Time 0 T Exponentially distributed with 1/ λ

8
Proof 8

9
Memoryless Property 9

10
Example 10

11
11 Hyper-exponential distributions H 2 H n Advantage Allows a more sophisticated representation Of a service time While preserving the exponential distribution And have a good chance of analyzing the problem λ1λ1 λ2λ2 p 1-p.... λ1λ1 λ2λ2 λnλn p1p1 p2p2 pnpn

12
12 Further properties of the exponential distribution

13
13 Further properties of the exponential distribution (ct’d) X 1, X 2, …, X n independent r.v. X i follows an exponential distribution with Parameter λ i => f Xi (t) = λ i e λit Define X = min{X 1, X 2, …, X n } is also exponentially distributed Proof f X (t) = ?

14
14 Joint distribution functions Discrete case One variable (pmf) P(X=i) Joint distribution P(X 1 =i 1, X 2 =i 2, …, X n =i n ) Continuous case One variable (pdf) f X (t) Joint distribution f X1, X2,…Xn, (t 1,t 2,.., t n )

15
15 Independent random variables The random variables X 1, X 2 Are said to be independent if, for all a, b Example Green die: X1 Red die: X2 X3 = X1 + X2 X3 and X1 are dependent or independent?

16
16 Marginal distribution Joint distribution Discrete case P(X 1 =i, X 2 =j), for all i, j in S1xS2 => Continuous case f X1,X2 (t 1,t 2 ), for all t 1,t 2 =>

17
17 Expectation of a r.v.: the continuous case X is a continuous r.v. Having a probability density function f(x) The expected value of X is defined by Define g(X) a function of r.v. X

18
18 Expectation of a r.v.: the continuous case (cont’d)

19
19 Variance, auto-correlation, & covariance

20
20 Conditional probability and conditional expectation: d.r.v. X and Y are discrete r.v. Conditional probability mass function Of X given that Y=y Conditional expectation of X given that Y=y

21
21 Conditional probability and expectation: continuous r.v. If X and Y have a joint pdf f X,Y (x,y) Then, the conditional probability density function Of X given that Y=y The conditional expectation Of X given that Y=y

22
22 Computing expectations by conditioning Denote E[X|Y]: function of the r.v. Y Whose value at Y=y is E[X|Y=y] E[X|Y]: is itself a random variable Property of conditional expectation if Y is a discrete r.v. if Y is continuous with density f Y (y) => (1) (2) (3)

23
23 Proof of equation when X and Y are discrete

Similar presentations

OK

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google