Presentation is loading. Please wait.

Presentation is loading. Please wait.

Let E denote some event. Define a random variable X by Computing probabilities by conditioning.

Similar presentations


Presentation on theme: "Let E denote some event. Define a random variable X by Computing probabilities by conditioning."— Presentation transcript:

1 Let E denote some event. Define a random variable X by Computing probabilities by conditioning

2 Let E denote some event. Define a random variable X by Computing probabilities by conditioning

3 Let E denote some event. Define a random variable X by Computing probabilities by conditioning

4 Example 1: Let X and Y be two independent continuous random variables with densities f X and f Y. What is P ( X < Y )?

5

6

7

8 Example 2: Let X and Y be two independent continuous random variables with densities f X and f Y. What is the distribution of X + Y ?

9

10

11

12

13

14 Example 3: (Thinning of a Poisson) Suppose X ~Poisson  and {U i } are i.i.d. Bernoulli(p) independent of X.

15

16  A stochastic process { X ( t ), t  T } is collection of random variables Stochastic Processes

17  A stochastic process { X ( t ), t  T } is collection of random variables  For each value of t, there is a corresponding random variable X ( t ) (state of the system at time t ) Stochastic Processes

18  A stochastic process { X ( t ), t  T } is collection of random variables  For each value of t, there is a corresponding random variable X ( t ) (state of the system at time t )  When t takes on discrete values (e.g., t = 1, 2,...)  discrete time stochastic process (the notation X n is often used instead, n = 1, 2,...) Stochastic Processes

19  A stochastic process { X ( t ), t  T } is collection of random variables  For each value of t, there is a corresponding random variable X ( t ) (state of the system at time t )  When t takes on discrete values (e.g., t = 1, 2,...)  discrete time stochastic process (the notation X n is often used instead, n = 1, 2,...)  When t takes on continuous values  continuous time stochastic process Stochastic Processes

20  Example 1 : X ( t ) is the number of customers waiting in line at time t to check their luggage at an airline counter (continuous stochastic process)

21  Example 2 : X n is the number of laptops a computer store sells in week n.

22  Example 1 : X ( t ) is the number of customers waiting in line at time t to check their luggage at an airline counter (continuous stochastic process)  Example 2 : X n is the number of laptops a computer store sells in week n.  Example 3: X n = 1 if it rains on the n th day of the month and X n = 0 otherwise.

23  Example 4: A gambler wins $1 with probability p, loses $1 with probability 1- p. She starts with $ N and quits if she reaches either $ M or $0. X n is the amount of money the gambler has after playing n rounds.

24  P ( X n =i +1 |X n -1 =i, X n -2 =i -1,..., X 0 =N }= P ( X n =i +1 |X n -1 =i }= p (i≠ 0, M)  P ( X n =i -1 | X n -1 =i, X n -2 =i -1,..., X 0 =N } = P ( X n =i -1 |X n -1 =i }=1– p (i ≠ 0, M) P i, i +1 =P ( X n =i +1 |X n -1 =i }; P i, i -1 =P ( X n =i -1 |X n -1 =i }

25  P i, i +1 = p ; P i, i -1 = 1- p for i≠ 0, M  P 0,0 = 1; P M, M = 1 for i≠ 0, M (0 and M are called absorbing states)  P i, j = 0, otherwise

26  { X n : n =0, 1, 2,...} is a discrete time stochastic process  If X n = i the process is said to be in state i at time n  { i : i =0, 1, 2,...} is the state space  If P ( X n +1 =j|X n =i, X n -1 =i n -1,..., X 0 =i 0 }= P ( X n +1 =j|X n =i } = P ij, the process is said to be a Discrete Time Markov Chain (DTMC).  P ij is the transition probability from state i to state j Markov Chains

27 P : transition matrix

28  Example 1: Probability it will rain tomorrow depends only on whether it rains today or not: P (rain tomorrow|rain today) =  P (rain tomorrow|no rain today) =  State 0 = rain State 1 = no rain

29  Example 2 (random walk): A Markov chain whose state space is 0,  1,  2,..., and P i,i +1 = p = 1 - P i,i -1 for i =0,  1,  2,..., and 0 < p < 1 is said to be a random walk.

30 To define a DTMC, we need  Specify the states  Demonstrate the Markov property  Obtain the stationary probability transition matrix P Defining a DTMC


Download ppt "Let E denote some event. Define a random variable X by Computing probabilities by conditioning."

Similar presentations


Ads by Google