Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 11 – Stochastic Processes

Similar presentations


Presentation on theme: "Lecture 11 – Stochastic Processes"— Presentation transcript:

1 Lecture 11 – Stochastic Processes
Topics Definitions Review of probability Realization of a stochastic process Continuous vs. discrete systems Examples

2 Basic Definitions Stochastic process: System that changes over time in an uncertain manner Examples Automated teller machine (ATM) Printed circuit board assembly operation Runway activity at airport State: Snapshot of the system at some fixed point in time Transition: Movement from one state to another

3 Elements of Probability Theory
Experiment: Any situation where the outcome is uncertain. Sample Space, S: All possible outcomes of an experiment (we will call them the “state space”). Event: Any collection of outcomes (points) in the sample space. A collection of events E1, E2,…,En is said to be mutually exclusive if Ei  Ej =  for all i ≠ j = 1,…,n. Random Variable (RV): Function or procedure that assigns a real number to each outcome in the sample space. Cumulative Distribution Function (CDF), F(·): Probability distribution function for the random variable X such that F(a)  Pr{X ≤ a}

4 Components of Stochastic Model
Time: Either continuous or discrete parameter. State: Describes the attributes of a system at some point in time. s = (s1, s2, , sv); for ATM example s = (n) Convenient to assign a unique nonnegative integer index to each possible value of the state vector. We call this X and require that for each s  X. For ATM example, X = n. In general, Xt is a random variable.

5 Model Components (continued)
Activity: Takes some amount of time – duration. Culminates in an event. For ATM example  service completion. Transition: Caused by an event and results in movement from one state to another. For ATM example, # = state, a = arrival, d = departure Stochastic Process: A collection of random variables {Xt}, where t  T = {0, 1, 2, . . .}.

6 Realization of the Process
Deterministic Process Time between arrivals Pr{ ta   } = 0,  < 1 min = 1,   1 min Arrivals occur every minute. Time for servicing customer Pr{ ts   } = 0,  < 0.75 min = 1,   0.75 min Processing takes exactly 0.75 minutes. Number in system, n (no transient response)

7 Realization of the Process (continued)
Stochastic Process Time for servicing customer Pr{ ts   } = 0,  < 0.75 min = 0.6,    1.5 min = 1,   1.5 min Number in system, n

8 Markovian Property Given that the present state is known, the conditional probability of the next state is independent of the states prior to the present state. Present state at time t is i: Xt = i Next state at time t + 1 is j: Xt+1 = j Conditional Probability Statement of Markovian Property: Pr{Xt+1 = j | X0 = k0, X1 = k1,…, Xt = i } = Pr{Xt+1 = j | Xt = i }   for t = 0, 1,…, and all possible sequences i, j, k0, k1, , kt–1 Interpretation: Given the present, the past is irrelevant in determining the future.

9 Transitions for Markov Processes
State space: S = {1, 2, , m} Probability of going from state i to state j in one move: pij State-transition matrix Theoretical requirements: 0  pij  1, j pij = 1, i = 1,…,m

10 Discrete-Time Markov Chain
A discrete state space Markovian property for transitions One-step transition probabilities, pij, remain constant over time (stationary) Simple Example State-transition matrix State-transition diagram 1 2 P = 0.6 0.3 0.1 0.8 0.2

11 (There are other possible bets not include here.)
Game of Craps Roll 2 dice Outcomes Win = 7 or 11 Loose = 2, 3, 12 Point = 4, 5, 6, 8, 9, 10 If point, then roll again. Win if point Loose if 7 Otherwise roll again, and so on (There are other possible bets not include here.)

12 State-Transition Network for Craps

13 Transition Matrix for Game of Craps
Sum 2 3 4 5 6 7 8 9 10 11 12 Prob. 0.028 0.056 0.083 0.111 0.139 0.167 Probability of win = Pr{ 7 or 11 } = = 0.223 Probability of loss = Pr{ 2, 3, 12 } = = 0.112

14 Examples of Stochastic Processes
Single stage assembly process with single worker, no queue State = 0, worker is idle State = 1, worker is busy Multistage assembly process with single worker, no queue State = 0, worker is idle State = k, worker is performing operation k = 1, , 5

15 Examples (continued) Multistage assembly process with single worker with queue (Assume 3 stages only; i.e., 3 operations) s = (s1, s2) where Operations k = 1, 2, 3

16 Queuing Model with Two Servers
s = (s1, s2 , s3) where i = 1, 2 State-transition network

17 Series System with No Queues
Component Notation Definition State s = (s1, s2 , s3) State space S = { (0,0,0), (1,0,0), , (0,1,1), (1,1,1) } The state space consists of all possible binary vectors of 3 components. Activities Y = {a, d1, d2 , d3} a = arrival at operation 1 dj = completion of operation j for j = 1, 2, 3

18 What You Should Know About Stochastic Processes
What a state is. What a realization is (stationary vs. transient). What the difference is between a continuous and discrete-time system. What the common applications are. What a state-transition matrix is.


Download ppt "Lecture 11 – Stochastic Processes"

Similar presentations


Ads by Google