Presentation is loading. Please wait.

Presentation is loading. Please wait.

Solutions Markov Chains 2

Similar presentations


Presentation on theme: "Solutions Markov Chains 2"— Presentation transcript:

1 Solutions Markov Chains 2
4) A computer is inspected at the end of every hour. It is found to be either working (up) or failed (down). If the computer is found to be up, the probability of its remaining up for the next hour is It it is down, the computer is repaired, which may require more than one hour. Whenever, the computer is down (regardlewss of how long it has been down), the probability of its still being down 1 hour later is 0.35. a. Construct the one-step transition probability matrix. b. Find the expected first passage time from i to j for all i, j. Soln: Let, S = 0 if computer is down = 1 if computer is up Then, b. Find expected first passage times

2 Solutions Markov Chains 3
4) (cont.) b. Find expected first passage times

3 Solutions Markov Chains 8
4) (cont.) Alternative solution to b. (1) (2) (3) (4) Substituting (4) into (1) gives And from (4),

4 Solutions Markov Chains 4
4) (cont.) Alternative solution to b. . 13 p = . 87 1 p =

5 Solutions Markov Chains 10
5) A manufacturer has a machine that, when operational at the beginning of a day, has a probability of 0.1 of breaking down sometime during the day. When this happens, the repair is done the next day and completed at the end of that day. a. Formulate the evolution of the status of the machine as a 3 state Markov Chain. b. Fine the expected first passage times from i to j. c. Suppose the machine has gone 20 full days without a breakdown since the last repair was completed. How many days do we expect until the next breakdown/repair? Soln: a. Let, S = 0 if machine running at day’s end | running at start = 1 if machine down at day’s end | running at start = 2 if machine running at day’s end | down at start

6 Solutions Markov Chains 5
5) (cont.) a. S = 0 if machine running at day’s end | running at start = 1 if machine down at day’s end | running at start = 2 if machine running at day’s end | down at start Continuing in this fashion gives the one-step transition prob. b. Note: This makes intuitive sense. If the machine has a 10% chance of failing on any given day, then the expected number of days between failures is 10, (m01 = 10).

7 Solutions Markov Chains 6
5) (cont.)

8 Solutions Markov Chains 7
5) (cont.) Back substituting for m02, m10, m11

9 Solutions Markov Chains 8
5) (cont.) c. Suppose the machine has gone 20 full days without a breakdown since the last repair was completed. How many days do we expect until the next breakdown/repair? If we read this as the expected number of days to breakdown since the last repair, we are asking for m21 in which case, If we read this as the expected number of days to breakdown and subsequent repair since the last repair, we are asking for m22. Again, this should make intuitive sense. A machine has a 10% chance of breaking down. Therefore, the expected time between failures is 10 days. Since it takes 1 day to repair, the time from repair to repair is 10+1 = 11 days.

10 Solutions Markov Chains 9
A military maintenance depot overhauls tanks. There is room for 3 tanks in the facility and one tank in an overflow area. At most 4 tanks can be at the depot at one time. Every morning a tank arrives for an overhaul. If the depot is full, however, it is turned away, no new arrivals occur under these circumstances. On any given day, the following probabilities govern the completion of overhauls. No. tanks Prob These values are independent of the number of tanks in the depot, but obviously no more tanks than are waiting in line at the start of the day can be completed. Develop a Markov chain model for this situation. Soln: Let S = # tanks in the depot at the start of a day just after a tank arrival = 1, 2, 3, 4 (note, since 1 arrives each day, we can never have S=0 Start State Event End State Prob. 4 overhaul 0, 1 declined 4 .2 4 overhaul 1, 1 arrives 4 .4 4 overhaul 2, 1 arrives 3 .3 4 overhaul 3, 1 arrives 2 .1 3 overhaul 0. 1 arrives 4 .2 3 overhaul 1, 1 arrives 3 .4 3 overhaul 2, 1 arrives 2 .3 3 overhaul 3, 1 arrives 1 .1

11 Solutions Markov Chains 10
Start State Event End State Prob. 4 overhaul 0, 1 declined 4 .2 4 overhaul 1, 1 arrives 4 .4 4 overhaul 2, 1 arrives 3 .3 4 overhaul 3, 1 arrives 2 .1 3 overhaul 0. 1 arrives 4 .2 3 overhaul 1, 1 arrives 3 .4 3 overhaul 2, 1 arrives 2 .3 3 overhaul 3, 1 arrives 1 .1 2 overhaul 0, 1 arrives 3 .2 2 overhaul 1, 1 arrives, 2 .4 2 overhaul all, 1 arrives 1 overhaul 0, 1 arrives 2 .2 1 overhaul all, 1 arrives

12 Solutions Markov Chains 10
Suppose we count at end of day, then Start State Event End State Prob. 4 1 declined, overhaul 4 1 declined, overhaul 4 1 declined, overhaul 4 1 declined, overhaul 3 1 arrives, overhaul 3 1 arrives, overhaul 3 1 arrives, overhaul 3 1 arrives, overhaul 2 1 arrives, overhaul 2 1 arrives, overhaul 2 1 arrives, overhaul 2 1 arrives, overhaul 1 1 arrives, overhaul 1 1 arrives, overhaul 1 1 arrives, overhaul all 0 1 arrives, overhaul 0 1 arrives, overhaul all


Download ppt "Solutions Markov Chains 2"

Similar presentations


Ads by Google