Presentation is loading. Please wait.

Presentation is loading. Please wait.

15 October 2012 Eurecom, Sophia-Antipolis Thrasyvoulos Spyropoulos / Absorbing Markov Chains 1.

Similar presentations


Presentation on theme: "15 October 2012 Eurecom, Sophia-Antipolis Thrasyvoulos Spyropoulos / Absorbing Markov Chains 1."— Presentation transcript:

1 15 October 2012 Eurecom, Sophia-Antipolis Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Absorbing Markov Chains 1

2 Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Eurecom, Sophia-Antipolis  A mouse is trapped in the above maze with 3 rooms and 1 exit  When inside a room with x doors, it chooses any of them with equal probability (1/x) Q: How long will it take it on average to exit the maze, if it starts at room i? Q: How long if it starts from a random room? 2 1 2 3 exit

3 Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Eurecom, Sophia-Antipolis  Def: T i = expected time to leave maze, starting from room I  T 2 = 1/3*1 + 1/3*(1+T 1 )+1/3*(1+T 3 ) = 1 + 1/3*(T 1 +T 3 )  T 1 = 1 + T 2  T 3 = 1 + T 2  T 2 = 5, T 3 = 6, T 1 = 6 Q: Could you have guessed it directly? A: times room 2 is visited before exiting is geometric(1/3)  on average, the wrong exit will be taken twice (each time costing two steps) and the 3 rd time the mouse exits 3 1 2 3 exit

4 Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Eurecom, Sophia-Antipolis  A packet must be routed towards the destination over the above network  “Hot Potato Routing” works as follows: when a router receives a packet, it picks any of its outgoing links randomly (including the incoming link) and send the packet immediately. Q: How long does it take to deliver the packet? 4 destination

5 Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Eurecom, Sophia-Antipolis  First Step Analysis: We can still apply it!  But it’s a bit more complicated: 9x9 system of linear equations  Not easy to guess solution either!  We’ll try to model this with a Markov Chain 5 destination

6 Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Eurecom, Sophia-Antipolis  9 transient states: 1-9  1 absorbing state: A Q: Is this chain irreducible? A: No! Q: Hot Potato Routing Delay  expected time to asborption? 6 1 2 3 4 5 6 78 9 A 1 1/4 1/2 1/3 1/2 1/4 1/3 1/2 1/3 1/2 1/4 1 1/3

7 Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Eurecom, Sophia-Antipolis  We can define transition matrix P (10x10) Q: What is P (n) as n  ∞? A: every row converges to [0,0,…,1] Q: How can we get ET iA ?  (expected time to absorption starting from i) Q: How about ? A: No, the sum goes to infinity! 7 1 2 3 4 5 6 78 9 A 1 1/4 1/2 1/3 1/2 1/4 1/3 1/2 1/3 1/2 1/4 1 1/3

8 Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Eurecom, Sophia-Antipolis  Transition matrix can be written in canonical form  Transient states written first, followed by absorbing ones  Calculate P (n) using canonical form Q: Q n as n  ∞? A: it goes to O Q: where does the (*) part of the matrix converge to if only one absorbing state? A: to a vector of all 1s 8

9 Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Eurecom, Sophia-Antipolis Theorem: The matrix (I-Q) has an inverse  N = (I-Q) -1 is called the fundamental matrix  N = I + Q + Q 2 + …  n ik : the expected number of times the chain is in state k, starting from state i, before being absorbed Proof: 9

10 Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Eurecom, Sophia-Antipolis Theorem:  Let T i be the expected number of steps before the chain is absorbed, given that the chain starts in state i,  let T be the column vector whose i th entry is T i. then T = Nc,  where c is a column vector all of whose entries are 1 Proof:  Σ k n ik :add all entries in the i th row of N  expected number of times in any of the transient states for a given starting state i  the expected time required before being absorbed.  T i = Σ k n ik. 10

11 Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Eurecom, Sophia-Antipolis Theorem:  b ij :probability that an absorbing chain will be absorbed in (absorbing) state j, if it starts in (transient) state i.  B: (t-by-r) matrix with entries b ij. then B = NR,  R as in the canonical form. Proof: 11

12 Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Eurecom, Sophia-Antipolis  Use Matlab to get matrices  Matrix N =  Vector T = 12 3.2174 2.6957 2.3478 6.6522 4.5652 4.0000 3.8261 3.2174 2.6087 1.3478 3.9130 2.9565 4.0435 4.3043 4.0000 2.5217 2.3478 2.1739 1.1739 2.9565 3.4783 3.5217 3.6522 4.0000 2.2609 2.1739 2.0870 2.2174 2.6957 2.3478 6.6522 4.5652 4.0000 3.8261 3.2174 2.6087 1.5217 2.8696 2.4348 4.5652 4.9565 4.0000 2.7826 2.5217 2.2609 1.0000 2.0000 2.0000 3.0000 3.0000 4.0000 2.0000 2.0000 2.0000 1.9130 2.5217 2.2609 5.7391 4.1739 4.0000 4.8696 3.9130 2.9565 1.6087 2.3478 2.1739 4.8261 3.7826 4.0000 3.9130 4.6087 3.3043 1.3043 2.1739 2.0870 3.9130 3.3913 4.0000 2.9565 3.3043 3.6522 33.1304 27.6087 25.3043 32.1304 27.9130 21.0000 32.3478 30.5652 26.7826

13 Thrasyvoulos Spyropoulos / spyropou@eurecom.fr Eurecom, Sophia-Antipolis  A wireless path consisting of H hops (links)  link success probability p  A packet is (re-)transmitted up to M times on each link  If it fails, it gets retransmitted from the source (end-to-end) Q: How many transmissions until end-to-end success? 13


Download ppt "15 October 2012 Eurecom, Sophia-Antipolis Thrasyvoulos Spyropoulos / Absorbing Markov Chains 1."

Similar presentations


Ads by Google