Presentation is loading. Please wait.

Presentation is loading. Please wait.

Eager Markov Chains Parosh Aziz Abdulla Noomene Ben Henda Richard Mayr Sven Sandberg TexPoint fonts used in EMF. Read the TexPoint manual before you delete.

Similar presentations


Presentation on theme: "Eager Markov Chains Parosh Aziz Abdulla Noomene Ben Henda Richard Mayr Sven Sandberg TexPoint fonts used in EMF. Read the TexPoint manual before you delete."— Presentation transcript:

1 Eager Markov Chains Parosh Aziz Abdulla Noomene Ben Henda Richard Mayr Sven Sandberg TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A

2 Informationsteknologi Institutionen för informationsteknologi | Outline Introduction Expectation Problem Algorithm Scheme Termination Conditions Subclasses of Markov Chains  Examples Conclusion

3 Informationsteknologi Institutionen för informationsteknologi | Introduction Model: Infinite-state Markov chains  Used to model programs with unreliable channels, randomized algorithms… Interest: Conditional expectations  Expected execution time of a program  Expected resource usage of a program

4 Informationsteknologi Institutionen för informationsteknologi | Introduction Infinite-state Markov chain  Infinite set of states  Target set  Probability distributions Example

5 Informationsteknologi Institutionen för informationsteknologi | Introduction Reward function  Defined over paths reaching the target set Example

6 Informationsteknologi Institutionen för informationsteknologi | Expectation Problem Instance  A Markov chain  A reward function Task  Compute/approximate the conditional expectation of the reward function

7 Informationsteknologi Institutionen för informationsteknologi | Expectation Problem Example:  The weighted sum  The reachability probability  The conditional expectation *4+0.1*(-5)= = /0.9=3

8 Informationsteknologi Institutionen för informationsteknologi | Expectation Problem Remark  Problem in general studied for finite-state Markov chains Contribution  Algorithm scheme to compute it for infinite- state Markov chains  Sufficient conditions for termination

9 Informationsteknologi Institutionen för informationsteknologi | Algorithm Scheme At each iteration n  Compute paths up to depth n  Consider only those ending in the target set  Update the expectation accordingly Path Exploration

10 Informationsteknologi Institutionen för informationsteknologi | Algorithm Scheme Correctness  The algorithm computes/approximates the correct value Termination  Not guaranteed: lower-bounds but no upper- bounds

11 Informationsteknologi Institutionen för informationsteknologi | Termination Conditions Exponentially bounded reward function  The intuition: limit on the growth of the reward functions  Remark: The limit is reasonable: for example polynomial functions are exponentially bounded

12 Informationsteknologi Institutionen för informationsteknologi | Termination Conditions 0 The abs of the reward Bound on the reward

13 Informationsteknologi Institutionen för informationsteknologi | Termination Conditions Eager Markov chain  The intuition: Long paths contribute less in the expectation value  Remark: Reasonable: for example PLCS, PVASS, NTM induce all eager Markov chains

14 Informationsteknologi Institutionen för informationsteknologi | Termination Conditions 0 1 Prob. of reaching the target in more than n steps Bound on the probability

15 Informationsteknologi Institutionen för informationsteknologi | Termination Conditions Pf Ws Ce

16 Informationsteknologi Institutionen för informationsteknologi | Subclasses of Markov Chains Eager Markov chains Markov chains with finite eager attractor Markov chains with the bounded coarseness property NTM PVASS PLCS

17 Informationsteknologi Institutionen för informationsteknologi | Finite Eager Attractor Attractor:  Almost surely reached from every state Finite eager attractor:  Almost surely reached  Unlikely to stay ”too long” outside of it A EA

18 Informationsteknologi Institutionen för informationsteknologi | Finite Eager Attractor EA 0 1 b Prob. to return in More than n steps

19 Informationsteknologi Institutionen för informationsteknologi | Finite Eager Attractor Finite eager attractor implies eager Markov chain??  Reminder: Eager Markov chain: Prob. of reaching the target in more than n steps

20 Informationsteknologi Institutionen för informationsteknologi | Finite Eager Attractor FEA Paths of length n that visit the attractor t times

21 Informationsteknologi Institutionen för informationsteknologi | Finite Eager Attractor Proof idea: identify 2 sets of paths  Paths that visit the attractor often without going to the target set:  Paths that visit the attractor rarely without going the target set:

22 Informationsteknologi Institutionen för informationsteknologi | Finite Eager Attractor Paths visiting the attractor rarely: t less than n/c FEA Pr_n

23 Informationsteknologi Institutionen för informationsteknologi | Finite Eager Attractor Paths visiting the attractor often: t greater than n/c FEA PtPl Po_n

24 Informationsteknologi Institutionen för informationsteknologi | Probabilistic Lossy Channel Systems (PLCS) Motivation:  Finite-state processes communicating through unbounded and unreliable channels  Widely used to model systems with unreliable channels (link protocol)

25 Informationsteknologi Institutionen för informationsteknologi | PLCS ab b Send c!a ab Receive c?b a c?b q0 q3q2 q1 nop c!a c!b aba Channel c nop

26 Informationsteknologi Institutionen för informationsteknologi | PLCS c?b q0 q3q2 q1 nop c!a c!b aba Channel c nop ab b Loss b b a a

27 Informationsteknologi Institutionen för informationsteknologi | PLCS Configuration  Control location  Content of the channel Example  [q3,”aba”] c?b q0 q3q2 q1 nop c!a c!b aba Channel c nop

28 Informationsteknologi Institutionen för informationsteknologi | PLCS A PLCS induces a Markov chain:  States: Configurations  Transitions: Loss steps combined with discrete steps

29 Informationsteknologi Institutionen för informationsteknologi | PLCS Example:  [q1,”abb”] [q2,”a”]  By losing one of the messages ”b” and firing the marked step. Probability:  P=Ploss*2/3 c?b q0 q3q2 q1 nop c!a c!b aba Channel c nop

30 Informationsteknologi Institutionen för informationsteknologi | PLCS Result: Each PLCS induces a Markov chain with finite eager attractor.  Proof hint: When the size of the channels is big enough, it is more likely (with a probability greater than ½) to lose a message.

31 Informationsteknologi Institutionen för informationsteknologi | Bounded Coarseness The probability of reaching the target within K steps is bounded from below by a constant b.

32 Informationsteknologi Institutionen för informationsteknologi | Bounded Coarseness Boundedly coarse Markov chain implies eager Markov chain??  Reminder: Eager Markov chain: Prob. of reaching the target in more than n steps

33 Informationsteknologi Institutionen för informationsteknologi | Bounded Coarseness Prob. Reach. Within K steps KnK steps 2K PnP2 Pn:Prob. of avoiding the target in nK steps P1

34 Informationsteknologi Institutionen för informationsteknologi | Probabilistic Vector Addition Systems with states (PVASS) Motivation:  PVASS are generalizations of Petri-nets.  Widely used to model parallel processes, mutual exclusion program…

35 Informationsteknologi Institutionen för informationsteknologi | PVASS Configuration  Control location  Values of the variables x and y Example:  [q1,x=2,y=0] q0 q3q2 q1 nop --x --y ++x ++y x

36 Informationsteknologi Institutionen för informationsteknologi | PVASS A PVASS induces a Markov chain:  States: Configurations  Transitions: discrete steps

37 Informationsteknologi Institutionen för informationsteknologi | PVASS Example:  [q1,1,1] [q2,1,0]  By taking the marked step. Probability:  P=2/3 q0 q3q2 q1 nop --x --y ++x ++y x

38 Informationsteknologi Institutionen för informationsteknologi | PVASS Result: Each PVASS induces a Markov chain which has the bounded coarseness property.

39 Informationsteknologi Institutionen för informationsteknologi | Noisy Turing Machines (NTM) Motivation:  They are Turing Machines augmented with a noise parameter.  Used to model systems operating in ”hostile” environment

40 Informationsteknologi Institutionen för informationsteknologi | NTM Fully described by a Turing Machine and a noise parameter. q1 q3q2 q4 a/bb b # # RR R RR S S ab#b#aab

41 Informationsteknologi Institutionen för informationsteknologi | NTM q1 q3q2 q4 a/bb b # # RR R RR S S ab#b#aab Discret Step ab#b#aab bb#b#aab

42 Informationsteknologi Institutionen för informationsteknologi | NTM q1 q3q2 q4 a/bb b # # RR R RR S S ab#b#aab Noise Step ab#b#aab #b#b#aab

43 Informationsteknologi Institutionen för informationsteknologi | NTM Result: Each NTM induces a Markov chain which has the bounded coarseness property.

44 Informationsteknologi Institutionen för informationsteknologi | Conclusion Summary:  Algorithm scheme for approximating expectations of reward functions  Sufficient conditions to guarantee termination:  Exponentially bounded reward function  Eager Markov chains

45 Informationsteknologi Institutionen för informationsteknologi | Conclusion Direction for future work  Extending the result to Markov decision processes and stochastic games  Find more concrete applications

46 Thank you

47 Informationsteknologi Institutionen för informationsteknologi | PVASS Order on configurations: <=  Same control locations  Ordered values of the variables Example:  [q0,3,4] <= [q0,3,5] q0 q3q2 q1 nop --x --y ++x ++y x

48 Informationsteknologi Institutionen för informationsteknologi | PVASS Probability of each step > 1/10 Boundedly coarse: parameters K and 1/10^K q0 q3q2 q1 nop --x --y ++x ++y x Target set K iterations


Download ppt "Eager Markov Chains Parosh Aziz Abdulla Noomene Ben Henda Richard Mayr Sven Sandberg TexPoint fonts used in EMF. Read the TexPoint manual before you delete."

Similar presentations


Ads by Google