Download presentation

Presentation is loading. Please wait.

Published byKenyon Surls Modified about 1 year ago

1
Rick Quax Postdoctoral researcher Computational Science University of Amsterdam EU FP7 projects: Nudge control through information processing

2
2 Entropy of a coin flip The outcome of the coin flip carries 1 bit of information The outcome of the coin flip carries 0 bits of information I.e., I need 1 bit to fully describe the outcome of the coin flip I.e., I need 0 bits to fully describe the outcome of the coin flip (it is already known beforehand) In general: Entropy of coin flip: Shannon’s information theory

3
3 Mutual information X = 0 or 1? Transform Noise (Non-linear) function … Communication channel inference How much information was transferred? Examples: 1 bit is transferred (perfect transmission) 0 bits were transferred (no transmission)

4
4 Mutual information X = 0 or 1? Transform Noise (Non-linear) function … Communication channel inference How much information was transferred? In general: A priori uncertainty Remaining uncertainty after knowing Y In direct formula:

5
5 Mutual information X = 0 or 1? Transform Noise (Non-linear) function … Communication channel inference How much information was transferred? In general: A priori uncertainty Remaining uncertainty after knowing Y Assuming p(X=x) = 0.5

6
Nudge control: information flow … … …

7
Information integration Transform Communication channel Computation! i.i.d.

8
Nudge control using information flow

9
Nudge control: information flow … … Assuming p(X=x) = 0.5 Trans form Communication channelCausal relation (stochastic):

10
Information integration Control has 100% efficiency: (All entropy in Z comes from controller β ’) doesn’t work works perfect

11
Information flow in networks AB C D Information dissipation length Information dissipation time Most influential node in network? ?

12
Relation to causal structure Causal structure Information flows If I nudge X to X+dX then I can solve dY, i.e., I will know exactly how Y changes due to X If I nudge X then I can estimate |dY|, i.e., I know the impact magnitude but not form Approximation which is easier to obtain! SEM

13
Obstacle: ambiguity Information flow is not always uniquely identifiable Information from X flows into C… But does it flow through A or through B (or both)? I(C:A) > 0 and I(C:B) > 0, even if A would not causally influence C at all, due to correlation (‘information overlap’)… B A X C

14
Solution: disambiguate by nudging Nudge A by adding small perturbation dA I(C:dA) > 0? I(C:dB) > 0? After all correlations are resolved, information flow = causality (magnitude) B A X C dA

15
Why ‘nudging’? Direct formula of mutual information: It is made up of probability densities If the control is small, then and we can assume that the calculated information flows still predicts impact If the control is too large, then information flow may change completely But it’s ok… A single nudge can still have large effect due to non-linearities (unlike linear models) A series of persistent nudges can gradually transform system to desired behavior

16
Conclusion Control theory requires a causality structure Causality structure requires solving the system dynamics For complex systems this is generally impossible We approximate causality structure by information flow structure Disambiguate by nudge controllers where necessary Information flow predicts impact magnitude of a small perturbation (nudge) Applications include ecosystems, medical disorders, transport systems Low-cost, minimum side effects

17
The Sophocles project receives funding from the EC's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 317534. Thanks! Questions?

18
Information storage

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google