Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Coffee Automaton: Quantifying the Rise and Fall of Complexity in Closed Systems Scott Aaronson (MIT) Joint work with Lauren Ouellette and Sean Carroll.

Similar presentations

Presentation on theme: "The Coffee Automaton: Quantifying the Rise and Fall of Complexity in Closed Systems Scott Aaronson (MIT) Joint work with Lauren Ouellette and Sean Carroll."— Presentation transcript:

1 The Coffee Automaton: Quantifying the Rise and Fall of Complexity in Closed Systems Scott Aaronson (MIT) Joint work with Lauren Ouellette and Sean Carroll

2 It all started with a talk Sean Carroll gave last summer on a cruise from Norway to Denmark…



5 Proposed system: the coffee automaton Our modest goal: Understand the rise and fall of complexity quantitatively in some simple model system n n grid (n even), initially in the configuration to the right (Half coffee and half cream) At each time step, choose 2 horizontally or vertically adjacent squares uniformly at random and swap them if theyre colored differently


7 Control Experiment: The Non- Interacting Coffee Automaton The starting configuration is the same, but now we let an arbitrary number of cream particles occupy the same square (and treat the coffee as just an inert background) Dynamics: Each cream particle follows an independent random walk Intuitively, because of the lack of interaction, complexity should never become large in this system

8 Fundamental requirement: Need to assign a value near 0 to both completely ordered states and completely random states, but assign large values to other states (the complex states) Also, should be possible to compute or approximate the measure efficiently in cases of interest How to Quantify Complexity? Lots of Approaches in the Santa Fe Community

9 Warmup: How to Quantify Entropy K(x) = Kolmogorov complexity of x = length of the shortest program that outputs x Old, well-understood connection between K and H: K(x) is uncomputableworse than SZK-complete!! But in some (not all) situations, one can approximate K(x) by G(x) K(x), where G(x) is the gzip file size of x Problem: Approximating H is SZK-complete!

10 Let f(x) be a function that outputs only the important, macroscopic information in a state x, washing or averaging out the random fluctuations Approach 1 to Quantifying Complexity: Coarse-Grained Entropy Advantage of coarse-graining: Something physicists do all the time in practice Disadvantage: Seems to some like a human notionwho decides which variables are important or unimportant? Then look at H(f(x)) H(x). Intuitively, H(f(x)) should be maximized when theres interesting structure

11 Then consider the expected mutual information between the configurations in P and F: Approach 2: Causal Complexity (Shalizi et al. 2004) Given a point (x,t) in a cellular automatons spacetime history, let P and F be its past and future lightcones respectively: F P (x,t)Time t

12 Advantages of causal complexity: Has an operational meaning Depends only on causal structure, not on arbitrary choices of how to coarse-grain Disadvantages: Not a function of the current state only Requires going arbitrarily far into past and future I(P,F) can be large simply because not much is changing Intuition: If dynamics are simple then I(P,F) 0 since H(P) H(F) 0 If dynamics are random then I(P,F) 0 since H(F|P) H(F) In intermediate cases, I(P,F) can be large since the past has nontrivial correlations with the future

13 Approach 3: Logical Depth (Bennett 1988) Depth(x) = Running time of the shortest program that outputs x Depth(0 n ) = Depth(random string) = n But there must exist very deep strings, since otherwise Kolmogorov complexity would become computable! Advantage: Connects Santa Fe and computational complexity Disadvantages: There are intuitively complex patterns that arent deep Computability properties are terrible

14 Given a set S {0,1} n, let K(S) be the length of the shortest program that lists the elements of S Given x {0,1} n, let Soph c (x) be the minimum of K(S), over all S {0,1} n such that x S and K(S)+log 2 |S| K(x)+c In a near-minimal program for x, the smallest number of bits that need to be code rather than random data Approach 4: Sophistication (Kolmogorov 1983, Koppel 1987) Sophistication is often thought of in terms of a two-part code: Program for SIncompressible index of x in S Soph c (x) = size of this part Soph c (0 n )=O(1), for take S={0 n } Soph c (random string)=O(1), for take S={0,1} n On the other hand, one can show that there exist x with Soph c (x) n-O(log n)

15 Theorem (far from tight; follows Antunes-Fortnow): Let c=o(n); then there exists a string z {0,1} n with Soph c (z) n/3 Proof: Let A = {x {0,1} n : x belongs to some set S {0,1} n with K(S) n/3 and K(S)+log 2 |S| 2n/3 } Let z be the lexicographically-first n-bit string not in A (such a z must exist by counting) K(z) n/3+o(n), since among all programs that define a set S with K(S) n/3, we simply need to specify which one runs for the longest time Suppose Soph c (z) n/3. Then theres a set S containing z such that K(S) n/3 and K(S)+log 2 |S| K(z)+c n/3+o(n). But that means z A, contradiction. Special Proof Slide for Hebrew U!

16 Problem: Soph c (x) is tiny for typical states x of the coffee automaton! Why? Because we can let S be the ensemble of sampled states at time t; then x is almost certainly an incompressible element of S Advantage of resource-bounded sophistication: The two-part code picks out a coarse-graining for free without our needing to put it in by hand Disadvantages: Hard to compute; approximations to Soph c efficient (x) didnt work well in experiments Solution: Could use resource-bounded sophistication, e.g., minimal length of p in a minimal 2-part code consisting of (polytime program p outputting AC 0 circuit C, input to C)

17 Let I = coffee-cup bitmap (n 2 bits) Let C(I) = coarse-graining of I. Each pixel gets colored by the mean of the surrounding L L block (with, say, L~ n), rounded to one of (say) 10 creaminess levels Our Complextropy Measure Complextropy := K(C(I)) G(C(I)) K(C(I)): gzip file size of C(I); approximation to complextropy that were able to compute

18 Complextropys connection to sophistication and two-part codes: Compressed coarse-grained imageRemaining info in image K(C(I)) = size of this part Complextropys connection to causal complexity: The regions over which we coarse-grain arent totally arbitrary! They can be derived from the coffee automatons causal structure Complextropy can be seen as an extremely resource- bounded type of sophistication!

19 Even in the non-interacting case, rounding effects cause a random pattern in the coarse-grained image, at the border between the cream and the coffee Makes K(C(I)) artificially large The Border Pixels Problem Hacky Solution: Allow rounding 1 to the most common color in each row. That gets rid of the border pixel artifacts, while hopefully still preserving structure in the interacting case

20 Behavior of G(I) and G(C(I)) in Interacting Case

21 Behavior of G(I) and G(C(I)) in Non-Interacting Case

22 Qualitative Pattern Doesnt Depend on Compression Program

23 Dependence on the Grid Size n Maximum entropy G(I) increases like ~n 2 for an n n coffee cup Maximum coarse-grained entropy G(C(I)) increases like ~n

24 We can give a surprisingly clean proof that K(C(I)) never becomes large in the non-interacting case Analytic Understanding? Let a t (x,y) be the number of cream particles at point (x,y) at step t Claim: E[a t (x,y)] 1 for all x,y,t Proof: True when t=0; apply induction on t Now let a t (B) = (x,y) B a t (x,y) be the number of cream particles in an L L square B after t steps Clearly E[a t (B)] L 2 for all t,B by linearity

25 By a Chernoff bound, So by a union bound, provided If the above happens, then by symmetry, each row of C(I) will be a uniform color, depending only on the height of the row and t Hence K(C(I)) log 2 n + log 2 t + O(1)

26 Prove that, in the interacting case, K(C(I)) does indeed become (n) (or even (log n)) Requires understanding detailed behavior of a Markov chain prior to mixingnot so obvious what tools to use Maybe the 1D case is a good starting point? Open Problems Clarify relations among coarse-grained entropy, causal complexity, logical depth, and sophistication Find better methods to approximate entropy and to deal with border pixel artifacts

27 Long-range ambition: Laws that, given any mixing process, let us predict whether or not coarse-grained entropy or other types of complex organization will form on the way to equilibrium Theorem: In a gas of non-interacting particles, no nontrivial complextropy ever forms Numerically-supported conjecture: In a liquid of mutually-repelling particles, some nontrivial complextropy does form Effects of gravity / viscosity / other more realistic physics? So far…

Download ppt "The Coffee Automaton: Quantifying the Rise and Fall of Complexity in Closed Systems Scott Aaronson (MIT) Joint work with Lauren Ouellette and Sean Carroll."

Similar presentations

Ads by Google