Presentation is loading. Please wait.

Presentation is loading. Please wait.

Boltzmann, Shannon, and (Missing) Information. Second Law of Thermodynamics. Entropy of a gas. Entropy of a message. Information?

Similar presentations


Presentation on theme: "Boltzmann, Shannon, and (Missing) Information. Second Law of Thermodynamics. Entropy of a gas. Entropy of a message. Information?"— Presentation transcript:

1 Boltzmann, Shannon, and (Missing) Information

2 Second Law of Thermodynamics. Entropy of a gas. Entropy of a message. Information?

3 B.B. (before Boltzmann): Carnot, Kelvin, Clausius, (19 th c.) Second Law of Thermodynamics: The entropy of an isolated system never decreases. Entropy defined in terms of heat exchange: Change in entropy = (Heat absorbed)/(Absolute temp). (+ if absorbed, - if emitted). (Molecules unnecessary).

4 Q Hot (T h )Cold (T c ) Isolated system. Has some structure (ordered). Heat, Q, extracted from hot, same amount absorbed by cold – energy conserved, 1 st Law. Entropy of hot decreases by Q/T h ; entropy of cold increases by Q/T c > Q/T h, 2 d Law. In the fullness of time … No structure (no order). Lukewarm

5 Paul’s entropy picture Sun releases heat Q at high temp  entropy decreases Living stuff absorb heat Q at lower temp  larger entropy increases Stuff releases heat q, gets more organized  entropy decreases Surroundings absorb q, gets more disorganized  entropy increases … Overall, entropy increases

6 2 d Law of Thermodynamics does not forbid emergence of local complexity (e.g., life, brain, …). 2 d Law of Thermodynamics does not require emergence of local complexity (e.g., life, brain, …).

7 Boltzmann (1872)) Entropy of a dilute gas. N molecules obeying Newtonian physics (time reversible). State of each molecule given by its position and momentum. Molecules may collide – i.e., transfer energy and momentum among each other. colliding

8 Represent system in a space whose coordinates are positions and momenta = mv (phase space). momentum position Subdivide space into B bins. p k = fraction of particles whose positions and momenta are in bin k.

9 p k ’s change because of Motion Collisions External forces Build a histogram of the p k ’s.

10 All in 1 bin – highly structured, highly ordered no missing information, no uncertainty. Uniformly distributed – unstructured, disordered, random. maximum uncertainty, maximum missing information. In-between case  intermediate amount of missing information (uncertainty). Any flattening of histogram (phase space landscape) increases uncertainty. Given the p k ’s, how much information do you need to locate a molecule in phase space?

11 Boltzmann: Amount of uncertainty, or missing information, or randomness, of the distribution of the p k ’s, can be measured by H B =  p k log(p k )

12 All in 1 bin – highly structured, highly ordered H B = 0. Maximum H B. Uniformly distributed – unstructured, disorder, random. H B = - log B. Minimum H B. In-between case  intermediate amount of missing information (uncertainty). In – between value of H B. p k histogram revisited.

13 Boltzmann’s Famous H Theorem Define: H B =  p k log(p k ) Assume: Molecules obey Newton’s Laws of motion. Show: H B never increases. AHA! - H B never decreases: behaves like entropy!! If it looks like a duck … Identify entropy with – H B : S = - k B H B Boltzmann’s constant

14 New version of Second Law: The phase space landscape either does not change or it becomes flatter. It may peak locally provided it flattens overall. life?

15 Two “paradoxes” 1.Reversal (Loschmidt, Zermelo). Irreversible phenomena (2 d Law, arrow of time) emerge from reversible molecular dynamics. (How can this be? – cf Tony Rothman).

16 2. Recurrence (Poincaré). Sooner or later, you are back where you started. (So, what does approach to equilibrium mean?) Graphic from: J. P. Crutchfield et al., “Chaos,” Sci. Am., Dec., 1986.

17 Well … 1.Interpret H theorem probabilistically. Boltzmann’s treatment of collisions is really probabilistic,…, molecular chaos, coarse-graining, indeterminacy – anticipating quantum mechanics? Entropy is probability of a macrostate – is it something that emerges in the transition from the micro to the macro? 2.Poincaré recurrence time is really very, very long for real systems – longer than the age of the universe, even. Anyhow, entropy does not decrease! … on to Shannon

18 AB (After Boltzmann): Shannon (1949) Entropy of a message Message encoded in an alphabet of B symbols, e.g., English sentence (26 letters + space + punctuations) Morse code (dot, dash, space) DNA (A, T, G, C) p k = fraction of the time that symbol k occurs (~ probability that symbol k occurs).

19 pick a symbol – any symbol … Shannon’s problem: Want a quantity that measures missing information: how much information is needed to establish what the symbol is, or uncertainty about what the symbol is, or how many yes-no questions need to be asked to establish what the symbol is. Shannon’s answer: H S = - k  p k log(p k ) A positive number

20 Morse code example: All dots: p 1 = 1, p 2 = p 3 = 0. Take any symbol – it’s a dot; no uncertainty, no question needed, no missing information, H S = chance that it’s a dot or a dash: p 1 = p 2 = ½, p k = 0. Given the p’s, need to ask one question (what question?), one piece of missing information, H S = log(2) = 0.69 Random: all symbols equally likely, p 1 = p 2 = p 3 = 1/3. Given the p’s, need to ask as many as 2 questions -- 2 pieces of missing information, H S = log(3) = 1.1

21 1. It looks like a duck … but does it quack? There’s no H theorem for Shannon’s H S. 2. H is insensitive to meaning. Two comments: Shannon: “ [The] semantic aspects of communication are irrelevant to the engineering problem.”

22 On H theorems: Q: What did Boltzmann have that Shannon didn’t? A: Newton (or equivalent dynamical rules for the evolution of the p k ’s). Does Shannon have rules for how the p k ’s evolve? In a communications system, the p k ’s may change because of transmission errors. In genetics, is it mutation? Is the result always a flattening of the p k landscape, or an increase in missing information? Is Shannon’s H S just a metaphor? What about Maxwell’s demon?

23 On dynamical rules. Is a neuron like a refrigerator? Entropy of fridge decreases. Entropy of signal decreases.

24 The entropy of a refrigerator may increase, but it needs electricity. The entropy of the message passing through a neuron may increase, but it needs nutrients. General Electric designs refrigerators. Who designs neurons?

25 Insensitive to meaning: Morse revisited X={ … } H E L L O W O R L D Y={.- -… … } A B C D E F G H I J M Same p k ’s, same entropies – same “missing information.”

26 If X and Y are separately scrambled – still same p k ’s, same “missing information” – same entropy. The message is in the sequence? What do geneticists say? Information – as entropy – is not a very useful way to characterize the genetic code?

27 Do Boltzmann and Shannon mix? Boltzmann’s entropy of a gas, S B = - k B  p k log p k k B relates temperature to energy: E = k B T relates temperature of a gas to PV. Shannon’s entropy of a message, S S = - k  p k log p k k is some positive constant – no reason to be k B. Does S B + S S mean anything? Does the sum never decrease? Can an increase in one make up for a decrease in the other?

28 Maxwell’s demon yet once more. Demon measures velocity of molecule by bouncing light on it and absorbing reflected light; process transfers energy to demon; increases demon’s entropy – makes up for entropy decrease of gas.


Download ppt "Boltzmann, Shannon, and (Missing) Information. Second Law of Thermodynamics. Entropy of a gas. Entropy of a message. Information?"

Similar presentations


Ads by Google