Presentation is loading. Please wait.

Presentation is loading. Please wait.

Information Reading: Complexity: A Guided Tour, Chapter 3.

Similar presentations


Presentation on theme: "Information Reading: Complexity: A Guided Tour, Chapter 3."— Presentation transcript:

1 Information Reading: Complexity: A Guided Tour, Chapter 3

2 Recap: Core disciplines of the science of complexity Dynamics: The study of continually changing structure and behavior of systems Information: The study of representation, symbols, and communication Computation: The study of how systems process information and act on the results Evolution: The study of how systems adapt to constantly changing environments

3 Information Motivating questions: What are “order” and “disorder”? What are the laws governing these quantities? How do we define “information”? What is the “ontological status” of information How is information signaled between two entities? How is information processed to produce “meaning”?

4 Energy, Work, and Entropy What is energy? What is entropy? What are the laws of thermodynamics? What is “the arrow of time”?

5 Maxwell’s Demon James Clerk Maxwell, 1831−1879 See Netlogo simulation

6 Szilard’s solution Leo Szilard, 1898−1964

7 Bennett and Landauer’s solution Charles Bennett, b. 1943 Rolf Landauer, 1927–1999

8 Entropy/Information in Statistical Mechanics What is “statistical mechanics”? Describe the concepts of “macrostate” and “microstate” Ludwig Boltzmann, 1844−1906

9 Entropy/Information in Statistical Mechanics What is “statistical mechanics”? Describe the concepts of “macrostate” and “microstate”. Combinatorics of a slot machine Possible fruits: apple, orange, cherry, pear, lemon –Microstates –Macrostates Macrostate: “Three identical fruits” How many microstates? Macrostate: “Exactly one lemon” How many microstates? Macrostate: “At least 2 cherries” How many microstates?

10 Boltzmann’s entropy, S

11 Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”).

12 Boltzmann’s entropy, S Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”). Aside: What is a “natural logarithm”?

13 Natural Log e

14 Intuition about Log b x “How many places (diversity) do I need (when working with b number of symbols) to to represent number x ?” Log 10 100 = 2 Log 10 1000 = 3 Log 2 16 = 4

15 Intuition about Log b x “How many places (diversity) do I need (when working with b number of symbols) to to represent number x ?” Log 10 100 = 2 Log 10 1000 = 3 Log 2 16 = 4 Log 2.5 = -1 ?

16 Quick review of logarithms log 10 ln log 2 log a b = log 10 b / log 10 a = log n b / log n a for any n

17 Boltzmann’s entropy, S Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”). Bolzmann entropy: the more microstates that give rise to a macrostate, the more probable that macrostate is. Thus high entropy = more probable macrostate.

18 Boltzmann’s entropy, S Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”). Bolzmann entropy: the more microstates that give rise to a macrostate, the more probable that macrostate is. Thus high entropy = more probable macrostate. Second Law of Thermodynamics (à la Boltzmann): Nature tends towards more probable macrostates

19 Boltzmann’s entropy, S What does this have to do with the “arrow of time”?

20 Lord Kelvin: Heat Death of the Universe “The result would inevitably be a state of universal rest and death, if the universe were finite and left to obey existing laws. But it is impossible to conceive a limit to the extent of matter in the universe; and therefore science points rather to an endless progress, through an endless space, of action involving the transformation of potential energy into palpable motion and hence into heat, than to a single finite mechanism, running down like a clock, and stopping for ever. [4]potential energy and hence into echa On the Age of the Sun’s Heat" (1862)

21 Shannon Information / Entropy http://www.khanacademy.org/science/brit-cruise/information-theory

22 Shannon Information / Entropy Claude Shannon, 1916−2001 What were his motivations for defining/studying information? What is a “message source”?

23 Shannon Information Vocab

24 Shannon Information Definitions

25 Shannon Information Measured in “bits” Message source has N “miscrostates” (or “messages”, e.g., words). p i is the probability of message i. Boltzmann Entropy Measured in units defined by k (often “Joules per Kelvin”)

26 Messages: {Da}

27 Messages: {300 words}

28 Calculating Shannon Entropy

29


Download ppt "Information Reading: Complexity: A Guided Tour, Chapter 3."

Similar presentations


Ads by Google