Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cellular Automata and Amorphous Computing Melanie Mitchell Portland State University and Santa Fe Institute Complex Systems Summer School Friday June 20,

Similar presentations


Presentation on theme: "Cellular Automata and Amorphous Computing Melanie Mitchell Portland State University and Santa Fe Institute Complex Systems Summer School Friday June 20,"— Presentation transcript:

1 Cellular Automata and Amorphous Computing Melanie Mitchell Portland State University and Santa Fe Institute Complex Systems Summer School Friday June 20, 2008 Copyright © 2008 by Melanie Mitchell

2 What are cellular automata? Game of Life

3 Review (?) of Computation Theory Hilbert’s problems Turing machines Universal Turing Machines Uncomputability of the halting problem

4 Turing machine

5 Example of Turing machine rule set

6 Two fundamental theorems of computation theory: 1.There exists a universal Turing machine 2.There is no Turing machine that can solve the halting problem.

7 Interesting problem: Given an initial configuration, can you calculate analytically how many steps Life will run for before it reaches a fixed configuration?

8 Universal Computation in the Game of Life

9

10

11 What is the feasibility of using this kind of universal computation in practice?

12 Von Neumann’s self-reproducing automaton

13 John von Neumann 1903–1957

14 Von Neumann’s self-reproducing automaton After his key role in designing the first electronic computers, von Neumann became interested in links between computers and biology John von Neumann 1903–1957

15 Von Neumann’s self-reproducing automaton In the last years of his life, von Neumann worked on the “logic” of self-reproduction and devised the first instance of a self-reproducing “machine” (in software, finally implemented in 1990s). John von Neumann 1903–1957

16 Von Neumann’s self-reproducing automaton Von Neumann’s design is complicated, but some of its key ideas can be captured by a simpler problem:

17 Von Neumann’s self-reproducing automaton Von Neumann’s design is complicated, but some of its key ideas can be captured by a simpler problem: Design a computer program that will print out a copy of itself.

18 A candidate self-copying program

19 program copy

20 A candidate self-copying program program copy print( “program copy”);

21 A candidate self-copying program program copy print( “program copy”); print( “ print(“program copy”);”);

22 A candidate self-copying program program copy print( “program copy”); print( “ print(“program copy”);”); print(“ print( “ print(“program copy”);”);”);

23 A candidate self-copying program program copy print( “program copy”); print( “ print(“program copy”);”); print(“ print( “ print(“program copy”);”);”);

24 A candidate self-copying program program copy print( “program copy”); print( “ print(“program copy”);”); print(“ print( “ print(“program copy”);”);”); “A machine can’t reproduce itself; to do so it would have to contain a description of itself, and that description would have to contain a description of itself, and so on ad infinitum”.

25 Some commands we will need in our programming language

26 mem –the memory location of the instruction currently being executed

27 Some commands we will need in our programming language mem –the memory location of the instruction currently being executed 1 2 3 4 5 computer memory program test print(“Hello, world”); print(“Goodbye”); end

28 Some commands we will need in our programming language mem –the memory location of the instruction currently being executed 1 2 3 4 5 computer memory program test print(“Hello, world”); print(“Goodbye”); end mem = 2

29 Some commands we will need in our programming language

30 line(n) –the string of characters in memory location n

31 Some commands we will need in our programming language line(n) –the string of characters in memory location n 1 2 3 4 5 program test print(“Hello, world”); print(“Goodbye”); end

32 Some commands we will need in our programming language line(n) –the string of characters in memory location n print(line(2)); will print print(“Hello, world”); 1 2 3 4 5 program test print(“Hello, world”); print(“Goodbye”); end

33 Some commands we will need in our programming language

34 loop until condition –loops until the condition is true

35 Some commands we will need in our programming language loop until condition –loops until the condition is true x = 0; loop until x = 4 { print(“Hello, world); x = x+1; }

36 Some commands we will need in our programming language loop until condition –loops until the condition is true x = 0; loop until x = 4 { print(“Hello, world); x = x+1; } Hello, world Output:

37 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end

38 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end Output

39 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end Output

40 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end Output L=3

41 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy Output L=3

42 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; Output L=3

43 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; Output L=3

44 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; Output L=3

45 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); Output L=3

46 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); Output L=4

47 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); Output L=4

48 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); Output L=4

49 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); Output L=4

50 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); Output L=4

51 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); Output L=5

52 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); Output L=5

53 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); Output L=5

54 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); Output L=5

55 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” Output L=5

56 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” Output L=6

57 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” Output L=6

58 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” Output L=6

59 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” Output L=6

60 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { Output L=6

61 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { Output L=7

62 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { Output L=7

63 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { Output L=7

64 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { Output L=7

65 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); Output L=7

66 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); Output L=8

67 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); Output L=8

68 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); Output L=8

69 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); Output L=8

70 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; Output L=8

71 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; Output L=9

72 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; Output L=9

73 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; Output L=9

74 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; Output L=9

75 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } Output L=9

76 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } Output L=10

77 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } Output L=10

78 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } Output L=10

79 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); Output L=10

80 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); Output L=11

81 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); Output L=11

82 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); Output L=11

83 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); end Output L=11

84 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); end Output L=11

85 A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); end Output L=11

86 Significance of the self-copying program The essence of self-copying in this program is to use the same information stored in memory in two ways: –interpret it as instructions in a computer program

87 Significance of the self-copying program The essence of self-copying in this program is to use the same information stored in memory in two ways: –interpret it as instructions in a computer program –interpret it as “data” to be used by the instructions in the computer program

88 Significance of the self-copying program The essence of self-copying in this program is to use the same information stored in memory in two ways: –interpret it as instructions in a computer program –interpret it as “data” to be used by the instructions in the computer program This is also a key mechanism in biological self- reproduction

89 Significance of the self-copying program The essence of self-copying in this program is to use the same information stored in memory in two ways: –interpret it as instructions in a computer program –interpret it as “data” to be used by the instructions in the computer program This is also a key mechanism in biological self- reproduction –DNA = program and data

90 Significance of the self-copying program The essence of self-copying in this program is to use the same information stored in memory in two ways: –interpret it as instructions in a computer program –interpret it as “data” to be used by the instructions in the computer program This is also a key mechanism in biological self- reproduction –DNA = program and data This principle was formulated by von Neumann in the 1940s, before the details of biological self-reproduction were well-understood.

91 Programs and interpreters

92 Notice that the self-copying program needs an external interpreter—the part of the computer that carries out the instructions

93 Programs and interpreters Notice that the self-copying program needs an external interpreter—the part of the computer that carries out the instructions Rough analogy to biology:

94 Programs and interpreters Notice that the self-copying program needs an external interpreter—the part of the computer that carries out the instructions Rough analogy to biology: –DNA = program and data

95 Programs and interpreters Notice that the self-copying program needs an external interpreter—the part of the computer that carries out the instructions Rough analogy to biology: –DNA = program and data –RNA, ribosomes, and enzymes = interpreter

96 Programs and interpreters Notice that the self-copying program needs an external interpreter—the part of the computer that carries out the instructions Rough analogy to biology: –DNA = program and data –RNA, ribosomes, and enzymes = interpreter DNA contains instructions for copying itself, and also for building its own interpreter

97 Programs and interpreters Notice that the self-copying program needs an external interpreter—the part of the computer that carries out the instructions Rough analogy to biology: –DNA = program and data –RNA, ribosomes, and enzymes = interpreter DNA contains instructions for copying itself, and also for building its own interpreter Von Neumann’s self-reproducing automaton also did this.

98 What are cellular automaton actually used for? Different perspectives: –CAs are models of physical (or biological or social) systems –CAs are alternative methods for approximating differential equations –CAs are devices that can simulate standard computers –CAs are parallel computers that can perform image processing, random number generation, cryptography, etc. –CAs are a framework for implementing molecular scale computation –CAs are a framework for exploring how “collective computation” might take place in natural systems (and that might be imitated in novel human-made computational systems)

99 Dynamics and Computation in Cellular Automata http://math.hws.edu/xJava/CA/EdgeOfChaosCA1.html

100 1.Fixed point 2.Periodic 3.Chaotic 4.“Complex” –long transients –universal computation? Wolfram’s classes for elementary CAs

101 Rule: ECA 110 is a universal computer (Matthew Cook, 2002) Wolfram’s numbering of ECA: 0 1 1 0 1 1 1 0 = 110 in binary

102

103 Outline of A New Kind of Science (Wolfram,2001) (from MM review, Science, 2002) Simple programs can produce complex, and random-looking behavior –Complex and random-looking behavior in nature comes from simple programs. Natural systems can be modeled using cellular-automata-like architectures Cellular automata are a framework for understanding nature Principle of computational equivalence

104 Principle of Computational Equivalence 1.The ability to support universal computation is very common in nature. 2.Universal computation is an upper limit on the sophistication of computations in nature. 3.Computing processes in nature are almost always equivalent in sophistication.

105 How can we describe information processing in complex systems?

106 majority on A cellular automaton evolved by the genetic algorithm majority off

107 Stephen Wolfram's last problem from ``Twenty problems in the theory of cellular automata'' (Wolfram, 1985): 20. What higher-level descriptions of information processing in cellular automata can be given? "It seems likely that a radically new approach is needed"

108 What is needed? 1.How can we characterize patterns as computations? 2.How can we design computations in the language of patterns?

109 1. How can we characterize patterns as computations?

110 Components of computation in spatially extended systems: –Storage of information –Transfer of information –Integration of information from different spatial locations

111 1. How can we characterize patterns as computations? Components of computation in spatially extended systems: –Storage of information –Transfer of information –Integration of information from different spatial locations First step: What structures in the observed patterns implement these components? Second step: How do these structures implement the computation?

112 –Transfer of information: moving particles From http://www.stephenwolfram.com/publications/articles/ca/86-caappendix/16/text.html

113 –Transfer of information: moving particles From http://www.stephenwolfram.com/publications/articles/ca/86-caappendix/16/text.html

114 –Transfer of information: moving particles –Integration of information from different spatial locations: particle collisions From http://www.stephenwolfram.com/publications/articles/ca/86-caappendix/16/text.html

115 –Transfer of information: moving particles –Integration of information from different spatial locations: particle collisions From http://www.stephenwolfram.com/publications/articles/ca/86-caappendix/16/text.html

116 How to automatically identify information-carrying structures in spatio-temporal patterns?

117 Three proposals: Filter by regular languages (Crutchfield and Hanson, 1993; Crutchfield et al., 2002) Filter by local statistical complexity (Shalizi et al., 2006) Filter by local information measures (Lizier et al., 2007) How to automatically identify information-carrying structures in spatio-temporal patterns?

118 Filter by regular languages (Crutchfield and Hanson, 1993; Crutchfield et al., 2002) Regular language: Simple-to-describe periodic pattern

119 Examples: Regular domains: (0)*, (1)*, (01)* Particles (spatially localized, temporally periodic boundaries or “defects” between regular domains) Regular domains filtered out CA for performing density classification

120 Regular domains: (0001)*, (1110)* Rule 54 Regular domains filtered out

121 Filter by local statistical complexity (Shalizi et al., 2006) Local statistical complexity of site i: –amount of information from past needed for optimal prediction of future in the vicinity of site i

122 Filter by local statistical complexity (Shalizi et al., 2006) Local statistical complexity of site i: –amount of information from past needed for optimal prediction of future in the vicinity of site i site i at time t

123 Filter by local statistical complexity (Shalizi et al., 2006) Local statistical complexity of site i: –amount of information from past needed for optimal prediction of future in the vicinity of site i site i at time t

124 Filter by local statistical complexity (Shalizi et al., 2006) Local statistical complexity of site i: –amount of information from past needed for optimal prediction of future in the vicinity of site i site i at time t

125 Filter by local statistical complexity (Shalizi et al., 2006) light cone of past influence on site i site i at time t Local statistical complexity of site i: – amount of information from past needed for optimal prediction of future in the vicinity of site i

126 Filter by local statistical complexity (Shalizi et al., 2006) light cone of past influence on site i site i at time t Local statistical complexity of site i: – amount of information from past needed for optimal prediction of future in the vicinity of site i

127 Filter by local statistical complexity (Shalizi et al., 2006) light cone of past influence on site i site i at time t Local statistical complexity of site i: – amount of information from past needed for optimal prediction of future in the vicinity of site i

128 Filter by local statistical complexity (Shalizi et al., 2006) light cone of past influence on site i site i at time t Local statistical complexity of site i: – amount of information from past needed for optimal prediction of future in the vicinity of site i light cone of future influence of site i

129 Filter by local statistical complexity (Shalizi et al., 2006) light cone of past influence on site i site i at time t Local statistical complexity of site i: – amount of information from past needed for optimal prediction of future in the vicinity of site i light cone of future influence of site i How well does past light cone predict future light cone?

130 Example: Rule 110 Filtering by local statistical complexity OriginalFiltered

131 Example: Rule 110 Filtering by local statistical complexity Note: This filter requires no prior determination of “regular domains”. But more computationally expensive than filtering by regular domains. OriginalFiltered

132 Filter by local information measures (Lizier et al., 2006)

133 Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

134 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

135 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. site i at time t Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

136 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. site i at time t site i k time steps in past Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

137 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. site i at time t Degree of information storage: how predictable is current state from past states? site i k time steps in past Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

138 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. site i at time t Degree of information storage: how predictable is current state from past states? site i k time steps in past Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

139 Rule 54 Positive information storage From Lizier et al., 2007 Degree of information storage at each site Positive: information has been stored at this site

140 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Amount of information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

141 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Amount of information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i. site i at time t site i  j at time t  1 (j  radius of neighborhood)

142 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Amount of information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i. site i at time t site i  j at time t  1 Degree of information transfer: how predictable is state at site i from previous state at site i  j ?

143 Rule 54Positive t (i, j, n+1) Degree of information transfer at each site (for j =  1) Positive: information has been transferred from site i  j to site i

144 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

145 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i. site i at time t site i k time steps in past

146 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i. site i at time t site i  j at time t  1 site i k time steps in past

147 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i. site i at time t site i  j at time t  1 site i k time steps in past Degree of information modification: how unpredictable is state at site i from storage and transfer?

148 Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i. site i at time t site i  j at time t  1 site i k time steps in past Degree of information modification: how unpredictable is state at site i from storage and transfer? “local separable information”: positive: no info modification negative: info modification

149 Local information modification Rule 54 Negative s(i, n) plotted on top of information transfer Negative: State at site i is not well predicted by information storage or transfer.

150 First step: What structures in the observed patterns implement these components?

151

152 Second step: How do these structures implement the computation?

153 majority on A cellular automaton evolved by the genetic algorithm (Performance  80%) majority off

154 Particles Regular domains: (0)*, (1)*, (01)* From Crutchfield et al., 2001

155 laws of “particle physics”

156 Hordijk, Crutchfield, and Mitchell, 1996: Models of CA computation in terms of particle kinematics (Note “condensation phase”)

157 Particle model of CA computation

158 CAs evolved for density classification CAs evolved for synchronization

159 generation 17 generation 18

160 First step: What structures in the observed patterns implement these components? Second step: How do these structures implement the computation?

161 How can we design computations in the language of patterns? “Programming” the CA in the language of the high-level structures, and compiling the program to the level of the CA rule Open problem!

162 Amorphous Computing Abelson, Sussman et al., MIT

163 Amorphous Computing (Abelson et al., 2000; 2007) Main ideas: –Produce vast quantities of tiny, unreliable computing elements (“particles”) at very low cost –Give them limited wireless communication abilities so each particle can communicate with nearby particles –Spray paint them onto a surface. Spatial arrangement, and thus communication, will be irregular. Processing and communication will be asynchronous. –Have them self-organize into a reliable network that does something useful

164 Some possible applications Smart buildings that sense usage and adjust to save energy Smart bridges that monitor traffic load and structural integrity Smart arrays of microphones for opimizing acoustics Self-assembly of nano-machines

165 One example: Origami-based self-assembly (R. Nagpal et al.) Set-up: –“Spray paint” thousands of MEMS “agents” on a 2D square foldable material. –Agents have no knowledge of global position or interconnect topology. –All agents have identical program. –Agents communicate locally (out to distance r) –Agents run asynchronously –Agents will collectively “fold” material to desired shape.

166 High-level language: Origami Shape Language –Six paper-folding axioms that can be used to construct a large class of flat folded shapes. Low-level language primitives: –gradients –neighborhood query –cell-to-cell contact –polarity inversion –flexible folding

167 Folding a cup

168 Origami Shape Language Primitives: points p i, lines L i, regions R i Axioms (Huzita): 1. Given two points p1 and p2, fold a line through them. 2. Given two points p1 and p2, fold p1 onto p2 (creates crease that bisects the line p1p2 at right angles 3. Given two lines L1 and L2, fold L1 onto L2 (constructs the crease that bisects the angle between L1 and L2)

169 4. Given p1 and L1, fold L1 onto itself through p1 (constructs a crease through p1 perpendicular to L1). 5. Given p1 and p2 and L1, make a fold that places p1 on L1 and passes through p2 (constructs the tanget to the parabola (p1 L1) through p2) 6. Given p1 and p2 and lines L1 and L2, make a fold that places p1 on L1 and p2 on L2 (constructs the tanget to two parabolas)

170 Primitive operations we’ll need: (create-region p1 L1) (within-region r1 op1...): restricts operations to give region (execute-fold L1 type landmark-point) Fold types: Valley (apical), Mountain (basal) apical: puts apical surface on inside basal: puts basal surface on inside Landmark-point: defines new “apical” and “basal” surfaces after fold by defining side of fold that will reverse polarity (intersect L1 L1): returns a point that is the intersection of the two lines

171 Corner points: c1-c4 Edge lines: e12-e41

172 Low-level agent operations (Nagpal, 2001)

173

174

175

176

177

178

179

180

181 How to implement OSL in low-level cell programs Example

182

183 I will put a link to all my slides on the CSSS wiki (under “Readings”  “Melanie Mitchell”). Thanks for listening!


Download ppt "Cellular Automata and Amorphous Computing Melanie Mitchell Portland State University and Santa Fe Institute Complex Systems Summer School Friday June 20,"

Similar presentations


Ads by Google