Reversing Conway’s Game of Life Jonathan Goetz. The Rules of the Game 1 living neighbor = cell dies 2 living neighbors = cell maintains 3 living neighbors.

Slides:



Advertisements
Similar presentations
INTRODUCTION TO MACHINE LEARNING Bayesian Estimation.
Advertisements

Game of Life in 21 st Century ECE817 Presentation By Kyusik Chung
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
(Page 554 – 564) Ping Perez CS 147 Summer 2001 Alternative Parallel Architectures  Dataflow  Systolic arrays  Neural networks.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Spring 2002 Shreekanth Mandayam Robi Polikar ECE Department.
Game of Life Changhyo Yu Game of Life2 Introduction Conway’s Game of Life  Rule Dies if # of alive neighbor cells =< 2 (loneliness) Dies.
Artificial Neural Networks -Application- Peter Andras
RECURSIVE PATTERNS WRITE A START VALUE… THEN WRITE THE PATTERN USING THE WORDS NOW AND NEXT: NEXT = NOW _________.
2.3) Functions, Rules, Tables and Graphs
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Variations of Conway’s Game of Life Eswar Kondapavuluri.
SE367 Project Final Presentation By: Sujith Thomas Parimi Krishna Chaitanya In charge:- Prof Amitabha Mukerjee.
Multiple-Layer Networks and Backpropagation Algorithms
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 INFO331 Machine learning. Neural networks. Supervised.
© Negnevitsky, Pearson Education, Will neural network work for my problem? Will neural network work for my problem? Character recognition neural.
Multi-Layer Perceptrons Michael J. Watts
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Appendix B: An Example of Back-propagation algorithm
Michigan REU Final Presentations, August 10, 2006Matt Jachowski 1 Multivariate Analysis, TMVA, and Artificial Neural Networks Matt Jachowski
Classification / Regression Neural Networks 2
Radial Basis Function Networks:
FAULT TREE ANALYSIS (FTA). QUANTITATIVE RISK ANALYSIS Some of the commonly used quantitative risk assessment methods are; 1.Fault tree analysis (FTA)
The Game of Life A simulation of "life". From simple rules, complex behavior arises Rules –A cell that is alive and has fewer than two live neighbors dies.
Playing God: The Engineering of Functional Designs in the Game of Life Liban Mohamed Computer Systems Research Lab
Research Into the Time Reversal of Cellular Automata Team rm -rf / Daniel Kaplun, Dominic Labanowski, Alex Lesman.
Review Recursion Call Stack. Two-dimensional Arrays Visualized as a grid int[][] grays = {{0, 20, 40}, {60, 80, 100}, {120, 140, 160}, {180, 200, 220}};
Applying Neural Networks Michael J. Watts
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
Math – What is a Function? 1. 2 input output function.
A Variation on Conway’s Game of Life Winston Lee EPS 109.
Chapter 6 Methods: A Deeper Look. Objectives In this chapter you will learn: How static methods and fields are associated with an entire class rather.
The Northern Lights: Demonstrations. Programmability Overview A major aspect of our project is programmability- it is an interactive display medium, the.
A function is a special kind of relation. (A relation is an operation, or series of operations, that maps one number onto another.)
EEE502 Pattern Recognition
Playing Tic-Tac-Toe with Neural Networks
FUNCTIONS FUNCTIONS DOMAIN: THE INPUT VALUES FOR A RELATION. USUALLY X INDEPENDENT VARIABLE RANGE: THE OUTPUT VALUES FOR A RELATION. USUALLY.
Intro. ANN & Fuzzy Systems Lecture 15. Pattern Classification (I): Statistical Formulation.
Supervise Learning Introduction. What is Learning Problem Learning = Improving with experience at some task – Improve over task T, – With respect to performance.
Functions and relations
Lecture 15. Pattern Classification (I): Statistical Formulation
Lateral Inhibition: How does it work
Functions and relations
Hebb and Perceptron.
Warm-Up Fill in the tables below for each INPUT-OUTPUT rule. 3)
SLOPE = = = The SLOPE of a line is There are four types of slopes
Training a Neural Network
1.6 Represent Functions as Rules and Tables
C Graphing Functions.
Object Classification through Deconvolutional Neural Networks
Multiplying and dividing recap
Dr. Fowler  CCM Functions.
Function Rules and Tables.
FUNCTION NOTATION AND EVALUATING FUNCTIONS
21 3 Variables Selection Functions Repetition Challenge 21
Functions and Tables.
Warm-Up Study the patterns below to determine the next five numbers in each sequence. You may use the calculator to check your answers. 2, 4, 6, 8, 10...
UNDERSTANDING FUNCTIONS
FUNCTION MACHINES This is a function machine: Input output
Artificial Neural Networks / Spring 2002
Lesson 3.3 Writing functions.
Presentation transcript:

Reversing Conway’s Game of Life Jonathan Goetz

The Rules of the Game 1 living neighbor = cell dies 2 living neighbors = cell maintains 3 living neighbors = cell lives 4-8 living neighbors = cell dies Many states share descendants Some states have no ancestors Some states are their own ancestors Information is lost In Reverse?

Challenges Loss of information prevents traditional back calculation of initial state The problem itself maps rather easily to a 400 input and 400 output MLP pattern classification which is applied recursively.

Details The basic structure is a multilayered set of MLPs which rescales the result to a range of 0- 1 before reapplying the same MLP in order to approximate the input layout. Initial attempts to create a variable depth training weighting were not successful.

Status Initial attempts at rewriting the training process for between 1-5 levels of difference unsuccessful. Instead implementing this by converting every multi-generational training data point into multiple 1 generation steps.