Evolving a Sigma-Pi Network as a Network Simulator by Justin Basilico.

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
NEURAL NETWORKS Backpropagation Algorithm
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Yuri R. Tsoy, Vladimir G. Spitsyn, Department of Computer Engineering
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
EA, neural networks & fuzzy systems Michael J. Watts
Artificial Neural Networks
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Genetic algorithms for neural networks An introduction.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
Neural Networks Marco Loog.
Evolution and Coevolution of Artificial Neural Networks playing Go Thesis by Peter Maier, Salzburg, April 2004 Additional paper used Computer Go, by Martin.
Neural Networks Chapter Feed-Forward Neural Networks.
EASy Summer 2006Non-Symbolic AI lec 91 Non-Symbolic AI lecture 9 Data-mining – using techniques such as Genetic Algorithms and Neural Networks. Looking.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Genetic Algorithm What is a genetic algorithm? “Genetic Algorithms are defined as global optimization procedures that use an analogy of genetic evolution.
The Performance of Evolutionary Artificial Neural Networks in Ambiguous and Unambiguous Learning Situations Melissa K. Carroll October, 2004.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Neural Network Tools. Neural Net Concepts The package provides a “standard” multi-layer perceptron –Composed of layers of neurons –All neurons in a layer.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Genetic Programming System for Music Generation With Automated Fitness Raters.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Genetic Algorithms and Ant Colony Optimisation
Classification Part 3: Artificial Neural Networks
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Integrating Neural Network and Genetic Algorithm to Solve Function Approximation Combined with Optimization Problem Term presentation for CSC7333 Machine.
C. Benatti, 3/15/2012, Slide 1 GA/ICA Workshop Carla Benatti 3/15/2012.
A Comparison of Nature Inspired Intelligent Optimization Methods in Aerial Spray Deposition Management Lei Wu Master’s Thesis Artificial Intelligence Center.
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
Multi-Layer Perceptrons Michael J. Watts
Neural Networks Chapter 6 Joost N. Kok Universiteit Leiden.
Appendix B: An Example of Back-propagation algorithm
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
1 RECENT DEVELOPMENTS IN MULTILAYER PERCEPTRON NEURAL NETWORKS Walter H. Delashmit Lockheed Martin Missiles and Fire Control Dallas, TX 75265
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Applying Neural Networks Michael J. Watts
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
Non-Bayes classifiers. Linear discriminants, neural networks.
1 Genetic Algorithms and Ant Colony Optimisation.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
CS621 : Artificial Intelligence
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
AI & Machine Learning Libraries By Logan Kearsley.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Bump Hunting The objective PRIM algorithm Beam search References: Feelders, A.J. (2002). Rule induction by bump hunting. In J. Meij (Ed.), Dealing with.
An Evolutionary Algorithm for Neural Network Learning using Direct Encoding Paul Batchis Department of Computer Science Rutgers University.
Genetic Algorithm(GA)
Evolutionary Design of the Closed Loop Control on the Basis of NN-ANARX Model Using Genetic Algoritm.
Genetic Algorithms and Evolutionary Programming A Brief Overview.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Evolutionary Computation Evolving Neural Network Topologies.
Bayesian Neural Networks
Neural Networks.
Deep Feedforward Networks
Applying Neural Networks
EA, neural networks & fuzzy systems
of the Artificial Neural Networks.
EE368 Soft Computing Genetic Algorithms.
Presentation transcript:

Evolving a Sigma-Pi Network as a Network Simulator by Justin Basilico

Problem description n To evolve a neural network that acts as a general network which can be “programmed” by its inputs to act as a variety of different networks. u Input: Another network and its input. u Output: The output of the given network on the input. n Use a sigma-pi network and evolve its connectivity using a genetic algorithm.

Problem motivation n If a network can be created to simulate other networks given as input, then perhaps we can build neural networks that act upon other neural networks. n It would be interesting to see if one network could apply backpropagation to another network.

Previous work n This problem remains largely unexplored. n Evolutionary techniques have been applied to networks similar to sigma-pi networks u Janson & Frenzel, Training Product Unit Neural Networks with Genetic Algorithms (1993) F Evolved product networks, which are “more powerful” than sigma-pi networks since they allow a variable exponent

Previous work n Papers from class that are provide some information and inspiration: u Plate, Randomly connected sigma-pi neurons can form associator networks (2000) u Belew, McInerney, & Schraudolph, Evolving Networks: Using the Genetic Algorithm with Connectionist Learning (1991) u Chalmers, The Evolution of Learning: An Experiment in Genetic Connectionism (1990)

Approach (Overview) n Generate a testing set of 100 random networks to simulate. n Generate initial population of chromosomes for sigma-pi network. n For each generation, decode each chromosome into a sigma-pi network and use fitness function to evaluate network’s fitness as a simulator using the testing set.

Approach (Overview) n First try to simulate single-layer networks: u 2 input and 1 output units u 2 input and 2 output units n Then try it on a multi-layer network: u 2 input, 2 hidden, and 2 output units

Approach n Input encoding u The simulation network is given the input to the simulated network along with the weight values for the network it is simulating. u Generate a fully-connected, feed- forward network with random weights along with a random input, then feed the input through the network to get the output.

Approach n Input encoding u Example: outputinput 1 bias w 30 w 31 w 32 w 40 w 41 w 42 w 50 input 2 w 30 w 31 w 32 w 53 w 54 Network : w 40 w 41 w 42 w 50 w 53 w 54 inputsweight layer 1weight layer 2 Input encoding : hidden 1 hidden 2 input 1 input 2

Approach n Target output encoding u The output that the randomly weighted network produces on its random input.

Approach n Chromosome encoding u Each chromosome encodes the connectivity (architecture) of the sigma-pi network. u To simplify things, allow network weights to either be 1.0, signifying there is a connection there, or 0.0 signifying that there is not. u Initialize chromosome to random string of bits.

Approach n Chromosome encoding: u To encode the connectivity of a layer with m units to a layer with n units, use a binary string of length: (m + 1)  n  Example:   bias 

Approach n Genetic algorithm u Selection: Chromosomes ranked by fitness, probability of selection based on rank. u Crossover: Randomly select bits in chromosome for crossover. (I might add in some sort of functional unit here.) u Mutation: Each bit in every chromosome has a mutation rate of 0.01.

Approach n Fitness function u Put build a sigma-pi network using the chromosome. u Test the sigma-pi network on a testing set of 100 networks. u Better chromosomes have smaller fitness value.

Approach n Fitness function u Attempt 1: Mean squared error. F Problem: Evolved networks just always guessed 0.5 because a sigmoid activation function was used. u Attempt 2: Number of incorrect outputs within a threshold of F Problem: We want an optimum solution with as few weights in the network as possible. u Attempt 3: Use second function and also factor in number of 1’s in chromosome.

Results n So far: u Tried to train a backpropagation network to do simulation, but it did not work. u Managed to evolve sigma-pi network architectures to simulate simple, one layer networks with linear units. u Still working on simulating networks with two layers and with sigmoid units.

Results n Network with 2 input, 1 output units and linear activation u Population: 100 u Optimal solution after 24 generations bias   w 30 input 1 input 2 w 31 w 32

Results n Network with 2 input, 2 output units and linear activation u Population: 150 u Optimal solution after 121 generations (stabilizes after 486) bias   w 30 input 1 input 2 w 31 w 32 w 40 w 41 w 42  

Results (so far) n Network with 2 input, 2 hidden, and 1 output units and linear activation u Have not gotten it to work yet. u Sigma-pi network has 3 hidden layers ( ) u Might be a problem to do with sparse connectivity of solution where input weights need to be “saved” for later layers. u Potential solution: Fix more of the network network architecture so the size of the chromosome is smaller.

Future work n Expand evolution parameters to allow wider variation in the evolved networks (network weights, activation functions). n Try to simulate larger networks. n Evolve a network that implements backpropagation: u Start small with just the delta-rule for output and hidden units. u Work up to a network that does full backpropagation.

More future work n Evolve networks that create their own learning rules. n Use a learning algorithm for training sigma-pi networks rather than evolution. n Create a sigma-pi network that simulates other sigma-pi networks.