Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ.

Slides:



Advertisements
Similar presentations
Z-Plane Analysis DR. Wajiha Shah. Content Introduction z-Transform Zeros and Poles Region of Convergence Important z-Transform Pairs Inverse z-Transform.
Advertisements

On an Improved Chaos Shift Keying Communication Scheme Timothy J. Wren & Tai C. Yang.
DCSP-11 Jianfeng Feng
TOPDRIM: Update WP2 March 2013 Rick Quax, Peter M.A. Sloot.
NEU Neural Computing MSc Natural Computation Department of Computer Science University of York.
Copyright ©2010, ©1999, ©1989 by Pearson Education, Inc. All rights reserved. Discrete-Time Signal Processing, Third Edition Alan V. Oppenheim Ronald W.
Summer 2011 Monday, 8/1. As you’re working on your paper Make sure to state your thesis and the structure of your argument in the very first paragraph.
A Statistical Mechanical Analysis of Online Learning: Can Student be more Clever than Teacher ? Seiji MIYOSHI Kobe City College of Technology
B.Macukow 1 Lecture 3 Neural Networks. B.Macukow 2 Principles to which the nervous system works.
1 A Statistical Mechanical Analysis of Online Learning: Seiji MIYOSHI Kobe City College of Technology
1 Statistical Mechanics of Online Learning for Ensemble Teachers Seiji Miyoshi Masato Okada Kobe City College of Tech. Univ. of Tokyo, RIKEN BSI.
I said to myself about the language of men, when they prove God, and see that they themselves are beasts: the case of humans and the case of beasts are.
1 Discrete Hilbert Transform 7 th April 2007 Digital Signal Processing I Islamic University of Gaza.
Fuzzy immune PID neural network control method based on boiler steam pressure system Third pacific-asia conference on circuits,communications and system,
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
On Modeling Feedback Congestion Control Mechanism of TCP using Fluid Flow Approximation and Queuing Theory  Hisamatu Hiroyuki Department of Infomatics.
Molecular Biology of Memory: A Dialogue Between Genes and Synapses Eric R. Kandel, MD Nobel Prize Laureate in Medicine 2000.
NSF Foundations of Hybrid and Embedded Software Systems UC Berkeley: Chess Vanderbilt University: ISIS University of Memphis: MSI A New System Science.
Basic Models in Neuroscience Oren Shriki 2010 Associative Memory 1.
Simulation Waiting Line. 2 Introduction Definition (informal) A model is a simplified description of an entity (an object, a system of objects) such that.
DEVS and DEVS Model Dr. Feng Gu. Cellular automata with fitness.
Dynamics of Learning & Distributed Adaptation PI: James P. Crutchfield, Santa Fe Institute Second PI Meeting, April 2001, SFe Dynamics of Learning:
Biologically Inspired Robotics Group,EPFL Associative memory using coupled non-linear oscillators Semester project Final Presentation Vlad TRIFA.
1 Analysis of Ensemble Learning using Simple Perceptrons based on Online Learning Theory Seiji MIYOSHI 1 Kazuyuki HARA 2 Masato OKADA 3,4,5 1 Kobe City.
Nawaf M Albadia Introduction. Components. Behavior & Characteristics. Classes & Rules. Grid Dimensions. Evolving Cellular Automata using Genetic.
Absorbing Phase Transitions
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
Artificial Neurons: Hopfield Networks Seminar: Introduction to the Theory of Neural Computation Introduction Neurophysiological Background Modeling Simplified.
TOPIC : Types of fault simulation
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
Fuzzy BSB-neuro-model. «Brain-State-in-a-Box Model» (BSB-model) Dynamic of BSB-model: (1) Activation function: (2) 2.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
POLYNOMIAL, RATIONAL, EXPONENTIAL, AND LOGARITHMIC FUNCTIONS College Algebra.
ECE 8443 – Pattern Recognition ECE 3163 – Signals and Systems Objectives: The Trigonometric Fourier Series Pulse Train Example Symmetry (Even and Odd Functions)
Introduction to Lattice Simulations. Cellular Automata What are Cellular Automata or CA? A cellular automata is a discrete model used to study a range.
Methodology of Simulations n CS/PY 399 Lecture Presentation # 19 n February 21, 2001 n Mount Union College.
September Bound Computation for Adaptive Systems V&V Giampiero Campa September 2008 West Virginia University.
STATISTICAL COMPLEXITY ANALYSIS Dr. Dmitry Nerukh Giorgos Karvounis.
1 On Dynamic Parallelism Adjustment Mechanism for Data Transfer Protocol GridFTP Takeshi Itou, Hiroyuki Ohsaki Graduate School of Information Sci. & Tech.
SIMULATIONS, REALIZATIONS, AND THEORIES OF LIFE H. H. PATTEE (1989) By Hyojung Seo Dept. of Psychology.
Chapter 4. Formal Tools for the Analysis of Brain-Like Structures and Dynamics (1/2) in Creating Brain-Like Intelligence, Sendhoff et al. Course: Robots.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
The Trigonometric Fourier Series Representations
Analysis of spectro-temporal receptive fields in an auditory neural network Madhav Nandipati.
Brief Announcement : Measuring Robustness of Superpeer Topologies Niloy Ganguly Department of Computer Science & Engineering Indian Institute of Technology,
DEVS-based Modeling and Simulation References: 1.B. P. Zeigler, Hessam S. Sarjoughian, Introduction to DEVS Modeling and Simulation with JAVA: Developing.
Designing High-Capacity Neural Networks for Storing, Retrieving and Forgetting Patterns in Real-Time Dmitry O. Gorodnichy IMMS, Cybernetics Center of Ukrainian.
Analog recurrent neural network simulation, Θ(log 2 n) unordered search with an optically-inspired model of computation.
The Trigonometric Fourier Series Representations
TOPIC : Introduction to Sequential Circuits UNIT 1: Modeling and Simulation Module 4 : Modeling Sequential Circuits.
An Associative Memory based on a Mixed-Signal Cellular Neural Network Michael Flynn, Daniel Weyer.
A Method to Approximate the Bayesian Posterior Distribution in Singular Learning Machines Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Ch 7. Computing with Population Coding Summarized by Kim, Kwonill Bayesian Brain: Probabilistic Approaches to Neural Coding P. Latham & A. Pouget.
Memory Network Maintenance Using Spike-Timing Dependent Plasticity David Jangraw, ELE ’07 Advisor: John Hopfield, Department of Molecular Biology 12 T.
The Theory of Simulation
Stefan Mihalas Ernst Niebur Krieger Mind/Brain Institute and
Dynamics of Learning & Distributed Adaptation
Master Equation Formalism
Intermittency and clustering in a system of self-driven particles
Arithmetic Sequences.
Dynamics of Learning & Distributed Adaptation James P
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Recurrent Networks A recurrent network is characterized by
Quantum computation with classical bits
 = N  N matrix multiplication N = 3 matrix N = 3 matrix N = 3 matrix
Optimizing Neural Information Capacity through Discretization
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
Word.
Presentation transcript:

Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ. RIKEN BSI, ERATO KDB JAPAN JAPAN JAPAN ~ miyoshi/

Background Synapses of real neural systems seem to have delays. It is very important to analyze associative memory model with delayed synapses. Computer simulation is powerful method. Theoretical and analytical approach is indispensable to research on delayed networks. There is a Limit on the number of neurons.However, Yanai-Kim theory by using Statistical Neurodynamics Good Agreement with computer simulation Computational Complexity is O(L 4 t) Simulating network with large delay steps is realistically impossible.

Objective To derive macroscopic steady state equations by using discrete Fourier transformation To discuss storage capacity quantitatively even for a large L limit (L: length of delay)

Recurrent Neural Network with Delay Elements Model

Overlap Model Discrete Synchronous Updating Rule Correlation Learning for Sequence Processing

Macrodynamical Equations by Statistical Neurodynamics Yanai & Kim(1995) Miyoshi, Yanai & Okada(2002)

Initial Condition of the Network One Step Set Initial Condition Only the states of neurons are set explicitly. The states of delay elements are set to be zero. All Steps Set Initial Condition The states of all neurons and all delay elements are set to be close to the stored pattern sequences. If they are set to be the stored pattern sequences themselves ≡ Optimum Initial Condition setzero set

Dynamical Behaviors of Recall Process All Steps Set Intial Condition Loading rateα=0.5 Length of delay L=3 Simulation ( N=2000 ) Theory

Dynamical Behaviors of Recall Process All Steps Set Intial Condition Loading rateα=0.5 Length of delay L=2 Simulation ( N=2000 ) Theory

Loading rates α - Steady State Overlaps m Simulation ( N=500 ) Theory

Length of delay L - Critical Loading Rate α C

Macrodynamical Equations by Statistical Neurodynamics Yanai & Kim(1995) Miyoshi, Yanai & Okada(2002) Good Agreement with Computer Simulation Computational Complexity is O(L 4 t)

Macroscopic Steady State Equations Accounting for Steady State Parallel Symmetry in terms of Time Steps Discrete Fourier Transformation

Loading rates α - Steady State Overlaps m

Simulation ( N=500 ) Theory

Loading rate α - Steady State Overlap

Storage Capacity of Delayed Network Storage Capacity = L

Conclusions Yanai-Kim theory (macrodynamical equations for delayed network) is re-derived. Steady state equations are derived by using discrete Fourier transformation. Storage capacity is L in a large L limit. → Computational Complexity is O(L 4 t) → Intractable to discuss macroscopic properties in a large L limit → Computational complexity does not formally depend on L → Phase transition points agree with those under the optimum initial conditions, that is, the Storage Capacities !