High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06,

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Université du Québec École de technologie supérieure Face Recognition in Video Using What- and-Where Fusion Neural Network Mamoudou Barry and Eric Granger.
Princess Nora University Artificial Intelligence Artificial Neural Network (ANN) 1.
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Introduction to Neural Networks Computing
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB.
Tuomas Sandholm Carnegie Mellon University Computer Science Department
Classification Neural Networks 1
Machine Learning Neural Networks
Simple Neural Nets For Pattern Classification
Introduction CS/CMPE 537 – Neural Networks. CS/CMPE Neural Networks (Sp 2004/2005) - Asim LUMS2 Biological Inspiration The brain is a highly.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Pattern Association A pattern association learns associations between input patterns and output patterns. One of the most appealing characteristics of.
Correlation Matrix Memory CS/CMPE 333 – Neural Networks.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Before we start ADALINE
Data Mining with Neural Networks (HK: Chapter 7.5)
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #31 4/17/02 Neural Networks.
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
CS Instance Based Learning1 Instance Based Learning.
Aula 4 Radial Basis Function Networks
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Radial-Basis Function Networks
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Machine Learning. Learning agent Any other agent.
Computational Video Group From recognition in brain to recognition in perceptual vision systems. Case study: face in video. Example: identifying computer.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks. Applied Problems: Image, Sound, and Pattern recognition Decision making  Knowledge discovery  Context-Dependent Analysis.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
1 Pattern Classification X. 2 Content General Method K Nearest Neighbors Decision Trees Nerual Networks.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
CSC321: Introduction to Neural Networks and machine Learning Lecture 16: Hopfield nets and simulated annealing Geoffrey Hinton.
Features of Biological Neural Networks 1)Robustness and Fault Tolerance. 2)Flexibility. 3)Ability to deal with variety of Data situations. 4)Collective.
Features of Biological Neural Networks 1)Robustness and Fault Tolerance. 2)Flexibility. 3)Ability to deal with variety of Data situations. 4)Collective.
Face Detection Using Large Margin Classifiers Ming-Hsuan Yang Dan Roth Narendra Ahuja Presented by Kiang “Sean” Zhou Beckman Institute University of Illinois.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Designing High-Capacity Neural Networks for Storing, Retrieving and Forgetting Patterns in Real-Time Dmitry O. Gorodnichy IMMS, Cybernetics Center of Ukrainian.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Fall 2004 Perceptron CS478 - Machine Learning.
Neural Networks.
Real Neurons Cell structures Cell body Dendrites Axon
Classification with Perceptrons Reading:
Classification Neural Networks 1
Chapter 3. Artificial Neural Networks - Introduction -
The Network Approach: Mind as a Web
Introduction to Neural Network
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

High Performance Associative Neural Networks: Overview and Library High Performance Associative Neural Networks: Overview and Library Presented at AI06, Quebec city, Canada, June 7-9, 2006 Oleksiy K. Dekhtyarenko 1 and Dmitry O. Gorodnichy Institute of Mathematical Machines and Systems, Dept. of Neurotechnologies, 42 Glushkov Ave., Kiev, 03187, Ukraine. 2 - Institute for Information Technology, National Research Council of Canada, M-50 Montreal Rd, Ottawa, Ontario, K1A 0R6, Canada.

2 Associative Neural Network Model Features: Distributed storage of information fault tolerance Parallel way of operation efficient hardware implementation Non-iterative learning rules fast, deterministic training Confirms to three main principles of neural processing: 1.Non-linear processing 2.Massively distributed collective decision making 3.Synaptic plasticity 1.to accumulate learning data in time by adjusting synapses 2.to associate receptor to effector (using thus computed synaptic values) The Associative Neural Network (AsNN) is a dynamical nonlinear system capable of processing information via the evolution of its state in high dimensional state-space.

3 Examples of Practical Applications Face recognition from video * Electronic Nose ** * D. Gorodnichy – Associative Neural Networks as Means for Low-Resolution Video-Based Recognition, IJCNN05 ** A. Reznik; Y. Shirshov; B. Snopok; D. Nowicki; O. Dekhtyarenko & I. Kruglenko – Associative Memories for Chemical Sensing, ICONIP'02

4 Associative Properties Convergence Process Network evolves according to the state update rule: – set of memorized patterns We want the network to be retrieve data by associative similarity (to restore noisy or incomplete input data):

5 Sparse Associative Neural Network Advantages over Fully-Connected Model: Less memory needed for s/w simulation Quicker convergence during s/w simulation Fewer and/or more suitable connections for h/w implementation Greater biological plausibility Output of neuron i can affect neuron j (w ij 0) if and only if: Architecture, or Connectivity Template: Connection Density:

6 Network Architectures Random Architecture 1D Cellular Architecture Small-World Architecture 1 – the worst 5 – the best Associative Performance Memory Consumption Hardware Friendly Regular (cellular) 155 Small-World 254 Scale-Free 253 Random 352 Adaptive 452 Fully-Connected 511

7 Compare to … Fully connected net with n=24x24 neurons obtained by tracking and memorizing faces (of 24x24 pixel resolution) from real-life video sequences [Gorodnichy 05] Notice visible inherent synaptic structure ! This synaptic interdependency is utilized by Sparse architectures.

8 Some Learning Algorithms Projective Hebbian (Perceptron LR) Delta Rule Pseudo-Inverse – selection operator, where 1.Performance Evaluation Criteria 1.Performance Evaluation Criteria Error correction capability (Associativity strength) Capacity Training complexity Memory requirements Execution time: a) in Learning and b) in Recognition

9 Comparative Performance Analysis Networks with Fixed Architectures Associative performance and training complexity as a function of number of stored patterns Cellular 1D network with dimension 256 and connection radius 12, randomly generated data vectors

10 Comparative Performance Analysis Influence of Architecture Sparse network with dimension 200, randomly generated data vectors, various ways of architecture selection Associative performance as a function of connection density PI WS – PseudoInverse Weight Select, architecture targeting maximum informational capacity per synapse PI Random – Randomly set sparse architecture with PseudoInverse learning rule PI Cell – Cellular architecture with PseudoInverse learning rule PI WS Reverse – architecture constructed using the opposite criterion of PI WS

11 Associative Neural Network Library Publicly available at Effective C++ implementation of full and sparse associative networks Includes noniterative Pseudo-Inverse LR with possibility of addition/removal of selected vectors to/from memory Different learning rules: Projective, Hebbian, Delta Rule, Pseudo-Inverse Different architectures: fully-connected, cellular (1D and 2D), random, small-world, adaptive Desaturation Technique: allows to increase memory capacity up to 100% Different update rules: synchro. vs. asynchro. Detection of cycles Different testing functions: absolute and normalized radius of attraction, capacity Associative Classifiers: Convergence-based, Modular

12 Associative Neural Network Library Hierarchy of Main Classes