C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.

Slides:



Advertisements
Similar presentations
FUNCTION FITTING Student’s name: Ruba Eyal Salman Supervisor:
Advertisements

Pattern Association.
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Slides from: Doug Gray, David Poole
NEURAL NETWORKS Backpropagation Algorithm
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Kostas Kontogiannis E&CE
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Classification and Prediction by Yen-Hsien Lee Department of Information Management College of Management National Sun Yat-Sen University March 4, 2003.
Data Mining with Neural Networks (HK: Chapter 7.5)
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
CS 4700: Foundations of Artificial Intelligence
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Presented by: Kamakhaya Argulewar Guided by: Prof. Shweta V. Jain
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Multi Layer NN and Bit-True Modeling of These Networks SILab presentation Ali Ahmadi September 2007.
Explorations in Neural Networks Tianhui Cai Period 3.
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Appendix B: An Example of Back-propagation algorithm
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
The Perceptron. Perceptron Pattern Classification One of the purposes that neural networks are used for is pattern classification. Once the neural network.
BACKPROPAGATION: An Example of Supervised Learning One useful network is feed-forward network (often trained using the backpropagation algorithm) called.
Multi-Layer Perceptron
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
Music Genre Classification Alex Stabile. Example File
 Based on observed functioning of human brain.  (Artificial Neural Networks (ANN)  Our view of neural networks is very simplistic.  We view a neural.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Chapter 18 Connectionist Models
EEE502 Pattern Recognition
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Neural Networks 2nd Edition Simon Haykin
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Chapter 6 Neural Network.
Alex Stabile. Research Questions: Could a computer learn to distinguish between different composers? Why does music by different composers even sound.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Neural networks.
Neural Network Architecture Session 2
Learning in Neural Networks
ANN-based program for Tablet PC character recognition
CSE 473 Introduction to Artificial Intelligence Neural Networks
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
Neural Networks A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Prof. Carolina Ruiz Department of Computer Science
Artificial Neural Networks for Pattern Recognition
Data Mining with Neural Networks (HK: Chapter 7.5)
BACKPROPOGATION OF NETWORKS
network of simple neuron-like computing elements
Backpropagation.
Backpropagation.
David Kauchak CS51A Spring 2019
Learning Combinational Logic
Pattern Recognition: Statistical and Neural
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

c - IT Acumens. COMIT Acumens. COM

To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural network that uses the Backpropagation algorithm for the purpose of alphanumeric recognition and the use of Bi-directional Associative memories for the purpose of pattern recognition in the application of association of names with phone numbers. OBJECTIVE

REQUIREMENTS PLATFORM: Windows 9x/XP LANGUAGE USED: Microsoft VC++ DEVELOPMENT TOOL: Microsoft Visual Studio. GUI DESIGN : Microsoft Foundation Classes. OTHER DESIGN TOOLS: SmartDraw.

MODULES  Character Recognition (alpha numeric)  7 Segment Display  Look up Table  Back Propagation Algorithm  Pattern Recognition Bi-directional Associative Model

SYNOPSIS The character recognition in this project deals with the identification of alphanumeric characters that are created by user interaction. To highlight the importance of neural networks in this scenario two methods of automated character recognition was initially developed : Back propagation network Bidirectional associative memory

Character Recognition The basic function of this module is to implement neural network based alphanumeric recognition. There are three sub modules in this. The first one being using the pattern generated by the user through a GUI based 7 segment display. The next one is using a 5 x 7 grid generated in GUI to obtain the user interaction. This pattern is used as input and the particular character is recognized. The third sub module is the implementation of neural networks. It uses the backpropagation network for both alphabets and numeric recognition. The GUI created for this module allows the user to train the network with specific pattern of input.

7 Segment Display This method cannot display alphabets like Q, W, R, Y, K, Z, X, V, N, M.

Look Up Table Method This method is simpler and faster but the user has to create the pattern

Back Propagation Network For the recognition of alphabets a network with 26 output nodes in the output layer, 35 input nodes in the input layer and 50 hidden nodes in the hidden layer is used. For the recognition of numbers the same network with 10 output nodes is used. The output of a BackPropagation network as a classification decision.

BPN for Numeric Recognition

Training the network A back-propagation network also typically starts with a random set of weights. The network adjusts its weights each time it sees an input-output pair. Each pair requires two stages: a forward pass a backward pass The forward pass involves presenting a sample input to the network and letting activations flow until they reach the output layer.

In the backward pass, the network's actual output (from the forward pass) is compared with the target output and error estimates are computed for the output units. The weights connected to the output units can be adjusted in order reduce these errors. Then use the error estimates of the output units to derive error estimates for the hidden layers. Finally, errors are propagated back to the connections stemming from the input units. Training the network

Alphabet recognition -BPN

Numeric recognition BPN

Pattern Recognition The function of this module is to simulate a Bi- directional associative memory model for the application of associating the names with phone numbers. The module defines the names and phone numbers. The user when attempts to give the name with the wrong spelling the network learns the pattern given and finds the name that is most closely associated with the given pattern and gives the display. This module is to demonstrate the use of the BAM model in the field of pattern recognition.

The BAM network consists of two layers. Input layer being X layer and the output layer being Y. X layer represents the names and Y layer represents the phone numbers. X layer : 30 units with 6 bits per char in the name. Y layer: 42 units with 6 bits per char in the phone number. Implementing the BAM network.

Process of simulating the network Generating the network allocating the sufficient memory for the network layers X and Y. Initializing the application finding the Bipolar values for both the names and phone numbers. Calculating the weights weight =input * output of the particular unit.

Process of simulating the network Propagating signals between layers It involves the process of adjusting the output of Y layer such that the correct association between the element of the X Layer is found. Output: The output of the network is the association of the names with phone numbers.

SCREEN SHOT- Bi directional Associative memory