“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.

Slides:



Advertisements
Similar presentations
Perceptron Lecture 4.
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Introduction to Artificial Neural Networks
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Artificial Neural Network
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Artificial Intelligence (CS 461D)
Simple Neural Nets For Pattern Classification
Introduction CS/CMPE 537 – Neural Networks. CS/CMPE Neural Networks (Sp 2004/2005) - Asim LUMS2 Biological Inspiration The brain is a highly.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
PERCEPTRON. Chapter 3: The Basic Neuron  The structure of the brain can be viewed as a highly interconnected network of relatively simple processing.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Machine Learning. Learning agent Any other agent.
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Chapter 9 Neural Network.
Artificial Neural Networks. Applied Problems: Image, Sound, and Pattern recognition Decision making  Knowledge discovery  Context-Dependent Analysis.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
UNSUPERVISED LEARNING NETWORKS
Artificial Intelligence & Neural Network
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
SUPERVISED LEARNING NETWORK
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
November 21, 2013Computer Vision Lecture 14: Object Recognition II 1 Statistical Pattern Recognition The formal description consists of relevant numerical.
COMP53311 Other Classification Models: Neural Network Prepared by Raymond Wong Some of the notes about Neural Network are borrowed from LW Chan’s notes.
Neural Networks 2nd Edition Simon Haykin
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
CHAPTER 1 1 INTRODUCTION “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Intro. ANN & Fuzzy Systems Lecture 3 Basic Definitions of ANN.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Artificial Neural Networks
Other Classification Models: Neural Network
Artificial Intelligence (CS 370D)
Other Classification Models: Neural Network
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Hebb and Perceptron.
Chapter 3. Artificial Neural Networks - Introduction -
OVERVIEW OF BIOLOGICAL NEURONS
XOR problem Input 2 Input 1
Artificial Intelligence Lecture No. 28
Lecture Notes for Chapter 4 Artificial Neural Networks
ARTIFICIAL NEURAL networks.
The Network Approach: Mind as a Web
Introduction to Neural Network
Presentation transcript:

“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL NEURAL NETWORKS: AN INTRODUCTION

DEFINITION OF NEURAL NETWORKS According to the DARPA Neural Network Study (1988, AFCEA International Press, p. 60):... a neural network is a system composed of many simple processing elements operating in parallel whose function is determined by network structure, connection strengths, and the processing performed at computing elements or nodes. According to Haykin (1994), p. 2: A neural network is a massively parallel distributed processor that has a natural propensity for storing experiential knowledge and making it available for use. It resembles the brain in two respects: Knowledge is acquired by the network through a learning process. Interneuron connection strengths known as synaptic weights are used to store the knowledge. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

BRAIN COMPUTATION The human brain contains about 10 billion nerve cells, or neurons. On average, each neuron is connected to other neurons through approximately 10,000 synapses. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

INTERCONNECTIONS IN BRAIN “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

BIOLOGICAL (MOTOR) NEURON “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

 Information-processing system.  Neurons process the information.  The signals are transmitted by means of connection links.  The links possess an associated weight.  The output signal is obtained by applying activations to the net input. ARTIFICIAL NEURAL NET “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

MOTIVATION FOR NEURAL NET  Scientists are challenged to use machines more effectively for tasks currently solved by humans.  Symbolic rules don't reflect processes actually used by humans.  Traditional computing excels in many areas, but not in others. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

The major areas being:  Massive parallelism  Distributed representation and computation  Learning ability  Generalization ability  Adaptivity  Inherent contextual information processing  Fault tolerance  Low energy consumption. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

ARTIFICIAL NEURAL NET X2X2 X1X1 W2W2 W1W1 Y The figure shows a simple artificial neural net with two input neurons (X 1, X 2 ) and one output neuron (Y). The inter connected weights are given by W 1 and W 2. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

ASSOCIATION OF BIOLOGICAL NET WITH ARTIFICIAL NET “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

The neuron is the basic information processing unit of a NN. It consists of: 1. A set of links, describing the neuron inputs, with weights W 1, W 2, …, W m. 2. An adder function (linear combiner) for computing the weighted sum of the inputs (real numbers): 3. Activation function for limiting the amplitude of the neuron output. PROCESSING OF AN ARTIFICIAL NET “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

BIAS OF AN ARTIFICIAL NEURON The bias value is added to the weighted sum ∑w i x i so that we can transform it from the origin. Y in = ∑w i x i + b, where b is the bias x 1 -x 2 =0 x 1 -x 2 = 1 x1 x1 x2 x2 x 1 -x 2 = -1 “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

MULTI LAYER ARTIFICIAL NEURAL NET INPUT: records without class attribute with normalized attributes values. INPUT VECTOR: X = { x 1, x 2, …, x n } where n is the number of (non-class) attributes. INPUT LAYER: there are as many nodes as non-class attributes, i.e. as the length of the input vector. HIDDEN LAYER: the number of nodes in the hidden layer and the number of hidden layers depends on implementation. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

OPERATION OF A NEURAL NET - f Weighted sum Input vector x Output y Activation function Weight vector w  w 0j w 1j w nj x0x0 x1x1 xnxn Bias “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

WEIGHT AND BIAS UPDATION Per Sample Updating updating weights and biases after the presentation of each sample. Per Training Set Updating (Epoch or Iteration) weight and bias increments could be accumulated in variables and the weights and biases updated after all the samples of the training set have been presented. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

STOPPING CONDITION  All change in weights (  wij) in the previous epoch are below some threshold, or  The percentage of samples misclassified in the previous epoch is below some threshold, or  A pre-specified number of epochs has expired.  In practice, several hundreds of thousands of epochs may be required before the weights will converge. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

BUILDING BLOCKS OF ARTIFICIAL NEURAL NET  Network Architecture (Connection between Neurons)  Setting the Weights (Training)  Activation Function “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

LAYER PROPERTIES  Input Layer: Each input unit may be designated by an attribute value possessed by the instance.  Hidden Layer: Not directly observable, provides nonlinearities for the network.  Output Layer: Encodes possible values. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

TRAINING PROCESS  Supervised Training - Providing the network with a series of sample inputs and comparing the output with the expected responses.  Unsupervised Training - Most similar input vector is assigned to the same output unit.  Reinforcement Training - Right answer is not provided but indication of whether ‘right’ or ‘wrong’ is provided. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

ACTIVATION FUNCTION  ACTIVATION LEVEL – DISCRETE OR CONTINUOUS  HARD LIMIT FUCNTION (DISCRETE) Binary Activation function Bipolar activation function Identity function  SIGMOIDAL ACTIVATION FUNCTION (CONTINUOUS) Binary Sigmoidal activation function Bipolar Sigmoidal activation function “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

ACTIVATION FUNCTION Activation functions: (A) Identity (B) Binary step (C) Bipolar step (D) Binary sigmoidal (E) Bipolar sigmoidal (F) Ramp “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

CONSTRUCTING ANN  Determine the network properties: Network topology Types of connectivity Order of connections Weight range  Determine the node properties: Activation range  Determine the system dynamics Weight initialization scheme Activation – calculating formula Learning rule “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

PROBLEM SOLVING  Select a suitable NN model based on the nature of the problem.  Construct a NN according to the characteristics of the application domain.  Train the neural network with the learning procedure of the selected model.  Use the trained network for making inference or solving problems. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

NEURAL NETWORKS  Neural Network learns by adjusting the weights so as to be able to correctly classify the training data and hence, after testing phase, to classify unknown data.  Neural Network needs long time for training.  Neural Network has a high tolerance to noisy and incomplete data. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

SALIENT FEATURES OF ANN  Adaptive learning  Self-organization  Real-time operation  Fault tolerance via redundant information coding  Massive parallelism  Learning and generalizing ability  Distributed representation “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

McCULLOCH–PITTS NEURON  Neurons are sparsely and randomly connected  Firing state is binary (1 = firing, 0 = not firing)  All but one neuron are excitatory (tend to increase voltage of other cells) One inhibitory neuron connects to all other neurons It functions to regulate network activity (prevent too many firings) “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

LINEAR SEPARABILITY  Linear separability is the concept wherein the separation of the input space into regions is based on whether the network response is positive or negative.  Consider a network having positive response in the first quadrant and negative response in all other quadrants (AND function) with either binary or bipolar data, then the decision line is drawn separating the positive response region from the negative response region. “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

HEBB NETWORK Donald Hebb stated in 1949 that in the brain, the learning is performed by the change in the synaptic gap. Hebb explained it: “When an axon of cell A is near enough to excite cell B, and repeatedly or permanently takes place in firing it, some growth process or metabolic change takes place in one or both the cells such that A’s efficiency, as one of the cells firing B, is increased.” “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

 The weights between neurons whose activities are positively correlated are increased:  Associative memory is produced automatically  The Hebb rule can be used for pattern association, pattern categorization, pattern classification and over a range of other areas. HEBB LEARNING “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.

FEW APPLICATIONS OF NEURAL NETWORKS “Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved.