G5BAIM Artificial Intelligence Methods Graham Kendall Neural Networks.

Slides:



Advertisements
Similar presentations
Artificial Intelligence 12. Two Layer ANNs
Advertisements

Multi-Layer Perceptron (MLP)
Perceptron Lecture 4.
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Introduction to Neural Networks Computing
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Artificial Neural Network
Introduction to Artificial Intelligence (G51IAI)
Biological and Artificial Neurons Michael J. Watts
Simple Neural Nets For Pattern Classification
Neural Networks.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
PERCEPTRON. Chapter 3: The Basic Neuron  The structure of the brain can be viewed as a highly interconnected network of relatively simple processing.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
September 14, 2010Neural Networks Lecture 3: Models of Neurons and Neural Networks 1 Visual Illusions demonstrate how we perceive an “interpreted version”
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Before we start ADALINE
Data Mining with Neural Networks (HK: Chapter 7.5)
The McCulloch-Pitts Neuron. Characteristics The activation of a McCulloch Pitts neuron is binary. Neurons are connected by directed weighted paths. A.
1 COMP305. Part I. Artificial neural networks.. 2 The McCulloch-Pitts Neuron (1943). McCulloch and Pitts demonstrated that “…because of the all-or-none.
CS 4700: Foundations of Artificial Intelligence
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Artificial Neural Networks
Learning in neural networks Chapter 19 DRAFT. Biological neuron.
2101INT – Principles of Intelligent Systems Lecture 10.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Artificial Intelligence Neural Networks ( Chapter 9 )
Semiconductors, BP&A Planning,
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
ADVANCED PERCEPTRON LEARNING David Kauchak CS 451 – Fall 2013.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
EEE502 Pattern Recognition
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
1 Perceptron as one Type of Linear Discriminants IntroductionIntroduction Design of Primitive UnitsDesign of Primitive Units PerceptronsPerceptrons.
Artificial Intelligence Methods Neural Networks Lecture 1 Rakesh K. Bissoondeeal Rakesh K.
Perceptrons Michael J. Watts
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Start with student evals. What function does perceptron #4 represent?
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Neural networks.
Perceptron as one Type of Linear Discriminants
G5AIAI Introduction to AI
Artificial Intelligence 12. Two Layer ANNs
Artificial Neural Networks
Introduction to Neural Network
David Kauchak CS158 – Spring 2019

CS621: Artificial Intelligence Lecture 18: Feedforward network contd
Artificial Neural Network
Presentation transcript:

G5BAIM Artificial Intelligence Methods Graham Kendall Neural Networks

G5BAIM Neural Networks Neural Networks AIMA – Chapter 19 Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994 An Introduction to Neural Networks (2nd Ed). Morton, IM, 1995

G5BAIM Neural Networks Neural Networks McCulloch & Pitts (1943) are generally recognised as the designers of the first neural network Many of their ideas still used today (e.g. many simple units combine to give increased computational power and the idea of a threshold)

G5BAIM Neural Networks Neural Networks Hebb (1949) developed the first learning rule (on the premise that if two neurons were active at the same time the strength between them should be increased)

G5BAIM Neural Networks Neural Networks During the 50’s and 60’s many researchers worked on the perceptron amidst great excitement saw the death of neural network research for about 15 years – Minsky & Papert Only in the mid 80’s (Parker and LeCun) was interest revived (in fact Werbos discovered algorithm in 1974)

G5BAIM Neural Networks Neural Networks

G5BAIM Neural Networks Neural Networks We are born with about 100 billion neurons A neuron may connect to as many as 100,000 other neurons

G5BAIM Neural Networks Neural Networks Signals “move” via electrochemical signals The synapses release a chemical transmitter – the sum of which can cause a threshold to be reached – causing the neuron to “fire” Synapses can be inhibitory or excitatory

G5BAIM Neural Networks The First Neural Neural Networks McCulloch and Pitts produced the first neural network in 1943 Many of the principles can still be seen in neural networks of today

G5BAIM Neural Networks The First Neural Neural Networks 2 2 X1X1 X2X2 X3X3 Y The activation of a neuron is binary. That is, the neuron either fires (activation of one) or does not fire (activation of zero).

G5BAIM Neural Networks The First Neural Neural Networks 2 2 X1X1 X2X2 X3X3 Y For the network shown here the activation function for unit Y is f(y_in) = 1, if y_in >= θ else 0 wherey_in is the total input signal received θ is the threshold for Y

G5BAIM Neural Networks The First Neural Neural Networks 2 2 X1X1 X2X2 X3X3 Y Neurons is a McCulloch-Pitts network are connected by directed, weighted paths

G5BAIM Neural Networks The First Neural Neural Networks 2 2 X1X1 X2X2 X3X3 Y If the weight on a path is positive the path is excitatory, otherwise it is inhibitory

G5BAIM Neural Networks The First Neural Neural Networks 2 2 X1X1 X2X2 X3X3 Y All excitatory connections into a particular neuron have the same weight, although different weighted connections can be input to different neurons

G5BAIM Neural Networks The First Neural Neural Networks 2 2 X1X1 X2X2 X3X3 Y Each neuron has a fixed threshold. If the net input into the neuron is greater than the threshold, the neuron fires

G5BAIM Neural Networks The First Neural Neural Networks 2 2 X1X1 X2X2 X3X3 Y The threshold is set such that any non-zero inhibitory input will prevent the neuron from firing

G5BAIM Neural Networks The First Neural Neural Networks 2 2 X1X1 X2X2 X3X3 Y It takes one time step for a signal to pass over one connection.

G5BAIM Neural Networks The First Neural Neural Networks AND Function 1 1 X1X1 X2X2 Y Threshold(Y) = 2

G5BAIM Neural Networks The First Neural Neural Networks AND Function OR Function 2 2 X1X1 X2X2 Y Threshold(Y) = 2

G5BAIM Neural Networks The First Neural Neural Networks AND NOT Function 2 X1X1 X2X2 Y Threshold(Y) = 2

G5BAIM Neural Networks The First Neural Neural Networks XOR Function Z1Z1 Z2Z2 Y X1X1 X2X2 X 1 XOR X 2 = (X 1 AND NOT X 2 ) OR (X 2 AND NOT X 1 )

G5BAIM Neural Networks The First Neural Neural Networks If we touch something cold we perceive heat If we keep touching something cold we will perceive cold If we touch something hot we will perceive heat

G5BAIM Neural Networks The First Neural Neural Networks To model this we will assume that time is discrete If cold is applied for one time step then heat will be perceived If a cold stimulus is applied for two time steps then cold will be perceived If heat is applied then we should perceive heat

G5BAIM Neural Networks The First Neural Neural Networks X1X1 X2X2 Z1Z1 Z2Z2 Y1Y1 Y2Y2 Heat Cold Hot Cold

G5BAIM Neural Networks The First Neural Neural Networks X1X1 X2X2 Z1Z1 Z2Z2 Y1Y1 Y2Y2 Heat Cold Hot Cold It takes time for the stimulus (applied at X 1 and X 2 ) to make its way to Y 1 and Y 2 where we perceive either heat or cold At t(0), we apply a stimulus to X 1 and X 2 At t(1) we can update Z 1, Z 2 and Y 1 At t(2) we can perceive a stimulus at Y 2 At t(2+n) the network is fully functional

G5BAIM Neural Networks The First Neural Neural Networks We want the system to perceive cold if a cold stimulus is applied for two time steps Y 2 (t) = X 2 (t – 2) AND X 2 (t – 1)

G5BAIM Neural Networks The First Neural Neural Networks We want the system to perceive heat if either a hot stimulus is applied or a cold stimulus is applied (for one time step) and then removed Y 1 (t) = [ X 1 (t – 1) ] OR [ X 2 (t – 3) AND NOT X 2 (t – 2) ]

G5BAIM Neural Networks The First Neural Neural Networks The network shows Y 1 (t) = X 1 (t – 1) OR Z 1 (t – 1) Z 1 (t – 1) = Z 2 ( t – 2) AND NOT X 2 (t – 2) Z 2 (t – 2) = X 2 (t – 3) Substituting, we get Y 1 (t) = [ X 1 (t – 1) ] OR [ X 2 (t – 3) AND NOT X 2 (t – 2) ] which is the same as our original requirements

G5BAIM Neural Networks The First Neural Neural Networks You can confirm that Y 2 works correctly You can also check it works on the spreadsheet

G5BAIM Neural Networks Modelling a Neuron a j :Activation value of unit j w j,I :Weight on the link from unit j to unit i in I :Weighted sum of inputs to unit i a I :Activation value of unit i g:Activation function

G5BAIM Neural Networks Activation Functions Step t (x)=1 if x >= t, else 0 Sign(x)=+1 if x >= 0, else –1 Sigmoid(x)=1/(1+e -x ) Identity Function

G5BAIM Neural Networks Simple Networks

G5BAIM Neural Networks Simple Networks t = 0.0 y x W = 1.5 W = 1

G5BAIM Neural Networks Perceptron Synonym for Single- Layer, Feed-Forward Network First Studied in the 50’s Other networks were known about but the perceptron was the only one capable of learning and thus all research was concentrated in this area

G5BAIM Neural Networks Perceptron A single weight only affects one output so we can restrict our investigations to a model as shown on the right Notation can be simpler, i.e.

G5BAIM Neural Networks What can perceptrons represent?

G5BAIM Neural Networks What can perceptrons represent? 0,0 0,1 1,0 1,1 0,0 0,1 1,0 1,1 AND XOR Functions which can be separated in this way are called Linearly Separable Only linearly Separable functions can be represented by a perceptron

G5BAIM Neural Networks What can perceptrons represent? Linear Separability is also possible in more than 3 dimensions – but it is harder to visualise

G5BAIM Neural Networks Training a perceptron Aim

G5BAIM Neural Networks Training a perceptrons t = 0.0 y x W = 0.3 W = -0.4 W = 0.5

G5BAIM Neural Networks Learning While epoch produces an error Present network with next inputs from epoch Err = T – O If Err <> 0 then W j = W j + LR * I j * Err End If End While

G5BAIM Neural Networks Learning While epoch produces an error Present network with next inputs from epoch Err = T – O If Err <> 0 then W j = W j + LR * I j * Err End If End While Epoch : Presentation of the entire training set to the neural network. In the case of the AND function an epoch consists of four sets of inputs being presented to the network (i.e. [0,0], [0,1], [1,0], [1,1])

G5BAIM Neural Networks Learning While epoch produces an error Present network with next inputs from epoch Err = T – O If Err <> 0 then W j = W j + LR * I j * Err End If End While Training Value, T : When we are training a network we not only present it with the input but also with a value that we require the network to produce. For example, if we present the network with [1,1] for the AND function the training value will be 1

G5BAIM Neural Networks Learning While epoch produces an error Present network with next inputs from epoch Err = T – O If Err <> 0 then W j = W j + LR * I j * Err End If End While Error, Err : The error value is the amount by which the value output by the network differs from the training value. For example, if we required the network to output 0 and it output a 1, then Err = -1

G5BAIM Neural Networks Learning While epoch produces an error Present network with next inputs from epoch Err = T – O If Err <> 0 then W j = W j + LR * I j * Err End If End While Output from Neuron, O : The output value from the neuron Ij : Inputs being presented to the neuron Wj : Weight from input neuron (I j ) to the output neuron LR : The learning rate. This dictates how quickly the network converges. It is set by a matter of experimentation. It is typically 0.1

G5BAIM Neural Networks Learning 0,0 0,1 1,0 1,1 I1I1 I2I2 After First Epoch 0,0 0,1 1,0 1,1 I1I1 I2I2 At Convergence Note I 1 point = W 0 /W 1 I 2 point = W 0 /W 2

G5BAIM Artificial Intelligence Methods Graham Kendall End of Neural Networks