An Introduction To The Backpropagation Algorithm

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Slides from: Doug Gray, David Poole
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Artificial Neural Networks - Introduction -
Machine Learning Neural Networks
1 Part I Artificial Neural Networks Sofia Nikitaki.
Neural Networks Basic concepts ArchitectureOperation.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Neural Networks Chapter Feed-Forward Neural Networks.
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
An Introduction To The Backpropagation Algorithm Who gets the credit?
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Neuro-fuzzy Systems Xinbo Gao School of Electronic Engineering Xidian University 2004,10.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Data Mining and Neural Networks Danny Leung CS157B, Spring 2006 Professor Sin-Min Lee.
Neural Networks Chapter 6 Joost N. Kok Universiteit Leiden.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Appendix B: An Example of Back-propagation algorithm
NEURAL NETWORKS FOR DATA MINING
Classification / Regression Neural Networks 2
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
EE459 Neural Networks Backpropagation
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
EEE502 Pattern Recognition
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Neural Networks 2nd Edition Simon Haykin
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
An Introduction To The Backpropagation Algorithm.
Neural networks.
Multiple-Layer Networks and Backpropagation Algorithms
Neural Network Architecture Session 2
Deep Feedforward Networks
Artificial Neural Networks
Supervised Learning in ANNs
The Gradient Descent Algorithm
Advanced information retreival
Real Neurons Cell structures Cell body Dendrites Axon
LECTURE 28: NEURAL NETWORKS
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Recurrent Neural Networks
Hebb and Perceptron.
Data Mining with Neural Networks (HK: Chapter 7.5)
ECE 471/571 - Lecture 17 Back Propagation.
Weather Prediction Expert System Approaches
Neural Networks Advantages Criticism
Artificial Intelligence Methods
Artificial Neural Network & Backpropagation Algorithm
Synaptic DynamicsII : Supervised Learning
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
Neural Network - 2 Mayank Vatsa
LECTURE 28: NEURAL NETWORKS
Ch4: Backpropagation (BP)
Ch4: Backpropagation (BP)
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

An Introduction To The Backpropagation Algorithm Who gets the credit?

Basic Neuron Model In A Feedforward Network Inputs xi arrive through pre-synaptic connections Synaptic efficacy is modeled using real weights wi The response of the neuron is a nonlinear function f of its weighted inputs Copyright G. A. Tagliarini, PhD 1/14/2019

Copyright G. A. Tagliarini, PhD Network Topology Feedforward Inputs Outputs Inputs Feedback Outputs Copyright G. A. Tagliarini, PhD 1/14/2019

Differences In Networks Feedforward Networks Solutions are known Weights are learned Evolves in the weight space Used for: Prediction Classification Function approximation Feedback Networks Solutions are unknown Weights are prescribed Evolves in the state space Used for: Constraint satisfaction Optimization Feature matching Copyright G. A. Tagliarini, PhD 1/14/2019

Copyright G. A. Tagliarini, PhD Inputs To Neurons Arise from other neurons or from outside the network Nodes whose inputs arise outside the network are called input nodes and simply copy values An input may excite or inhibit the response of the neuron to which it is applied, depending upon the weight of the connection Copyright G. A. Tagliarini, PhD 1/14/2019

Copyright G. A. Tagliarini, PhD Weights Represent synaptic efficacy and may be excitatory or inhibitory Normally, positive weights are considered as excitatory while negative weights are thought of as inhibitory Learning is the process of modifying the weights in order to produce a network that performs some function Copyright G. A. Tagliarini, PhD 1/14/2019

Copyright G. A. Tagliarini, PhD Output The response function is normally nonlinear Samples include Sigmoid Piecewise linear Copyright G. A. Tagliarini, PhD 1/14/2019

Backpropagation Preparation Training Set A collection of input-output patterns that are used to train the network Testing Set A collection of input-output patterns that are used to assess network performance Learning Rate-η A scalar parameter, analogous to step size in numerical integration, used to set the rate of adjustments Copyright G. A. Tagliarini, PhD 1/14/2019

Copyright G. A. Tagliarini, PhD Network Error Total-Sum-Squared-Error (TSSE) Root-Mean-Squared-Error (RMSE) Copyright G. A. Tagliarini, PhD 1/14/2019

A Pseudo-Code Algorithm Randomly choose the initial weights While error is too large For each training pattern (presented in random order) Apply the inputs to the network Calculate the output for every neuron from the input layer, through the hidden layer(s), to the output layer Calculate the error at the outputs Use the output error to compute error signals for pre-output layers Use the error signals to compute weight adjustments Apply the weight adjustments Periodically evaluate the network performance Copyright G. A. Tagliarini, PhD 1/14/2019

Possible Data Structures Two-dimensional arrays Weights (at least for input-to-hidden layer and hidden-to-output layer connections) Weight changes (Dij) One-dimensional arrays Neuron layers Cumulative current input Current output Error signal for each neuron Bias weights Copyright G. A. Tagliarini, PhD 1/14/2019

Apply Inputs From A Pattern Apply the value of each input parameter to each input node Input nodes computer only the identity function Feedforward Inputs Outputs Copyright G. A. Tagliarini, PhD 1/14/2019

Calculate Outputs For Each Neuron Based On The Pattern The output from neuron j for pattern p is Opj where and k ranges over the input indices and Wjk is the weight on the connection from input k to neuron j Feedforward Inputs Outputs Copyright G. A. Tagliarini, PhD 1/14/2019

Calculate The Error Signal For Each Output Neuron The output neuron error signal dpj is given by dpj=(Tpj-Opj) Opj (1-Opj) Tpj is the target value of output neuron j for pattern p Opj is the actual output value of output neuron j for pattern p Copyright G. A. Tagliarini, PhD 1/14/2019

Calculate The Error Signal For Each Hidden Neuron The hidden neuron error signal dpj is given by where dpk is the error signal of a post-synaptic neuron k and Wkj is the weight of the connection from hidden neuron j to the post-synaptic neuron k Copyright G. A. Tagliarini, PhD 1/14/2019

Calculate And Apply Weight Adjustments Compute weight adjustments DWji at time t by DWji(t)= η dpj Opi Apply weight adjustments according to Wji(t+1) = Wji(t) + DWji(t) Some add a momentum term a*DWji(t-1) Copyright G. A. Tagliarini, PhD 1/14/2019

An Example: Exclusive “OR” Training set ((0.1, 0.1), 0.1) ((0.1, 0.9), 0.9) ((0.9, 0.1), 0.9) ((0.9, 0.9), 0.1) Testing set Use at least 121 pairs equally spaced on the unit square and plot the results Omit the training set (if desired) Copyright G. A. Tagliarini, PhD 1/14/2019

An Example (continued): Network Architecture inputs output(s) Copyright G. A. Tagliarini, PhD 1/14/2019

An Example (continued): Network Architecture Target output 0.9 0.1 Sample input 1 0.9 1 1 Copyright G. A. Tagliarini, PhD 1/14/2019

Feedforward Network Training by Backpropagation: Process Summary Select an architecture Randomly initialize weights While error is too large Select training pattern and feedforward to find actual network output Calculate errors and backpropagate error signals Adjust weights Evaluate performance using the test set Copyright G. A. Tagliarini, PhD 1/14/2019

An Example (continued): Network Architecture ?? Actual output ??? Target output 0.9 0.1 ?? ?? Sample input ?? 1 ?? ?? 0.9 ?? 1 ?? ?? 1 Copyright G. A. Tagliarini, PhD 1/14/2019