October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Weather We decide (or experimentally determine) to use.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Perceptron Lecture 4.
NEURAL NETWORKS Backpropagation Algorithm
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
Introduction to Artificial Intelligence (G51IAI)
Reading for Next Week Textbook, Section 9, pp A User’s Guide to Support Vector Machines (linked from course website)
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Simple Neural Nets For Pattern Classification
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
The back-propagation training algorithm
Artificial Neural Networks ML Paul Scheible.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
Biological neuron artificial neuron.
September 21, 2010Neural Networks Lecture 5: The Perceptron 1 Supervised Function Approximation In supervised learning, we train an ANN with a set of vector.
Chapter 6: Multilayer Neural Networks
September 23, 2010Neural Networks Lecture 6: Perceptron Learning 1 Refresher: Perceptron Training Algorithm Algorithm Perceptron; Start with a randomly.
October 5, 2010Neural Networks Lecture 9: Applying Backpropagation 1 K-Class Classification Problem Let us denote the k-th class by C k, with n k exemplars.
October 7, 2010Neural Networks Lecture 10: Setting Backpropagation Parameters 1 Creating Data Representations On the other hand, sets of orthogonal vectors.
October 28, 2010Neural Networks Lecture 13: Adaptive Networks 1 Adaptive Networks As you know, there is no equation that would tell you the ideal number.
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
Neural Networks Lecture 17: Self-Organizing Maps
October 12, 2010Neural Networks Lecture 11: Setting Backpropagation Parameters 1 Exemplar Analysis When building a neural network application, we must.
November 21, 2012Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms III 1 Learning in the BPN Gradients of two-dimensional functions:
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Radial-Basis Function Networks
November 25, 2014Computer Vision Lecture 20: Object Recognition IV 1 Creating Data Representations The problem with some data representations is that the.
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
1/11 طراحی و آموزش شبکه های عصبی Slide from Dr. M. Pomplun.
December 5, 2012Introduction to Artificial Intelligence Lecture 20: Neural Network Application Design III 1 Example I: Predicting the Weather Since the.
Artificial Neural Networks
Kumar Srijan ( ) Syed Ahsan( ). Problem Statement To create a Neural Networks based multiclass object classifier which can do rotation,
© Negnevitsky, Pearson Education, Will neural network work for my problem? Will neural network work for my problem? Character recognition neural.
1/11 طراحی و آموزش شبکه های عصبی Slide from Dr. M. Pomplun.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
November 26, 2013Computer Vision Lecture 15: Object Recognition III 1 Backpropagation Network Structure Perceptrons (and many other classifiers) can only.
BACKPROPAGATION: An Example of Supervised Learning One useful network is feed-forward network (often trained using the backpropagation algorithm) called.
Multi-Layer Perceptron
EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
EEE502 Pattern Recognition
October 1, 2013Computer Vision Lecture 9: From Edges to Contours 1 Canny Edge Detector However, usually there will still be noise in the array E[i, j],
Multilayer Neural Networks (sometimes called “Multilayer Perceptrons” or MLPs)
Neural Networks 2nd Edition Simon Haykin
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Intelligent Control Methods Lecture 14: Neuronal Nets (Part 2) Slovak University of Technology Faculty of Material Science and Technology in Trnava.
April 14, 2016Introduction to Artificial Intelligence Lecture 20: Image Processing 1 Example I: Predicting the Weather Let us study an interesting neural.
April 12, 2016Introduction to Artificial Intelligence Lecture 19: Neural Network Application Design II 1 Now let us talk about… Neural Network Application.
Neural Network Architecture Session 2
Deep Feedforward Networks
Supervised Learning in ANNs
Real Neurons Cell structures Cell body Dendrites Axon
network of simple neuron-like computing elements
Creating Data Representations
Capabilities of Threshold Neurons
Example I: Predicting the Weather
Computer Vision Lecture 19: Object Recognition III
Random Neural Network Texture Model
Presentation transcript:

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Weather We decide (or experimentally determine) to use a hidden layer with 42 sigmoidal neurons. In summary, our network has 207 input neurons 207 input neurons 42 hidden neurons 42 hidden neurons 4 output neurons 4 output neurons Because of the small output vectors, 42 hidden units may suffice for this application.

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 2 Example I: Predicting the Weather The next thing we need to do is collecting the training exemplars. First we have to specify what our network is supposed to do: In production mode, the network is fed with the current weather conditions, and its output will be interpreted as the weather forecast for tomorrow. Therefore, in training mode, we have to present the network with exemplars that associate known past weather conditions at a time t with the conditions at t – 24 hrs. So we have to collect a set of historical exemplars with known correct output for every input.

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 3 Example I: Predicting the Weather Obviously, if such data is unavailable, we have to start collecting them. The selection of exemplars that we need depends, among other factors, on the amount of changes in weather at our location. For example, in Honolulu, Hawaii, our exemplars may not have to cover all seasons, because there is little variation in the weather. In Boston, however, we would need to include data from every calendar month because of dramatic changes in weather across seasons. As we know, some winters in Boston are much harder than others, so it might be a good idea to collect data for several years.

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 4 Example I: Predicting the Weather And how about the granularity of our exemplar data, i.e., the frequency of measurement? Using one sample per day would be a natural choice, but it would neglect rapid changes in weather. If we use hourly instantaneous samples, however, we increase the likelihood of conflicts. Therefore, we decide to do the following: We will collect input data every hour, but the corresponding output pattern will be the average of the instantaneous patterns over a 12-hour period. This way we reduce the possibility of errors while increasing the amount of training data.

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 5 Example I: Predicting the Weather Now we have to train our network. If we use samples in one-hour intervals for one year, we have 8,760 exemplars. Our network has 207   4 = 8862 weights, which means that data from ten years, i.e., 87,600 exemplars would be desirable (rule of thumb).

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 6 Example I: Predicting the Weather Since with a large number of samples the hold-one- out training method is very time consuming, we decide to use partial-set training instead. The best way to do this would be to acquire a test set (control set), that is, another set of input-output pairs measured on random days and at random times. After training the network with the 87,600 exemplars, we could then use the test set to evaluate the performance of our network.

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 7 Example I: Predicting the Weather Neural network troubleshooting: Plot the global error as a function of the training epoch. The error should decrease after every epoch. If it oscillates, do the following tests.Plot the global error as a function of the training epoch. The error should decrease after every epoch. If it oscillates, do the following tests. Try reducing the size of the training set. If then the network converges, a conflict may exist in the exemplars.Try reducing the size of the training set. If then the network converges, a conflict may exist in the exemplars. If the network still does not converge, continue pruning the training set until it does converge. Then add exemplars back gradually, thereby detecting the ones that cause conflicts.If the network still does not converge, continue pruning the training set until it does converge. Then add exemplars back gradually, thereby detecting the ones that cause conflicts.

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 8 Example I: Predicting the Weather If this still does not work, look for saturated neurons (extreme weights) in the hidden layer. If you find those, add more hidden-layer neurons, possibly an extra 20%.If this still does not work, look for saturated neurons (extreme weights) in the hidden layer. If you find those, add more hidden-layer neurons, possibly an extra 20%. If there are no saturated units and the problems still exist, try lowering the learning parameter  and training longer.If there are no saturated units and the problems still exist, try lowering the learning parameter  and training longer. If the network converges but does not accurately learn the desired function, evaluate the coverage of the training set.If the network converges but does not accurately learn the desired function, evaluate the coverage of the training set. If the coverage is adequate and the network still does not learn the function precisely, you could refine the pattern representation. For example, you could include a season indicator to the input, helping the network to discriminate between similar inputs that produce very different outputs.If the coverage is adequate and the network still does not learn the function precisely, you could refine the pattern representation. For example, you could include a season indicator to the input, helping the network to discriminate between similar inputs that produce very different outputs. Then you can start predicting the weather!

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 9 Example II: Face Recognition Now let us assume that we want to build a network for a computer vision application. More specifically, our network is supposed to recognize faces and face poses. This is an example that has actually been implemented. All information, program code, data, etc, can be found at:

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 10 Example II: Face Recognition The goal is to classify camera images of faces of various people in various poses. Images of 20 different people were collected, with up to 32 images per person. The following variables were introduced: expression (happy, sad, angry, neutral) expression (happy, sad, angry, neutral) direction of looking (left, right, straight ahead, up) direction of looking (left, right, straight ahead, up) sunglasses (yes or no) sunglasses (yes or no) In total, 624 grayscale images were collected, each with a resolution of 30 by 32 pixels and intensity values between 0 and 255.

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 11 Example II: Face Recognition The network presented here only has the task of determining the face pose (left, right, up, straight) shown in an input image. It uses 960 input units (one for each pixel in the image), 960 input units (one for each pixel in the image), 3 hidden units 3 hidden units 4 output neurons (one for each pose) 4 output neurons (one for each pose) Each output unit receives an additional input, which is always 1. By varying the weight for this input, the backpropagation algorithm can adjust an offset for the net input signal.

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 12 Example II: Face Recognition The following diagram visualizes all network weights after 1 epoch and after 100 epochs. Their values are indicated by brightness (ranging from black = -1 to white = 1). Each 30 by 32 matrix represents the weights of one of the three hidden-layer units. Each row of four squares represents the weights of one output neuron (three weights for the signals from the hidden units, and one for the constant signal 1). After training, the network is able to classify 90% of new (non-trained) face images correctly.

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 13 Example II: Face Recognition

October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 14 Example III: Character Recognition