CSSE463: Image Recognition Day 21 Upcoming schedule: Upcoming schedule: Exam covers material through SVMs Exam covers material through SVMs.

Slides:



Advertisements
Similar presentations
NEURAL NETWORKS Backpropagation Algorithm
Advertisements

1 Image Classification MSc Image Processing Assignment March 2003.
G53MLE | Machine Learning | Dr Guoping Qiu
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Support Vector Machines
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Neural Networks I CMPUT 466/551 Nilanjan Ray. Outline Projection Pursuit Regression Neural Network –Background –Vanilla Neural Networks –Back-propagation.
Lecture 14 – Neural Networks
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
CSSE463: Image Recognition Day 31 Due tomorrow night – Project plan Due tomorrow night – Project plan Evidence that you’ve tried something and what specifically.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Radial Basis Function (RBF) Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Classification III Tamara Berg CS Artificial Intelligence Many slides throughout the course adapted from Svetlana Lazebnik, Dan Klein, Stuart Russell,
Radial Basis Function Networks
CSSE463: Image Recognition Day 27 This week This week Last night: k-means lab due. Last night: k-means lab due. Today: Classification by “boosting” Today:
Artificial Neural Networks
© Negnevitsky, Pearson Education, Will neural network work for my problem? Will neural network work for my problem? Character recognition neural.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Appendix B: An Example of Back-propagation algorithm
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Rotation Invariant Neural-Network Based Face Detection
NEURAL NETWORKS FOR DATA MINING
Artificial Intelligence Techniques Multilayer Perceptrons.
Multi-Layer Perceptron
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Non-Bayes classifiers. Linear discriminants, neural networks.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
CSSE463: Image Recognition Day 33 This week This week Today: Classification by “boosting” Today: Classification by “boosting” Yoav Freund and Robert Schapire.
CS621 : Artificial Intelligence
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
CSSE463: Image Recognition Day 15 Announcements: Announcements: Lab 5 posted, due Weds, Jan 13. Lab 5 posted, due Weds, Jan 13. Sunset detector posted,
EEE502 Pattern Recognition
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
CSSE463: Image Recognition Day 15 Today: Today: Your feedback: Your feedback: Projects/labs reinforce theory; interesting examples, topics, presentation;
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Learning: Neural Networks Artificial Intelligence CMSC February 3, 2005.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
CSSE463: Image Recognition Day 14
Data Mining, Neural Network and Genetic Programming
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 20
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 20
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 13
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 18
Ch4: Backpropagation (BP)
Presentation transcript:

CSSE463: Image Recognition Day 21 Upcoming schedule: Upcoming schedule: Exam covers material through SVMs Exam covers material through SVMs

Multilayer feedforward neural nets Many perceptrons Many perceptrons Organized into layers Organized into layers Input (sensory) layer Hidden layer(s): 2 proven sufficient to model any arbitrary function Output (classification) layer Powerful! Powerful! Calculates functions of input, maps to output layers Calculates functions of input, maps to output layers Example Example x1x1 x2x2 x3x3 y1y1 Sensory (HSV) Hidden (functions) Classification (apple/orange/banana) y2y2 y3y3 Q4

XOR example 2 inputs 2 inputs 1 hidden layer of 5 neurons 1 hidden layer of 5 neurons 1 output 1 output

Backpropagation algorithm Initialize all weights randomly For each labeled example: For each labeled example: Calculate output using current network Update weights across network, from output to input, using Hebbian learning Iterate until convergence Iterate until convergence Epsilon decreases at every iteration Matlab does this for you. Matlab does this for you. matlabNeuralNetDemo.m matlabNeuralNetDemo.m x1x1 x2x2 x3x3 y1y1 y2y2 y3y3 a. Calculate output (feedforward) b. Update weights (feedback) R peat Q5

Parameters Most networks are reasonably robust with respect to learning rate and how weights are initialized Most networks are reasonably robust with respect to learning rate and how weights are initialized However, figuring out how to However, figuring out how to normalize your input normalize your input determine the architecture of your net determine the architecture of your net is a black art. You might need to experiment. One hint: is a black art. You might need to experiment. One hint: Re-run network with different initial weights and different architectures, and test performance each time on a validation set. Pick best. Re-run network with different initial weights and different architectures, and test performance each time on a validation set. Pick best.

References This is just the tip of the iceberg! See: This is just the tip of the iceberg! See: Sonka, pp Sonka, pp Laurene Fausett. Fundamentals of Neural Networks. Prentice Hall, Laurene Fausett. Fundamentals of Neural Networks. Prentice Hall, Approachable for beginner. Approachable for beginner. C.M. Bishop. Neural Networks for Pattern Classification. Oxford University Press, C.M. Bishop. Neural Networks for Pattern Classification. Oxford University Press, Technical reference focused on the art of constructing networks (learning rate, # of hidden layers, etc.) Technical reference focused on the art of constructing networks (learning rate, # of hidden layers, etc.) Matlab neural net help is very good Matlab neural net help is very good

SVMs vs. Neural Nets SVM: SVM: Training can take a long time with large data sets. Consider that you’ll want to experiment with parameters… Training can take a long time with large data sets. Consider that you’ll want to experiment with parameters… But the classification runtime and space are O(sd), where s is the number of support vectors, and d is the dimensionality of the feature vectors. But the classification runtime and space are O(sd), where s is the number of support vectors, and d is the dimensionality of the feature vectors. In the worst case, s = size of whole training set (like nearest neighbor) In the worst case, s = size of whole training set (like nearest neighbor) But no worse than implementing a neural net with s perceptrons in the hidden layer. But no worse than implementing a neural net with s perceptrons in the hidden layer. Empirically shown to have good generalizability even with relatively-small training sets and no domain knowledge. Empirically shown to have good generalizability even with relatively-small training sets and no domain knowledge. Neural networks: Neural networks: can tune architecture. Lots of parameters! can tune architecture. Lots of parameters! Q3

How does svmfwd compute y1? y1 is just the weighted sum of contributions of individual support vectors: d = data dimension, e.g., 294,  = kernel width. numSupVecs, svcoeff (alpha) and bias are learned during training. Note: looking at which of your training examples are support vectors can be revealing! (Keep in mind for sunset detector and term project) Much easier computation than training Much easier computation than training Could implement on a device without MATLAB (e.g., a smartphone) easily Could implement on a device without MATLAB (e.g., a smartphone) easily Check out sample Java code Check out sample Java code

CSSE463 Roadmap