CSSE463: Image Recognition Day 18

Slides:



Advertisements
Similar presentations
1 Image Classification MSc Image Processing Assignment March 2003.
Advertisements

G53MLE | Machine Learning | Dr Guoping Qiu
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Support Vector Machines
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Classification Neural Networks 1
Lecture 14 – Neural Networks
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Chapter 6: Multilayer Neural Networks
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
CSSE463: Image Recognition Day 21 Upcoming schedule: Upcoming schedule: Exam covers material through SVMs Exam covers material through SVMs.
CSSE463: Image Recognition Day 27 This week This week Last night: k-means lab due. Last night: k-means lab due. Today: Classification by “boosting” Today:
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Appendix B: An Example of Back-propagation algorithm
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Rotation Invariant Neural-Network Based Face Detection
NEURAL NETWORKS FOR DATA MINING
Multi-Layer Perceptron
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Non-Bayes classifiers. Linear discriminants, neural networks.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
CSSE463: Image Recognition Day 33 This week This week Today: Classification by “boosting” Today: Classification by “boosting” Yoav Freund and Robert Schapire.
CS621 : Artificial Intelligence
CSSE463: Image Recognition Day 15 Announcements: Announcements: Lab 5 posted, due Weds, Jan 13. Lab 5 posted, due Weds, Jan 13. Sunset detector posted,
EEE502 Pattern Recognition
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
CSSE463: Image Recognition Day 15 Today: Today: Your feedback: Your feedback: Projects/labs reinforce theory; interesting examples, topics, presentation;
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Neural networks and support vector machines
CSSE463: Image Recognition Day 14
Self-Organizing Network Model (SOM) Session 11
Neural Network Architecture Session 2
Artificial Neural Networks
Data Mining, Neural Network and Genetic Programming
Real Neurons Cell structures Cell body Dendrites Axon
CSSE463: Image Recognition Day 17
Data Mining with Neural Networks (HK: Chapter 7.5)
CSSE463: Image Recognition Day 20
Neural Networks Advantages Criticism
Classification Neural Networks 1
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
CSSE463: Image Recognition Day 17
Neural Networks.
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 20
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 13
CSSE463: Image Recognition Day 15
Ch4: Backpropagation (BP)
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 15
Chapter - 3 Single Layer Percetron
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 18
David Kauchak CS51A Spring 2019
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Image recognition.
Ch4: Backpropagation (BP)
Presentation transcript:

CSSE463: Image Recognition Day 18 Upcoming schedule: Exam covers material through neural nets

Perceptron training Each misclassified sample is used to change the weight “a little bit” so that the classification is better the next time. Consider inputs in form x = [x1, x2, … xn] Target label is y = {+1,-1} Algorithm (Hebbian Learning) Randomize weights Loop until converge If wx + b > 0 and y is -1: wi -= e*xi for all i b -= ey else if wx + b < 0 and y is +1: wi += e*xi for all i b += ey Else (it’s classified correctly, do nothing) e is the learning rate (a parameter that can be tuned). Demo perceptronLearning.m is broken.

Multilayer feedforward neural nets Many perceptrons Organized into layers Input (sensory) layer Hidden layer(s): 2 proven sufficient to model any arbitrary function Output (classification) layer Powerful! Calculates functions of input, maps to output layers Example x1 y1 x2 y2 x3 y3 Sensory (HSV) Hidden (functions) Classification (apple/orange/banana) Q4

XOR example 2 inputs 1 hidden layer of 5 neurons 1 output

Backpropagation algorithm Initialize all weights randomly For each labeled example: Calculate output using current network Update weights across network, from output to input, using Hebbian learning Iterate until convergence Epsilon decreases at every iteration Matlab does this for you. matlabNeuralNetDemo.m x1 y1 x2 y2 x3 y3 Architecture of demo: 1 input, 1 hidden layer of 5 neurons, 1 output (each hidden and output neuron has a bias (so 6), each link has a weight (so 10)) a. Calculate output (feedforward) R peat b. Update weights (feedback) Q5

Parameters Most networks are reasonably robust with respect to learning rate and how weights are initialized However, figuring out how to normalize your input determine the architecture of your net is a black art. You might need to experiment. One hint: Re-run network with different initial weights and different architectures, and test performance each time on a validation set. Pick best.

References This is just the tip of the iceberg! See: Sonka, pp. 404-407 Laurene Fausett. Fundamentals of Neural Networks. Prentice Hall, 1994. Approachable for beginner. C.M. Bishop. Neural Networks for Pattern Classification. Oxford University Press, 1995. Technical reference focused on the art of constructing networks (learning rate, # of hidden layers, etc.) Matlab neural net help is very good

SVMs vs. Neural Nets SVM: Neural networks: Training can take a long time with large data sets. Consider that you’ll want to experiment with parameters… But the classification runtime and space are O(sd), where s is the number of support vectors, and d is the dimensionality of the feature vectors. In the worst case, s = size of whole training set (like nearest neighbor) But no worse than implementing a neural net with s perceptrons in the hidden layer. Empirically shown to have good generalizability even with relatively-small training sets and no domain knowledge. Neural networks: can tune architecture. Lots of parameters! Last Q’s

How does svmfwd compute y1? y1 is just the weighted sum of contributions of individual support vectors: d = data dimension, e.g., 294, s = kernel width. numSupVecs, svcoeff (alpha) and bias are learned during training. Note: looking at which of your training examples are support vectors can be revealing! (Keep in mind for sunset detector and term project) Much easier computation than training Was easy to implement on a device without MATLAB (ea smartphone)

CSSE463 Roadmap