CSSE463: Image Recognition Day 18

Slides:



Advertisements
Similar presentations
1 Image Classification MSc Image Processing Assignment March 2003.
Advertisements

Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Support Vector Machines
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Classification Neural Networks 1
Lecture 14 – Neural Networks
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Radial Basis Function Networks
CSSE463: Image Recognition Day 21 Upcoming schedule: Upcoming schedule: Exam covers material through SVMs Exam covers material through SVMs.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Appendix B: An Example of Back-propagation algorithm
NEURAL NETWORKS FOR DATA MINING
Multi-Layer Perceptron
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
CSSE463: Image Recognition Day 15 Announcements: Announcements: Lab 5 posted, due Weds, Jan 13. Lab 5 posted, due Weds, Jan 13. Sunset detector posted,
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
CSSE463: Image Recognition Day 15 Today: Today: Your feedback: Your feedback: Projects/labs reinforce theory; interesting examples, topics, presentation;
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Neural networks and support vector machines
CSSE463: Image Recognition Day 14
CS 388: Natural Language Processing: Neural Networks
Neural Network Architecture Session 2
Deep Learning Amin Sobhani.
Artificial Neural Networks
Learning with Perceptrons and Neural Networks
Learning in Neural Networks
Data Mining, Neural Network and Genetic Programming
COMP24111: Machine Learning and Optimisation
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
CSSE463: Image Recognition Day 17
CSC 578 Neural Networks and Deep Learning
General Aspects of Learning
CSSE463: Image Recognition Day 20
Neural Networks Advantages Criticism
Classification Neural Networks 1
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
CSSE463: Image Recognition Day 17
Neural Networks.
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 20
Neural Networks Geoff Hulten.
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 13
CSSE463: Image Recognition Day 15
Ch4: Backpropagation (BP)
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 15
Backpropagation David Kauchak CS159 – Fall 2019.
Chapter - 3 Single Layer Percetron
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 18
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
David Kauchak CS51A Spring 2019
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
David Kauchak CS158 – Spring 2019
Introduction to Neural Networks
Image recognition.
Ch4: Backpropagation (BP)
Presentation transcript:

CSSE463: Image Recognition Day 18 Exam covers materials through neural nets

Perceptron training Each misclassified sample is used to change the weight “a little bit” so that the classification is better the next time. Consider inputs in form x = [x1, x2, … xn] Target label is y = {+1,-1} Algorithm (Hebbian Learning) Randomize weights Loop until converge If wx + b > 0 and y is -1: wi -= e*xi for all i b -= ey else if wx + b < 0 and y is +1: wi += e*xi for all i b += ey Else (it’s classified correctly, do nothing) e is the learning rate (a parameter that can be tuned). Demo perceptronLearning.m is broken.

Multilayer feedforward neural nets Many perceptrons Organized into layers Input (sensory) layer Hidden layer(s): 2 proven sufficient to model any arbitrary function Output (classification) layer Powerful! Calculates functions of input, maps to output layers Example x1 y1 x2 y2 x3 y3 Sensory (HSV) Hidden (functions) Classification (apple/orange/banana) Q4

XOR example 2 inputs 1 hidden layer of 5 neurons 1 output

Backpropagation algorithm Initialize all weights randomly For each labeled example: Calculate output using current network Update weights across network, from output to input, using Hebbian learning Iterate until convergence Epsilon decreases at every iteration Matlab does this for you. matlabNeuralNetDemo.m x1 y1 x2 y2 x3 y3 Architecture of demo: 1 input, 1 hidden layer of 5 neurons, 1 output (each hidden and output neuron has a bias (so 6), each link has a weight (so 10)) a. Calculate output (feedforward) R peat b. Update weights (feedback) Q5

Parameters Most networks are reasonably robust with respect to learning rate and how weights are initialized However, figuring out how to normalize your input determine the architecture of your net is a black art. You might need to experiment. One hint: Re-run network with different initial weights and different architectures, and test performance each time on a validation set. Pick best.

References This is just the tip of the iceberg! See: Sonka, pp. 404-407 Laurene Fausett. Fundamentals of Neural Networks. Prentice Hall, 1994. Approachable for beginner. C.M. Bishop. Neural Networks for Pattern Classification. Oxford University Press, 1995. Technical reference focused on the art of constructing networks (learning rate, # of hidden layers, etc.) Matlab neural net help is very good

Multiclass scene classification Neural nets Make one node per class in output layer. Take max. SVM Train 1 SVM per class. For example, 1 Sunset vs all others 1 Beach vs all others 1 Field vs all others … Assign class of max positive output. Note: that this isn’t multi-label scene classification (google it)

How does SVM predict compute score? y1 is just the weighted sum of contributions of individual support vectors: d = data dimension, e.g., 294, s = kernel width. numSupVecs, svcoeff (alpha) and bias are learned during training. Note: looking at which of your training examples are support vectors can be revealing! (Keep in mind for sunset detector and term project) Much easier computation than training Was easy to implement on a device without MATLAB (ea smartphone)

More SVMs vs. Neural Nets Empirically shown to have good generalizability even with relatively-small training sets and no domain knowledge. The classification runtime and space are O(____), where s is the number of support vectors, and d is the dimensionality of the feature vectors. In the worst case, s = size of whole training set (like nearest neighbor) No worse runtime than implementing a neural net with s perceptrons in the hidden layer. But s too high means generalization is ________ Neural networks: can tune architecture. Lots of parameters! Training can take a long time with large data sets. Consider that you’ll want to experiment with parameters… But performance can be excellent with larger networks and lots of training data Stay tuned for deep nets next time! Generalization is poor. Last Q’s

CSSE463 Roadmap