CSSE463: Image Recognition Day 13

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
1 Image Classification MSc Image Processing Assignment March 2003.
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Perceptron Learning Rule
NEURAL NETWORKS Perceptron
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Classification Neural Networks 1
Machine Learning Neural Networks
Lecture 14 – Neural Networks
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
Chapter 6: Multilayer Neural Networks
Data Mining with Neural Networks (HK: Chapter 7.5)
Artificial Neural Network
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
CSSE463: Image Recognition Day 21 Upcoming schedule: Upcoming schedule: Exam covers material through SVMs Exam covers material through SVMs.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Machine Learning Chapter 4. Artificial Neural Networks
Appendix B: An Example of Back-propagation algorithm
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Non-Bayes classifiers. Linear discriminants, neural networks.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
Chapter 6 Neural Network.
CSSE463: Image Recognition Day 15 Today: Today: Your feedback: Your feedback: Projects/labs reinforce theory; interesting examples, topics, presentation;
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
CSSE463: Image Recognition Day 14
Neural Network Architecture Session 2
Learning with Perceptrons and Neural Networks
Learning in Neural Networks
Artificial neural networks:
CSSE463: Image Recognition Day 11
Artificial Neural Networks
CSSE463: Image Recognition Day 17
Data Mining with Neural Networks (HK: Chapter 7.5)
CSSE463: Image Recognition Day 20
CSSE463: Image Recognition Day 11
Classification Neural Networks 1
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 17
Neural Networks.
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 20
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 16
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 16
David Kauchak CS158 – Spring 2019
Presentation transcript:

CSSE463: Image Recognition Day 13 Upcoming schedule: Lab 4 due Weds, 3:25. Shape1: elongation = 1.632636, C1 = 19.2531, C2 = 5.0393 Term project ideas due next Monday. Exam next Thursday Sunset detector due in ~2.5 weeks This week: Today: Neural networks Tuesday: Support Vector Machine (SVM) Introduction and derivation Thursday: Project details, SVM demo Friday: SVM lab

Mid-term feedback Pace good for most Enjoyable! Please give overview of labs at start of lab Please add more details in slides Remove fruit bowl Past feedback: Assignments lag lectures by ½ - 1 week Assignments directly reinforce lecture Enjoy it, including lecture Interesting material Like being able to start labs early

Neural networks “Biologically inspired” model of computation Can model arbitrary real-valued functions for classification and association between patterns Discriminative model Models decision boundary directly Less memory than nearest neighbor Fast! Can be parallelized easily for large problems We will take a practical approach to classification

Perceptron model Computational model of a single neuron Inputs Outputs Function and threshold Will be connected to form a complete network Q1,2

Modeling logic gates We’ll do OR together. Inputs: x1 = {0,1}, x2 = {0,1} We need to pick weights wi and x0 (= -t, the threshold) such that it outputs 0 or 1 appropriately Quiz: You do AND, NOT, and XOR. Note that a single perceptron is limited in what it can classify. What is the limitation? Q3

Perceptron training Each misclassified sample is used to change the weight “a little bit” so that the classification is better the next time. Consider inputs in form x = [x1, x2, … xn] Target label is y = {+1,-1} Algorithm (Hebbian Learning) Randomize weights Loop until converge If wx + b > 0 and y is -1: wi -= e*xi for all i b -= ey else if wx + b < 0 and y is +1: wi += e*xi for all i b += ey Else (it’s classified correctly, do nothing) e is the learning rate (a parameter that can be tuned). Demo perceptronLearning.m is broken.

Multilayer feedforward neural nets Many perceptrons Organized into layers Input (sensory) layer Hidden layer(s): 2 proven sufficient to model any arbitrary function Output (classification) layer Powerful! Calculates functions of input, maps to output layers Example x1 y1 x2 y2 x3 y3 Sensory (HSV) Hidden (functions) Classification (apple/orange/banana) Q4

XOR example 2 inputs 1 hidden layer of 5 neurons 1 output

Backpropagation algorithm Initialize all weights randomly For each labeled example: Calculate output using current network Update weights across network, from output to input, using Hebbian learning Iterate until convergence Epsilon decreases at every iteration Matlab does this for you.  matlabNeuralNetDemo.m x1 y1 x2 y2 x3 y3 Architecture of demo: 1 input, 1 hidden layer of 5 neurons, 1 output (each hidden and output neuron has a bias (so 6), each link has a weight (so 10)) a. Calculate output (feedforward) R peat b. Update weights (feedback) Q5

Parameters Most networks are reasonably robust with respect to learning rate and how weights are initialized However, figuring out how to normalize your input determine the architecture of your net is a black art. You might need to experiment. One hint: Re-run network with different initial weights and different architectures, and test performance each time on a validation set. Pick best.

References This is just the tip of the iceberg! See: Sonka, pp. 404-407 Laurene Fausett. Fundamentals of Neural Networks. Prentice Hall, 1994. Approachable for beginner. C.M. Bishop. Neural Networks for Pattern Classification. Oxford University Press, 1995. Technical reference focused on the art of constructing networks (learning rate, # of hidden layers, etc.) Matlab neural net help