CSSE463: Image Recognition Day 17

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
1 Image Classification MSc Image Processing Assignment March 2003.
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Classification Neural Networks 1
Machine Learning Neural Networks
Lecture 14 – Neural Networks
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
An Illustrative Example
Chapter 6: Multilayer Neural Networks
Data Mining with Neural Networks (HK: Chapter 7.5)
Artificial Neural Networks
Artificial Neural Network
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
CSSE463: Image Recognition Day 21 Upcoming schedule: Upcoming schedule: Exam covers material through SVMs Exam covers material through SVMs.
Neural Networks. Plan Perceptron  Linear discriminant Associative memories  Hopfield networks  Chaotic networks Multilayer perceptron  Backpropagation.
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Machine Learning Chapter 4. Artificial Neural Networks
Appendix B: An Example of Back-propagation algorithm
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
NEURAL NETWORKS FOR DATA MINING
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Pattern Classification X. 2 Content General Method K Nearest Neighbors Decision Trees Nerual Networks.
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Chapter 6 Neural Network.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Announcements 1. Textbook will be on reserve at library 2. Topic schedule change; modified reading assignment: This week: Linear discrimination, evaluating.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Neural networks.
CSSE463: Image Recognition Day 14
Neural Network Architecture Session 2
Learning with Perceptrons and Neural Networks
Learning in Neural Networks
Artificial neural networks:
Real Neurons Cell structures Cell body Dendrites Axon
Artificial Neural Networks
CSSE463: Image Recognition Day 17
CSC 578 Neural Networks and Deep Learning
Data Mining with Neural Networks (HK: Chapter 7.5)
CSSE463: Image Recognition Day 20
Neural Networks Advantages Criticism
Classification Neural Networks 1
Training a Neural Network
network of simple neuron-like computing elements
CSSE463: Image Recognition Day 17
Neural Networks.
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 13
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 18
Seminar on Machine Learning Rada Mihalcea
David Kauchak CS158 – Spring 2019
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

CSSE463: Image Recognition Day 17 Upcoming schedule: Term project ideas due tonight. Exam next Monday Sunset detector due in ~1.5 weeks This week’s topics: see schedule

Neural networks “Biologically inspired” model of computation Can model arbitrary real-valued functions for classification and association between patterns Discriminative model Models decision boundary directly Less memory than nearest neighbor Fast! Can be parallelized easily for large problems We will take a practical approach to classification

Perceptron model Computational model of a single neuron Inputs Outputs Function and threshold Will be connected to form a complete network Q1,2

Modeling logic gates We’ll do OR together. Inputs: x1 = {0,1}, x2 = {0,1} We need to pick weights wi and x0 (= -t, the threshold) such that it outputs 0 or 1 appropriately Quiz: You do AND, NOT, and XOR. Note that a single perceptron is limited in what it can classify. What is the limitation? Q3

Perceptron training Each misclassified sample is used to change the weight “a little bit” so that the classification is better the next time. Consider inputs in form x = [x1, x2, … xn] Target label is y = {+1,-1} Algorithm (Hebbian Learning) Randomize weights Loop until converge If wx + b > 0 and y is -1: wi -= e*xi for all i b -= ey else if wx + b < 0 and y is +1: wi += e*xi for all i b += ey Else (it’s classified correctly, do nothing) e is the learning rate (a parameter that can be tuned). Demo perceptronLearning.m is broken.

Multilayer feedforward neural nets Many perceptrons Organized into layers Input (sensory) layer Hidden layer(s): 2 proven sufficient to model any arbitrary function Output (classification) layer Powerful! Calculates functions of input, maps to output layers Example x1 y1 x2 y2 x3 y3 Sensory (HSV) Hidden (functions) Classification (apple/orange/banana) Q4

XOR example 2 inputs 1 hidden layer of 5 neurons 1 output

Backpropagation algorithm Initialize all weights randomly For each labeled example: Calculate output using current network Update weights across network, from output to input, using Hebbian learning Iterate until convergence Epsilon decreases at every iteration Matlab does this for you.  matlabNeuralNetDemo.m x1 y1 x2 y2 x3 y3 Architecture of demo: 1 input, 1 hidden layer of 5 neurons, 1 output (each hidden and output neuron has a bias (so 6), each link has a weight (so 10)) a. Calculate output (feedforward) R peat b. Update weights (feedback) Q5

Parameters Most networks are reasonably robust with respect to learning rate and how weights are initialized However, figuring out how to normalize your input determine the architecture of your net is a black art. You might need to experiment. One hint: Re-run network with different initial weights and different architectures, and test performance each time on a validation set. Pick best.