Download presentation
Presentation is loading. Please wait.
Published byDuane Simpson Modified over 9 years ago
1
Lecture 27: Recognition Basics CS4670/5670: Computer Vision Kavita Bala Slides from Andrej Karpathy and Fei-Fei Li http://vision.stanford.edu/teaching/cs231n/
2
Announcements PA 3 Artifact voting Vote by Tuesday night
3
Today Image classification pipeline Training, validation, testing Score function and loss function Building up to CNNs for learning – 5-6 lectures on deep learning
4
Image Classification Slides from Andrej Karpathy and Fei-Fei Li http://vision.stanford.edu/teaching/cs231n/
5
Image Classification: Problem
6
Data-driven approach Collect a database of images with labels Use ML to train an image classifier Evaluate the classifier on test images Slides from Andrej Karpathy and Fei-Fei Li http://vision.stanford.edu/teaching/cs231n/
7
Data-driven approach Collect a database of images with labels Use ML to train an image classifier Evaluate the classifier on test images Slides from Andrej Karpathy and Fei-Fei Li http://vision.stanford.edu/teaching/cs231n/
8
Train and Test Split dataset between training images and test images Be careful about inflation of results
9
Classifiers Nearest Neighbor kNN SVM …
10
Nearest Neighbor Classifier Train – Remember all training images and their labels Predict – Find the closest (most similar) training image – Predict its label as the true label
11
How to find the most similar training image? What is the distance metric? Slides from Andrej Karpathy and Fei-Fei Li http://vision.stanford.edu/teaching/cs231n/
12
Choice of distance metric Hyperparameter Slides from Andrej Karpathy and Fei-Fei Li http://vision.stanford.edu/teaching/cs231n/
13
k-nearest neighbor Find the k closest points from training data Labels of the k points “vote” to classify
14
How to pick hyperparameters? Methodology – Train and test – Train, validate, test Train for original model Validate to find hyperparameters Test to understand generalizability
15
Validation Slides from Andrej Karpathy and Fei-Fei Li http://vision.stanford.edu/teaching/cs231n/
16
Cross-validation Slides from Andrej Karpathy and Fei-Fei Li http://vision.stanford.edu/teaching/cs231n/
17
CIFAR-10 and NN results Slides from Andrej Karpathy and Fei-Fei Li http://vision.stanford.edu/teaching/cs231n/
18
Visualization: L2 distance
19
Complexity and Storage N training images, M testing images Training: O(1) Testing: O(MN) Hmm – Normally need the opposite – Slow training (ok), fast testing (necessary)
20
Summary Data-driven: Train, validate, test – Need labeled data Classifier – Nearest neighbor, kNN (approximate NN, ANN)
21
Score function Slides from Andrej Karpathy and Fei-Fei Li http://vision.stanford.edu/teaching/cs231n/
22
Linear Classifier
24
Computing scores
25
Geometric Interpretation
26
Interpretation: Template matching
27
Linear classifiers Find linear function (hyperplane) to separate positive and negative examples Which hyperplane is best?
28
Support vector machines Find hyperplane that maximizes the margin between the positive and negative examples C. Burges, A Tutorial on Support Vector Machines for Pattern Recognition, Data Mining and Knowledge Discovery, 1998A Tutorial on Support Vector Machines for Pattern Recognition
29
Support vector machines Find hyperplane that maximizes the margin between the positive and negative examples Margin Support vectors For support, vectors,
30
Support vector machines Find hyperplane that maximizes the margin between the positive and negative examples Margin Support vectors Distance between point and hyperplane: For support, vectors, Therefore, the margin is 2 / ||w||
31
Bias Trick
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.