CSSE463: Image Recognition Day 11

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
CSE803 Fall Pattern Recognition Concepts Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Stockman CSE803 Fall Pattern Recognition Concepts Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching.
嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural Pattern Recognition Neural networks.
1 Pattern Recognition Concepts How should objects be represented? Algorithms for recognition/matching * nearest neighbors * decision tree * decision functions.
Data Mining: Classification & Predication Hosam Al-Samarraie, PhD. Centre for Instructional Technology & Multimedia Universiti Sains Malaysia.
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Chapter 4: Pattern Recognition. Classification is a process that assigns a label to an object according to some representation of the object’s properties.
CSSE463: Image Recognition Day 11 Lab 4 (shape) tomorrow: feel free to start in advance Lab 4 (shape) tomorrow: feel free to start in advance Test Monday.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
CSSE463: Image Recognition Day 23 Midterm behind us… Midterm behind us… Foundations of Image Recognition completed! Foundations of Image Recognition completed!
CSSE463: Image Recognition Day 15 Announcements: Announcements: Lab 5 posted, due Weds, Jan 13. Lab 5 posted, due Weds, Jan 13. Sunset detector posted,
CSSE463: Image Recognition Day 11 Due: Due: Written assignment 1 tomorrow, 4:00 pm Written assignment 1 tomorrow, 4:00 pm Start thinking about term project.
CSSE463: Image Recognition Day 14 Lab due Weds. Lab due Weds. These solutions assume that you don't threshold the shapes.ppt image: Shape1: elongation.
CSSE463: Image Recognition Day 15 Today: Today: Your feedback: Your feedback: Projects/labs reinforce theory; interesting examples, topics, presentation;
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
CSSE463: Image Recognition Day 14
Machine Learning – Classification David Fenyő
Evaluating Classifiers
Ananya Das Christman CS311 Fall 2016
Introduction to Machine Learning
CSSE463: Image Recognition Day 21
Performance Evaluation 02/15/17
School of Computer Science & Engineering
CSSE463: Image Recognition Day 11
Basic machine learning background with Python scikit-learn
Chapter 7 – K-Nearest-Neighbor
Recognition using Nearest Neighbor (or kNN)
Object detection as supervised classification
Data Mining Classification: Alternative Techniques
Features & Decision regions
Machine Learning Week 1.
CSSE463: Image Recognition Day 20
Nearest-Neighbor Classifiers
CSSE463: Image Recognition Day 23
CSSE463: Image Recognition Day 17
Introduction to Pattern Recognition
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 20
Midterm Exam Closed book, notes, computer Format:
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 23
Model Evaluation and Selection
CSSE463: Image Recognition Day 14
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
CSSE463: Image Recognition Day 13
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
CSSE463: Image Recognition Day 15
CS4670: Intro to Computer Vision
CSSE463: Image Recognition Day 14
Midterm Exam Closed book, notes, computer Similar to test 1 in format:
Learning Chapter 18 and Parts of Chapter 20
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 23
CSSE463: Image Recognition Day 16
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 17
Junheng, Shengming, Yunsheng 11/09/2018
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 23
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 16
ECE – Pattern Recognition Lecture 8 – Performance Evaluation
Presentation transcript:

CSSE463: Image Recognition Day 11 Keep thinking about term project ideas. Interesting data set, use of which won the Marr prize in computer vision: http://vision.cs.stonybrook.edu/~vicente/sbucaptions/ Interesting project? From Larry Gates: https://lab.nationalmedals.org/img_processing Next 2-3 weeks: Pattern recognition Concepts, error types (today) Basic theory and how to use classifiers in MATLAB: Support vector machines (SVM). Neural networks

Pattern recognition Making a decision from data A classification problem: assign a single class label to a datum point Can include a special class, reject, if a sample (a single datum point) appears not to belong to any known class If it is on the boundary between classes Else forced classification Boundaries between classes-how? There’s tons of theory, can be applied to many areas. We focus on small subset of those used for vision Q1

Baseline: Hand-tuned decision boundaries You did this based on observations for fruit classification You’ll do the same thing in Lab 4 for shapes But what if the features were much more complex? We now discuss classifiers that learn class boundaries based on labeled training examples

Ex: Nearest neighbor classifier Assumes we have a feature vector for each image Calculate distance from new test sample to each labeled training sample. Assign label as closest training sample Generalize by assigning same label as the majority of the k nearest neighbors. No majority? Boundaries: http://ai6034.mit.edu/fall12/index.php?title=Demonstrations

Look at this to understand nearest neighbor http://ai6034.mit.edu/fall12/index.php?title=Demonstrations Shows Voronai diagrams for nearest neighbor classifiers Nice intro to SVMs also

Nearest class mean Test point Find class means and calculate distance to each mean Pro? Con? Partial solution: clustering Learning vector quantization (LVQ): tries to find optimal clusters LVQ Q2

Common model of learning machines Statistical Learning Labeled Training Images Extract Features (color, texture) Summary Test Image Extract Features (color, texture) Classifier Label

How good is your classifier? Example from medicine: Disease detection Consider costs of false neg. vs. false pos. Lots of different error measures Accuracy = 10500/10800 = 97%. Is 97% accuracy OK? Recall (or true positive rate) = 500/600=83% Precision = 500/700=71% False pos rate = 200/10200 = 2% Detected True Yes No 500 (true pos.) 100 (false neg.) 200 (false pos.) 10000 (true neg.) 600 Total actual positive 10200 Total actual negative 700 Total det. as pos. 10100 Total det. as neg.

How good is your classifier? Write out definitions of each measure now Examples Accuracy = 10500/10800 = 97%. Recall (or true positive rate) = 500/600=83% Precision = 500/700=71% False pos rate = 200/10200 = 2% Detected: Has: Yes No 500 (true pos.) 100 (false neg.) 200 (false pos.) 10000 (true neg.) Q3a-d

Thresholding real-valued output allows you to tradeoff TPR and FPR Simple example: Classes P = positive, N = negative, and single real-valued output. True class: NN N P N P N N P N PP N PP PP PPP Output: -3 -2 -1 0 1 2 3 Threshold output to get class. label = output > t ? P : N Choice of threshold a If t == 0: TPR = ___, FPR = ___ 9/12 2/8 If t == 1: TPR = ___, FPR = ___ Repeat for many values of t 7/12, 1/8 True Pos Rate False Pos Rate

ROC curve Receiver-operating characteristic Useful when you can change a threshold to get different true and false positive rates Consider extremes Much more information recorded here! Q3

Confusion matrices for m>2 (outdoor image example) Detected True Beach recall: 169/(169+0+2+3+12+14)=84.5% Note confusion between mountain and urban classes due to features Similar colors and spatial layout Q4

Why do we need separate training and test sets? Exam analogy But working on practice questions is helpful…get the analogy? We hope our ability to do well on practice questions helps us on the actual exam Application to nearest-neighbor classifiers Often reserve a 3rd set for validation as well (to tune parameters of training set) Q5-8