CSSE463: Image Recognition Day 11

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Computational Learning An intuitive approach. Human Learning Objects in world –Learning by exploration and who knows? Language –informal training, inputs.
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural Pattern Recognition Neural networks.
1 Pattern Recognition Concepts How should objects be represented? Algorithms for recognition/matching * nearest neighbors * decision tree * decision functions.
Data Mining: Classification & Predication Hosam Al-Samarraie, PhD. Centre for Instructional Technology & Multimedia Universiti Sains Malaysia.
1 Learning Chapter 18 and Parts of Chapter 20 AI systems are complex and may have many parameters. It is impractical and often impossible to encode all.
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Chapter 4: Pattern Recognition. Classification is a process that assigns a label to an object according to some representation of the object’s properties.
CSSE463: Image Recognition Day 11 Lab 4 (shape) tomorrow: feel free to start in advance Lab 4 (shape) tomorrow: feel free to start in advance Test Monday.
CS 1699: Intro to Computer Vision Support Vector Machines Prof. Adriana Kovashka University of Pittsburgh October 29, 2015.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
CSSE463: Image Recognition Day 23 Midterm behind us… Midterm behind us… Foundations of Image Recognition completed! Foundations of Image Recognition completed!
GENDER AND AGE RECOGNITION FOR VIDEO ANALYTICS SOLUTION PRESENTED BY: SUBHASH REDDY JOLAPURAM.
CSSE463: Image Recognition Day 15 Announcements: Announcements: Lab 5 posted, due Weds, Jan 13. Lab 5 posted, due Weds, Jan 13. Sunset detector posted,
CSSE463: Image Recognition Day 11 Due: Due: Written assignment 1 tomorrow, 4:00 pm Written assignment 1 tomorrow, 4:00 pm Start thinking about term project.
Computer Vision Spring ,-685 Instructor: S. Narasimhan WH 5409 T-R 10:30am – 11:50am Lecture #23.
CSSE463: Image Recognition Day 15 Today: Today: Your feedback: Your feedback: Projects/labs reinforce theory; interesting examples, topics, presentation;
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
CSSE463: Image Recognition Day 14
Evaluating Classifiers
Introduction to Machine Learning
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
CSSE463: Image Recognition Day 21
Performance Evaluation 02/15/17
School of Computer Science & Engineering
CSSE463: Image Recognition Day 11
Chapter 7 – K-Nearest-Neighbor
Recognition using Nearest Neighbor (or kNN)
Support Vector Machines (SVM)
Object detection as supervised classification
Data Mining Classification: Alternative Techniques
Features & Decision regions
Machine Learning Week 1.
CSSE463: Image Recognition Day 20
CSSE463: Image Recognition Day 11
Nearest-Neighbor Classifiers
CSSE463: Image Recognition Day 23
CSSE463: Image Recognition Day 17
Introduction to Pattern Recognition
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 20
Midterm Exam Closed book, notes, computer Format:
CSSE463: Image Recognition Day 15
Model Evaluation and Selection
CSSE463: Image Recognition Day 14
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
CSSE463: Image Recognition Day 13
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
CSSE463: Image Recognition Day 15
CS4670: Intro to Computer Vision
CSSE463: Image Recognition Day 14
Midterm Exam Closed book, notes, computer Similar to test 1 in format:
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 23
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 16
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 23
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 16
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
ECE – Pattern Recognition Lecture 8 – Performance Evaluation
Presentation transcript:

CSSE463: Image Recognition Day 11 Start thinking about term project ideas. Interesting data set, use of which won the Marr prize in computer vision: http://vision.cs.stonybrook.edu/~vicente/sbucaptions/ Interesting project? From Larry: https://lab.nationalmedals.org/img_processing Next 1.5 weeks: Pattern recognition Concepts, error types (today) Basic theory and how to use classifiers in MATLAB: Support vector machines (SVM). Neural networks

Pattern recognition Making a decision from data A classification problem: assign a single class label to a datum point Can include a special class, reject, if a sample (a single datum point) appears not to belong to any known class If it is on the boundary between classes Else forced classification Boundaries between classes-how? There’s tons of theory, can be applied to many areas. We focus on small subset of those used for vision Q1

Baseline: Hand-tuned decision boundaries You did this based on observations for fruit classification You’ll do the same thing in Lab 4 for shapes But what if the features were much more complex? We now discuss classifiers that learn class boundaries based on exemplars (e.g., labeled training examples)

Ex: Nearest neighbor classifier Assumes we have a feature vector for each image Calculate distance from new test sample to each labeled training sample. Assign label as closest training sample Generalize by assigning same label as the majority of the k nearest neighbors. No majority? Boundaries: http://ai6034.mit.edu/fall12/index.php?title=Demonstrations

Look at this to understand nearest neighbor http://ai6034.mit.edu/fall12/index.php?title=Demonstrations Shows Voronai diagrams for nearest neighbor classifiers Nice intro to SVMs also

Nearest class mean Test point Find class means and calculate distance to each mean Pro? Con? Partial solution: clustering Learning vector quantization (LVQ): tries to find optimal clusters LVQ Q2

Common model of learning machines Statistical Learning Labeled Training Images Extract Features (color, texture) Summary Test Image Extract Features (color, texture) Classifier Label

How good is your classifier? Example from medicine: Disease detection Consider costs of false neg. vs. false pos. Lots of different error measures Accuracy = 10500/10800 = 97%. Is 97% accuracy OK? Recall (or true positive rate) = 500/600=83% Precision = 500/700=71% False pos rate = 200/10200 = 2% Detected True Yes No 500 (true pos.) 100 (false neg.) 200 (false pos.) 10000 (true neg.) 600 Total actual positive 10200 Total actual negative 700 Total det. as pos. 10100 Total det. as neg.

How good is your classifier? Write out definitions of each measure now Examples Accuracy = 10500/10800 = 97%. Recall (or true positive rate) = 500/600=83% Precision = 500/700=71% False pos rate = 200/10200 = 2% Detected: Has: Yes No 500 (true pos.) 100 (false neg.) 200 (false pos.) 10000 (true neg.) Q3a-d

Thresholding real-valued output allows you to tradeoff TPR and FPR Simple example: Classes P = positive, N = negative, and single real-valued output. True class: NN N P N P N N P N PP N PP PP PPP Output: -3 -2 -1 0 1 2 3 Threshold output to get class. label = output > t ? P : N Choice of threshold a If t == 0: TPR = ___, FPR = ___ 9/12 2/8 If t == 1: TPR = ___, FPR = ___ Repeat for many values of t 7/12, 1/8 True Pos Rate False Pos Rate

ROC curve Receiver-operating characteristic Useful when you can change a threshold to get different true and false positive rates Consider extremes Much more information recorded here! Q3

Focus on testing Let m = the number of possible class labels Consider m==2. Example: Calculate distance to cluster means for 2 classes. Dist/prob1 Test Image Extract Features (color, texture) Class: 1, 2 Decide Dist/prob2

Multiclass problems Consider m>2. Example: Calculate distance to cluster means for 10 classes. Dist/prob1 Dist/prob2 Test Image Extract Features (color, texture) Class: 1, 2,…N Decide Dist/probN

Confusion matrices for m>2 (outdoor image example) Detected True Beach recall: 169/(169+0+2+3+12+14)=84.5% Note confusion between mountain and urban classes due to features Similar colors and spatial layout Q4

Why do we need separate training and test sets? Exam analogy But working on practice questions is helpful…get the analogy? We hope our ability to do well on practice questions helps us on the actual exam Application to nearest-neighbor classifiers Often reserve a 3rd set for validation as well (to tune parameters of training set) Q5-8