CSSE463: Image Recognition Day 11 Due: Due: Written assignment 1 tomorrow, 4:00 pm Written assignment 1 tomorrow, 4:00 pm Start thinking about term project.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

COMPUTER AIDED DIAGNOSIS: CLASSIFICATION Prof. Yasser Mostafa Kadah –
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Chapter 4 Pattern Recognition Concepts: Introduction & ROC Analysis.
Supervised Learning Techniques over Twitter Data Kleisarchaki Sofia.
Discriminative and generative methods for bags of features
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
CS 590M Fall 2001: Security Issues in Data Mining Lecture 3: Classification.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Chapter 4 Pattern Recognition Concepts continued.
嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural Pattern Recognition Neural networks.
1 Pattern Recognition Concepts How should objects be represented? Algorithms for recognition/matching * nearest neighbors * decision tree * decision functions.
Data Mining: Classification & Predication Hosam Al-Samarraie, PhD. Centre for Instructional Technology & Multimedia Universiti Sains Malaysia.
1 Learning Chapter 18 and Parts of Chapter 20 AI systems are complex and may have many parameters. It is impractical and often impossible to encode all.
Computational Intelligence: Methods and Applications Lecture 12 Bayesian decisions: foundation of learning Włodzisław Duch Dept. of Informatics, UMK Google:
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Chapter 4: Pattern Recognition. Classification is a process that assigns a label to an object according to some representation of the object’s properties.
CSSE463: Image Recognition Day 11 Lab 4 (shape) tomorrow: feel free to start in advance Lab 4 (shape) tomorrow: feel free to start in advance Test Monday.
CS 1699: Intro to Computer Vision Support Vector Machines Prof. Adriana Kovashka University of Pittsburgh October 29, 2015.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
CSSE463: Image Recognition Day 23 Midterm behind us… Midterm behind us… Foundations of Image Recognition completed! Foundations of Image Recognition completed!
GENDER AND AGE RECOGNITION FOR VIDEO ANALYTICS SOLUTION PRESENTED BY: SUBHASH REDDY JOLAPURAM.
CSSE463: Image Recognition Day 15 Announcements: Announcements: Lab 5 posted, due Weds, Jan 13. Lab 5 posted, due Weds, Jan 13. Sunset detector posted,
Computer Vision Spring ,-685 Instructor: S. Narasimhan WH 5409 T-R 10:30am – 11:50am Lecture #23.
Evaluating Classifiers Reading: T. Fawcett, An introduction to ROC analysis, Sections 1-4, 7 (linked from class website)An introduction to ROC analysis.
Notes on HW 1 grading I gave full credit as long as you gave a description, confusion matrix, and working code Many people’s descriptions were quite short.
CS 2750: Machine Learning The Bias-Variance Tradeoff Prof. Adriana Kovashka University of Pittsburgh January 13, 2016.
CSSE463: Image Recognition Day 14 Lab due Weds. Lab due Weds. These solutions assume that you don't threshold the shapes.ppt image: Shape1: elongation.
CSSE463: Image Recognition Day 15 Today: Today: Your feedback: Your feedback: Projects/labs reinforce theory; interesting examples, topics, presentation;
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
CSSE463: Image Recognition Day 14
Evaluating Classifiers
CSSE463: Image Recognition Day 21
Performance Evaluation 02/15/17
School of Computer Science & Engineering
Classifiers!!! BCH339N Systems Biology / Bioinformatics – Spring 2016
CSSE463: Image Recognition Day 11
Chapter 7 – K-Nearest-Neighbor
Object detection as supervised classification
Data Mining Classification: Alternative Techniques
Features & Decision regions
Machine Learning Week 1.
CSSE463: Image Recognition Day 20
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 23
CSSE463: Image Recognition Day 17
Evaluating Classifiers (& other algorithms)
Introduction to Pattern Recognition
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 20
Midterm Exam Closed book, notes, computer Format:
CSSE463: Image Recognition Day 23
Model Evaluation and Selection
CSSE463: Image Recognition Day 14
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
CSSE463: Image Recognition Day 13
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
CSSE463: Image Recognition Day 14
Midterm Exam Closed book, notes, computer Similar to test 1 in format:
CSSE463: Image Recognition Day 15
CSSE463: Image Recognition Day 23
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 16
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 23
CSSE463: Image Recognition Day 11
Evaluating Classifiers
CSSE463: Image Recognition Day 16
ECE – Pattern Recognition Lecture 8 – Performance Evaluation
Presentation transcript:

CSSE463: Image Recognition Day 11 Due: Due: Written assignment 1 tomorrow, 4:00 pm Written assignment 1 tomorrow, 4:00 pm Start thinking about term project ideas. Start thinking about term project ideas. Lab 4 (shape) tomorrow: feel free to start in advance Lab 4 (shape) tomorrow: feel free to start in advance Questions? Questions? Next 1.5 weeks: Pattern recognition Next 1.5 weeks: Pattern recognition Concepts, error types (today) Concepts, error types (today) Basic theory and how to use classifiers in Matlab: Basic theory and how to use classifiers in Matlab: Neural networks Neural networks Support vector machines (SVM). Support vector machines (SVM).

Pattern recognition Making a decision from data Making a decision from data A classification problem: assign a single class label to a datum point A classification problem: assign a single class label to a datum point Can include a special class, reject, Can include a special class, reject, if a sample (a single datum point) appears not to belong to any known class if a sample (a single datum point) appears not to belong to any known class If it is on the boundary between classes If it is on the boundary between classes Else forced classification Else forced classification Boundaries between classes-how? Boundaries between classes-how? There’s tons of theory, can be applied to many areas. We focus on small subset of those used for vision There’s tons of theory, can be applied to many areas. We focus on small subset of those used for vision Q1

Baseline: Hand-tuned decision boundaries You did this based on observations for fruit classification You did this based on observations for fruit classification You’ll do the same thing in Lab 4 for shapes You’ll do the same thing in Lab 4 for shapes But what if the features were much more complex? But what if the features were much more complex? We now discuss classifiers that learn class boundaries based on exemplars (e.g., labeled training examples) We now discuss classifiers that learn class boundaries based on exemplars (e.g., labeled training examples)

Ex: Nearest neighbor classifier Assumes we have a feature vector for each image Assumes we have a feature vector for each image Calculate distance from new test sample to each labeled training sample. Calculate distance from new test sample to each labeled training sample. Assign label as closest training sample Assign label as closest training sample Generalize by assigning same label as the majority of the k nearest neighbors. No majority? Generalize by assigning same label as the majority of the k nearest neighbors. No majority?

Nearest class mean Find class means and calculate distance to each mean Find class means and calculate distance to each mean Pro? Con? Partial solution: clustering Partial solution: clustering Learning vector quantization (LVQ): finds optimal clusters LVQ Test point Q2

Common model of learning machines Statistical Learning Labeled Training Images Extract Features (color, texture) Test Image Summary Label Classifier Extract Features (color, texture)

Focus on testing Let m = the number of possible class labels Let m = the number of possible class labels Consider m==2. Consider m==2. Example: Calculate distance to cluster means for 2 classes. Example: Calculate distance to cluster means for 2 classes. Dist/prob 1 Dist/prob 2 Test Image Extract Features (color, texture) Decide Class: 1, 2

Multiclass problems Consider m>2. Consider m>2. Example: Calculate distance to cluster means for 10 classes. Example: Calculate distance to cluster means for 10 classes. Dist/prob 1 Dist/prob 2 Dist/prob N Test Image Extract Features (color, texture) Decide Class: 1, 2,…N

How good is your classifier? Example from medicine: Disease detection Example from medicine: Disease detection Consider costs of false neg. vs. false pos. Consider costs of false neg. vs. false pos. Lots of different error measures Lots of different error measures Accuracy = 10500/10800 = 97%. Is 97% accuracy OK? Recall (or true positive rate) = 500/600=83% Precision = 500/700=71% False pos rate = 200/10200 = 2% Detected DetectedTrueYesNo Yes500 (true pos.) 100 (false neg.) No200 (false pos.) (true neg.) Total actual negative 600 Total actual positive 700 Total det. as pos Total det. as neg.

How good is your classifier? Write out definitions of each measure now Write out definitions of each measure now Examples Examples Accuracy = 10500/10800 = 97%. Recall (or true positive rate) = 500/600=83% Precision = 500/700=71% False pos rate = 200/10200 = 2% Detected:Has:YesNo Yes500 (true pos.) 100 (false neg.) No200 (false pos.) (true neg.) Q3

What if I have a tunable threshold, t? If t == 0: TPR = ___, FPR = ___ 9/122/8 If t == 1: TPR = ___, FPR = ___ Repeat for many values of t NN N P N P N N P N PP N PP PP PPP Simple example: single real-valued output. Thresholding: label = output > t ? P : N False Pos Rate True Pos Rate True class:

ROC curve Receiver-operating characteristic Receiver-operating characteristic Useful when you can change a threshold to get different true and false positive rates Useful when you can change a threshold to get different true and false positive rates Consider extremes Consider extremes Much more information recorded here! Much more information recorded here! Q3

Confusion matrices for m>2 (outdoor image example) Beach recall: 169/( )=84.5% Beach recall: 169/( )=84.5% Note confusion between mountain and urban classes due to features Note confusion between mountain and urban classes due to features Similar colors and spatial layout Similar colors and spatial layout Detected True Q4

Why do we need separate training and test sets? Exam analogy But working on practice questions is helpful…get the analogy? We hope our ability to do well on practice questions helps us on the actual exam Application to nearest-neighbor classifiers Often reserve a 3 rd set for validation as well (to tune parameters of training set) Q5-8