Chapter 4 Pattern Recognition Concepts continued.

Slides:



Advertisements
Similar presentations
Pseudo-Relevance Feedback For Multimedia Retrieval By Rong Yan, Alexander G. and Rong Jin Mwangi S. Kariuki
Advertisements

Segmentation of Touching Characters in Devnagari & Bangla Scripts Using Fuzzy MultiFactorial Analysis Presented By: Sanjeev Maharjan St. Xavier’s College.
ETISEO Benoît GEORIS and François BREMOND ORION Team, INRIA Sophia Antipolis, France Lille, December th 2005.
Chapter 4 Pattern Recognition Concepts: Introduction & ROC Analysis.
Machine Learning & Data Mining CS/CNS/EE 155 Lecture 2: Review Part 2.
Assessing and Comparing Classification Algorithms Introduction Resampling and Cross Validation Measuring Error Interval Estimation and Hypothesis Testing.
1 Pattern Recognition Pattern recognition is: 1. The name of the journal of the Pattern Recognition Society. 2. A research area in which patterns in data.
Pattern Recognition: Readings: Ch 4: , , 4.13
1 Pattern Recognition Pattern recognition is: 1. The name of the journal of the Pattern Recognition Society. 2. A research area in which patterns in data.
Binary Image Analysis: Part 2 Readings: Chapter 3: mathematical morphology region properties region adjacency 1.
CSE803 Fall Pattern Recognition Concepts Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching.
© Vipin Kumar CSci 8980 Fall CSci 8980: Data Mining (Fall 2002) Vipin Kumar Army High Performance Computing Research Center Department of Computer.
Tutorial 2 LIU Tengfei 2/19/2009. Contents Introduction TP, FP, ROC Precision, recall Confusion matrix Other performance measures Resource.
1 The Ecological Statistics of Grouping by Similarity Charless Fowlkes, David Martin, Jitendra Malik Computer Science Division University of California.
Decision Theory Naïve Bayes ROC Curves
5/30/2006EE 148, Spring Visual Categorization with Bags of Keypoints Gabriella Csurka Christopher R. Dance Lixin Fan Jutta Willamowski Cedric Bray.
Jeremy Wyatt Thanks to Gavin Brown
Stockman CSE803 Fall Pattern Recognition Concepts Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching.
CSCI 347 / CS 4206: Data Mining Module 06: Evaluation Topic 07: Cost-Sensitive Measures.
Evaluating Classifiers
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural Pattern Recognition Neural networks.
: Chapter 10: Image Recognition 1 Montri Karnjanadecha ac.th/~montri Image Processing.
Evaluation – next steps
8D040 Basis beeldverwerking Feature Extraction Anna Vilanova i Bartrolí Biomedical Image Analysis Group bmia.bmt.tue.nl.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Classification II. 2 Numeric Attributes Numeric attributes can take many values –Creating branches for each value is not ideal The value range is usually.
Basic statistics 11/09/13.
Performance measurement. Must be careful what performance metric we use For example, say we have a NN classifier with 1 output unit, and we code ‘1 =
הערכת טיב המודל F-Measure, Kappa, Costs, MetaCost ד " ר אבי רוזנפלד.
8D040 Basis beeldverwerking Feature Extraction Anna Vilanova i Bartrolí Biomedical Image Analysis Group bmia.bmt.tue.nl.
1 Pattern Recognition Concepts How should objects be represented? Algorithms for recognition/matching * nearest neighbors * decision tree * decision functions.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Pattern Recognition 1 Pattern recognition is: 1. The name of the journal of the Pattern Recognition Society. 2. A research area in which patterns in data.
Chapter 4: Pattern Recognition. Classification is a process that assigns a label to an object according to some representation of the object’s properties.
CSSE463: Image Recognition Day 11 Lab 4 (shape) tomorrow: feel free to start in advance Lab 4 (shape) tomorrow: feel free to start in advance Test Monday.
Computational Intelligence: Methods and Applications Lecture 16 Model evaluation and ROC Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
1Ellen L. Walker Category Recognition Associating information extracted from images with categories (classes) of objects Requires prior knowledge about.
Covariance matrices for all of the classes are identical, But covariance matrices are arbitrary.
Data Mining Practical Machine Learning Tools and Techniques By I. H. Witten, E. Frank and M. A. Hall Chapter 5: Credibility: Evaluating What’s Been Learned.
CSSE463: Image Recognition Day 11 Due: Due: Written assignment 1 tomorrow, 4:00 pm Written assignment 1 tomorrow, 4:00 pm Start thinking about term project.
Evaluating Classification Performance
Bayesian decision theory: A framework for making decisions when uncertainty exit 1 Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e.
Chapter 5: Credibility. Introduction Performance on the training set is not a good indicator of performance on an independent set. We need to predict.
Meta-learning for Algorithm Recommendation Meta-learning for Algorithm Recommendation Background on Local Learning Background on Algorithm Assessment Algorithm.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Chapter 5: Credibility. Introduction Performance on the training set is not a good indicator of performance on an independent set. We need to predict.
Timothy Wiemken, PhD MPH Assistant Professor Division of Infectious Diseases Diagnostic Tests.
Supervise Learning. 2 What is learning? “Learning denotes changes in a system that... enable a system to do the same task more efficiently the next time.”
Introduction to Machine Learning
Performance Evaluation 02/15/17
CSSE463: Image Recognition Day 11
Classification Evaluation And Model Selection
Chapter 7 – K-Nearest-Neighbor
Machine Learning Week 10.
Data Mining Classification: Alternative Techniques
Features & Decision regions
Pattern Recognition Pattern recognition is:
Content-Based Image Retrieval
CSSE463: Image Recognition Day 11
Content-Based Image Retrieval
Evaluating Models Part 1
Binary Image Analysis: Part 2 Readings: Chapter 3:
CSSE463: Image Recognition Day 11
Roc curves By Vittoria Cozza, matr
CSSE463: Image Recognition Day 11
ECE – Pattern Recognition Lecture 8 – Performance Evaluation
ECE – Pattern Recognition Midterm Review
Presentation transcript:

Chapter 4 Pattern Recognition Concepts continued

Using features Ex. Recognizing characters Ex. Recognizing characters Area Area Height Height Width Width # of holes # of holes # of strokes # of strokes Centroid Centroid Best axis (of least inertia) Best axis (of least inertia) Second moments (about axis of least and most inertia) Second moments (about axis of least and most inertia)

Decision tree

Classifying using nearest class mean Some problems are more “fuzzy” and can’t be solved using simple decision trees. Some problems are more “fuzzy” and can’t be solved using simple decision trees. To classify a candidate, c, compute its distance from all know class means and assign it to the same class as the class of the nearest mean. To classify a candidate, c, compute its distance from all know class means and assign it to the same class as the class of the nearest mean.

Classifying using nearest class mean (with good results)

Euclidean distance (and scaled Euclidean distance)

Other distance measures from from

What is a distance metric? 3 properties: 1. g(x,y) = g(y,x)symmetric 2. g(x,x) = 0 and g(x,y)=0 implies x=y 3. g(x,y) + g(y,z) >= g(x,z)triangle inequality from from

Classifying using nearest class mean (with poor results)

Classifying using nearest neighbor To classify a candidate, c, compute its distance to all member of all know classes and assign it to the same class as the class of the nearest element. To classify a candidate, c, compute its distance to all member of all know classes and assign it to the same class as the class of the nearest element.

Precision vs. recall Say we have an image db we wish to query. Say we have an image db we wish to query. “Show me all images of tanks.” “Show me all images of tanks.” Precision = # of relevant documents retrieved divided by the total number of documents retrieved Precision = # of relevant documents retrieved divided by the total number of documents retrieved Precision = TP / (TP+FP) Precision = TP / (TP+FP) PPV = positive predictive value PPV = positive predictive value Probability that the target is is actually present when the observer says that is it present. Probability that the target is is actually present when the observer says that is it present.

Precision vs. recall Say we have an image db we wish to query. Say we have an image db we wish to query. “Show me all images of tanks.” “Show me all images of tanks.” Recall = # of relevant documents retrieved divided by the total number of relevant documents in the db. Recall = # of relevant documents retrieved divided by the total number of relevant documents in the db. Recall = TP / (TP+FN) = TPF Recall = TP / (TP+FN) = TPF Negative predictive value Negative predictive value NPV = TN / (TN+FN) NPV = TN / (TN+FN) Probability that the target is actually absent when the observer says that it is absent. Probability that the target is actually absent when the observer says that it is absent.

Structural (pattern recognition) techniques Simple features may not be enough for recognition. Simple features may not be enough for recognition. So relationships between these primitive features are used (in structural techniques). So relationships between these primitive features are used (in structural techniques).

Same bounding box, holes, strokes, centroid, 2 nd moments in row and column directions, and similar major axis direction. Same bounding box, holes, strokes, centroid, 2 nd moments in row and column directions, and similar major axis direction.

bay=intrusion of background

lid=virtual line segment that close the bay

Structural graphs a graph G = ( V, E ) a graph G = ( V, E ) where V is a vertex set and E is an edge set where V is a vertex set and E is an edge set from from

Structural graphs a graph G = ( V, E ) a graph G = ( V, E ) where V is a vertex set and E is an edge set where V is a vertex set and E is an edge set V S = side S = side L = lake L = lake B = bay B = bay E CON = connection of 2 strokes CON = connection of 2 strokes ADJ = stroke region is immediately adjacent to a lake or bay region ADJ = stroke region is immediately adjacent to a lake or bay region ABOVE = 1 hole (lake or bay) lies above another ABOVE = 1 hole (lake or bay) lies above another

Structural graph conclusions Graph-matching techniques can be used for recognition. Graph-matching techniques can be used for recognition. Or we can count occurrences of relationships and use these counts as a feature vector for statistical PR. Or we can count occurrences of relationships and use these counts as a feature vector for statistical PR.

Structural graph homework Create structural graphs for the following characters: Create structural graphs for the following characters: X 8 C 6 your answers to me with the subject, “structural.” your answers to me with the subject, “structural.”

Confusion matrix

Empirical error rate = % misclassified Empirical error rate = % misclassified Empirical reject rate = % rejected Empirical reject rate = % rejected

Empirical error rate = % misclassified = 25/1000 overall (does not include rejects); 5/100 for 9s Empirical error rate = % misclassified = 25/1000 overall (does not include rejects); 5/100 for 9s Empirical reject rate = % rejected = 7/1000 Empirical reject rate = % rejected = 7/1000 Can ROC analysis be applied? Can ROC analysis be applied?

Recall ROC analysis TP = true positive = present and detected TP = true positive = present and detected TN = true negative = not present and not detected TN = true negative = not present and not detected FP = false positive = not present but detected FP = false positive = not present but detected FN = false negative = present but not detected FN = false negative = present but not detected True positive fraction True positive fraction TPF = TP / (TP+FN) TPF = TP / (TP+FN) true abnormals called abnormal by the observer true abnormals called abnormal by the observer False positive fraction False positive fraction FPF = FP / (FP+TN) FPF = FP / (FP+TN)

Ex. ROC analysis for the classification of ‘3’ TP = ‘3’ present and ‘3’ detected TP = ‘3’ present and ‘3’ detected TN = not present and not detected TN = not present and not detected FP = not present but detected FP = not present but detected FN = present but not detected FN = present but not detected

Ex. ROC analysis for the classification of ‘3’ TP = ‘3’ present and ‘3’ detected TP = ‘3’ present and ‘3’ detected TN = not present and not detected TN = not present and not detected FP = not present but detected FP = not present but detected FN = present but not detected FN = present but not detected

Ex. ROC analysis for the classification of ‘3’ TP = ‘3’ present and ‘3’ detected TP = ‘3’ present and ‘3’ detected TN = not present and not detected TN = not present and not detected FP = not present but detected FP = not present but detected FN = present but not detected FN = present but not detected

Ex. ROC analysis for the classification of ‘3’ TP = ‘3’ present and ‘3’ detected TP = ‘3’ present and ‘3’ detected TN = not present and not detected TN = not present and not detected FP = not present but detected FP = not present but detected FN = present but not detected FN = present but not detected

Skip remainder of Chapter 4