Object Class Recognition Using Discriminative Local Features Gyuri Dorko and Cordelia Schmid.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Pseudo-Relevance Feedback For Multimedia Retrieval By Rong Yan, Alexander G. and Rong Jin Mwangi S. Kariuki
Scale & Affine Invariant Interest Point Detectors Mikolajczyk & Schmid presented by Dustin Lennon.
Context-based object-class recognition and retrieval by generalized correlograms by J. Amores, N. Sebe and P. Radeva Discussion led by Qi An Duke University.
Presented by Xinyu Chang
Human Identity Recognition in Aerial Images Omar Oreifej Ramin Mehran Mubarak Shah CVPR 2010, June Computer Vision Lab of UCF.
1 Machine Learning: Lecture 10 Unsupervised Learning (Based on Chapter 9 of Nilsson, N., Introduction to Machine Learning, 1996)
Search Engines Information Retrieval in Practice All slides ©Addison Wesley, 2008.
Clustering with k-means and mixture of Gaussian densities Jakob Verbeek December 3, 2010 Course website:
K Means Clustering , Nearest Cluster and Gaussian Mixture
An Overview of Machine Learning
Supervised Learning Recap
Chapter 4: Linear Models for Classification
G. Valenzise *, L. Gerosa, M. Tagliasacchi *, F. Antonacci *, A. Sarti * IEEE Int. Conf. On Advanced Video and Signal-based Surveillance, 2007 * Dipartimento.
Object Recognition with Invariant Features n Definition: Identify objects or scenes and determine their pose and model parameters n Applications l Industrial.
Robust Moving Object Detection & Categorization using self- improving classifiers Omar Javed, Saad Ali & Mubarak Shah.
Lecture 17: Supervised Learning Recap Machine Learning April 6, 2010.
Locally Constraint Support Vector Clustering
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
Unsupervised Learning: Clustering Rong Jin Outline  Unsupervised learning  K means for clustering  Expectation Maximization algorithm for clustering.
Object Recognition with Invariant Features n Definition: Identify objects or scenes and determine their pose and model parameters n Applications l Industrial.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Multi-Class Object Recognition Using Shared SIFT Features
Text Classification from Labeled and Unlabeled Documents using EM Kamal Nigam Andrew K. McCallum Sebastian Thrun Tom Mitchell Machine Learning (2000) Presented.
Object Recognition Using Distinctive Image Feature From Scale-Invariant Key point D. Lowe, IJCV 2004 Presenting – Anat Kaspi.
Scale Invariant Feature Transform (SIFT)
Evaluation of features detectors and descriptors based on 3D objects P. Moreels - P. Perona California Institute of Technology.
Optimal Adaptation for Statistical Classifiers Xiao Li.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Multiple Object Class Detection with a Generative Model K. Mikolajczyk, B. Leibe and B. Schiele Carolina Galleguillos.
Object Recognition by Parts Object recognition started with line segments. - Roberts recognized objects from line segments and junctions. - This led to.
Introduction to machine learning
Overview Introduction to local features
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
A Thousand Words in a Scene P. Quelhas, F. Monay, J. Odobez, D. Gatica-Perez and T. Tuytelaars PAMI, Sept
ECSE 6610 Pattern Recognition Professor Qiang Ji Spring, 2011.
Classification 2: discriminative models
SVCL Automatic detection of object based Region-of-Interest for image compression Sunhyoung Han.
Recognition and Matching based on local invariant features Cordelia Schmid INRIA, Grenoble David Lowe Univ. of British Columbia.
Overview Harris interest points Comparing interest points (SSD, ZNCC, SIFT) Scale & affine invariant interest points Evaluation and comparison of different.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
International Conference on Intelligent and Advanced Systems 2007 Chee-Ming Ting Sh-Hussain Salleh Tian-Swee Tan A. K. Ariff. Jain-De,Lee.
Building local part models for category-level recognition C. Schmid, INRIA Grenoble Joint work with G. Dorko, S. Lazebnik, J. Ponce.
A Statistically Selected Part-Based Probabilistic Model for Object Recognition Zhipeng Zhao, Ahmed Elgammal Department of Computer Science, Rutgers, The.
Partially Supervised Classification of Text Documents by Bing Liu, Philip Yu, and Xiaoli Li Presented by: Rick Knowles 7 April 2005.
Classifying Images with Visual/Textual Cues By Steven Kappes and Yan Cao.
Classification 1: generative and non-parameteric methods Jakob Verbeek January 7, 2011 Course website:
Evaluation of interest points and descriptors. Introduction Quantitative evaluation of interest point detectors –points / regions at the same relative.
MACHINE LEARNING 8. Clustering. Motivation Based on E ALPAYDIN 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2  Classification problem:
Limitations of Cotemporary Classification Algorithms Major limitations of classification algorithms like Adaboost, SVMs, or Naïve Bayes include, Requirement.
Visual Categorization With Bags of Keypoints Original Authors: G. Csurka, C.R. Dance, L. Fan, J. Willamowski, C. Bray ECCV Workshop on Statistical Learning.
Jakob Verbeek December 11, 2009
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Supervised Learning Resources: AG: Conditional Maximum Likelihood DP:
Overview Introduction to local features Harris interest points + SSD, ZNCC, SIFT Scale & affine invariant interest point detectors Evaluation and comparison.
GENDER AND AGE RECOGNITION FOR VIDEO ANALYTICS SOLUTION PRESENTED BY: SUBHASH REDDY JOLAPURAM.
Chapter 13 (Prototype Methods and Nearest-Neighbors )
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
Presented by David Lee 3/20/2006
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
Gaussian Mixture Model classification of Multi-Color Fluorescence In Situ Hybridization (M-FISH) Images Amin Fazel 2006 Department of Computer Science.
Unsupervised Learning Part 2. Topics How to determine the K in K-means? Hierarchical clustering Soft clustering with Gaussian mixture models Expectation-Maximization.
SUPPORT VECTOR MACHINES
Evaluating Classifiers
Presented by David Lee 3/20/2006
Introduction to Machine Learning
Classification of unlabeled data:
Saliency, Scale and Image Description (by T. Kadir and M
Recognition and Matching based on local invariant features
Presentation transcript:

Object Class Recognition Using Discriminative Local Features Gyuri Dorko and Cordelia Schmid

Introduction This method is a two step approach to develop a discriminative feature selection for object part recognition and detection. This method is a two step approach to develop a discriminative feature selection for object part recognition and detection. The first step extracts scale and affine invariant local features. The first step extracts scale and affine invariant local features. The second generates and trains a model using the features in a “weakly supervised” approach. The second generates and trains a model using the features in a “weakly supervised” approach.

Local Descriptors Detectors Detectors Harris-Laplace Harris-Laplace Harris-Affine Harris-Affine Entropy (Kadir & Brady) Entropy (Kadir & Brady) Descriptors Descriptors SIFT (Scale Invariant Feature Transform) SIFT (Scale Invariant Feature Transform)

Learning This is also a two step process This is also a two step process Part Classifier Part Classifier EM clustering in the descriptor space EM clustering in the descriptor space Part Selection Part Selection Ranking by classification likelihood Ranking by classification likelihood Ranking by mutual information criterion Ranking by mutual information criterion

Learning the part classifiers With the clustering set positive descriptors are obtained to estimate a Gaussian Mixture Model (GMM). It is a parametric estimation of the of the probability distribution of the local descriptors. Where K is the number of Gaussian components and: The dimension of the vectors x is 128 corresponding to the dimensions of the SIFT features.

Learning the part classifiers The model parameters  i,  i and P(C i ) are computed with the expectation-maximization (EM) algorithm. The EM is initialized with the output of the K-means algorithm. This are the equations to update the parameters at the jth maximization (M) step.

Learning the part classifiers The clusters are obtained from assigning each descriptor to its closest component. The clusters typically contain representative object parts or textures. Here we see some characteristic clusters of each database. With the mixture model a boundary is defined for each component to form K part classifiers. Each classifier is associated with one Gaussian A test feature y is assigned to the component i * having the highest probability.

Selection The selection ranks the components according to its ability to discriminate between the object- class and the background. The selection ranks the components according to its ability to discriminate between the object- class and the background. By classification likelihood. Promotes having high true positives and low false positives. By classification likelihood. Promotes having high true positives and low false positives. By mutual information. Selects part classifiers based on the information content to separate background from the objects-class. By mutual information. Selects part classifiers based on the information content to separate background from the objects-class.

Ranking by classification likelihood The ranking is computed as follows: The ranking is computed as follows: Where V (u) and V (n) are the unlabeled (potentially positive) descriptors v j (u) and negative descriptors v j (n) from the validation set. Performs selection by classification rate. This component hay have very low recall rates. Even though this parts are individually rare, combinations of them provide sufficient recall with excellent precision. Recall: true features/( true features + true negatives)

Ranking by mutual information Best to select a few discriminative general part classifiers. Best to select a few discriminative general part classifiers. Ranks parts classifiers based on their information content for separating the background from the object-class. Ranks parts classifiers based on their information content for separating the background from the object-class. The mutual information of component C i and object- class O is: The mutual information of component C i and object- class O is: Naively assumes all unlabeled as the object

Final feature Classifier Based on the ranking, the n part classifiers of the highest rank are chosen and marked as positive. Based on the ranking, the n part classifiers of the highest rank are chosen and marked as positive. The rest are marked as negative, the true negative and the non-discriminative positive ones. The rest are marked as negative, the true negative and the non-discriminative positive ones. Note that each part classifier is based on a Gaussian component, thus the MAP criterion only activates one part classifier per descriptor. Note that each part classifier is based on a Gaussian component, thus the MAP criterion only activates one part classifier per descriptor.

Applications Initial step for localization within images. The output is not binary but a ranking of the part classification. Initial step for localization within images. The output is not binary but a ranking of the part classification. Classification of the presence or absence of an object in an image. Here is required an additional criterion of how many p positive classified descriptors are required to mark the presence of an object. The authors uses this because it is easier to compare. Classification of the presence or absence of an object in an image. Here is required an additional criterion of how many p positive classified descriptors are required to mark the presence of an object. The authors uses this because it is easier to compare.

Experimental Results Precision by detector and ranking Feature selection with increasing n

Experimental Results ROC (Receiver Operating Characteristic)True positives on equal-error rate

Experimental Results

Selection of the entropy detector Selection results of different feature detectors

Thanks!