Presentation is loading. Please wait.

Presentation is loading. Please wait.

Gustavo Carneiro The Automatic Design of Feature Spaces for Local Image Descriptors using an Ensemble of Non-linear Feature Extractors.

Similar presentations


Presentation on theme: "Gustavo Carneiro The Automatic Design of Feature Spaces for Local Image Descriptors using an Ensemble of Non-linear Feature Extractors."— Presentation transcript:

1 Gustavo Carneiro The Automatic Design of Feature Spaces for Local Image Descriptors using an Ensemble of Non-linear Feature Extractors

2 Set of Matching Problems Visual Object Recognition Visual Class Recognition Wide baseline Matching

3 Set of Matching Problems 1- Design a feature space that facilitates certain matching problems SIFT [Lowe,ICCV09] Shape Context [Belongie et al. PAMI02] HOG [Dalal & Triggs,CVPR05]

4 Set of Matching Problems 2- Given a matching problem, and a set of feature spaces, combine them in order to minimize probability of error (mismatches) [Varma & Ray, CVPR07] Target Matching Problem SIFT Shape Context HOG

5 Set of Matching Problems 3- Given a matching problem, find the feature space and respective parameters θ that minimizes probability of error (mismatches) [Hua et al.ICCV07] Target Matching Problem Feature Transform 2 (θ*) Feature Transform 1 (θ) Feature Transform 2 (θ) Feature Transform 1 (θ*)

6 Set of Matching Problems 4- Given future unknown matching problems, find the feature space that minimizes probability of error (mismatches) Feature Transform Matching Problem 1 Feature Transform Matching Problem 2 Feature Transform Matching Problem 3 Feature Transform Matching Problem 4 Feature Transform Matching Problem 5 Target Matching Problem 1 Target Matching Problem 1 Target Matching Problem 2 Target Matching Problem 2

7 The Universal Feature Transform Solve random and simple matching problems The more matching problems solved, the easier it will be to solve new problems Restriction: problems should be in similar feature ranges and similar class statistics

8 (Linear) Distance Metric Learning [Chopra et al.CVPR05,Goldberger et al.NIPS04, Weinberger & Saul JMLR09] Linear transform: Image patches: Distance in T space:

9 (Non-Linear) Distance Metric Learning [Sugiyama JMLR07] Rewrite S (b) and S (w) : By taking the following transformation: Generalized Eigenvalue Problem: Dot product replaced by non-linear kernel function Feature Transform

10 Linear vs Non-linear DML LINEAR NON- LINEAR Points from the same class collapse and are far from each other Points not belonging to any class collapse at the origin

11 Intuition Train several feature transforms – Random matching problems Aggregate distances [Breiman 01] : – Threshold-based classifier –

12 T Intuition Unkown target problem Random training problem 1 Small dist. Large dist. ROC Aggregated distances

13 T Intuition Unkown target problem Random training problem 2 Small dist. Largedist. ROC Aggregated distances

14 Toy Example Combing 100 feature spaces... Error decreases with number of feature spaces No matter the error for each space UFT Original NLMSL trained

15 Experiments Dataset of for training [Winder & Brown,CVPR07] : – Backprojecting 3D points to 2D images from scene reconstructions – Variations in scene location, brightness and partial occlusion – Similar pre-processing of [Winder & Brown,CVPR07] – Train: all patch classes from Trevi & Yosemite dataset – Test: 50K matching and 50K non-matching pairs from Notre Dame dataset

16 Experiments Using cross validation – 50 training classes for training each feature space – 50 training feature spaces Error decreases with number of feature spaces No matter the error in each spaceUFT (2.28%)SIFT (6.3%) @95% TP

17 Experiments Matching database [Mikolajczyk & Schmid,PAMI’05]

18 Conclusion Competitive performance Simple ensemble classifier (can be efficiently implemented) Adapt to new classification problems (no re-training)

19 Linear vs Non-linear DML LINEAR NON- LINEAR 10 runs, 100 points per class Classifier: threshold matching Non-linear: low bias, high variance Linear: High bias, low variance

20 Combining Feature Spaces Breiman’s idea about ensemble classifiers [Breiman 01] : – combine low-bias, high-variance (unstable) classifiers to produce low-bias, low-variance classifiers. Distance


Download ppt "Gustavo Carneiro The Automatic Design of Feature Spaces for Local Image Descriptors using an Ensemble of Non-linear Feature Extractors."

Similar presentations


Ads by Google