Download presentation

Presentation is loading. Please wait.

Published byGeorgina Gilbert Modified about 1 year ago

1
Proposed concepts illustrated well on sets of face images extracted from video: Face texture and surface are smooth, constraining them to a manifold Recognition with varying illumination appreciated as a very difficult problem Learning over Sets using Boosted Manifold Principal Angles (BoMPA) Tae-Kyun Kim, Ognjen Arandjelović, Roberto Cipolla Engineering Department, University of Cambridge Introduction Conclusions The main contributions of this work: Framework for efficient matching of sets of patterns Proposed algorithm centred around the idea of modelling non-linear subspaces Proposed multimodal extension is more flexible and accurate than previous ones Experimental evaluation From Sets to Subspaces Key problems: i. Representation of potentially large sets (see Figure 1.) ii. Efficiency in set comparisons iii. Robustness to noise, outliers and limited data Described a novel method for pattern set-based discrimination Principal angles a theoretically well justified concept for comparisons of subspaces Proposed algorithms for learning the optimal principal angle weighting, and fusion of global and local manifold behaviour significantly improve recognition results Figure 1. Face vector sets: samples from two typical face sets used to illustrate concepts proposed in this paper (top) and the corresponding patterns in the 3D principal component subspaces (bottom). The sets capture appearance changes of faces of two different individuals as they performed unconstrained head motion in front of a fixed camera. Our approach: i. Consider subspaces sets are confined to ii. Use mixture models for non-linearities & intrinsic data dimensionality iii. Find the most similar modes of data variation using principal angles Principal angles between linear subspaces Key properties: Can be seen as finding nearest neighbours over subspaces (see Fig. 2) Numerically stable and robust to noise Definition: Figure 4. MSM, BPA and MPA: (left) The first 3 principal vectors between two linear subspaces which MSM incorrectly classifies as corresponding to the same person. In spite of different identities, the most similar modes of variation are very much alike and can be seen to correspond to especially difficult illuminations. (centre) Boosted Principal Angles (BPA), on the other hand, chooses different principal vectors as the most discriminating – these modes of variation are now less similar between the two sets. (right) Modelling of nonlinear manifolds corresponding to the two image sets produces a further improvement. Local information is well captured and the principal vectors are now very dissimilar. Application-optimal principal angle fusion Figure 2. Principal vectors in MSM: The first 3 pairs (top and bottom rows) of principal vectors for a comparison of two linear subspaces corresponding to the same (left) and different individuals (right). In the former case, the most similar modes of pattern variation, represented by principal vectors, are very much alike in spite of different illumination conditions used in data acquisition. Different principal angles carry varying amounts of information for discrimination between classes. This varies from application to application, i.e. on the semantics of sets that represent different classes. Key ideas: Learn how to optimally combine principal angles Employ AdaBoost - each simple learner based on a single principal angle Train on random draws of in- and out-of class subsets Proposed learning scheme reveals interesting results on face data, see Fig. 3 and 4. Figure 3. Boosted Principal Angles: (a) A typical set of weights corresponding to weak principal angle-based classifiers, obtained using AdaBoost. This figure confirms our criticism of MSM-based methods for (i) their simplistic fusion of information from different principal angles and (ii) the use of only the first few angles, see Section 1.1. (b) The average performance of a simple MSM classifier and our boosted variant.. Nonlinear manifolds Pattern variations within and between sets often highly nonlinear. Key ideas: Use mixture of Probabilistic PCA to capture locally linear variations within a set Define manifold proximity as weighted combination of similarity of global and most similar local variations

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google