Presentation is loading. Please wait.

Presentation is loading. Please wait.

Graz University of Technology, AUSTRIA Institute for Computer Graphics and Vision Fast Visual Object Identification and Categorization Michael Grabner,

Similar presentations


Presentation on theme: "Graz University of Technology, AUSTRIA Institute for Computer Graphics and Vision Fast Visual Object Identification and Categorization Michael Grabner,"— Presentation transcript:

1 Graz University of Technology, AUSTRIA Institute for Computer Graphics and Vision Fast Visual Object Identification and Categorization Michael Grabner, Helmut Grabner, Horst Bischof

2 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 2 (of 19) Agenda  Motivation  Approach  Experimental Illustration  Results  Outlook

3 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 3 (of 19) Problem Database: Ferencz, Yale, Buffalo How large scale object recognition can be handled in an adequate time? How knowledge can be used for incremental learning from few examples?

4 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 4 (of 19) Identification vs. Categorization Faces Writings Cars Horst boringJoe wondering Bill‘s carZip Code 77840 Horst laughing Identification Categorization...

5 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 5 (of 19) Identification and Categorization Faces Horst Helmut Joe Cars Car 1 Car 2 Car 3 Car 4 Writings ZIP Codes Places wondering Identification depends on the granularity of categorization tired

6 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 6 (of 19) Our approach  „Object Memory“ -Hierarchical meaning objects are stored in a hierarchical way -Incremental meaning objects can be added incrementally to the structure -Fast meaning identification of objects is done efficiently

7 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 7 (of 19) Features  Two types of features -Haar-Like (Viola and Jones 2001) -Orientation Histograms  Advantages -Coding of gradient information (Lowe 2004, Edelman 1997) -Fast computation allows to extract a large number of features leading to robustness (Porikli 2005, Grabner 2005)

8 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 8 (of 19) Integral Orientation Histogram F. Porikli: „Integral histograms: A fast way to extract histograms in Cartesian spaces“, in Proc. CVPR 2005

9 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 9 (of 19) Feature Selection  Goal is to distinguish between objects by selecting discriminative features  Feature Pool  Learn distance function (Ferencz 2005) -„same“ vs. „same“ and „same“ vs. „different“

10 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 10 (of 19) 1.) A weak classifier corresponds to a single feature 2.) Perform boosting to select N features 3.) Final strong classifier is a linear combination of features Boosting for Feature Selection (Viola and Jones 2001) selected Features Object model

11 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 11 (of 19) Building the „Object Memory“  Initialization: 2 objects form a single layer  Adding a novel object: -Evaluating the sample starting at the highest layer If sample can not be modeled by one of the classifiers: ADD TO CURRENT LAYER If sample can be modeled by one of the classifiers: GO DEEPER –If classifier has no child: INITIALIZE A NEW LAYER  Retrain -current layer to distinguish between these models -parents for getting generic object models in higher layers Generating layers of similar objects and learn to differentiate between these similar objects

12 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 12 (of 19) Building the „Object Memory“ Training the Object Memory On-line Illustration  MATLAB

13 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 13 (of 19) Identification Process  Evaluating the sample starting at the highest level  Multi-path evaluation based on model confidences  Post Processing (i.e. take reference model with highest confidence) Note: evaluation is fast using integral data structures

14 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 14 (of 19) Identification Process Evaluation the Object Memory On-line Illustration  MATLAB

15 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 15 (of 19) Experiments - Overview  Experiment 1 -Illustration of the approach -3 categories (Cars, Faces, Writings) -Training using 6 images per object -Model complexity: 30 features  Experiment 2 -Performance evaluation on category Cars -Varying number of objects and model complexity

16 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 16 (of 19) Experiment 1 – Trained Object Memory

17 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 17 (of 19) Experiment 2  Experiment on database Car (Ferencz) -6 samples for training (const) -RPC obtained by varying confidence threshold Variation of model complexity (30 Objects)Variation of objects (15 Features)

18 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 18 (of 19) Conclusion and Outlook  Conclusion -Hierarchical structuring of objects by a simple heuristic -Incremental adding of novel objects from few examples -Fast Identification  Outlook -More objects -Fast and efficient retraining On-line boosting for model update -Detection, Tracking and Recognition within one framework all tasks are performed with same types of features

19 NIPS 2005 Workshop: Interclass Transfer „why learning to recognize many objects is easier than learning to recognize just one“ Slide 19 (of 19) Thank you for your attention!


Download ppt "Graz University of Technology, AUSTRIA Institute for Computer Graphics and Vision Fast Visual Object Identification and Categorization Michael Grabner,"

Similar presentations


Ads by Google