Download presentation

Presentation is loading. Please wait.

Published bySophie Poppy Modified over 4 years ago

1
Object Recognition using Local Descriptors Javier Ruiz-del-Solar, and Patricio Loncomilla Center for Web Research Universidad de Chile

2
Outline Motivation & Recognition Examples Dimensionality problems Object Recognition using Local Descriptors Matching & Storage of Local Descriptors Conclusions

3
Motivation Object recognition approaches based on local invariant descriptors (features) have become increasingly popular and have experienced an impressive development in the last years. Invariance against: scale, in-plane rotation, partial occlusion, partial distortion, partial change of point of view. The recognition process consists on two stages: 1.scale-invariant local descriptors (features) of the observed scene are computed. 2.these descriptors are matched against descriptors of object prototypes already stored in a model database. These prototypes correspond to images of objects under different view angles.

4
Recognition Examples (1/2)

5
Recognition Examples (2/2)

6
Image Matching Examples (1/2)

7
Image Matching Examples (2/2)

8
Some applications Object retrieval in multimedia databases (e.g. Web) Image retrieval by similarity in multimedia databases Robot self-localization Binocular vision Image alignment and matching Movement compensation …

9
However … there are some problems Dimensionality problems A given image can produce ~100-1,000 descriptors of 128 components (real values) The model database can contain until 1,000-10,000 objects in some special applications => large number of comparisons => large processing time => large databases size Main motivation of this talk: To get some ideas about how to make efficient comparisons between local descriptors as well as efficient storage of them …

10
Recognition Process The recognition process consists on two stages: 1.scale-invariant local descriptors (features) of the observed scene are computed. 2.these descriptors are matched against descriptors of object prototypes already stored in a model database. These prototypes correspond to images of objects under different view angles.

11
Interest Points Detection Scale Invariant Descriptors (SIFT) Calculation Affine Transform Calculation SIFT Matching SIFT Database Interest Points Detection Scale Invariant Descriptors (SIFT) Calculation Reference Image Offline Database Creation Input Image Affine Transform Parameters

12
Interest Points Detection Scale Invariant Descriptors (SIFT) Calculation Affine Transform Calculation SIFT Matching SIFT Database Interest Points Detection Scale Invariant Descriptors (SIFT) Calculation Input Image Affine Transform Parameters Reference Image Offline Database Creation

13
Interest Points Detection (1/2) Interests points correspond to maxima of the SDoG (Subsampled Difference of Gaussians) Scale-Space (x,y, ). Scale Space SDoG Ref: Lowe 1999

14
Interest Points Detection (2/2) Examples of detected interest points. Our improvement: Subpixel location of interest points by a 3D quadratic approximation around the detected interest point in the scale-space.

15
Interest Points Detection Scale Invariant Descriptors (SIFT) Calculation Affine Transform Calculation SIFT Matching SIFT Database Interest Points Detection Scale Invariant Descriptors (SIFT) Calculation Input Image Affine Transform Parameters Reference Image Offline Database Creation

16
SIFT Calculation For each obtained keypoint, a descriptor or feature vector that considers the gradient values around the keypoint is computed. This descriptors are called SIFT (Scale - Invariant Feature Transformation). SIFTs allow obtaining invariance against to scale and orientation. Ref: Lowe 2004

17
Interest Points Detection Scale Invariant Descriptors (SIFT) Calculation Affine Transform Calculation SIFT Matching SIFT Database Interest Points Detection Scale Invariant Descriptors (SIFT) Calculation Input Image Affine Transform Parameters Reference Image Offline Database Creation

18
SIFT Matching Euclidian distance between the SIFTs (vectors) is employed.

19
Interest Points Detection Scale Invariant Descriptors (SIFT) Calculation Affine Transform Calculation SIFT Matching SIFT Database Interest Points Detection Scale Invariant Descriptors (SIFT) Calculation Input Image Affine Transform Parameters Reference Image Offline Database Creation

20
Affine Transform Calculation (1/2) Several stages are employed: 1.Object Pose Prediction In the pose space a Hough transform is employed for obtaining a coarse prediction of the object pose, by using each matched keypoint for voting for all object pose that are consistent with the keypoint. A candidate object pose is obtained if at least 3 entries are found in a Hough bin. 2. Affine Transformation Calculation A least-squares procedure is employed for finding an affine transformation that correctly account for each obtained pose.

21
Affine Transform Calculation (2/2) 3. Affine Transformation Verification Stages: Verification using a probabilistic model (Bayes classifier). Verification based on Geometrical Distortion Verification based on Spatial Correlation Verification based on Graphical Correlation Verification based on the Object Rotation 4. Transformations Merging based on Geometrical Overlapping In blue verification stages proposed by us for improving the detection of robots heads.

22
Input Image Reference Images AIBO Head Pose Detection Example

23
Matching & Storage of Local Descriptors Each reference image gives a set of keypoints. Each keypoint have a graphical descriptor, which is a 128- components vector. All the (keypoint,vector) pairs corresponding to a set of reference images are stored in a set T. Reference image x,y,n, v 1 v 2... v 128 x,y,n, v 1 v 2... v 128 x,y,n, v 1 v 2... v 128 x,y,n, v 1 v 2... v 128... (1)(2)(3)(4) = T

24
Reference image p1p1 d1d1 p2p2 d2d2 p3p3 d3d3 p4p4 d4d4... = T Matching & Storage of Local Descriptors More compact notation Each reference image gives a set of keypoints. Each keypoint have a graphical descriptor, which is a 128- components vector. All the (keypoint,vector) pairs corresponding to a set of reference images are stored in a set T.

25
In the matching-generation stage, an input image gives another set of keypoints and vectors. For each input descriptor, the first and second nearest descriptors in T must be found. Then, a pair of nearest descriptors (d,d FIRST ) gives a pair of matched keypoints (p,p FIRST ). Matching & Storage of Local Descriptors Input image p d... Search in T p FIRST d FIRST p SEC d SEC p1p1 d1d1 p2p2 d2d2...

26
The match is accepted if the ratio between the distance to the first nearest descriptor and the distance to the second nearest descriptor is lower than a given threshold This indicates that exists no possible confusion in the search results. Matching & Storage of Local Descriptors Accepted if: distance(, ) < * distance (, ) dd FIRST dd SEC

27
A way to store the T set in a ordered way is using a kd-tree In this case, we will use a 128d-tree As well known, in a kd-tree the elements are stored in the leaves. The other nodes are divisions of the space in some dimension. Storage: Kd-trees 1 >2 2 >3 2 >5 1313 2727 6565 8989 All the vectors with more than 2 in the first dimension, stored at right side Division node Storage node

28
Generation of balanced kd-trees: We have a set of vectors We calculate the means and variances for each dimension i. Storage: Kd-trees a1a2…a1a2… b1b2…b1b2… c1c2…c1c2… d1d2…d1d2… ………………

29
Tree construction: Select the dimension i MAX with the largest variance Order the vectors with respect to the i MAX dimension. Select the median M in this dimension. Get a division node. Repeat the process in a recursive way. Storage: Kd-trees iMAX >M Nodes with i MAX component lesser than M Nodes with i MAX component greater than M

30
Search process of the nearest neighbors, two alternatives: Compare almost all the descriptors in T with the given descriptor and return the nearest one, or Compare Q nodes at most, and return the nearest of them (compare calculate Euclidean distance) Requires a good search strategy It can fail The failure probability is controllable by Q We choose the second option and we use the BBF (Best Bin First) algorithm. Search Process

31
Set: v: query vector Q: priority queue ordered by distance to v (initially void) r: initially is the root of T v FIRST : initially not defined and with an infinite distance to v ncomp: number of comparisons, initially zero. While (!finish): Make a search for v in T from r => arrive to a leaf c Add all the directions not taken during the search to Q in an ordered way (each division node in the path gives one not-taken direction) If c is more near to v than v FIRST, then v FIRST =c Make r = the first node in Q (the more near to v), ncomp++ If distance(r,v) > distance(v FIRST,v), finish=1 If ncomp > ncomp MAX, finish=1 Search Process: BBF Algorithm

32
1 >2 2 >3 1313 2727 20 7 2 >7 1 >6 5 1500 9 1000 20 8 ?: queue: C MIN : Distance between 2 and 20 Search Example Requested vector 1 >2 18 I am a pointer 20>2 Go right Not-taken option 18

33
Search Example 1 >2 2 >3 1313 2727 20 7 2 >7 1 >6 5 1500 9 1000 20 8 ?: C MIN : queue: 1 >2 2 >7 18 1 8>7 Go right 18 1 comparisons: 0

34
1 >2 2 >3 1313 2727 20 7 2 >7 1 >6 5 1500 9 1000 20 8 ?: queue: 1 >2 2 >7 18 1 >8 C MIN : Search Example 114 20>6 Go right 18 1 14 comparisons: 0

35
1 >2 2 >3 1313 2727 20 7 2 >7 1 >6 5 1500 9 1000 20 8 ?: queue: 1 >2 2 >7 18 1 1 >8 14 C MIN : 9 1000 992 We arrived to a leaf Store nearest leaf in C MIN Search Example 992 14 18 1 comparisons: 1

36
1 >2 2 >3 1313 2727 20 7 2 >7 1 >6 5 1500 9 1000 20 8 ?: queue: 1 >2 2 >7 18 1 1 >8 14 C MIN : 9 1000 992 Distance from best-in-queue is lesser than distance from c MIN Start new search from best in queue Delete best node in queue Search Example 992 14 18 1

37
1 >2 2 >3 1313 2727 20 7 2 >7 1 >6 5 1500 9 1000 20 8 ?: queue: 1 >2 18 1 >8 12 C MIN : 9 1000 992 Go down from here Search Example 992 14 18 1 comparisons: 1

38
1 >2 2 >3 1313 2727 20 7 2 >7 1 >6 5 1500 9 1000 20 8 ?: queue: 1 >2 18 1 >8 12 C MIN : We arrived to a leaf Store nearest leaf in C MIN 992 14 18 1 1 20 7 1 Search Example comparisons: 2

39
1 >2 2 >3 1313 2727 20 7 2 >7 1 >6 5 1500 9 1000 20 8 ?: queue: 1 >2 18 1 >8 12 C MIN : 992 14 18 1 1 20 7 1 Search Example Distance from best-in-queue is NOT lesser than distance from c MIN Finish comparisons: 2

40
Conclusions BBF+Kd-trees: good trade off between short search time and high success probability. But, perhaps BBF+ Kd-trees is not the optimal solution. Finding a better methodology is very important to massive applications (as an example, for Web image retrieval)

Similar presentations

OK

Automatic Image Alignment (feature-based) 15-463: Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and.

Automatic Image Alignment (feature-based) 15-463: Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and.

© 2018 SlidePlayer.com Inc.

All rights reserved.

To ensure the functioning of the site, we use **cookies**. We share information about your activities on the site with our partners and Google partners: social networks and companies engaged in advertising and web analytics. For more information, see the Privacy Policy and Google Privacy & Terms.
Your consent to our cookies if you continue to use this website.

Ads by Google

Ppt on network layer design issues Show ppt on second monitor Ppt on collection in java Ppt on young uncle comes to town Ppt on art and craft movement era Ppt on water scarcity quotes Ppt on 4 p's of marketing Ppt on earth movements and major landforms in germany Ppt on first conditional grammar Ppt on indian army weapons pictures