Download presentation
Published byJennifer Harris Modified over 9 years ago
1
Distinctive Image Features from Scale-Invariant Keypoints
David G. Lowe, University of British Columbia
2
What do we want from image features ?
To be invariant in order to find again the feature if the image changes invariance to translation, rotation, scale changes. invariance to affine deformations and 3D viewpoint change invariance to illumination changes To be distinctive in order to match the right feature !
3
Invariances Invariance to translation Invariance to rotation
4
Invariance to scale change
5
How to deal with scale changes ?
The image is embedded into scale-space Progressive smoothing of the image to obtain a fine-to-coarse representation Use of low-pass linear filters Family of functions L(x,t) = h(x,t) * f(x) The gaussian kernel is the only possible kernel [Lindeberg,94] linear filter structure of semi-group relatively to the scale parameter: h(x,t1+t2) = h(x,t1) * h(x,t2) rotational symmetry
6
Features selection Scale-space peaks in the difference of gaussians convolved with the image Approximation of a Laplacian filter (in 1D, [-1 2 –1] ) Bandpass filter Reconstruction of the initial image is possible from the set of filtered versions Why selecting local maxima of this filtered image ? I don’t know Others have done it successfully [Crowley&Parker,84][Mikolajczyk,02]
7
How many scales should we consider ?
Need for a discrete set of downscaled images Match between initial image and synthetic image obtained by random rotation and random scaling ( )
8
Examples
9
Descriptor associated to the keypoint
Computation of an intrinsic orientation associated to the keypoint orientation = direction of the gradient for better stability: computation of a histogram of local orientations, and selection of the peak in this histogram Each feature is associated to a location, a scale and an orientation
10
Descriptor associated to the keypoint (II)
Projection of 44 gradients onto 8 orientations 128 dimensional vector robustness to affine changes : gradual change in the descriptor when position of the keypoint changes Normalization for invariance to illumination changes
11
Testing the features Sensitivity to affine changes
Distinctiveness of the features (30 degrees rotation, 2% noise)
12
Individual object recognition
?
13
Recognition algorithm: a voting approach
14
Kd-tree search Binary search An exhaustive search is needed to obtain the nearest neighbor. This is time consuming !! Nearest neighbor
15
Kd-tree search - backtracking
16
Kd-tree search - backtracking
17
Kd-tree search - backtracking
Good choice !
18
How often do we get the right match ?
Parameters: - database size: 100,000 points “best-bin” search: max. 200 iterations “restricted search” : max. 480 iterations Results averaged over 1000 runs.
19
The voting approach – Hough transform
(x1,y1,s1,1) (x2,y2,s2,2) Transform predicted by this match: x = x2-x1 y = y2-y1 s = s2 / s1 = 2 / 1 Voting performed in the space of transform parameters s y x
20
Last step: compute frame transform
Solves for affine transform parameters Use of least square error solution
21
Results : scene recognition
22
Results : occlusions
23
Results : change of scale
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.