Presentation is loading. Please wait.

Presentation is loading. Please wait.

General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution.

Similar presentations


Presentation on theme: "General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution."— Presentation transcript:

1 General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution of the system of nonlinear equations

2 Localization for Mobile Robot Using Monocular Vision

3 Vision based Self-localization methods Self-localization methods of mobile robot – Position tracking : encoder, ultrasonic sensors, local sensors – Global localization : laser-range scanner, vision-based methods Vision-based methods of indoor application – Stereo vision Directly detects the geometric information, complicated H/W, much processing time – Omni-directional view Using conic mirror, low resolution – Mono view using landmarks Using artificial landmarks

4 Background on Monocular methods Related work in monocular method 1.Sugihara(1988) did pioneering works in localization using vertical edges. 2.Atiya and Hager(1993) used geometric tolerance to describe observation error. 3.Kosaka and Kak (1992) proposed a model-based monocular vision system with a 3D geometric model. 4.Munoz and Gonzalez (1998) added an optimization procedure.

5 Previous work on monocular methods Related work in monocular method Talluri and Aggarwal (1996) considered correspondence problem between a stored 3D model and 2D image in an outdoor urban environment. Aider et. al. (2005) proposed an incremental model- based localization using view-invariant regions. Another approach adopting SIFT (Scale-Invariant Feature Transformation) algorithm to compute correspondence between the SIFT features saved and images during navigation.

6 Self-localization based on Vertical Lines mono A self-localization method using vertical lines with mono view will be presented here. 1.Indoor environment, use horizontal and vertical line features(doors, furniture) 2.Find vertical lines, compute pattern vectors 3.Match the lines with the corners of map 4.Find position (x,y,θ) with matched information

7 Detect line segments 2. Localization algorithm Map-making and path planning Line segments ≥ 3 Matching lines with map Input image end Uncertainty > T Yes No Localization(x,y,θ) Destination Fig. 1 The flowchart of self-localization

8 Line feature detection Vertical Sobel operation Vertically projected histogram One dimensional averaging, and thresholding Local maximum are indexed as feature points Fig. 2 Projected histogram and a local maximum U Local maximum Threshold value not This method does not use edge detection and next Hough Our method uses Canny and Hough

9 Experimental results using histograms Fig. 7 The procedures of detecting vertical lines (c) Projected histogram (d) Vertical lines (a) Original Image (b) Vertical edges Fig. 6 Mobile robot

10 Experimental results: sequence of images Sequence of Input images in an experiment with the robot

11 How it worked: Experimental results Fig. 10. Errors through Y axis Fig. 9. The result of localization in the given map real predicted

12 Correspondence of feature vectors

13 They use geometrical information of the line features of the map Feature vectors are defined with hue(H) and saturation (S) Feature vectors of the right and left regions are defined Check whether a line meets floor regions – Connected line, non-connected line : Define visibility of regions of c onnected line – Visible region, Occluded region (1)

14 2.2 Correspondence using feature vectors (2) Matching of feature vector of lines with map. – Lines of both visible region, one visible region, non-contacted line The correspondence of neighbor lines are investigated with the lines having geometrical relationship. Fig. 3 Floor contacted lines and visible regions. Contacted line : x 1, x 2, x 3. Non-contacted line : x 4. Visible region : l 1, l 2, r 2, l 3, r 3, r 4. Occluded region : r 1, l 4

15 2.3 Self-localization using vertical lines (1) The coordinates of feature points are matched to the camera coordinates of the map. Fig. 4 Global coordinates and camera coordinates

16 2.3 Self-localization using vertical lines (2) Fig. 5 Perspective transformation of camera coordinates : Image plane coordinates : Camera coordinates : Feature points of camera coordinates : Features of image plane : Focal length of camera

17 2.3 Self-localization using vertical lines (3) Camera coordinates can be transformed to world coordinates by a rigid body transformation T. The camera coordinates and world coordinates are related with translation and rotation. The transformation T can be defined as (2) (3) What is a relation of camera coordinates and world coordinates? Use a transformation!

18 2.3 Self-localization using vertical lines (4) Global coordinates are mapped to camera coordinates. The perspective transformation is (5) Perspective transformation and rigid transformation of the coordinates induce a system of nonlinear equations. induces from (4), (5). (5) (6) (4) system of nonlinear equations We create a system of nonlinear equations for the camera view

19 2.3 Self-localization using vertical lines (5) Jacobian matrix Newton’s method Newton’s method to find the solution of the nonlinear equations is (8) when initial value is given. where (7) 8(8)8(8) From previous slide f i from F from previous slide We get update of camera (robot) position and orientation

20 4. Conclusions 1.A self-localization method using vertical line segment with mono view was presented. 2.Line features are detected by projected histogram of edge image. for match with the point of map. 3.Pattern vectors and their geometrical properties are used for match with the point of map. 4.A system of nonlinear equations with perspective and rigid transformation of the matched points is induced. 5.Newton’s method was used to solve the equations. 6.The proposed algorithm using mono view is simple and applicable to indoor environment.

21 Robot Vision Lab.

22 Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan. 2006 Tongmyong University

23 Robot Vision Lab. 3. Experimental results (1) Real position (mm, °)Measured position (mm, °) No.XYAngleXY 100023.7946.130.04 2-160100032.5141.541.53 3-160200049.9058.281.24 4-160400034.5474.821.39 5-160600037.6761.411.20 6-160800029.3543.861.31 7-1601000026.57100.371.37 8-1601200030.4618.471.38 9-1601400018.5996.391.49 10-1601600014.1893.741.31 11-160180009.389.462.00 120660020.217.843.32 1301150034.6172.982.29 1401555024.3644.781.83 Real position (mm, °)Measured position (mm, °) No.XYAngleXY 1502005015.6635.600.26 16-650380050.0888.550.61 17-630660015.8432.290.89 18-5601045-484.1527.561.78 19-4651300-4536.0159.970.84 20-3751465-4728.0341.172.82 21-2851740-3711.2725.810.83 22-1902125-327.3373.291.69 23-752435-2218.2170.051.15 24-252715-1019.2544.073.24 251653034-2310.0370.431.97 263253455-2780.6366.941.45 273703915-159.04511.042.5 2  of errors 32.8353.801.58 Table 1 Real positions and errors


Download ppt "General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution."

Similar presentations


Ads by Google