Presentation is loading. Please wait.

Presentation is loading. Please wait.

Autonomous Vehicle Competition

Similar presentations


Presentation on theme: "Autonomous Vehicle Competition"β€” Presentation transcript:

1 Autonomous Vehicle Competition
Laksono Kurnianggoro 18 July 2014

2 Activities Last activities: Went to Seoul to test the AVC car.
Examined the image moments. Modified the car tracking method.

3 IECON Registration

4 Image Moments π‘š 𝑖𝑗 = π‘₯,𝑦 𝐼 π‘₯,𝑦 π‘₯ 𝑖 𝑦 𝑗 π‘₯ = π‘š 10 π‘š 00 ; 𝑦 = π‘š 01 π‘š 00

5 Problem with Car Detection Dataset
Some part of the dataset are background. Affecting the tracking result. Detected region Detected region (HSV) Backpojection image

6 Region used in histogram computation
Solution Retrain the SVM using better dataset. Need times to construct the new dataset. OR: Modify the tracking model. Using a smaller window size for the histogram computation. Detected region Region used in histogram computation result

7 Appendix

8 Part 1: Calibration of Camera and LRF
Relation between a point from camera’s and LRF’s point of view: Where: Ξ¦ is the rotation from camera to LRF. Ξ” is the translation from camera to LRF. 𝑃 π‘π‘Žπ‘š location (in 3D space) is directly known, However it is located at the calibration plane (checkerboard). 𝑃 𝐿𝑅𝐹 = Φ𝑃 π‘π‘Žπ‘š +Ξ” (1) Calibration system’s setting

9 Point to Plane Constraint
Using camera calibration method, checkerboard’s normal and its distance to camera can be found. The constraint: In the camera coordinate system: all the points (P) in checkerboard are constrained with its point to plane distance to the imaginary planar on the camera’s position. Imaginary planar, Parallel to the checker board checkerboard 𝑁=βˆ’ 𝑛 𝑑 βˆ’ 𝑛 d Camera’s position 𝑛 βˆ’ 𝑛 .𝑃 𝑛 =𝑑 𝑁.𝑃= 𝑑 2 (2)

10 Formulation of the Calibration System
Main relation: Simplification: All laser points are located at Y=0 οƒ  𝑃 𝐿𝑅𝐹 = 𝑋;𝑍;1 Formula (3) can be rewritten as: Where: Knowing N, d and 𝑃 𝐿𝑅𝐹 , H can be computed. 𝑁.𝑃= 𝑑 2 (2) 𝑃 π‘π‘Žπ‘š = Ξ¦ βˆ’1 (𝑃 𝐿𝑅𝐹 βˆ’Ξ”) (1) 𝑁. Ξ¦ βˆ’1 (𝑃 𝐿𝑅𝐹 βˆ’Ξ”)= 𝑑 2 (3) 𝑁.𝐻 𝑃 𝐿𝑅𝐹 = 𝑑 2 (4) 𝐻= Ξ¦ βˆ’ Ξ”

11 Decomposing H into Rotation and Translation
Where 𝐻 𝑖 is the 𝑖 π‘‘β„Ž column of matrix H: Ξ¦= [ 𝐻 1 ,βˆ’ 𝐻 1 Γ— 𝐻 2 , 𝐻 2 ] ⊺ βˆ† = βˆ’[ 𝐻 1 ,βˆ’ 𝐻 1 Γ— 𝐻 2 , 𝐻 2 ] ⊺ 𝐻 3

12 Laser’s visualization
Result Camera’s image Laser’s visualization

13 Calibration in AVC’s Car

14 Conversion of 3D point to Image Point
Points3D, 𝐾, Kc p.x=Points3d.x/Points3d.z p.y=Points3d.y/Points3d.z Tangential Distortion: π‘Ž_1 𝑖 =2.0βˆ— 𝑝 𝑖 .π‘₯+ 𝑝 𝑖 .𝑦 π‘Ž_2 𝑖 = π‘Ÿ 𝑖 βˆ— 𝑝 𝑖 .π‘₯ 2 π‘Ž_3 𝑖 = π‘Ÿ 𝑖 βˆ— 𝑝 𝑖 .𝑦 2 βˆ† π‘₯ 𝑖 =π‘˜π‘ 2 βˆ— π‘Ž_1 𝑖 +π‘˜π‘ 3 βˆ— π‘Ž_2 𝑖 βˆ† 𝑦 𝑖 =π‘˜π‘ 2 βˆ— π‘Ž_3 𝑖 +π‘˜π‘ 3 βˆ— π‘Ž_1 𝑖 Distortion: π‘Ÿ 𝑖 2 = 𝑝 𝑖 .π‘₯ 2 + 𝑝 𝑖 .𝑦 2 , 1<𝑖<π‘›π‘ƒπ‘œπ‘–π‘›π‘‘π‘  π‘Ÿ 𝑖 4 = π‘Ÿ 𝑖 2 π‘Ÿ 𝑖 2 π‘Ÿ 𝑖 6 = π‘Ÿ 𝑖 4 π‘Ÿ 𝑖 2 Radial Distortion: 𝑐𝑑𝑖𝑠𝑑 𝑖 =1.0+π‘˜π‘ 0 βˆ— π‘Ÿ 𝑖 2 +π‘˜π‘ 1 βˆ— π‘Ÿ 𝑖 4 +π‘˜π‘ 4 βˆ— π‘Ÿ 𝑖 6 𝑝_𝑑𝑑 𝑖 .π‘₯= 𝑝_π‘Ÿπ‘‘ 𝑖 .π‘₯βˆ— βˆ†π‘₯ 𝑖 𝑝_𝑑𝑑 𝑖 .𝑦= 𝑝_π‘Ÿπ‘‘ 𝑖 .π‘¦βˆ— βˆ†π‘¦ 𝑖 Skew: 𝛼=𝛾/𝑓π‘₯ 𝑝_π‘Ÿπ‘‘ 𝑖 = 𝑝 𝑖 βˆ— 𝑐𝑑𝑖𝑠𝑑 𝑖 𝑝_𝑠 𝑖 .π‘₯= 𝑝_𝑑𝑑 𝑖 .π‘₯+π›Όβˆ— 𝑝 𝑑𝑑 𝑖 .𝑦 𝑝_𝑠 𝑖 .𝑦= 𝑝_𝑑𝑑 𝑖 .𝑦 π‘ƒπ‘œπ‘–π‘›π‘‘π‘  π‘œπ‘› π‘‘β„Žπ‘’ π‘–π‘šπ‘Žπ‘”π‘’: 𝑝2𝑑.π‘₯= 𝑝_𝑠 𝑖 .π‘₯βˆ—π‘“π‘₯+𝑐π‘₯ 𝑝2𝑑.𝑦= 𝑝_𝑠 𝑖 .π‘¦βˆ—π‘“π‘¦+𝑐𝑦 𝐾= 𝑓π‘₯ 𝛾 𝑐π‘₯ 0 𝑓𝑦 𝑐𝑦 0 0 1

15 Part 2: Registration Problem
In the dataset, the laser was not registered properly to the camera. Solution: Using several sliding windows. Red: sliding window boxes, cyan: detected car.

16 Part 4: Predicting the Bounding Box Position
Frame n Frame n+1 When the result of car detection is available = dx, dy current prev When the result of car detection is available prev prediction Frame n Frame n+1

17 Part 5: Simple Clustering Method
Exploit the property of the LRF. Invalid scan is detected as 0 meter. If the different of the two consecutive scan data is more than a threshold, they belonged to different cluster.

18 Part 6: Tracking the Laser Data
Assumption: object in the new frame should not be moved far away compared to the previous frame. Green line: previous position Object candidate

19 Merging the LRF Tracking and Image Tracking
Calibration data Cam clustering filtering registration candidates Choose car clusters Recognition (svm) Choose the highest confidence region LMS tracking: add tracked cluster Detected cars region Pick the cluster closest to the predictor Image tracking Pick the corresponding LRF cluster A cluster is picked? Image tracking N Use Image detection as reference Y Update the predictor Laser tracking

20 Output data from the tracking program
Adding the cluster length to output. Visualization of the laser data box_x box_y box_w box_h laser_x laser_y length

21 Mean-Shift Tracking model patch Histogram computation normalization
observation Change the pixel value to corresponding histogram bin value Observed image Back-projection image Mean-shift 1.0 30 50 60 90 Histogram example Back-projection image example

22 Mean-Shift Finds the center of gravity from a given window.
Repeat until convergence. Iteration #1 Iteration #2

23 Mean-Shift in Back-Projection Image
Mass center is computed using moment. where π‘₯ = π‘š 10 π‘š 00 ; 𝑦 = π‘š 01 π‘š 00 π‘š 𝑖𝑗 = π‘₯,𝑦 𝐼 π‘₯,𝑦 π‘₯ 𝑖 𝑦 𝑗 video Illustration of the mean shift iteration on the back projection image sequences


Download ppt "Autonomous Vehicle Competition"

Similar presentations


Ads by Google