Dongwook Kim, Beomjun Kim, Taeyoung Chung, and Kyongsu Yi

Slides:



Advertisements
Similar presentations
Mobile Robot Localization and Mapping using the Kalman Filter
Advertisements

Automatic Feature Extraction for Multi-view 3D Face Recognition
(Includes references to Brian Clipp
Intelligent Systems Lab. Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes Davide Scaramuzza, Ahad Harati, and Roland.
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Sam Pfister, Stergios Roumeliotis, Joel Burdick
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Video Processing EN292 Class Project By Anat Kaspi.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Discriminative Training of Kalman Filters P. Abbeel, A. Coates, M
Prepared By: Kevin Meier Alok Desai
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Student: Hsu-Yung Cheng Advisor: Jenq-Neng Hwang, Professor
Scale Invariant Feature Transform (SIFT)
Tracking a maneuvering object in a noisy environment using IMMPDAF By: Igor Tolchinsky Alexander Levin Supervisor: Daniel Sigalov Spring 2006.
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Course AE4-T40 Lecture 5: Control Apllication
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
Estimating the Driving State of Oncoming Vehicles From a Moving Platform Using Stereo Vision IEEE Intelligent Transportation Systems 2009 M.S. Student,
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
כמה מהתעשייה? מבנה הקורס השתנה Computer vision.
ROBOT MAPPING AND EKF SLAM
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
3D Global Registration. The Problem Given: n scans around an objectGiven: n scans around an object Goal: align them allGoal: align them all First attempt:
Kalman filter and SLAM problem
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
GM-Carnegie Mellon Autonomous Driving CRL TitleAutomated Image Analysis for Robust Detection of Curbs Thrust AreaPerception Project LeadDavid Wettergreen,
3D SLAM for Omni-directional Camera
MESA LAB Multi-view image stitching Guimei Zhang MESA LAB MESA (Mechatronics, Embedded Systems and Automation) LAB School of Engineering, University of.
Simultaneous Localization and Mapping Presented by Lihan He Apr. 21, 2006.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Robust localization algorithms for an autonomous campus tour guide Richard Thrapp Christian Westbrook Devika Subramanian Rice University Presented at ICRA.
Young Ki Baik, Computer Vision Lab.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
3D polygonal meshes watermarking using normal vector distributions Suk-Hawn Lee, Tae-su Kim, Byung-Ju Kim, Seong-Geun Kwon.
Medical Image Analysis Dr. Mohammad Dawood Department of Computer Science University of Münster Germany.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
By: Aaron Dyreson Supervising Professor: Dr. Ioannis Schizas
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
The Unscented Kalman Filter for Nonlinear Estimation Young Ki Baik.
Robust Localization Kalman Filter & LADAR Scans
CSCI 631 – Foundations of Computer Vision March 15, 2016 Ashwini Imran Image Stitching.
Zhaoxia Fu, Yan Han Measurement Volume 45, Issue 4, May 2012, Pages 650–655 Reporter: Jing-Siang, Chen.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
David Belton and Geoff West CRC for Spatial Information Department of Spatial Sciences Curtin University 1.
Jinjin Shi, Jinxiang Wang, and Fangfa Fu IEEE Transactions on Intelligent Transportation Systems, Vol. 17, No. 4, April 2016 Presented by: Yang Yu
Signal and Image Processing Lab
SIFT Scale-Invariant Feature Transform David Lowe
Using Sensor Data Effectively
Paper – Stephen Se, David Lowe, Jim Little
Lecture 7: Image alignment
PSG College of Technology
Probabilistic Robotics
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Anastasios I. Mourikis and Stergios I. Roumeliotis
A segmentation and tracking algorithm
A special case of calibration
A New Approach to Track Multiple Vehicles With the Combination of Robust Detection and Two Classifiers Weidong Min , Mengdan Fan, Xiaoguang Guo, and Qing.
Vehicle Segmentation and Tracking in the Presence of Occlusions
Effective and Efficient Detection of Moving Targets From a UAV’s Camera
Kalman Filter فيلتر كالمن در سال 1960 توسط R.E.Kalman در مقاله اي تحت عنوان زير معرفي شد. “A new approach to liner filtering & prediction problem” Transactions.
6.1 Introduction to Chi-Square Space
Bayes and Kalman Filter
Facultad de Ingeniería, Centro de Cálculo
Multi-Information Based GCPs Selection Method
Calibration and homographies
Presentation transcript:

Lane-Level Localization Using an AVM Camera for an Automated Driving Vehicle in Urban Environments Dongwook Kim, Beomjun Kim, Taeyoung Chung, and Kyongsu Yi IEEE/ASME Transactions on Mechatronics, Vol. 22, No. 1, Feb 2017 Presented by: Yang Yu {yuyang@islab.ulsan.ac.kr} Dec. 2, 2017

Overview A lane-level localization algorithm based on a map-matching method. Digital map improves vehicle localization. Lane detection: around view monitoring(AVM). Features for map-match: lane marking. Sensor fusion: AVM module and vehicle sensors. Position correction: iterative closest point (ICP) . Map matching: map and sensor feature data. Covariance of the ICP is estimated using Haralick’s method. Localization filter: extended Kalman filter (EKF).

Structure of Vehicle Localization (1/2)

Structure of Vehicle Localization (2/2) Macro-level: navigation system-GPS and proprioceptive sensors. Road-level: radar and known information. Lane-level localization: AVM: top-view image around the vehicle Direct calculation of the lateral offset. Barely affected by others. More robust to weather and illumination conditions.(side line) ICP: localize the vehicle to digital map. Reduce false matching: ICP covariance and validation gate. EKF: corrected position obtained by map-matching is observation.

Around View Monitor Four wide-angle cameras set up at on the front, back, left and right. Range radar. Composite virtual bird's-eye top view image.

Regions of Interest Assign line type (lane or stop line, vertical or horizontal line). The lane and stop line detection.

Gaussian Filter Filtered by a two dimensional Gaussian kernel. Vertical direction of image A and horizontal direction of image B are smoothing Gaussians. Horizontal direction of image A and vertical direction of image B are the second derivatives of Gaussians

Random Sample Consensus After thresholding, RANSAC uses a model with randomly selected edge points to determine the most likely candidate lines Second-order polyline for expressing the line. Result of various lines detection:

Map-Matching Based on ICP A two-dimensional point-to-plane ICP algorithm matches digital map with the lane data obtained by AVM. It compares the two clouds and finds the transformation matrix (rotation matrix and translation vector) iteratively until minimizing error metric function. Minimize error along the surface normal. This error metric function: are matching result . The corrected vehicle position:

Matching Covariance Estimation (1/2) Haralick’s method is estimate the ICP’s covariance. The approximate value of covariance: Different shape of matching data: corridor (left) and U-shape (right).

Matching Covariance Estimation (2/2) Estimated standard deviation of matching error (red line) and true error obtained by Monte Carlo simulation (blue dot)

Localization Filter Position from ICP : observation inside EKF. Validation gate: solve false matching. Prediction vehicle model. Point position (x,y), orientation (Ψ) State vector

Extended Kalman Filter (1/3) Uk: external input (velocity and yaw-rate) Zk: corrected vehicle position. wk and vk : process noise and measurement noise. Process noise: associated with proprioceptive sensors. Zero mean and a Gaussian distribution.

Extended Kalman Filter (2/3) Nominal discrete process model equation estimated states from previous time Covariance of predicted state covariance matrix related to proprioceptive sensor’s noise

Extended Kalman Filter (3/3) Vehicle positions are corrected by a measurement update of the EKF covariance matrix of the measurement Covariance of the estimated state

Validation Gate Prevent fusing the result of false matching. Threshold about acceptability of measurements. Only measurements inside of validation gate update filter. a confidence level Normalized error .

Experimental results (1/7) Lane-level map: RTK-DGPS. accuracy centimeter. Detailed trajectory and travel time on lane keeping and consecutive lane changing.

Experimental results (2/7) Localization result of first scenario. (a)lateral position error, (b) longitudinal position error, (c) Yaw angle error, (d) map-matching history.

Experimental results (3/7) Localization result of second scenario. (a)lateral position error, (b) longitudinal position error, (c) Yaw angle error, (d) map-matching history.

Experimental results (4/7) Operation of map-matching according to the driving maneuvers.

Experimental results (5/7) Localization result of second scenario .

Experimental results (6/7) Histogram of the measurement residuals over all datasets.

Experimental results (7/7) Compare with method based on the front camera. Lateral accuracy is significantly improved.

Conclusion Lane-level localization using AVM cameras with a lane map is presented. Lane detection uses AVM module. Position correction uses ICP on detected lane and digital map Localization filter fuses map-matching result with vehicle sensors’ data. A covariance estimator for ICP and a validation gate are designed in localization filter.

Thank you for attention!

Iterative closest point (ICP) It compares the two clouds and finds the transformation matrix (the rotation matrix and the translation vector) iteratively by minimizing the sum of the square error of the distance between points. The whole ICP algorithm is as follows. 1. X is the model cloud set. P is the object cloud set. 2. Initialize P0=P, q0=[1,0,0,0,0,0,0]T, k=0. 3. Find the P’s closest points Y in X. Yk=Closest(Pk,X). 4. Calculate transfer parameters and error. (qk, dk)=Q(P0, Yk). 5. Coordinate transformation. Pk+1=qk(P0). 6. If dk-dk-1<τ (τ>0), then go to 7, else go to 3. 7. The final coordinate transformation. Pn+1= qn(P).

Particle Filter Initial: plenty of particle --X (t), evenly distribute; Prediction: state transition equation--every predictive particle; Correction: evaluate--weight; Resampling: filter based on weight; Filtering: particles resampled--new state transition equations--predict particle, do some as step 2.