Estimating the Driving State of Oncoming Vehicles From a Moving Platform Using Stereo Vision IEEE Intelligent Transportation Systems 2009 M.S. Student,

Slides:



Advertisements
Similar presentations
Université du Québec École de technologie supérieure Face Recognition in Video Using What- and-Where Fusion Neural Network Mamoudou Barry and Eric Granger.
Advertisements

People Counting and Human Detection in a Challenging Situation Ya-Li Hou and Grantham K. H. Pang IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART.
CSE473/573 – Stereo and Multiple View Geometry
Hilal Tayara ADVANCED INTELLIGENT ROBOTICS 1 Depth Camera Based Indoor Mobile Robot Localization and Navigation.
M.S. Student, Hee-Jong Hong
Hybrid Position-Based Visual Servoing
(Includes references to Brian Clipp
3D M otion D etermination U sing µ IMU A nd V isual T racking 14 May 2010 Centre for Micro and Nano Systems The Chinese University of Hong Kong Supervised.
Real Time Motion Capture Using a Single Time-Of-Flight Camera
Robust Object Tracking via Sparsity-based Collaborative Model
InteractIVe Summer School, July 6 th, 2012 Grid based SLAM & DATMO Olivier Aycard University of Grenoble 1 (UJF), FRANCE
A KLT-Based Approach for Occlusion Handling in Human Tracking Chenyuan Zhang, Jiu Xu, Axel Beaugendre and Satoshi Goto 2012 Picture Coding Symposium.
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Oklahoma State University Generative Graphical Models for Maneuvering Object Tracking and Dynamics Analysis Xin Fan and Guoliang Fan Visual Computing and.
A Novel 2D-to-3D Conversion System Using Edge Information IEEE Transactions on Consumer Electronics 2010 Chao-Chung Cheng Chung-Te li Liang-Gee Chen.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Vision-Based Motion Control of Robots
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
Robust Lane Detection and Tracking
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Student: Hsu-Yung Cheng Advisor: Jenq-Neng Hwang, Professor
Autonomous Vehicle Positioning with GPS in Urban Canyon Environments
Kalman Filtering Jur van den Berg. Kalman Filtering (Optimal) estimation of the (hidden) state of a linear dynamic process of which we obtain noisy (partial)
Introduction to Computer Vision CS223B, Winter 2005.
Tracking a maneuvering object in a noisy environment using IMMPDAF By: Igor Tolchinsky Alexander Levin Supervisor: Daniel Sigalov Spring 2006.
IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 20, NO. 11, NOVEMBER 2011 Qian Zhang, King Ngi Ngan Department of Electronic Engineering, the Chinese university.
Goal: Fast and Robust Velocity Estimation P1P1 P2P2 P3P3 P4P4 Our Approach: Alignment Probability ●Spatial Distance ●Color Distance (if available) ●Probability.
InerVis Mobile Robotics Laboratory Institute of Systems and Robotics ISR – Coimbra Contact Person: Jorge Lobo Human inertial sensor:
Overview and Mathematics Bjoern Griesbach
Dept. of ECE 1 Feature-based Object Tracking Dr. Dapeng Oliver Wu Joint Work with Bing Han, William Roberts, and Jian Li Department of Electrical and Computer.
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
Adaptive Signal Processing Class Project Adaptive Interacting Multiple Model Technique for Tracking Maneuvering Targets Viji Paul, Sahay Shishir Brijendra,
ROBOT MAPPING AND EKF SLAM
Kalman filter and SLAM problem
GM-Carnegie Mellon Autonomous Driving CRL TitleAutomated Image Analysis for Robust Detection of Curbs Thrust AreaPerception Project LeadDavid Wettergreen,
SLAM (Simultaneously Localization and Mapping)
3D Fingertip and Palm Tracking in Depth Image Sequences
Reading Notes: Special Issue on Distributed Smart Cameras, Proceedings of the IEEE Mahmut Karakaya Graduate Student Electrical Engineering and Computer.
Olga Zoidi, Anastasios Tefas, Member, IEEE Ioannis Pitas, Fellow, IEEE
Real-time object tracking using Kalman filter Siddharth Verma P.hD. Candidate Mechanical Engineering.
KinectFusion : Real-Time Dense Surface Mapping and Tracking IEEE International Symposium on Mixed and Augmented Reality 2011 Science and Technology Proceedings.
1. Introduction Motion Segmentation The Affine Motion Model Contour Extraction & Shape Estimation Recursive Shape Estimation & Motion Estimation Occlusion.
資訊碩一 蔡勇儀  Introduction  Method  Background generation and updating  Detection of moving object  Shape control points.
A General Framework for Tracking Multiple People from a Moving Camera
3D SLAM for Omni-directional Camera
Dynamic 3D Scene Analysis from a Moving Vehicle Young Ki Baik (CV Lab.) (Wed)
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan Tongmyong University.
An Information Fusion Approach for Multiview Feature Tracking Esra Ataer-Cansizoglu and Margrit Betke ) Image and.
Yao, B., and Fei-fei, L. IEEE Transactions on PAMI(2012)
Young Ki Baik, Computer Vision Lab.
Data Fusion and Multiple Models Filtering for Launch Vehicle Tracking and Impact Point Prediction: The Alcântara Case – Julio Cesar Bolzani de Campos Ferreira.
General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution.
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
Department of Computer Science,
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Camera calibration from multiple view of a 2D object, using a global non linear minimization method Computer Engineering YOO GWI HYEON.
Zhaoxia Fu, Yan Han Measurement Volume 45, Issue 4, May 2012, Pages 650–655 Reporter: Jing-Siang, Chen.
SEMINAR ON TRAFFIC MANAGEMENT USING IMAGE PROCESSING by Smruti Ranjan Mishra (1AY07IS072) Under the guidance of Prof Mahesh G. Acharya Institute Of Technology.
A Novel 2D-to-3D Conversion System Using Edge Information
Paper – Stephen Se, David Lowe, Jim Little
Vehicle Segmentation and Tracking in the Presence of Occlusions
Dongwook Kim, Beomjun Kim, Taeyoung Chung, and Kyongsu Yi
Vision based automated steering
Presentation transcript:

Estimating the Driving State of Oncoming Vehicles From a Moving Platform Using Stereo Vision IEEE Intelligent Transportation Systems 2009 M.S. Student, Heejong Hong Alexander Barth and Uwe Franke

Introduction Related Works Proposed Method Experimental Results Conclusion Outline 2

Introduction  Driver-assistance and safety systems Dynamic Object Detection for DAS Safety System with Dynamic Path Estimation

1. A model-free object representation based on groups 2. Fusion active sensors 3. Track-before-detection 4. Rediscovering an image region labeled as vehicle Related Works 4 D. Beymer, P. McLauchlan, B. Coifman and J. Malik "A real-time computer vision system for measuring traffic parameters", Proc. Comput. Vis. Pattern Recog, pp M. Maehlisch, W. Ritter and K. Dietmayer "De-cluttering with integrated probabilistic data association for multisensor multitarget ACC vehicle tracking", Proc. IEEE Intell. Veh. Symp., pp U. Franke, C. Rabe, H. Badino and S. Gehrig "6D-vision: Fusion of stereo and motion for robust environment perception", Proc. 27th DAGM Symp., pp X. Li, X. Yao, Y. Murphey, R. Karlsen and G. Gerhart "A real-time vehicle detection and tracking system in outdoor traffic scenes", Proc. 17th Int. Conf. Pattern Recog., pp.II:761 -II:

5 Proposed Method

Object Model 1. Pose (relative orientation and translation to ego-vehicle) 2. Motion State (velocity, acceleration, yaw rate) 3. Shape (rigid 3-D point cloud) Motion State Shape Pose

Object Tracking  Extended Kalman Filter (EKF) : Kalman filter for nonlinear model State transition(f) and observation model(h) Discrete-time predict and update equations Wikipedia : Jacobian of system & measurement model Example)

Object Tracking 1. State Vector of an object instance Reference point in ego-coordinates Rotation point in object-coordinates The object origin is ideally defined on the center rear axis

Object Tracking 2. Dynamic(System) Model R is 3x3 rotation matrix around the height axis N. Kaempchen, K. Weiss, M. Schaefer and K. Dietmayer "IMM object tracking for high dynamic driving maneuvers", Proc. IEEE Intell. Veh. Symp., pp Predicted state vector Time-discrete system EquationTransformation of an object point Translation matrix

Object Tracking 3. Measurement Model The measurement nonlinear eq. : perspective camera model Jacobian of measurement model Objects feature points on image coordinates using feature tracker (KLT) Feature point tracking using KLT

Kalman Filter Initialization 1. Image Based Initialization 2. Radar-Based Initialization (detect oncoming vehicle up to 200m) The mean velocity vector : The centroid of the 3-D positions : Initial Yaw : The lateral and longitudinal positions of the radar target :, Absolute radar velocity of the object : Initial Yaw :

Point Model Update 1. Maximum-likelihood estimation 2. Simple average filter Object’s Shape Expectation = 3x3 covariance matrix of t Expected object’s shape

13 Experimental Result

Simulation Results  Synthetic Sequence

Real World Results  Country Road Curve I

Real World Results  Country Road Curve II

Real World Results  Oncoming Traffic at Intersections

Real World Results  Leading Vehicles & Partial Occlusions

Real World Results  Challenges and Limits

Joint Histogram Based Cost Aggregation for Stereo Matching - TPAMI Conclusion

Joint Histogram Based Cost Aggregation for Stereo Matching - TPAMI Contribution New method for the image-based real-time tracking (25Hz, 640x480) Results of experiments with synthetic data & real-world Two different object detection method (image & radar) Feature-based object point model does not require a priori knowledge about the object’s shape Weakness No specific system block diagram User defined rotation point Shape depends on outlier removing algorithm (ex : max distance parameter) Shape is very sensitive about outlier of point cloud (because of yaw)

22 Thank you!