POLI di MI tecnicolano VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso, D. Leonello Politecnico.

Slides:



Advertisements
Similar presentations
Benoit Pigneur and Kartik Ariyur School of Mechanical Engineering Purdue University June 2013 Inexpensive Sensing For Full State Estimation of Spacecraft.
Advertisements

For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Design Presentation Spring 2009 Andrew Erdman Chris Sande Taoran Li.
EUCLIDEAN POSITION ESTIMATION OF FEATURES ON A STATIC OBJECT USING A MOVING CALIBRATED CAMERA Nitendra Nath, David Braganza ‡, and Darren Dawson EUCLIDEAN.
Hilal Tayara ADVANCED INTELLIGENT ROBOTICS 1 Depth Camera Based Indoor Mobile Robot Localization and Navigation.
© Copyright 2011 MicroStrain Inc. High Performance Miniature Inertial Measurement Systems MicroStrain Inc Mike Robinson
Position and Attitude Determination using Digital Image Processing Sean VandenAvond Mentors: Brian Taylor, Dr. Demoz Gebre-Egziabher A UROP sponsored research.
Literature Review: Safe Landing Zone Identification Presented by Keith Sevcik.
POLI di MI tecnicolano NAVIGATION AND CONTROL OF AUTONOMOUS VEHICLES WITH INTEGRATED FLIGHT ENVELOPE PROTECTION C.L. Bottasso Politecnico di Milano Workshop.
Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme.
Learning from Demonstrations Jur van den Berg. Kalman Filtering and Smoothing Dynamics and Observation model Kalman Filter: – Compute – Real-time, given.
Parallel Tracking and Mapping for Small AR Workspaces Vision Seminar
(Includes references to Brian Clipp
POLI di MI tecnicolano TRAJECTORY PLANNING FOR UAVs BY SMOOTHING WITH MOTION PRIMITIVES C.L. Bottasso, D. Leonello, B. Savini Politecnico di Milano AHS.
Project Progress Presentation Coffee delivery mission Dec, 10, 2007 NSH 3211 Hyun Soo Park, Iacopo Gentilini 1.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Matt McKeever Jonathan Baker UAV Design Team 11/16/2006
Automatic Control & Systems Engineering Autonomous Systems Research Mini-UAV for Urban Environments Autonomous Control of Multi-UAV Platforms Future uninhabited.
POLI di MI tecnicolano Numerical Simulation of Aero-Servo-Elastic Problems, with Application to Wind Turbines and Rotary Wing Vehicles Carlo L. Bottasso.
POLI di MI tecnicolano ADAPTIVE AUGMENTED CONTROL OF UNMANNED ROTORCRAFT VEHICLES C.L. Bottasso, R. Nicastro, L. Riviello, B. Savini Politecnico di Milano.
8/22/20061 Maintaining a Linked Network Chain Utilizing Decentralized Mobility Control AIAA GNC Conference & Exhibit Aug. 21, 2006 Cory Dixon and Eric.
Autonomous Vehicle Positioning with GPS in Urban Canyon Environments
An experiment on squad navigation of human and robots IARP/EURON Workshop on Robotics for Risky Interventions and Environmental Surveillance January 7th-8th,
Estimating the Driving State of Oncoming Vehicles From a Moving Platform Using Stereo Vision IEEE Intelligent Transportation Systems 2009 M.S. Student,
December, Simulation of Tightly Coupled INS/GPS Navigator Ade Mulyana, Takayuki Hoshizaki December, 2001 Purdue University.
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
An INS/GPS Navigation System with MEMS Inertial Sensors for Small Unmanned Aerial Vehicles Masaru Naruoka The University of Tokyo 1.Introduction.
W w w. a r c a a. a e r o AUSTRALIAN RESEARCH CENTRE FOR AEROSPACE AUTOMATION Overview of Activities at the Australian Research Centre for Aerospace Automation.
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Airborne Attitude Determination and Ground Target Location Using GPS Information and Vision Technique Shan-Chih Hsieh, Luke K.Wang, Yean-Nong Yang †,Fei-Bin.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Real-time Dense Visual Odometry for Quadrocopters Christian Kerl
Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments D. Grießbach, D. Baumbach, A. Börner, S. Zuev German Aerospace Center.
SPIE'01CIRL-JHU1 Dynamic Composition of Tracking Primitives for Interactive Vision-Guided Navigation D. Burschka and G. Hager Computational Interaction.
Real-time object tracking using Kalman filter Siddharth Verma P.hD. Candidate Mechanical Engineering.
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
Vision-based Landing of an Unmanned Air Vehicle
Sérgio Ronaldo Barros dos Santos (ITA-Brazil)
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Robust localization algorithms for an autonomous campus tour guide Richard Thrapp Christian Westbrook Devika Subramanian Rice University Presented at ICRA.
IMPROVE THE INNOVATION Today: High Performance Inertial Measurement Systems LI.COM.
M. De Cecco - Lucidi del corso di Robotica e Sensor Fusion Laser Range Finder Camera  direct depth measurement  wide accuracy span (till 200 m)  only.
Qualitative Vision-Based Mobile Robot Navigation Zhichao Chen and Stanley T. Birchfield Dept. of Electrical and Computer Engineering Clemson University.
Cooperative Air and Ground Surveillance Wenzhe Li.
Scientific Systems 1 Scientific Systems Company, Inc Presentation at Meeting No. 96 Aerospace Control and Guidance Systems Committee Harbour Town Resorts.
State Estimation for Autonomous Vehicles
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
1 Center for the Collaborative Control of Unmanned Vehicles (C3UV) UC Berkeley Karl Hedrick, Raja Sengupta.
Optic Flow QuadCopter Control
©Roke Manor Research Ltd 2011 Part of the Chemring Group 1 Startiger SEEKER Workshop Estelle Tidey – Roke Manor Research 26 th February 2011.
MASKS © 2004 Invitation to 3D vision. MASKS © 2004 Invitation to 3D vision Lecture 1 Overview and Introduction.
Strapdown Inertial Navigation Systems (INS) Sensors and UAVs Avionic
EE 495 Modern Navigation Systems Kalman Filtering – Part II Mon, April 4 EE 495 Modern Navigation Systems Slide 1 of 23.
10/31/ Simulation of Tightly Coupled INS/GPS Navigator Ade Mulyana, Takayuki Hoshizaki October 31, 2001 Purdue University.
Instantaneous Geo-location of Multiple Targets from Monocular Airborne Video.
Sensors Fusion for Mobile Robotics localization
Paper – Stephen Se, David Lowe, Jim Little
Pursuit-Evasion Games with UGVs and UAVs
Vision Based Motion Estimation for UAV Landing
Probabilistic Pursuit-Evasion Games with UGVs and UAVs
Map for Easy Paths GIANLUCA BARDARO
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Autonomous Cyber-Physical Systems: Sensing
Team A – Perception System using Stereo Vision and Radar
Vision based automated steering
A Short Introduction to the Bayes Filter and Related Models
Sensor Fusion Localization and Navigation for Visually Impaired People
SENSOR BASED CONTROL OF AUTONOMOUS ROBOTS
Presentation transcript:

POLI di MI tecnicolano VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso, D. Leonello Politecnico di Milano AHS International Specialists' Meeting on Unmanned Rotorcraft Phoenix, AZ, January 20-22, 2009

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Outline Introduction and motivation Inertial navigation by measurement fusion Vision-augmented inertial navigation - Stereo projection and vision-based position sensors - Vision-based motion sensors - Outlier rejection Results and applications Conclusions and outlook

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Rotorcraft UAVs at PoliMI navigation control Low-cost platform for development and testing of navigation and control strategies (including vision, flight envelope protection, etc.) Vehicles: off-the-shelf hobby helicopters On-board control hardware based on PC-104 standard everything is developedin-house Bottom-up approach, everything is developed in-house: - Inertial Navigation System (this paper) - Guidance and Control algorithms (AHS UAV `07: C.L. Bottasso et al., path planning by motion primitives, adaptive flight control laws) - Linux-based real-time OS - Flight simulators - System identification (estimation of inertia, estimation of aerodynamic model parameters from flight test data) - Etc. etc.

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA UAV Control Architecture Target Obstacles Hierarchical three-layer control architecture Hierarchical three-layer control architecture (Gat 1998): Strategic layer: assign mission objectives (typically relegated to a human operator) Tactical layer: generate vehicle guidance information, based on input from strategic layer and ambient mapping information Reflexive layer: track trajectory generated by tactical layer, control, stabilize and regulate vehicle Sense vehicle state of motion (to enable planning and tracking) Sense environment (to enable mapping)

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Accelerometer Gyro Sonar altimeter Magnetometer GPS Sensor fusion algorithm Laser scanner Other sensors Sensor fusion algorithm State Estimates Ambient map Obstacle/target recognition Stereo cameras Sensing of vehicle motion states Sensing of environment for mapping Advantages Advantages: Improved accuracy/better estimates, especially when in proximity of obstacles Sensor loss tolerant (e.g. because of faults, or GPS loss indoors, under vegetation or in urban canyons, etc.) Proposed approach Proposed approach: Recruit vision sensors for improved state estimation (This paper)

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Kalman-type Sensor fusion by Kalman-type filtering to account for measurement and process noise: States: Inputs: Outputs Measures: Classical Navigation System x : = ( v E T B ; ! B T ; r E T OB ; q ) T y : = ( v E T G ; r E T OG ; h ; m B T ) T u : = ( a T acc ; ! T gyro ) T z : = ( v T gps ; r T gps ; h sonar ; m T magn ) T _ x ( t ) = f ¡ x ( t ) ; u ( t ) ; º ( t ) ¢ y ( t k ) = h ¡ x ( t k ) ¢ z ( t k ) = y ( t k ) + ¹ ( t k ) ^ x ( t k + 1 ) = ¹ x ( t k + 1 ) + K ( t k + 1 ) ¡ z ( t k + 1 ) ¡ ¹ y ( t k + 1 ) ¢

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Accelerometer Gyro Sonar altimeter Magnetometer GPS Stereo cameras KLT Sensor fusion algorithm Other sensors State Estimates Outlier rejection Vision-Based Navigation System Kanade-Lucas-Tomasi tracker: Tracks feature points in the scene across the stereo cameras and across time steps Each tracked point becomes vision-based motion sensor Has own internal outlier rejection algorithm Vision sensors

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Feature point projection: Stereo vision Stereo vision: disparity Kanade-Lucas-Tomasi (KLT) computed with Kanade-Lucas-Tomasi (KLT) algorithm p = ¼ ( d ) d C = b d p C d = p 1 ¡ p 0 1 P O B b f C C 0 c c 0 1 c 0 3 c 0 2 p r OB d 0 Vision-Based Position Sensor Effect of one pixel error on estimated distance (BumbleBee X3 camera) ▶ Remarknoisy Remark: stereo vision info from low res cameras is noisy, need care

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Feature Point Tracking ▼ Left camera Time k ▶ Time k+1 ▶ ▼ Right camera Tracking across cameras Tracking across time steps

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA P O B b f C C 0 c c 0 1 c 0 3 c 0 2 p r OB d 0 Vision-Based Motion Sensor d d t ( r E + R c B + RC d C ) = 0 Differentiate the vector closure expression: Apparent motionmotion sensor Apparent motion of feature point on image plane (motion sensor): _ p C = ¡ MC T ¡ R T v E B + ! B £ ( c B + C d C ) ¢ Attitude Linear velocity Angular velocity

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Vision-Based Motion Sensor 1.For all tracked feature points, write motion sensor equation new output of the vehicle states This defines a new output of the vehicle states 2.Measure apparent motion of feature pt: Kalman filtering 3.Fuse in parallel with all other sensors using Kalman filtering z : = ( v T gps ; r T gps ; h sonar ; m T magn ;:::; d T v i s i on ; d T 0 v i s i on ;::: ) T Measured apparent velocity (due to vehicle motion) new augmented measurement vector This defines a new augmented measurement vector: GPS, gyro, accelerometer, magnetometer, altimeter readings + two (left & right cameras) vision sensor per tracked feature point y : = ( v E T G ; r E T OG ; h ; m B T ;:::; d ( t k + 1 ) C T k + 1 ; d ( t k + 1 ) C 0 T k + 1 ;::: ) T

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Outlier Rejection Outlier Outlier: A point which is not fixed wrt to the scene A false positive in the tracking KLT algorithm false info on the state of motion Outliers give false info on the state of motion, need a way to discard them from the process Apparent point velocity due to estimated vehicle motion Measured apparent velocity lengthdirection Drop tracked point if the two vectors differ too much in length and direction ▶ Two stage rejection Two stage rejection: 1.KLT internal 2.Vehicle motion compatibility check

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Examples: Pirouette Around Point Cloud Cloud of about 100 points Temporary loss of GPS signal (for 100 sec < t < 200 sec) conservative results To show conservative results: Only <=20 tracked points at each instant of time Vision sensors operating at 1Hz

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Examples: Pirouette Around Point Cloud Filter warm-up Begin GPS signal loss End GPS signal loss Classical non- vision-based IMU Vision-based IMU Remark Remark: No evident effect of GPS loss on state estimation for vision-augmented IMU

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Examples: Flight in a Village ◀ Scene environment and image acquisition based on Gazebo simulator Rectangular flight path in a village at 2m of altitude Three loops: 1.With GPS 2.Without GPS 3.With GPS

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Filter warm-up Begin GPS signal loss End GPS signal loss Some degradation of velocity estimates without GPS Filter warm-up Begin GPS signal loss End GPS signal loss Some degradation of velocity estimates without GPS Examples: Flight in a Village ◀ No evident effect of GPS loss on attitude estimation

Vision-Augmented Inertial Navigation POLITECNICO di MILANO DIA Conclusions inertial navigation systemvision motion sensors Proposed novel inertial navigation system using vision motion sensors Basic concept demonstrated in a high-fidelity virtual environment Observed facts Observed facts: Improved observability of vehicle states No evident transients during loss and reacquisition of sensor signals Higher accuracy when close to objects and for increasing number of tracked points Computational cost compatible with on-board hardware (PC-104 Pentium III) Outlook Outlook: Testing in the field Adaptive filtering: better robustness/tuning Recruitment of additional sensors (e.g. stereo laser-scanner) BumbleBee X3 camera