Working with the robot_localization Package

Slides:



Advertisements
Similar presentations
Discussion topics SLAM overview Range and Odometry data Landmarks
Advertisements

ROBOT LOCALISATION & MAPPING: MAPPING & LIDAR By James Mead.
(Includes references to Brian Clipp
Probabilistic Robotics Course Presentation Outline 1. Introduction 2. The Bayes Filter 3. Non Parametric Filters 4. Gausian Filters 5. EKF Map Based Localization.
Probabilistic Robotics Probabilistic Motion Models.
Probabilistic Robotics Probabilistic Motion Models.
Probabilistic Robotics Probabilistic Motion Models.
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
1 Observers Data Only Fault Detection Bo Wahlberg Automatic Control Lab & ACCESS KTH, SWEDEN André C. Bittencourt Department of Automatic Control UFSC,
Sponsors Mechanical Improvements Software The software is written in C++ and Python using the Robot Operating System (ROS) framework. The ROS tool, rviz,
Active SLAM in Structured Environments Cindy Leung, Shoudong Huang and Gamini Dissanayake Presented by: Arvind Pereira for the CS-599 – Sequential Decision.
CS 326 A: Motion Planning Planning Exploration Strategies.
Autonomous Robot Navigation Panos Trahanias ΗΥ475 Fall 2007.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
gMapping TexPoint fonts used in EMF.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Probabilistic Robotics
Discriminative Training of Kalman Filters P. Abbeel, A. Coates, M
Probabilistic Robotics: Motion Model/EKF Localization
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang
August, School of Aeronautics & Astronautics Engineering Optical Navigation Systems Takayuki Hoshizaki Prof. Dominick Andrisani.
1 Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Joseph Djugash Sanjiv Singh George Kantor Wei Zhang Carnegie Mellon University.
1/53 Key Problems Localization –“where am I ?” Fault Detection –“what’s wrong ?” Mapping –“what is my environment like ?”
What is GPS and UTM and how do they work? K. Michalski.
Overview and Mathematics Bjoern Griesbach
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
ROBOT MAPPING AND EKF SLAM
ROBOT LOCALISATION & MAPPING: MAPPING & LIDAR By James Mead.
Kalman filter and SLAM problem
Teaching Assistant: Roi Yehoshua
The Trimble Technology Timeline Evolution of GPS Technology – Applications in the Unmanned World Akshay Bandiwdekar Product Portfolio Manager – Avionics.
Teaching Assistant: Roi Yehoshua
Methee Srisupundit Final Defense.  Intelligent Vehicle  Localization  Observer (Estimator) Kalman Filter Particle Filter  Methodology Control Input.
Navi Rutgers University 2012 Design Presentation
3D SLAM for Omni-directional Camera
1 Robot Motion and Perception (ch. 5, 6) These two chapters discuss how one obtains the motion model and measurement model mentioned before. They are needed.
Probabilistic Robotics Probabilistic Motion Models.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 2.3: 2D Robot Example Jürgen Sturm Technische Universität München.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Young Ki Baik, Computer Vision Lab.
IMPROVE THE INNOVATION Today: High Performance Inertial Measurement Systems LI.COM.
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
These slides augment the FootSLAM ION GNSS 2009 and InsideGNSS (“SLAM Dance”) papers – we have made some minor improvements/corrections to notation Source:
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang Reading Assignment.
State Estimation for Autonomous Vehicles
Autonomous Navigation for Flying Robots Lecture 6.3: EKF Example
Presentation: Shashank Gundu.  Introduction  Related work  Hardware Platform  Experiments  Conclusion.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use dynamics models.
Nonlinear State Estimation
Planning in Information Space for a Quad-rotor Helicopter Ruijie He Thesis Advisor: Prof. Nicholas Roy.
EE 495 Modern Navigation Systems Kalman Filtering – Part II Mon, April 4 EE 495 Modern Navigation Systems Slide 1 of 23.
SLAM Techniques -Venkata satya jayanth Vuddagiri 1.
10/31/ Simulation of Tightly Coupled INS/GPS Navigator Ade Mulyana, Takayuki Hoshizaki October 31, 2001 Purdue University.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Localization Life in the Atacama 2004 Science & Technology Workshop January 6-7, 2005 Daniel Villa Carnegie Mellon Matthew Deans QSS/NASA Ames.
Embedded Autonomous Wheelchair Navigation System for AAL Scenarios
CSE-473 Mobile Robot Mapping.
Using Sensor Data Effectively
Paper – Stephen Se, David Lowe, Jim Little
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Simultaneous Localization and Mapping
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Anastasios I. Mourikis and Stergios I. Roumeliotis
Probabilistic Robotics
Motion Models (cont) 2/16/2019.
Principle of Bayesian Robot Localization.
Presentation transcript:

Working with the robot_localization Package Tom Moore Clearpath Robotics

What is robot_localization? General purpose state estimation package No limit on the number of input data sources Supported message types for state estimation nodes nav_msgs/Odometry geometry_msgs/PoseWithCovarianceStamped geometry_msgs/TwistWithCovarianceStamped sensor_msgs/Imu State vector: Two typical use cases Fuse continuous sensor data (e.g., wheel encoder odometry and IMU) to produce locally accurate state estimate Fuse continuous data with global pose estimates (e.g., from SLAM) to provide an accurate and complete global state estimate

Nodes State estimation nodes Sensor preprocessing nodes ekf_localization_node – Implementation of an extended Kalman filter (EKF) ukf_localization_node – Implementation of an unscented Kalman filter (UKF) Sensor preprocessing nodes navsat_transform_node – Allows users to easily transform geographic coordinates (latitude and longitude) into the robot’s world frame (typically map or odom)

robot_localization and the ROS Navigation Stack amcl gmapping navsat_transform_node GPS *kf_localization_node IMU Wheel Encoders *kf_localization_node Visual Odometry Barometer And so much more! Source: http://wiki.ros.org/move_base

Preparing Your Sensor Data: REPs Fact: you can’t spell “prepare” without REP. Two important REPs REP-103: http://www.ros.org/reps/rep-0103.html Covers standards for units and basic coordinate frame conventions REP-105: http://www.ros.org/reps/rep-0105.html Covers naming and semantics of the “principal” coordinate frames in ROS

map odom base_link Preparing Your Sensor Data: Transforms ekf_localization_node pose data odom ekf_localization_node twist data* base_link * and IMU data

Preparing Your Sensor Data: Frame IDs Requires a base_link  imu_link transform No transforms required

Preparing Your Sensor Data: Covariance initial_estimate_covariance (P0) process_noise_covariance nav_msgs/Odometry geometry_msgs/PoseWithCovarianceStamped geometry_msgs/TwistWithCovarianceStamped sensor_msgs/Imu

Configuring *kf_localization_node Coordinate frame specification <param name="map_frame" value="map"/> <param name="odom_frame" value="odom"/> <param name="base_link_frame" value="base_link"/> <param name="world_frame" value="odom"/>

Configuring *kf_localization_node Input specification <param name="odom0" value="/controller/odom"/> <param name="odom1" value="/some/other/odom"/> <param name="pose0" value="/altitude"/> <param name="pose1" value="/some/other/pose"/> <param name="pose2" value="/yet/another/pose"/> <param name="twist0" value="/optical_flow"/> <param name="imu0" value="/imu/left"/> <param name="imu1" value="/imu/right"/> <param name="imu2" value="/imu/front"/> <param name="imu3" value="/imu/back"/>

Configuring *kf_localization_node Basic input configuration <rosparam param="odom0_config"> [true, true, false, false, false, false, false, false, true, false, false, false] </rosparam> x, y, z roll, pitch, yaw y velocity, y velocity, z velocity roll velocity, pitch velocity, yaw velocity x accel., y accel., z accel. <rosparam param="odom1_config"> [true, true, true, false, false, false, false, false, false] </rosparam> <rosparam param="imu0_config"> [false, false, false, true, true, true, false, false, false, true, true, true] </rosparam> <param name="odom1_differential" value="true”>

Configuring *kf_localization_node Covariance specification (P0 and Q) <rosparam param="initial_estimate_covariance"> [0.8, 0, ..., 1e-9] </rosparam> <rosparam param="process_noise_covariance"> [0.04, 0, ..., 0.02]

Live Demo

Using navsat_transform_node What does it do? Many robots operate outdoors and make use of GPS receivers Problem: getting the data into your robot’s world frame Solution: Convert GPS data to UTM coordinates Use initial UTM coordinate, EKF/UKF output, and IMU to generate a (static) transform T from the UTM grid to your robot’s world frame Transform all future GPS measurements using T Feed output back into EKF/UKF Required Inputs nav_msgs/Odometry (EKF output, needed for robot’s current pose) sensor_msgs/Imu (must have a compass, needed to determine global heading) sensor_msgs/NavSatFix (output from your navigation satellite device)

Using navsat_transform_node Relevant settings <param name="magnetic_declination_radians" value="0"/> <param name="yaw_offset" value="0"/> <param name="zero_altitude" value="true"/> <param name="broadcast_utm_transform" value="true"/> <param name="publish_filtered_gps" value="true"/>

Using navsat_transform_node Typical Setup *kf_localization_node (map) (Sensor data) /odometry/filtered (/nav_msgs/Odometry) /odometry/gps (/nav_msgs/Odometry) /imu/data (sensor_msgs/Imu) /gps/fix (sensor_msgs/NavSatFix) navsat_transform_node *kf_localization_node (odom) (Sensor data)

robot_localization in the Wild robot_localization works on a broad range of robots! From this… (ayrbot) …to this (OTTO)

Thank you! Fin http://wiki.ros.org/robot_localization tmoore@clearpathrobotics.com