Teaching Assistant: Roi Yehoshua

Slides:



Advertisements
Similar presentations
Teaching Assistant: Roi Yehoshua
Advertisements

Discussion topics SLAM overview Range and Odometry data Landmarks
© 2011 Delmar, Cengage Learning Chapter 13 Preparing Graphics for the Web.
IR Lab, 16th Oct 2007 Zeyn Saigol
Probabilistic Robotics
1 Panoramic University of Amsterdam Informatics Institute.
Robotics Simulator Intelligent Systems Lab. What is it ? Software framework - Simulating Robotics Algorithms.
Teaching Assistant: Roi Yehoshua
A Robotic Wheelchair for Crowded Public Environments Choi Jung-Yi EE887 Special Topics in Robotics Paper Review E. Prassler, J. Scholz, and.
Autonomous Robot Navigation Panos Trahanias ΗΥ475 Fall 2007.
Robotic Mapping: A Survey Sebastian Thrun, 2002 Presentation by David Black-Schaffer and Kristof Richmond.
gMapping TexPoint fonts used in EMF.
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Probabilistic Robotics
1 CMPUT 412 Autonomous Map Building Csaba Szepesvári University of Alberta TexPoint fonts used in EMF. Read the TexPoint manual before you delete this.
Study on Mobile Robot Navigation Techniques Presenter: 林易增 2008/8/26.
Teaching Assistant: Roi Yehoshua
Object Tracking for Retrieval Application in MPEG-2 Lorenzo Favalli, Alessandro Mecocci, Fulvio Moschetti IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR.
1 QED In Vivo USB Input Output Box configuration This tutorial contains a number of instructions embedded in a great deal of explanation. Procedures that.
8. INPUT, OUTPUT and storage DEVICES i/o units
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Mohammed Rizwan Adil, Chidambaram Alagappan., and Swathi Dumpala Basaveswara.
Creating a MagicInfo Pro Screen Template
ROBOT LOCALISATION & MAPPING: MAPPING & LIDAR By James Mead.
Teaching Assistant: Roi Yehoshua
Multi-Robot Systems with ROS Lesson 1
by Chris Brown under Prof. Susan Rodger Duke University June 2012
BUILDING RICH MEDIA ELEMENTS. Design Decisions Based on Design Specification  Following the design specification will ensure that the application is.
Teaching Assistant: Roi Yehoshua
Prezentacja autorstwa:
Navi Rutgers University 2012 Design Presentation
WEKA - Explorer (sumber: WEKA Explorer user Guide for Version 3-5-5)
3D SLAM for Omni-directional Camera
SA-1 Mapping with Known Poses Ch 4.2 and Ch 9. 2 Why Mapping? Learning maps is one of the fundamental problems in mobile robotics Maps allow robots to.
GTSTRUDL 27 This latest version of GTSTRUDL includes numerous new features, feature enhancements, error corrections, and prerelease features. GTSTRUDL.
Simultaneous Localization and Mapping Presented by Lihan He Apr. 21, 2006.
Tutorial 111 The Visual Studio.NET Environment The major differences between Visual Basic 6.0 and Visual Basic.NET are the latter’s support for true object-oriented.
Mapping and Localization with RFID Technology Matthai Philipose, Kenneth P Fishkin, Dieter Fox, Dirk Hahnel, Wolfram Burgard Presenter: Aniket Shah.
Ling Chen ( From Shanghai University 1.
Teaching Assistant: Roi Yehoshua
Simultaneous Localization and Mapping
Using the AccuGlobe Software with the IndianaMap Using the AccuGlobe Software.
Adobe Photoshop CS5 – Illustrated Unit A: Getting Started with Photoshop CS5.
Working with the robot_localization Package
Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang Reading Assignment.
Copyright © Texas Education Agency, All rights reserved.1 Web Technologies Motion Graphics & Animation.
Fast SLAM Simultaneous Localization And Mapping using Particle Filter A geometric approach (as opposed to discretization approach)‏ Subhrajit Bhattacharya.
MATLAB and SimulinkLecture 61 To days Outline Graphical User Interface (GUI) Exercise on this days topics.
Learning the Basics of ArcMap 3.3 Updated 4/27/2010 Using Arc/View pt. 1 1.
Adobe Photoshop CS4 – Illustrated Unit A: Getting Started with Photoshop CS4.
Active-HDL Server Farm Course 11. All materials updated on: September 30, 2004 Outline 1.Introduction 2.Advantages 3.Requirements 4.Installation 5.Architecture.
SLAM Techniques -Venkata satya jayanth Vuddagiri 1.
Autonomous Mobile Robots Autonomous Systems Lab Zürich Probabilistic Map Based Localization "Position" Global Map PerceptionMotion Control Cognition Real.
Creation and Visualization of 3D Scenes with the MRPT library January, 2007 Jose Luis Blanco Claraco Dept. of Automation and System Engineering University.
Introducing Scratch Learning resources for the implementation of the scenario
Lecturer: Roi Yehoshua
Getting Started with Adobe Photoshop CS6
Layers in Adobe After Effect
Lecturer: Roi Yehoshua
2D Graphics and Animations in Unity 3D
Using Sensor Data Effectively
Paper – Stephen Se, David Lowe, Jim Little
DDC 1023 – Programming Technique
What is ROS? ROS is an open-source robot operating system
Developing Artificial Intelligence in Robotics
Probabilistic Map Based Localization
Robotics and Perception
Robotic Perception and Action
Probabilistic Robotics Bayes Filter Implementations FastSLAM
Presentation transcript:

Teaching Assistant: Roi Yehoshua

ROS Navigation Stack Building a map with ROS ROS visualization tool (rviz) (C)2013 Roi Yehoshua

A 2D navigation stack that takes in information from odometry, sensor streams, and a goal pose and outputs safe velocity commands that are sent to a mobile base. The navigation stack can move your robot without problems (such as crashing or getting lost) to another position ROS Navigation Introductory Video (C)2013 Roi Yehoshua

There are three main hardware requirements: – The navigation stack can only handle a differential drive and holonomic wheeled robots. It can also do certain things with biped robots, such as localization, as long as the robot does not move sideways – A planar laser must be mounted on the mobile base of the robot to create the map and localization Alternatively, you can generate something equivalent to laser scans from other sensors (Kinect for example) – Its performance will be best on robots that are nearly square or circular (C)2013 Roi Yehoshua

Before we can start using the navigation stack, we need to provide the robot a map of the world Different options to create the initial map: – Get the map from external source Like the building’s floorplan – Manual navigation of the robot using teleoperation – Random walk algorithm – More sophisticated algorithms e.g., Frontier-Based Exploration, Online Coverage (C)2013 Roi Yehoshua

Simultaneous localization and mapping (SLAM) is a technique used by robots to build up a map within an unknown environment while at the same time keeping track of their current location SLAM can be thought of as a chicken or egg problem: An unbiased map is needed for localization while an accurate pose estimate is needed to build that map. (C)2013 Roi Yehoshua

The gmapping package provides laser-based SLAM as a ROS node called slam_gmapping. It takes the laser scans and the odometry and builds a 2D occupancy grid map (OGM) It updates the map state when the robot moves – or when (after some motion) it has a good estimate of the robot's location and how the map is. (C)2013 Roi Yehoshua

The map is published to a topic called /map Message type is nav_msgs/OccupancyGridnav_msgs/OccupancyGrid Occupancy is represented as an integer in the range [0,100], with: 0 meaning completely free 100 meaning completely occupied the special value -1 for completely unknown (C)2013 Roi Yehoshua

gmapping implements FastSLAM 2.0 This is a highly efficient particle filtering algorithm that provably converges Kalman filter-based algorithms require time quadratic in the number of landmarks, while FastSLAM scales logarithmically with the number of landmarks in the map See details in the following paper: (C)2013 Roi Yehoshua

Start mapping in a new terminal window (C)2013 Roi Yehoshua $ rosrun gmapping slam_gmapping scan:=base_scan

ROS map_server node allows dynamically generated maps to be saved to file. Execute the following command in a new terminal: map_saver retrieves map data and writes it out to map.pgm and map.yaml in the current directory – Use the -f option to provide a different base name for the output files. To see the map, you can open the pgm file with the default Ubuntu image viewer program (eog) (C)2013 Roi Yehoshua $ rosrun map_server map_saver [-f mapname]

(C)2013 Roi Yehoshua

The image describes the occupancy state of each cell of the world in the color of the corresponding pixel. Whiter pixels are free, blacker pixels are occupied, and pixels in between are unknown. The thresholds that divide the three categories are defined in a YAML file (C)2013 Roi Yehoshua

Important fields: – resolution: Resolution of the map, meters / pixel – origin: The 2-D pose of the lower-left pixel in the map as (x, y, yaw) – occupied_thresh: Pixels with occupancy probability greater than this threshold are considered completely occupied. – free_thresh: Pixels with occupancy probability less than this threshold are considered completely free. (C)2013 Roi Yehoshua image: map.pgm resolution: origin: [ , , ] negate: 0 occupied_thresh: 0.65 free_thresh: image: map.pgm resolution: origin: [ , , ] negate: 0 occupied_thresh: 0.65 free_thresh: 0.196

(C)2013 Roi Yehoshua

You can watch the mapping progress in rviz rviz is a ROS 3D visualization tool that lets you see the world from a robot's perspective Rviz user guide and tutorials can be found at: Execute the following code to run rviz: (C)2013 Roi Yehoshua $ rosrun rviz rviz

If your system uses the Mesa graphics drivers (e.g. for Intel GPUs, inside a VM), hardware acceleration can cause problems. To get around this, disable this before running rviz: (C)2013 Roi Yehoshua $ export LIBGL_ALWAYS_SOFTWARE=1 $ rosrun rviz rviz $ export LIBGL_ALWAYS_SOFTWARE=1 $ rosrun rviz rviz

(C)2013 Roi Yehoshua

The first time you open rviz you will see an empty 3D view On the left is the Displays area, which contains a list of different elements in the world, that appears in the middle. – Right now it just contains global options and grid Below the Displays area, we have the Add button that allows the addition of more elements. (C)2013 Roi Yehoshua

Messages UsedDescriptionDisplay name Displays a set of AxesAxes sensor_msgs/JointStatesShows the effort being put into each revolute joint of a robot. Effort sensor_msgs/Image sensor_msgs/CameraInfo Creates a new rendering window from the perspective of a camera, and overlays the image on top of it. Camera Displays a 2D or 3D grid along a planeGrid nav_msgs/GridCellsDraws cells from a grid, usually obstacles from a costmap from the navigation stack. Grid Cells sensor_msgs/ImageCreates a new rendering window with an Image.Image sensor_msgs/LaserScanShows data from a laser scan, with different options for rendering modes, accumulation, etc. LaserScan nav_msgs/OccupancyGri d Displays a map on the ground plane.Map

(C)2013 Roi Yehoshua Messages UsedDescriptionDisplay name visualization_msgs/Marker Array Allows programmers to display arbitrary primitive shapes through a topic Markers nav_msgs/PathShows a path from the navigation stack.Path geometry_msgs/PoseStam ped Draws a pose as either an arrow or axesPose sensor_msgs/PointCloud sensor_msgs/PointCloud2 Shows data from a point cloud, with different options for rendering modes, accumulation, etc. Point Cloud(2) nav_msgs/OdometryAccumulates odometry poses from over time.Odometry sensor_msgs/RangeDisplays cones representing range measurements from sonar or IR range sensors. Range Shows a visual representation of a robot in the correct pose (as defined by the current TF transforms). RobotModel Displays the tf transform hierarchy.TF

Click the Add button under Displays and choose the LaserScan display In the LaserScan display properties change the topic to /base_scan In Global Options change Fixed Frame to odom To see the robot’s position also add the TF display The laser “map” that is built will disappear over time, because rviz can only buffer a finite number of laser scans (C)2013 Roi Yehoshua

Add the Map display Set the topic to /map Now you will be able to watch the mapping progress in rviz (C)2013 Roi Yehoshua

You can run rviz, using a configuration file that is already defined in the stage_ros package: (C)2013 Roi Yehoshua $ rosrun rviz rviz -d `rospack find stage_ros`/rviz/stage.rviz

(C)2013 Roi Yehoshua

Create a map of the erratic world using your random walker from the previous assignment Compare the resultant map to the original willow erratic world map located at /opt/ros/hydro/share/stage_ros/world/willow-full.pgm How long did it take the random walker to create an accurate map of the area? (C)2013 Roi Yehoshua