Algorithm Implementation: Safe Landing Zone Identification Presented by Noah Kuntz.

Slides:



Advertisements
Similar presentations
Feature Based Image Mosaicing
Advertisements

Lunar Landing GN&C and Trajectory Design Go For Lunar Landing: From Terminal Descent to Touchdown Conference Panel 4: GN&C Ron Sostaric / NASA JSC March.
Literature Review: Safe Landing Zone Identification Presented by Keith Sevcik.
Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme.
VI. Descent and Terminal Guidance for Pinpoint Landing and Hazard Avoidance Session Chair: Dr. Sam W. Thurman.
Kiyoshi Irie, Tomoaki Yoshida, and Masahiro Tomono 2011 IEEE International Conference on Robotics and Automation Shanghai International Conference Center.
TCSP – Patent Liability Analysis. Project Overview Overall Objectives Create an Unmanned Aerial Vehicle (UAV) which is capable of the following: Create.
Longin Jan Latecki Zygmunt Pizlo Yunfeng Li Temple UniversityPurdue University Project Members at Temple University: Yinfei Yang, Matt Munin, Kaif Brown,
Abstract This project focuses on realizing a series of operational improvements for WPI’s unmanned ground vehicle Prometheus with the end goal of a winning.
Sam Pfister, Stergios Roumeliotis, Joel Burdick
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
Footstep Planning Among Obstacles for Biped Robots James Kuffner et al. presented by Jinsung Kwon.
Automated Construction of Environment Models by a Mobile Robot Thesis Proposal Paul Blaer January 5, 2005.
3D Mapping Robots Intelligent Robotics School of Computer Science Jeremy Wyatt James Walker.
Autonomous Landing Hazard Avoidance Technology (ALHAT) Page 1 March 2008 Go for Lunar Landing Real-Time Imaging Technology for the Return to the Moon Dr.
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Navigation Systems for Lunar Landing Ian J. Gravseth Ball Aerospace and Technologies Corp. March 5 th, 2007 Ian J. Gravseth Ball Aerospace and Technologies.
Pacific Northwest Transportation Consortium University Transportation Center For Region 10 PacTrans Safety Research and Education Overviews Universities.
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
1 DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Second Quarterly IPR Meeting January 13, 1999 P. I.s: Leonidas J. Guibas and.
CSC 589 Lecture 22 Image Alignment and least square methods Bei Xiao American University April 13.
SEDS Research GroupSchool of EECS, Washington State University Annual Reliability & Maintainability Symposium January 30, 2002 Frederick T. Sheldon and.
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Autonomous Robot Localisation, Path Planning, Navigation, and Obstacle Avoidance The Problem: –Robot needs to know where it is? where it’s going? is there.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Challenging Environment
Project Partners AUVSI North America 10 th - 14 th August 2009 Mr Reece Clothier Prof. Rodney Walker The Smart Skies Project.
Seraj Dosenbach Greg Lammers Beau Morrison Ananya Panja.
SAILSaR Safe Autonomous Intelligent Landed Sample Return Joseph P. Martin Equinox Interscience.
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
Vision-based Landing of an Unmanned Air Vehicle
All logos in this presentation is courtesy of the Florida Institute of Technology Robotics and Spatial Systems Laboratory and the Florida Tech University.
Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D.
FY2012 demonstration experiments: Boring data (Overview) 1 〇 Boring data owned by the government - KuniJiban(MLIT) 〇 Prefectural, city and town boring.
Autonomous Air & Ground Surveillance Unit Objectives, Preliminary Specifications, and Option Analysis.
Spin Image Correlation Steven M. Kropac April 26, 2005.
3D Technology and the Section 106 Process Matt Diederich Archaeologist Oregon State Historic Preservation Office (SHPO) Oregon Heritage Programs Division.
MTP FY03 Year End Review – Oct 20-24, Visual Odometry Yang Cheng Machine Vision Group Section 348 Phone:
Probabilistic Coverage in Wireless Sensor Networks Authors : Nadeem Ahmed, Salil S. Kanhere, Sanjay Jha Presenter : Hyeon, Seung-Il.
10/19/2005 ACGSC Fall Meeting, Hilton Head SC Copyright Nascent Technology Corporation © 2005 James D. Paduano 1 NTC ACTIVITIES 2005 Outline 1)Activities.
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Active Vision Sensor Planning of CardEye Platform Sherif Rashad, Emir Dizdarevic, Ahmed Eid, Chuck Sites and Aly Farag ResearchersSponsor.
Intelligence Surveillance and Reconnaissance System for California Wildfire Detection Presented by- Shashank Tamaskar Purdue University
LiDAR (Light Detection And Ranging) is a technology that can be used as a topographic survey for many projects, including wetland creation. Flying LiDAR.
State Estimation for Autonomous Vehicles
Global Hybrid Control and Cooperative Mobile Robotics Yi Guo Center for Engineering Science Advanced Research Computer Science and Mathematics Division.
Using Adaptive Tracking To Classify And Monitor Activities In A Site W.E.L. Grimson, C. Stauffer, R. Romano, L. Lee.
SAS_05_Contingency_Lutz_Tal1 Contingency Software in Autonomous Systems Robyn Lutz, JPL/Caltech & ISU Doron Tal, USRA at NASA Ames Ann Patterson-Hine,
Rover and Instrument Capabilities Life in the Atacama 2004 Science & Technology Workshop Michael Wagner, James Teza, Stuart Heys Robotics Institute, Carnegie.
An Algorithm to Follow Arbitrarily Curved Paths Steven Kapturowski.
Problem Description: One line explanation of the problem to be solved Problem Description: One line explanation of the problem to be solved Proposed Solution:
Seeker kick-off workshop LAAS involvement in the project Simon LACROIX & Bach Van PHAM Laboratoire d’Analyse et d’Architecture des Systèmes CNRS, Toulouse.
Application of Stereo Vision in Tracking *This research is supported by NSF Grant No. CNS Opinions, findings, conclusions, or recommendations.
Heterogeneous Teams of Modular Robots for Mapping and Exploration by Grabowski et. al.
Pre-decisional – for Planning and Discussion Purposes Only 1 Technology Planning for Future Mars Missions Samad Hayati Manager, Mars Technology Program.
Reading 1D Barcodes with Mobile Phones Using Deformable Templates.
Integrating LiDAR Intensity and Elevation Data for Terrain Characterization in a Forested Area Cheng Wang and Nancy F. Glenn IEEE GEOSCIENCE AND REMOTE.
The Sk e y e ball Tracking Project Danko Antolovic, Chatham College, Pittsburgh, Pennsylvania Bryce Himebaugh, Indiana University, Bloomington, Indiana.
Louise Hunter. Background Search & Rescue Collapsed caves/mines Natural disasters Robots Underwater surveying Planetary exploration Bomb disposal.
3D Perception and Environment Map Generation for Humanoid Robot Navigation A DISCUSSION OF: -BY ANGELA FILLEY.
Autonomous Navigation of a
STEREO VISION BASED MOBILE ROBOT NAVIGATION IN ROUGH TERRAIN
Marginalization and sliding window estimation
Session Chair: Dr. Sam W. Thurman
Vision Based Autonomous Control of a Quadcopter
UAV Vision Landing Motivation Data Gathered Research Plan Background
How(UAVs) are used in Disaster Management
Vehicle Segmentation and Tracking in the Presence of Occlusions
DIGITAL OBJECT GATHERER
Overview: Chapter 2 Localization and Tracking
Presentation transcript:

Algorithm Implementation: Safe Landing Zone Identification Presented by Noah Kuntz

Problem Under Investigation UAV flying in unknown terrain Typically helicopter Map terrain Vision LIDAR Identify landing sites Hazard free Terrain is suitable Large enough to fit UAV

Papers Reviewed “Towards Vision-Based Safe Landing for an Autonomous Helicopter” Pedro J. Garcia-Pardo, Gaurav S. Sukhatme and James F. Montgomery Robotics and Automated Systems 2001 “Vision Guided Landing of an Autonomous Helicopter in Hazardous Terrain” Andrew Johnson, James Montgomery and Larry Matthies International Conference on Robotics and Automation 2005 “The JPL Autonomous Helicopter Testbed: A Platform for Planetary Exploration Technology Research and Development” James F. Montgomery, Andrew E. Johnson, Stergios I. Roumeliotis, and Larry H. Matthies Journal of Field Robotics 2006 “Lidar-based Hazard Avoidance for Safe Landing on Mars” Andrew Johnson, Allan Klumpp, James Collier and Aron Wolf AIAA Journal of Guidance, Control and Dynamics 2002

Algorithm Selection – Option 1 Source: “Towards Vision-Based Safe Landing for an Autonomous Helicopter” Strengths: Using vision requires low weight camera Processing power required is not high Weaknesses: Assumes flat underlying terrain, which severely limits practical usage. Assumes high contrast between obstacles and underlying terrain, risks failure to detect some objects. Could pick to land on an obstacle if it is large enough.

Algorithm Selection – Option 2 Source: “Vision Guided Landing of an Autonomous Helicopter in Hazardous Terrain,” “The JPL Autonomous Helicopter Testbed: A Platform for Planetary Exploration Technology Research and Development” Strengths: Using vision requires low weight camera Considering slope as well as roughness of underlying terrain produces a robust cost map in most terrain conditions Weaknesses: Requires extensive vision processing

Algorithm Selection – Option 3 Source: “Lidar-based Hazard Avoidance for Safe Landing on Mars” Strengths: Using LIDAR produces accurate terrain maps at wide angles and moderate processing power Considering slope as well as roughness of underlying terrain produces a robust cost map in most terrain conditions Weaknesses: Most LIDAR systems are heavy

Algorithm Choice – Option 3 Reasoning: Most robust in terms of accurately detecting obstacles Other than the sensor, identical to the 2 nd best choice, option 2 Mitigation of Weaknesses: Helicopter must be capable of lifting sufficient weight

Overview of Algorithm Implementation Digital Elevation Map SICK data is interpreted and flattened Pose and position at each SICK scan point is recorded from UAV autopilot Elevation map is generated by correlating the scanned depths with the position and pose at which they were recorded

Safe Landing Zone Elevation map is analyzed in units the size of the lander footprint, incremented at a fraction of the footprint size Planes are fit to each unit using least mean squares Slope of each plane is calculated from the center of the region Roughness is calculated as the difference between the original map and the fitted map Roughness and slope maps are normalized and added to produce the cost map Cost map is blurred to prevent landing on a good zone adjacent to a highly unsafe zone Landing zone is found in resulting image as the minimum cost point Overview of Algorithm Implementation

Hypothetical terrain was generated with a graphics program Generated Terrain

Fit planes to the original data Landing Zone Algorithm Step 1 Original: Fitted:

Find the slopes of the fitted planes Find the roughness based on the difference between the original maps and the fitted planes Landing Zone Algorithm Step 2,3 Normalized Slope Cost:Normalized Roughness Cost:

The cost map is produced by the adding the normalized roughness and slope cost, then blurred with a 3x3 Gaussian Landing Zone Algorithm Step 4 Total Cost:Blurred Cost:

The landing zone is chosen as the minimum cost point Landing Zone Algorithm Step 5 Map With Landing Zone:Image With Landing Zone:

More Examples