Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme.

Slides:



Advertisements
Similar presentations
Georgia Tech Aerial Robotics Dr. Daniel P Schrage Jeong Hur Fidencio Tapia Suresh K Kannan SUCCEED Poster Session 6 March 1997.
Advertisements

Design Presentation Spring 2009 Andrew Erdman Chris Sande Taoran Li.
Literature Review: Safe Landing Zone Identification Presented by Keith Sevcik.
Learning Parameterized Maneuvers for Autonomous Helicopter Flight Jie Tang, Arjun Singh, Nimbus Goehausen, Pieter Abbeel UC Berkeley.
Terrain Relative Navigation for Pinpoint Landing using Cubesats
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
The European Project sFly: Swarm of Micro Flying Robots EU FP7,
Parth Kumar ME5643: Mechatronics UAV ATTITUDE AND HEADING HOLD SYSTEM.
Sponsors Mechanical Improvements Software The software is written in C++ and Python using the Robot Operating System (ROS) framework. The ROS tool, rviz,
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
Automatic Control & Systems Engineering Autonomous Systems Research Mini-UAV for Urban Environments Autonomous Control of Multi-UAV Platforms Future uninhabited.
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Image Processing of Video on Unmanned Aircraft Video processing on-board Unmanned Aircraft Aims to develop image acquisition, processing and transmission.
Augmented Reality: Object Tracking and Active Appearance Model
Presented by Pat Chan Pik Wah 28/04/2005 Qualifying Examination
California Car License Plate Recognition System ZhengHui Hu Advisor: Dr. Kang.
POLI di MI tecnicolano VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso, D. Leonello Politecnico.
Vehicle Tracking/Payload Release System For Small UAV Project Team
Client: Space Systems & Controls Laboratory (SSCL) Advisor : Matthew Nelson Anders Nelson (EE) Mathew Wymore (CprE)
Use of FOS for Airborne Radar Target Detection of other Aircraft Example PDS Presentation for EEE 455 / 457 Preliminary Design Specification Presentation.
Advisor:趙春棠 Postgraduate:王嘉帷
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
An INS/GPS Navigation System with MEMS Inertial Sensors for Small Unmanned Aerial Vehicles Masaru Naruoka The University of Tokyo 1.Introduction.
West Virginia University
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Real-time Dense Visual Odometry for Quadrocopters Christian Kerl
Simultaneous Estimations of Ground Target Location and Aircraft Direction Heading via Image Sequence and GPS Carrier-Phase Data Luke K.Wang, Shan-Chih.
SPIE'01CIRL-JHU1 Dynamic Composition of Tracking Primitives for Interactive Vision-Guided Navigation D. Burschka and G. Hager Computational Interaction.
Computational Mechanics and Robotics The University of New South Wales
Navi Rutgers University 2012 Design Presentation
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
3D SLAM for Omni-directional Camera
Vision-based Landing of an Unmanned Air Vehicle
Sérgio Ronaldo Barros dos Santos (ITA-Brazil)
AEM 5333 UAV Search and Surveillance. Mission Description Overhead surveillance and tracking – Humans on foot – Moving vehicles Onboard GPS transceiver.
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan Tongmyong University.
Robust localization algorithms for an autonomous campus tour guide Richard Thrapp Christian Westbrook Devika Subramanian Rice University Presented at ICRA.
IMPROVE THE INNOVATION Today: High Performance Inertial Measurement Systems LI.COM.
Stereo Object Detection and Tracking Using Clustering and Bayesian Filtering Texas Tech University 2011 NSF Research Experiences for Undergraduates Site.
COBXXXX EXPERIMENTAL FRAMEWORK FOR EVALUATION OF GUIDANCE AND CONTROL ALGORITHMS FOR UAVS Sérgio Ronaldo Barros dos Santos,
10/19/2005 ACGSC Fall Meeting, Hilton Head SC Copyright Nascent Technology Corporation © 2005 James D. Paduano 1 NTC ACTIVITIES 2005 Outline 1)Activities.
Cooperative Air and Ground Surveillance Wenzhe Li.
Sensor Pack Binary Tone Generator Rabbit Processor IMUGPS Receiver Xbee Radio Modem Data Sink GPS Satellites Handset for Binary Tone Generator Operator:
1. COMMUNICATION Liam O’Sullivan  Used XBee RF 2.4 GHz modules for telemetry  Point to point communication (platform and GCS)  Disadvantages.
Towards Establishing and Maintaining Autonomous Quadrotor Formations Audrow J. Nash William Lee College of Engineering University of North Carolina at.
Source: Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on Author: Paucher, R.; Turk, M.; Adviser: Chia-Nian.
Based on the success of image extraction/interpretation technology and advances in control theory, more recent research has focused on the use of a monocular.
FUFO project Final report.
Existing Draganflyer Projects and Flight Model Simulator Demo Chayatat Ratanasawanya 5 February 2009.
March /5/2016 At A Glance STARS is a real-time, distributed, multi-spacecraft simulation system for GN&C technology research and development. It.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Su-ting, Chuang 1. Outline Introduction Related work Hardware configuration Detection system Optimal parameter estimation framework Conclusion 2.
Visual Odometry David Nister, CVPR 2004
IEEE International Conference on Multimedia and Expo.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
1 Center for the Collaborative Control of Unmanned Vehicles (C3UV) UC Berkeley Karl Hedrick, Raja Sengupta.
Ground Control Station Flight conTrol
Optic Flow QuadCopter Control
Chayatat Ratanasawanya May 18, Overview Recalls Progress & Achievement Results 2.
Software Narrative Autonomous Targeting Vehicle (ATV) Daniel Barrett Sebastian Hening Sandunmalee Abeyratne Anthony Myers.
By: Stuti Vyas( ) Drashti Sheth( ) Jay Vala( ) Internal Guide Mr. J. N. Patel.
Avatar Explore PROPOSAL NUMBERS Not available SCIENCE TEAM Erick Dupuis, Canadian Space Agency RESEARCH OBJECTIVES Demonstrate an operational and technical.
Younis H. Karim, AbidYahya School of Computer University Malaysia Perlis 1.
1 26/03/09 LIST – DTSI Design of controllers for a quad-rotor UAV using optical flow Bruno Hérissé CEA List Fontenay-Aux-Roses, France
Pursuit-Evasion Games with UGVs and UAVs
Vision Based Motion Estimation for UAV Landing
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Sensor Fusion Localization and Navigation for Visually Impaired People
Multi-UAV to UAV Tracking
Presentation transcript:

Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme

 Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

 Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

 Goal:  vision-guided autonomous flying robots  Application:  Law enforcement, search and rescue, inspection and surveillance  Technique:  Object detection, tracking, inertial navigation, GPS and nonlinear system modeling

 In this paper:  Two UAVs – Avatar and COLIBRI  Visual tracking => control commands

 Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

 Hummingbird (A. Conway, 1995)  Model-scale  Use GPS only  4 GPS antennas  Precisions: position 1cm attitude 1 degree

 AVATAR (Jun, 1999)  Onboard INS & GPS  Kalman Filter for State Estimation  Simulation

 Vision-guided Helicopter (Amidi, 1996, 1997)  Onboard DSP-based vision processor  Combine GPS and IMU data

 Vision-augmented navigation system (Bosse, 1997)  Uses vision in-the-loop to control a helicopter  Visual odometer (Amidi, 1998)  A notable vision-based technique used in autonomous helicopter  (Wu, et al, 2005)  Vision is used as additional sensor and fused with inertial and heading measurements for control

 Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

 AVATAR  Gas-powered radio-controlled model helicopter  RT-2 DGPS system provides positional accuracy of 2 cm  ISIS-IMU provides rate information to onboard computer, which is fused using a 16 state Kalman filter  Ground station: a laptop to send high-level control commands and differential GPS corrections  Autonomous flight is achieved using a behavior-based control architecture

 COLIBRI  Gas powered model helicopter  Fitted with a Xscale based flight computer augmented with GPS, IMU, Magnetometer, fused with a Kalman filter  VIA mini-ITX 1.25 GHz computer onboard with 512 Mb RAM, wireless interface and a firewire color camera  Ground station: a laptop to send high-level control commands, and for visualization

 Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

 Image segmentation and thresholding  Convert the image to grayscale  Use the value of “target color” as threshold  Segment the image to binary image where the object of interest is represented by 1’s and background with 0’s

 Square Finding  Find contours (represented by polylines) from the binary image  Use an algorithm to reduce the points in polylines  Result: simplified squares

 Template Matching  User selects a detected window (a target)from the GUI  A patch is selected around the location of the target  Use local search window to find best match between the target and the detected contours, deciding which window to track

 Kalman Filter  Once a suitable match is found, a Kalman filter is used to track the feature positions  Input: x and y coordinates of the features  Output: estimates of these coordinates in the next frame

 The user selects the object of interest from the GUI  The location of the object is used to generate visual reference

 Lateral visual reference

 Vertical Visual Reference

 Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

 A hierarchical behavior based control architecture  Output of Kalman filter is compared with desired values to give an error signal to controller

 Controller is based on a decoupled PID control

 Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

 At Del Valle Urban Search and Rescue Training site in Santa Clarita, California  AVATAR, four trials  First, the helicopter is commanded to fly autonomously to a given GPS waypoint  As soon as it detects the featured window, the controller switches from GPS-based to vision-based control

Location of the features in the image

Helicopter position in meters. (left figure) vertical axis– easting (right figure) vertical axis – northing

 At ETSII Campus in Madrid, Spain  COLIBRI  Seven experimental trials on two different days

Velocity references (vy r ) with the helicopter velocity (vy) Lateral displacement (east)

Velocity references (vz r ) with the helicopter velocity (vz) altitude displacement (down)

Helicopter displacements during the entire flight trial

 colibrivideoWeb.wmv colibrivideoWeb.wmv

 Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

 Demonstrated an approach to visually control an autonomous helicopter: use visual algorithm to command UAV when GPS has dropouts  Experimentally demonstrated by performing vision-based window tracking tasks on two different platforms at different locations and different conditions

 The topic is interesting  Visual algorithm is demonstrated effective in the experiments  But… the writing is so ugly.  Poor explanation ▪ features, template and matching  Incomplete explanation of figures