MAV Optical Navigation Software System April 30, 2012 Tom Fritz, Pamela Warman, Richard Woodham Jr, Justin Clark, Andre DeRoux Sponsor: Dr. Adrian Lauf.

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

Sumitha Ajith Saicharan Bandarupalli Mahesh Borgaonkar.
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
October 28, 2011 Adrian Fletcher (CECS), Jacob Schreiver (ECE), Justin Clark (CECS), & Nathan Armentrout (ECE) Sponsor: Dr. Adrian Lauf 1.
Katie Dellaquila Jeremy Nelson Khiem Tong.  Project Overview [KED]  Multidisciplinary Aspects [KED]  Motivation (Similar Products) [KED]  System Schematic.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
OpenCV Stacy O’Malley CS-590 Summer, What is OpenCV? Open source library of functions relating to computer vision. Cross-platform (Linux, OS X,
Department of Electrical and Computer Engineering Texas A&M University College Station, TX Abstract 4-Level Elevator Controller Lessons Learned.
I-SOBOT SOCCER Padmashri Gargesa Intelligent Robotics I I (Winter 2011)
Cindy Song Sharena Paripatyadar. Use vision for HCI Determine steps necessary to incorporate vision in HCI applications Examine concerns & implications.
Eye Tracking Project Project Supervisor: Ido Cohen By: Gilad Ambar
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
1 Video Surveillance systems for Traffic Monitoring Simeon Indupalli.
I mage and M edia U nderstanding L aboratory for Performance Evaluation of Vision-based Real-time Motion Capture Naoto Date, Hiromasa Yoshimoto, Daisaku.
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
A Brief Overview of Computer Vision Jinxiang Chai.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Real-Time High Resolution Photogrammetry John Morris, Georgy Gimel’farb and Patrice Delmas CITR, Tamaki Campus, University of Auckland.
Multimedia Databases (MMDB)
Real-time object tracking using Kalman filter Siddharth Verma P.hD. Candidate Mechanical Engineering.
A General Framework for Tracking Multiple People from a Moving Camera
Object Tracking Using Autonomous Quad Copter Carlos A Munoz, Advisor: Dr. Tarek Sobh Robotics, Intelligent Sensing & Control (RISC) Lab., School of Engineering,
Update September 14, 2011 Adrian Fletcher, Jacob Schreiver, Justin Clark, & Nathan Armentrout.
CSCE 5013 Computer Vision Fall 2011 Prof. John Gauch
High-Resolution Interactive Panoramas with MPEG-4 발표자 : 김영백 임베디드시스템연구실.
FPGA-based Platform for Real-Time Stereo Vision Sergiy Zhelnakov, Pil Woo (Peter) Chun, Valeri Kirischian Supervisor: Dr. Lev Kirischian Reconfigurable.
Efficient Visual Object Tracking with Online Nearest Neighbor Classifier Many slides adapt from Steve Gu.
MASKS © 2004 Invitation to 3D vision. MASKS © 2004 Invitation to 3D vision Lecture 1 Overview and Introduction.
Realtime Robotic Radiation Oncology Brian Murphy 4ECE.
ECE 4007 L01 DK6 1 FAST: Fully Autonomous Sentry Turret Patrick Croom, Kevin Neas, Anthony Ogidi, Joleon Pettway ECE 4007 Dr. David Keezer.
Spatiotemporal Saliency Map of a Video Sequence in FPGA hardware David Boland Acknowledgements: Professor Peter Cheung Mr Yang Liu.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
Video Surveillance Under The Guidance of Smt. D.Neelima M.Tech., Asst. Professor Submitted by G. Subrahmanyam Roll No: 10021F0013 M.C.A.
Visual Odometry David Nister, CVPR 2004
Zack Nemes By: Clemence Larroche. To track and follow a car as it travels along a path.
IEEE International Conference on Multimedia and Expo.
Optic Flow QuadCopter Control
Person Following with a Mobile Robot Using Binocular Feature-Based Tracking Zhichao Chen and Stanley T. Birchfield Dept. of Electrical and Computer Engineering.
MASKS © 2004 Invitation to 3D vision. MASKS © 2004 Invitation to 3D vision Lecture 1 Overview and Introduction.
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
Portable Camera-Based Assistive Text and Product Label Reading From Hand-Held Objects for Blind Persons.
Motion tracking TEAM D, Project 11: Laura Gui - Timisoara Calin Garboni - Timisoara Peter Horvath - Szeged Peter Kovacs - Debrecen.
Contents Introduction Requirements Design Technology Working Interaction.
OpenCV C++ Image Processing
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
© Copyright AD Group Overview Presentation. © Copyright AD Group Introduction  Entry-level Video Server to the DV-IP Range  Offers incredible price–beating.
SMART CAMERAS AS EMBEDDED SYSTEM SMVEC. SMART CAMERA  See, think and act  Intelligent cameras  Embedding of image processing algorithms  Can be networked.
Automatic License Plate Recognition for Electronic Payment system Chiu Wing Cheung d.
Introduction to LEGO Mindstorms EV3 What is in the box?
Self-Navigation Robot Using 360˚ Sensor Array
Hand Gestures Based Applications
Signal and Image Processing Lab
Air Hockey Robot Students: Abdullah Ahmad & Moath Omar Supervisor:
MAV Optical Navigation
Video-based human motion recognition using 3D mocap data
Fast and Robust Object Tracking with Adaptive Detection
Team A – Perception System using Stereo Vision and Radar
MAV Optical Navigation
Optical Flow For Vision-Aided Navigation
Eric Grimson, Chris Stauffer,
Introduction Computer vision is the analysis of digital images
Mixed Reality Server under Robot Operating System
Nanyang Technological University
Introduction Computer vision is the analysis of digital images
MAV Optical Navigation
PRELIMINARY DESIGN REVIEW
Multi-UAV Detection and Tracking
Multi-UAV to UAV Tracking
Report 2 Brandon Silva.
Presentation transcript:

MAV Optical Navigation Software System April 30, 2012 Tom Fritz, Pamela Warman, Richard Woodham Jr, Justin Clark, Andre DeRoux Sponsor: Dr. Adrian Lauf

Background – Micro Aerial Vehicles (MAVs) A subset of Unmanned Aerial Vehicles (UAVs) – Predator – Raptor Very small, maneuverable, and lightweight MAV Categories – Fixed-wing – Rotary-wing – Flapping-wing Used for homeland security & battlefield applications – Surveillance – Reconnaissance

Flapping-Wing MAV

Purpose Develop an optical navigation software subsystem – User selectable destination – Semi-autonomous operation – Adaptable for flapping-wing MAVs – Operates in closed, static environment Classroom with tables and chairs No moving objects

Project Concept Develop a software navigation system using a camera that will eventually be used on an MAV Current testing and implementation will be using a wired webcam to prove functionality

System Requirements and Restrictions Requirements: – Create 3D model of the environment – Plan a path from current location to a selected destination – Communicate real-time navigation output – Work in any closed, static environment Restrictions – Monocular camera

Hardware Overview

Camera Restrictions: – Low Resolution – 30 frames per second Usage: – A video is comprised of static pictures (frames) that are streamed together to mimic real time motion. – A simplified example is a flipbook. – Different calculations are based on differenced from each frame to the next.

Software Overview

Software Overview Fall 2011

Software Overview Spring 2012

Object Discovery

Goal: Find prominent objects in view Purpose: – Initialize bounding box for object tracking & recognition module – Initialize landmarks Execution: – “Snake” algorithm – Blob Detection (New)

Object Discovery Algorithm

Blob Detection Purpose: To assist with finding the contours of objects for the Snake Algorithm Execution: – Convert image to binary image – Apply Thresholding – Smooth the Filter-Median

Snake (Active Contour) Actual Image

Threshold Image Purpose – To remove and ignore all other color from objects in the image except the colors specified. (ex. Red, Blue, Green) Proceess – – Use opencv function InRangeS() to select color range – Use opencv function Smooth and Erode to form blob

Snake (Active Contour W/Blobs) Threshold Image

Blob Detection Advantages – Very fast – Accurate – Can locat multiple objects in view Disadvantages – Does not track objects but simply redraws bounding boxes each frame – Does not work well with more difficult objects only simple one colored objects

Blob Detection Demonstration

Object Recognition and Tracking

Object Tracking & Recognition Goal: Create an object model Purpose: Help self-location Execution: – Tracking: Lucas-Kanade short term tracker – Recognition: Random forest machine learning using Haar features

Object Tracking & Recognition

Egomotion Estimation

Goal: Estimate the Egomotion of the camera Purpose: – Estimate the 3D motion of the camera “How has the camera moved?” – Provide information that can lead to 3D reconstruction Execution: – Optical Flow Lucas-Kanade Other Optical Flow methods Farneback dense Optical Flow

Egomotion Estimation

Optical Flow Definition: The pattern of apparent motion of objects, surfaces, and edges in a visual scene. Purpose: – Calculate the motion of points from frame to frame “How much has a point in a frame moved since the last frame.” – Provides data for 3D reconstruction and Egomotion Estimation Future work: – Use Optical Flow data for accurate Egomotion estimation – Test calculated values using Egomotion Emulator

Optical Flow Example Generated Using Lucas-Kanade Optical Flow Method

3D Reconstruction

Goal: Create a 3D map of the environment Purpose: – Determine the 3D map of the environment – Provide information that can lead to Path Planning Execution: – Future work needed – Structure From Motion, Stereo Vision Techniques

Object Recognition and Tracking

Path Planning Goal: Plan a path from one location to the next location Purpose: – Plan a path to a user specified location – Use the 3D reconstruction Execution: – Future work needed

Test bed Approach Purpose Simulates predetermined motion and captures truth data for comparison with the calculated Egomotion software Egomotion estimation module (software) Provide precise motion for one rotational axis and three translational axes (X,Y,Z) Hardware of Emulator Lego Mindstorms® kit utilized to create emulator Controller MIT HandyBoard Programmed in Interactive C LCD for controller feedback

3 Axis Egomotion Emulator

Design Considerations Cost Lego Mindstorms borrowed $0 Small 12” x 13”CNC machine cost about $3k Actual MAV is approx. $15k Input Simulate motion via predetermined paths Line follow via Infrared (IR) sensors Output IR sensor on each axes of translation 1 cm resolution (accuracy) LCD screen on controller for real time feedback

Testing Procedures Setup Assemble three sections (X,Y,Z) Necessary due to storage and portability Power on Controller and connect to PC via USB Download appropriate source code to controller Calibrate IR sensors values in source code for accurate feedback (IR_tester.ic) Testing Run program and collect data from controller LCD screen and compare with calculated Egomotion estimation data

Controller

3-Axis Egomotion Emulator

Emulator Accomplishments Completed XYZ translation XYZ IR sensor data acquisition Future goals Rotation about Z axis Code predetermined paths via IR sensors to test with software

Questions?

3D Reconstruction

Path Planning