MAV Optical Navigation Software System April 30, 2012 Tom Fritz, Pamela Warman, Richard Woodham Jr, Justin Clark, Andre DeRoux Sponsor: Dr. Adrian Lauf
Background – Micro Aerial Vehicles (MAVs) A subset of Unmanned Aerial Vehicles (UAVs) – Predator – Raptor Very small, maneuverable, and lightweight MAV Categories – Fixed-wing – Rotary-wing – Flapping-wing Used for homeland security & battlefield applications – Surveillance – Reconnaissance
Flapping-Wing MAV
Purpose Develop an optical navigation software subsystem – User selectable destination – Semi-autonomous operation – Adaptable for flapping-wing MAVs – Operates in closed, static environment Classroom with tables and chairs No moving objects
Project Concept Develop a software navigation system using a camera that will eventually be used on an MAV Current testing and implementation will be using a wired webcam to prove functionality
System Requirements and Restrictions Requirements: – Create 3D model of the environment – Plan a path from current location to a selected destination – Communicate real-time navigation output – Work in any closed, static environment Restrictions – Monocular camera
Hardware Overview
Camera Restrictions: – Low Resolution – 30 frames per second Usage: – A video is comprised of static pictures (frames) that are streamed together to mimic real time motion. – A simplified example is a flipbook. – Different calculations are based on differenced from each frame to the next.
Software Overview
Software Overview Fall 2011
Software Overview Spring 2012
Object Discovery
Goal: Find prominent objects in view Purpose: – Initialize bounding box for object tracking & recognition module – Initialize landmarks Execution: – “Snake” algorithm – Blob Detection (New)
Object Discovery Algorithm
Blob Detection Purpose: To assist with finding the contours of objects for the Snake Algorithm Execution: – Convert image to binary image – Apply Thresholding – Smooth the Filter-Median
Snake (Active Contour) Actual Image
Threshold Image Purpose – To remove and ignore all other color from objects in the image except the colors specified. (ex. Red, Blue, Green) Proceess – – Use opencv function InRangeS() to select color range – Use opencv function Smooth and Erode to form blob
Snake (Active Contour W/Blobs) Threshold Image
Blob Detection Advantages – Very fast – Accurate – Can locat multiple objects in view Disadvantages – Does not track objects but simply redraws bounding boxes each frame – Does not work well with more difficult objects only simple one colored objects
Blob Detection Demonstration
Object Recognition and Tracking
Object Tracking & Recognition Goal: Create an object model Purpose: Help self-location Execution: – Tracking: Lucas-Kanade short term tracker – Recognition: Random forest machine learning using Haar features
Object Tracking & Recognition
Egomotion Estimation
Goal: Estimate the Egomotion of the camera Purpose: – Estimate the 3D motion of the camera “How has the camera moved?” – Provide information that can lead to 3D reconstruction Execution: – Optical Flow Lucas-Kanade Other Optical Flow methods Farneback dense Optical Flow
Egomotion Estimation
Optical Flow Definition: The pattern of apparent motion of objects, surfaces, and edges in a visual scene. Purpose: – Calculate the motion of points from frame to frame “How much has a point in a frame moved since the last frame.” – Provides data for 3D reconstruction and Egomotion Estimation Future work: – Use Optical Flow data for accurate Egomotion estimation – Test calculated values using Egomotion Emulator
Optical Flow Example Generated Using Lucas-Kanade Optical Flow Method
3D Reconstruction
Goal: Create a 3D map of the environment Purpose: – Determine the 3D map of the environment – Provide information that can lead to Path Planning Execution: – Future work needed – Structure From Motion, Stereo Vision Techniques
Object Recognition and Tracking
Path Planning Goal: Plan a path from one location to the next location Purpose: – Plan a path to a user specified location – Use the 3D reconstruction Execution: – Future work needed
Test bed Approach Purpose Simulates predetermined motion and captures truth data for comparison with the calculated Egomotion software Egomotion estimation module (software) Provide precise motion for one rotational axis and three translational axes (X,Y,Z) Hardware of Emulator Lego Mindstorms® kit utilized to create emulator Controller MIT HandyBoard Programmed in Interactive C LCD for controller feedback
3 Axis Egomotion Emulator
Design Considerations Cost Lego Mindstorms borrowed $0 Small 12” x 13”CNC machine cost about $3k Actual MAV is approx. $15k Input Simulate motion via predetermined paths Line follow via Infrared (IR) sensors Output IR sensor on each axes of translation 1 cm resolution (accuracy) LCD screen on controller for real time feedback
Testing Procedures Setup Assemble three sections (X,Y,Z) Necessary due to storage and portability Power on Controller and connect to PC via USB Download appropriate source code to controller Calibrate IR sensors values in source code for accurate feedback (IR_tester.ic) Testing Run program and collect data from controller LCD screen and compare with calculated Egomotion estimation data
Controller
3-Axis Egomotion Emulator
Emulator Accomplishments Completed XYZ translation XYZ IR sensor data acquisition Future goals Rotation about Z axis Code predetermined paths via IR sensors to test with software
Questions?
3D Reconstruction
Path Planning