Presentation is loading. Please wait.

Presentation is loading. Please wait.

Artificial Vision-Based Tele-Operation for Lunar Exploration Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Bonnie Stern, Nicholas Logan,

Similar presentations


Presentation on theme: "Artificial Vision-Based Tele-Operation for Lunar Exploration Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Bonnie Stern, Nicholas Logan,"— Presentation transcript:

1 Artificial Vision-Based Tele-Operation for Lunar Exploration Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Bonnie Stern, Nicholas Logan, Stephanie Herd NASA JSC Mentors Dr. Bob Savely Dr. Mike Goza Project Mentor Dr. Giovanni Giardini Project Advisor Prof. Tamás Kalmár-Nagy

2 Project Members  Electrical Engineering  Computer Engineering  Nuclear Engineering  Mechanical Engineering  Aerospace Engineering  Nicholas Logan  Stephanie Herd  Aaron Roney  Albert Soto  Bonnie Stern  Brian Kuehner  David Taylor  Freshman  Sophomore  Junior  Senior

3 Outline  Motivation and Objectives  Ego-Motion Theory  Code Flow  Calibration and Rectification  Hardware  Testing Results  Future Work

4 Motivation  Lunar surface exploration  Human perspective  In safety  With low risk  3D environment reconstruction  Self location with artificial vision system

5 Objectives  Vision System  Ego-Motion estimation  Environment reconstruction Visual Feedback System for Tele-Operations  Tele-Operation System  Remote control mobile unit  Hardware and Mechanical Implementation

6 Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless 802.11 Network

7 Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless 802.11 Network Ego-Motion Theory

8 3D Reconstruction Theory Left image Right image u left p u right p v left p v right p u left p  It is impossible to compute the 3D coordinates of an object with a single image  Solution: Stereo Cameras  Disparity computation 3D reconstruction Image

9  Disparity map computation:  Given 2 images, it is a collection of pixel disparities  Point distances can be calculated from disparities Environment can be reconstructed from disparity map Left ImageRight ImageDisparity Map Environment Reconstruction

10 Perspective Projection Equation  Main goal: Evaluate the motion (translation and rotation) of the vehicle from sequences of images Ego-Motion Estimation  Solving will give change in position of the vehicle Optical Flow Example  Optical Flow is related to vehicle movement through the Perspective Projection Equation  Least Square solution

11 Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless 802.11 Network Code Flow

12 Sony VAIO - Pentium 4 Logitech QuickCam Deluxe Image Processing Code Calibration Parameters Acquire Images Rectify Images Ego-Motion Estimation Wireless 802.11 Network Ground Station

13 Mobile Unit Detailed Code Calibration Parameters Snapshot Image Matrix Image Parameters: Gray Scale (640x480) … Image Parameters: Gray Scale (640x480) … Acquire ImageRectify Images Rectified Image Matrix Save Image T = 0.15 secT = 0.5 sec Ego-Motion Estimation Apply Distortion Coefficient to Image Matrix Wireless 802.11 Network Ground Station

14 Ego-Motion Estimation Overview Find Features in Right Image Calibration Parameters Right Image Left Image Track Right Image Features in Left Image New Right Image New Left Image Find Features in New Right Image Find Features in Left Image Find Features in New Left Image Track Right Image Features in New Right Image Track Right Image Features in New Left Image Discard All non- Identical Points in All images Displacement Vector (X, Y, Z, X- Rot, Y-Rot, Z- Rot) Displacement Vector (X, Y, Z, X- Rot, Y-Rot, Z- Rot) T = 3 sec Image Feature Matrix Wireless 802.11 Network

15 Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless 802.11 Network Calibration and Rectification

16  Calibration: Utilizes Matlab tools to determine image distortion associated with the camera  Rectification: Removes the distortion in the images

17 Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless 802.11 Network Hardware

18 Mobile Unit TROPOS Router Laptop Web Cameras Mobile UnitBase Station Linksys Router Operator Computer Command Computer Wireless 802.11 Wireless 802.11

19 Improvements Implemented in the System  Improved robustness of the software  Implemented a menu driven system for the operator using Matlab’s network handling protocol  Allowed pictures to be taken  Run Ego-motion  Sending all the results to the operator  Graphic displaying of optical flow  Reduced crashing  Achieved greater mobile unit control

20 Mobile Unit Vehicle Courtesy of Prof. Dezhen Song Baseline D L FOV 1 FOV 2 α Horizontal View  Camera support system  3-DOF mechanical neck:  Panoramic rotation  Tilt rotation  Telescopic capability  Controlled height and baseline length

21 Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless 802.11 Network Testing Result

22 Test Environment Light to simulate solar exposure Black background to eliminate background features Lunar Environment Walls to eliminate stray light and side shadows Measured displacements

23 Test Setup  25 pictures taken from each location (0, 5, 10 and 15 cm) in the Z direction (perpendicular to camera focal plane), unidirectional movement  Set 1 25 images located at Z=0  Set 2 25 images located at Z=5  Set 3 25 images located at Z=10  Set 4 25 images located at Z=15  The distances are measured using a tape measure  The cameras are mounted using a semi ridged fixture

24 Determining the Number of Features  The standard deviation decreases with the more features  But the accuracy of the results decrease 100 Features were selected Results for 5 cm displacement Used all 100 images Compared each set to the previous

25 Ego-Motion: Example Optical Flow Left Image Optical Flow Right Image RANSAC degree5 cmStd Dev10 cmStd Dev15 cmStd Dev 55.313.659.434.3913.003.55 154.862.098.303.4013.656.39 304.351.668.214.0315.276.01

26 Problems  Images were not rectified  Possible motion of cameras between images  No image filtering  Camera mounting is misaligned  Images acquired from the right camera appear blurry

27 Conclusions and Future Work  Demonstrated:  Ego-motion estimation  Environment Reconstruction  Vehicle control and movement  System integration  Future Developments:  Filtering and improving results  Increase the robustness of the vision system  Create a visual 3D environment map

28  Thanks to: –Prof. Tamás Kalmár-Nagy –Dr. Giovanni Giardini –Prof. Dezhen Song –Change Young Kim –Magda Lagoudas –Tarek Elgohary –Pedro Davalos Acknowledgements


Download ppt "Artificial Vision-Based Tele-Operation for Lunar Exploration Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Bonnie Stern, Nicholas Logan,"

Similar presentations


Ads by Google