PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only Test Plan Review MSL Focused Technology Instrument Placement Validation Test Plan for 2D/3D.

Slides:



Advertisements
Similar presentations
Chayatat Ratanasawanya Min He May 13, Background information The goal Tasks involved in implementation Depth estimation Pitch & yaw correction angle.
Advertisements

Computer Vision, Robert Pless
The fundamental matrix F
Laser Speckle Extensometer ME 53
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
CS 376b Introduction to Computer Vision 04 / 21 / 2008 Instructor: Michael Eckmann.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Terrain Relative Navigation for Pinpoint Landing using Cubesats
Computer vision: models, learning and inference
Range Imaging and Pose Estimation of Non-Cooperative Targets using Structured Light Dr Frank Pipitone, Head, Sensor Based Systems Group Navy Center for.
Active Calibration of Cameras: Theory and Implementation Anup Basu Sung Huh CPSC 643 Individual Presentation II March 4 th,
Group S3. Lab Session 5 Following on from our previous lab session decided to find the relationship between Disparity vs Camera Separation. Measured Disparity.
Use of a commercial laser tracker for optical alignment James H. Burge, Peng Su, Chunyu Zhao, Tom Zobrist College of Optical Sciences Steward Observatory.
3D Measurements by PIV  PIV is 2D measurement 2 velocity components: out-of-plane velocity is lost; 2D plane: unable to get velocity in a 3D volume. 
CAU Kiel DAGM 2001-Tutorial on Visual-Geometric 3-D Scene Reconstruction 1 The plan for today Leftovers and from last time Camera matrix Part A) Notation,
Lecture 11: Structure from motion CS6670: Computer Vision Noah Snavely.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
3D Computer Vision and Video Computing 3D Vision Lecture 14 Stereo Vision (I) CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
CS223b, Jana Kosecka Rigid Body Motion and Image Formation.
CSE473/573 – Stereo Correspondence
3-D Computer Vision Using Structured Light Prepared by Burak Borhan.
Capturing the Motion of Ski Jumpers using Multiple Stationary Cameras Atle Nes Faculty of Informatics and e-Learning Trondheim University.
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
Lecture 11 Stereo Reconstruction I Lecture 11 Stereo Reconstruction I Mata kuliah: T Computer Vision Tahun: 2010.
Geometric and Radiometric Camera Calibration Shape From Stereo requires geometric knowledge of: –Cameras’ extrinsic parameters, i.e. the geometric relationship.
Image Formation. Input - Digital Images Intensity Images – encoding of light intensity Range Images – encoding of shape and distance They are both a 2-D.
Digital Image Correlation
Lecture 12 Stereo Reconstruction II Lecture 12 Stereo Reconstruction II Mata kuliah: T Computer Vision Tahun: 2010.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
3D SLAM for Omni-directional Camera
LASER AND ADVANCES IN METROLOGY
Flow Separation for Fast and Robust Stereo Odometry [ICRA 2009]
Shape from Stereo  Disparity between two images  Photogrammetry  Finding Corresponding Points Correlation based methods Feature based methods.
1. Two long straight wires carry identical currents in opposite directions, as shown. At the point labeled A, is the direction of the magnetic field left,
Metrology 1.Perspective distortion. 2.Depth is lost.
Introduction to the Principles of Aerial Photography
PFI Cobra/MC simulator Peter Mao. purpose develop algorithms for fiducial (FF) and science (SF) fiber identification under representative operating conditions.
Stereo Many slides adapted from Steve Seitz.
Cmput412 3D vision and sensing 3D modeling from images can be complex 90 horizon 3D measurements from images can be wrong.
G.Sirri – INFN Bologna 1/19 Lateral X-ray marks finding with ESS  Goal: implementation of the lateral mark finding in the ESS software  First Test: Plate-to-plate.
Lec 22: Stereo CS4670 / 5670: Computer Vision Kavita Bala.
How to startpage 1. How to start How to specify the task How to get a good image.
MTP FY03 Year End Review – Oct 20-24, Visual Odometry Yang Cheng Machine Vision Group Section 348 Phone:
Computer Vision : CISC 4/689 Going Back a little Cameras.ppt.
Introduction to Soft Copy Photogrammetry
Autonomous Navigation Based on 2-Point Correspondence 2-Point Correspondence using ROS Submitted By: Li-tal Kupperman, Ran Breuer Advisor: Majd Srour,
Bahadir K. Gunturk1 Phase Correlation Bahadir K. Gunturk2 Phase Correlation Take cross correlation Take inverse Fourier transform  Location of the impulse.
Development of a laser slit system in LabView
1 Imaging Techniques for Flow and Motion Measurement Lecture 20 Lichuan Gui University of Mississippi 2011 Stereo High-speed Motion Tracking.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Visual Odometry David Nister, CVPR 2004
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
©Roke Manor Research Ltd 2011 Part of the Chemring Group 1 Startiger SEEKER Workshop Estelle Tidey – Roke Manor Research 26 th February 2011.
Astrobiology Science and Technology for Exploring Planets (ASTEP) Mid-Year Review August 4, 2004 Robust Autonomous Instrument Placement for Rovers (JPL:
Correspondence and Stereopsis. Introduction Disparity – Informally: difference between two pictures – Allows us to gain a strong sense of depth Stereopsis.
Seeing Things Through Stuff 1)Stare straight ahead at the word “Refraction” on the blackboard. 2)Close your right eye. 3)Hold the index finger of your.
Localization Life in the Atacama 2004 Science & Technology Workshop January 6-7, 2005 Daniel Villa Carnegie Mellon Matthew Deans QSS/NASA Ames.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
제 5 장 스테레오.
Recent developments on micro-triangulation
Paper – Stephen Se, David Lowe, Jim Little
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Motion and Optical Flow
STEREOPSIS The Stereopsis Problem: Fusion and Reconstruction
EECS 274 Computer Vision Stereopsis.
Zhigang Zhu, K. Deepak Rajasekar Allen R. Hanson, Edward M. Riseman
Filtering Things to take away from this lecture An image as a function
Spatial Data Entry via Digitizing
Course 6 Stereo.
Presentation transcript:

PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only Test Plan Review MSL Focused Technology Instrument Placement Validation Test Plan for 2D/3D Visual Target Tracking Validation Won S. Kim Robert D. Steele Adnan I. Ansar Siqi Chen August 2, 2004 (818)

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only2 Test Plan Objective and Scope Test Plan Objective –To test and validate 2D/3D visual target tracking technology –Generate the tracking reliability and error budget model furnished with experimentally validated numbers Scope –Metrology and calibration Total station metrology Mast pan/tilt positioning Mast and body camera calibration Mast calibration –Purely geometric camera handoff with 2D Refinement –Target tracking straight path on flat surface straight path on surface with small rocks winding path on surface with large rocks straight-to-target path hazard avoidance navigation –Target Tracking using MER images

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only3 System Description 2D/3D Tracking Rover Locomotor Navigator Rover Pose Estimator (Visual Odometer) Optional Camera Handoff Active Camera Control Normalized Cross Correlation 2D Tracking Arm’s reach 2D/3D Visual Tracking System Target Position (Stereo Vision) Scale Affine Matching

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only4 Without visual target tracking –3  approach error = 22.2 cm using Pancam and visual odometry (2% error) With visual target tracking –3  approach error = 1.5 cm at R= 1 m distance using Pancam initially with subsequent camera handoffs to Navcam and Hazcam Focal length (1/3” CCD camera) Field of view angles Stereo baseline Stereo range error (3  ) at 10 m distance Target approach error (3  ) with 2% navigation error Target approach error (3  ) with ideal visual tracking and camera handoff 16 mm17° × 13°30 cm9.7 cm22.2 cm1.5 cm 6 mm49° × 37°20 cm38.8 cm43.7 cm3.9 cm 2.3 mm113° × 86°10 cm202.2 cm203.2 cm10.1 cm Computing Target Approach Accuracy

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only5 Baseline Operational Scenario 1. Pancam for 4 m (from 10 m to 6 m) Minimum stereo range for Pancam is about 5 m 2. Handoff from Pancam to Navcam 3. Navcam for 4m (from 6 m to 2 m) Viewing angle for Navcam to target gets steep at about 2m 4. Handoff from Navcam to Hazcam 5. Hazcam for 1m (from 2 m to 1 m) Within arm’s reach 6. Anchor rover and place instrument

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only6 Test Variables and Performance Metric Rover motion step size Straight flat, rocky, or winding path High-texture or low-texture targets Lighting conditions Software algorithms and configuration Software parameter settings Tracking performance metrics Tracking percentage (success rate) Tracking error Experimental test variables

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only7 Tracking Reliability & Error Budget Model Rover locomotion/navigatorRover motion changes the target image, affecting the matching performance:  Target image size change  Target image roll, pitch, yaw changes Rover pose estimator using visual odometer (VO) VO estimation error affects active camera control:  Rover pose distance error  Rover pose orientation error Target position estimation using stereo vision Stereo vision triangulation error affects active camera control:  Target position error on image plane Active camera control to point the fixed-mast to the target (for Pancam and Navcam only) Fixed-mast pointing errors:  pan/tilt encoder resolution  pan/tilt backlash  mast calibration accuracy 2-D target tracking using normalized cross-correlation, scale, and affine matching The above active camera control with VO and stereo vision determines the target image displacement, which affects the tracking performance:  Tracking success percentage  Tracking error Camera handoff  Handoff success percentage  Handoff error

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only8 Hypothetical Calculations of Error Budget Model TerrainFlatSmall rocksLarge rocks Approach pathstraight winding Rover motion step size20 cm 20 cm or 10º Rover locomotion/navigator  Size change per frame  Pitch/yaw changes 2% at 10m 10% at 2 m  /  2% at 10 m 10% at 2 m 10º/  2% at 10m 10% at 2m 10º/ 10º VO rover pose  Distance and orientation errors (2%) 0.4 cm / 0.1º0.4 cm / 0.2º0.4 cm / 0.3º Target position error on image plane (stereo triangulation) 1 pixel Pan/tilt (540:1, 16 CPR)  encoder resolution and backlash  mast calibration accuracy 0.04º Overall orientation error for active camera control0.1º0.2º0.3º Target image displacement between frames  Pancam (17º FOV)  Navcam (45º FOV)  Hazcam (100º FOV) with active gaze 6 pixels 2.3 pixels 1 pixel 12 pixels 4.6 pixels 2 pixels 18 pixels 9.2 pixels 3 pixels 2-D target tracking and camera handoff (tracking percentage and error each step) 1.Pancam for 4 m (from 10 m to 6 m) 2.Handoff from Pancam to Navcam 3.Navcam for 4m (from 6 m to 2 m) 4.Handoff from Navcam to Hazcam 5.Hazcam for 1m (from 2 m to 1 m) 95%; 2 pixels 1 pixel 95%; 3 pixels 1.5 pixels 90%; 2 pixels 1 pixel 90%; 3 pixels 1 pixel 90%; 4 pixels 1.5 pixels 90%; 2.5 pixels 1 pixel 85%; 4 pixels 1 pixel 85%; 5 pixels 1.5 pixels 85%; 3 pixels 1 pixel Overall single-sol target approach and instrument placement (tracking percentage, pixel error, and placement error) 81%; 3.0 pixels 1σ = 2.0 cm 3σ = 6.1 cm 73%; 3.5 pixels 1σ = 2.4 cm 3σ =7.1 cm 61%; 4.0 pixels 1σ = 2.7 cm 3σ = 8.1 cm

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only9 Test Environment

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only10 Test Environment Rocky8 rover in the Mars Yard Leica TCRA 1103 –2 mm accuracy for manual tracking; 3 mm for auto tracking 10x10 dots calibration target board and target stand Bricks with reflective tape targets Camera specifications Lens manufacturer & focal length Hor. FOV × Vert. FOV CCD image resolution Stereo baseline PancamFujinon; 16 mm17° × 13°1024×768 pixels30 cm NavcamRaymax; 6 mm43.5° × 33°1024×768 pixels20 cm HazcamComputar; 2.3 mm113° × 86°640×480 pixels8.3 cm

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only11 Total Station Metrology Prism position data refinement –1 cm error over 60 cm span for initial rover pose –1º error in initial heading –17 cm error over 10 m Prism position repeatability –Measure every ¼ turn of prism stick Rover pose reference frame precision TS

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only12 Mast Pan/Tilt Positioning Mast Pan/Tilt Zero-Positioning Accuracy –Visual alignment markings Mast Pan/Tilt Positioning Repeatability –Move back to zero position from several different initial positions Mast Pan/Tilt Positioning Control Resolution –Tiny increment of radians (0.03º)

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only13 Mast and Body Camera Calibration Pancam/Navcam/Hazcam –8 calibration target positions each –Use acaldots (automatic version of ccaldots) and ccaladj

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only14 Mast and Body Camera Calibration Stereo camera error ellipsoids –Bricks with reflective tapes

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only15 Mast Calibration Move calibration target at different locations with different pan/tilt angles determine 7 parameters (pan_offset, tilt_offset, x, y, z, thx, thy) generate camera models relative to masthead determine mast calibration error determine reasonably sufficient number of target positions

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only16 Purely Geometric Camera Handoff Pancam to Navcam (at about 6 m) Navcam to Hazcam (at about 2 m) –measure handoff pixel errors –analysis based on stereo error ellipsoids –sensitivity to camera/mast calibration error From Navcam To Hazcam

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only17 2D Refinement Corrects purely geometric camera handoff error by 2D image matching Template image for the 2 nd camera must be generated from the 1 st camera Scaled Template Image –Uses Field of view angle ratio of Hazcam to Navcam Warped Template Image –Uses camera Models and stereo range maps of Navcam and Hazcam Template image reconstructed from Navcam Actual Hazcam image

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only18 2-D Refinement Overlay of Navcam image on Hazcam image

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only19 Target Tracking along Straight Approach Paths on Flat Surface Target Tracking Validation Experiments –Furnish the tracking reliability and error budget model with experimentally validated numbers –If some functionalities are not available, conduct portions of the baseline operations Rover Locomotion/Navigator –Linear steps from 10 m to 1 m 0.25 m – 36 steps (10, 9.75, 9.5, 9.25, 9, …, 2, 1.75, 1.5, 1.25, 1) 0.5 m – 18 steps (10, 9.5, 9, …, 2, 1.5, 1) 1 m – 9 steps (10, 9, …, 2, 1) –Percent-change steps from 10 m to 1m 10% change – 22 steps (10, 9, 8.1, 7.29, 6.56, …, 1.21, 1.09, 0.99) 20% change – 11 steps (10, 8, 6.4, 5.1, 4.1, …, 1.7, 1.3, 1.1, 0.9)

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only20 Pose Estimator, Stereo Localization, and Mast Pointing Pose Estimator –Wheel odometer + IMU + Visual odometer –Measure pose estimation inaccuracy by comparing with total station metrology Stereo Localization –Use stereo error ellipsoids data –Compare with total station metrology Mast Pan/Tilt Pointing –Select an initial target at different image positions –Run the tracker one step without rover motion first zero rover motion excludes pose estimation error –After the mast completes the new pointing, measure discrepancy between target image position and image center (512, 384) –Repeat the above with rover motion (to include pose estimation error)

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only21 Normalized Cross-Correlation and Scale/Affine Matching Refer to 2D Target Tracking Validation Report Conduct actual tracking runs as well as off-line tracking runs by using the image sets of actual tracking runs –use different target selections and image skips –measure the tracking reliability and error Linear or percent-change step size? What size? Target loss: how to prevent it or how to detect it? Effect of lighting

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only22 Camera Handoff and Hazcam Tracking Camera handoff –Examine handoff errors for the tracking runs under the baseline operational scenario (Pancam to Navcam; Navcam to Hazcam) –Compare with earlier experimental data tested separately Hazcam Tracking –Without active camera control –Estimate the new target image position based on rover pose estimator, stereo triangulation, and camera model –Examine the initial error of the estimated target image position, and tracking success rate and error after 2-D template matching

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only23 Target Tracking Tests with Different Approach Paths Target tracking along straight approach paths on surface with small rocks Target tracking along winding approach paths on surface with large rocks Target tracking with heading-towards-target Path Target tracking with hazard avoidance navigation Target tracking using MER images

Mars Science Laboratory Project Jet Propulsion Laboratory August 2, 2004PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only24 Software Modifications Desired for Test Plan 1.Implement exact mast inverse kinematics –camera frames are not in general parallel to the masthead frame –CAHVOR model optical axis does not in general pass through the image center 2.Add parameters for params.txt –Window size for VO –Window size for tracking –Image skip for off-line runs 3.Add two initial search options to the pgm header –Camera pose –Image position 4.Hazcam tracking without active camera control 5.Camera handoff –Pancam to Navcam –Navcam to Hazcam –2-D Refinement 6.Add parameters for drive.txt –Different step size for VO?? 7.Lost target detection 8.Stereo triangulation instead of full stereo image processing 9.Clean-up of output file format