Presentation is loading. Please wait.

Presentation is loading. Please wait.

Rover Navigation and Visual Odometry: A New Framework for Exploration Activities Enrica Zereik, Enrico Simetti, Alessandro Sperindé, Sandro Torelli, Fabio.

Similar presentations


Presentation on theme: "Rover Navigation and Visual Odometry: A New Framework for Exploration Activities Enrica Zereik, Enrico Simetti, Alessandro Sperindé, Sandro Torelli, Fabio."— Presentation transcript:

1 Rover Navigation and Visual Odometry: A New Framework for Exploration Activities Enrica Zereik, Enrico Simetti, Alessandro Sperindé, Sandro Torelli, Fabio Frassinelli, Davide Ducco and Giuseppe Casalino GRAAL Lab, DIST, University of Genoa ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

2 to deal with the underlying specific hardware platform real-time control systems to solve problems related to real-time constraints of control systems data-unaware to provide data-unaware communication mechanisms reused to be reused for different control systems in several applications control algorithm Develop a software architecture to let researchers focus their attention on the control algorithm only, without caring about the underlying physical system ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010 Why a Framework?

3 Minimization Minimization of the number of code lines not strictly related to the control algorithm communication Standard communication mechanism between control tasks (minimum impact on the algorithm) coordination Capability of coordination between remote frameworks Main Objectives Independency Independency of each control algorithm from the underlying software platform ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

4 Minimization Minimization of the number of code lines not strictly related to the control algorithm communication Standard communication mechanism between control tasks (minimum impact on the algorithm) coordination Capability of coordination between remote frameworks KAL Abstraction Levels Independency Independency of each control algorithm from the underlying software platform ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

5 coordination Capability of coordination between remote frameworks Minimization Minimization of the number of code lines not strictly related to the control algorithm communication Standard communication mechanism between control tasks (minimum impact on the algorithm) WF Abstraction Levels Independency Independency of each control algorithm from the underlying software platform ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

6 coordination Capability of coordination between remote frameworks Minimization Minimization of the number of code lines not strictly related to the control algorithm communication Standard communication mechanism between control tasks (minimum impact on the algorithm) Abstraction Levels Independency Independency of each control algorithm from the underlying software platform BBS ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

7 WorkFrame Name Server WorkFrame Name Server: abstraction of the OS resources and services KAL: Kernel Abstraction Layer ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

8 System Manager System Manager: resource request handling Sched Sched: Rel Sched can synchronize frameworks Logger Logger: communication toward user WF: WorkFrame ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

9 Inter-task communication Resource access Both local and remote tasks Shared BlackBoard publishing data Local execution of computation involving BB data BBS: BlackBoard System ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

10 Resources, Scheduling Device I/O Mutually Exclusive Interprocess Data Sharing (also with remote tasks) Network Communication C++ Math Routines Framework Hierarchy ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

11 Feature Extraction Feature Extraction: in each image of the stereo pair Stereo Matching Stereo Matching: correspondence research Triangulation Triangulation: correspondent 3D point computation Motion Estimation Motion Estimation: estimation of the motion occured between the two considered stereo image pairs Visual Odometry Module Tracking in Time Tracking in Time: tracking the same features in the following image acquisition ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

12 Visual Odometry Module Feature Extraction Feature Extraction: in each image of the stereo pair Triangulation Triangulation: correspondent 3D point computation Motion Estimation Motion Estimation: estimation of the motion occured between the two considered stereo image pairs Tracking in Time Tracking in Time: tracking the same features in the following image acquisition Stereo Matching Stereo Matching: correspondence research LOG filtering + SURF (robust descriptors ) ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

13 Feature Extraction Feature Extraction: in each image of the stereo pair Triangulation Triangulation: correspondent 3D point computation Motion Estimation Motion Estimation: estimation of the motion occured between the two considered stereo image pairs Tracking in Time Tracking in Time: tracking the same features in the following image acquisition Visual Odometry Module Stereo Matching Stereo Matching: correspondence research Epipolar constraint, descriptor-based ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

14 Feature Extraction Feature Extraction: in each image of the stereo pair Motion Estimation Motion Estimation: estimation of the motion occured between the two considered stereo image pairs Tracking in Time Tracking in Time: tracking the same features in the following image acquisition Visual Odometry Module Stereo Matching Stereo Matching: correspondence research Triangulation Triangulation: correspondent 3D point computation Subject to erros, outliers rejected ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

15 Feature Extraction Feature Extraction: in each image of the stereo pair Motion Estimation Motion Estimation: estimation of the motion occured between the two considered stereo image pairs Triangulation Triangulation: correspondent 3D point computation Visual Odometry Module Tracking in Time Tracking in Time: tracking the same features in the following image acquisition No external estimation, descriptor-based Stereo Matching Stereo Matching: correspondence research ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

16 Feature Extraction Feature Extraction: in each image of the stereo pair Motion Estimation Motion Estimation: estimation of the motion occured between the two considered stereo image pairs Stereo Matching Stereo Matching: correspondence research Triangulation Triangulation: correspondent 3D point computation Tracking in Time Tracking in Time: tracking the same features in the following image acquisition Least Square (outlier rejection, initial estimation) + Maximum Likelihood Estimation Visual Odometry Module ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

17 Experimental Setup ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010 Custom mobile GRAAL Tricycle-like structure Bumblebee2 stereo camera system

18 Preliminary Results ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

19 GRAAL Lab, DIST, University of Genoa European Space Agency Thales Alenia Space, Italy Enrica Zereik, Andrea Sorbara, Andrea Merlo, Frederic Didot and Giuseppe Casalino Robotic Crew Assistant for Exploration Missions: Vision, Force Control and Coordination Strategies ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

20 Eurobot Wet Model ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

21 JR3 force/torque sensor 7 d.o.f. arms, one camera on each four-wheeled rover for autonomous navigation pan/tilt stereo cameras for rover navigation pan/tilt stereoscopic head for manipulation exchangeable end-effector arm cameras Eurobot Ground Prototype ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

22 Coordination Coordination Rover and Arms Dynamic Programming-based strategyVision Object recognition and centering ARToolKiTPlus and OpenCV supportForce Approaching and actual grasping Contact detection EGP - Control Aspects ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

23 Dynamic Programming-based Coordination of robotic macro-structures Independent from the specific system configuration Many different control objectives can be required Velocity control task requirement Associated cost-to-go Moving platform velocity General Control Architecture with Priority Tasks ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

24 Task i-th: General Control Architecture with Priority Tasks ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

25 Use of relationships Monitoring MM tendency toward Backward Phase ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

26 Remarks The risk of MM losses still exists (e.g. if the object must be very high lifted) last If a MM loss is detected the last resort solution resort solution is modulating Implicit Priority Change Forward Phase ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

27 Backward phase at platform level Implicit Priority Change ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

28 Vision-based Recognition of Objects Marker-based object tracking Reliability Robustness Occurring problems Lighting conditions Complexity of the captured scene Distance from which the marker is seen Preliminary Thresholding Image Cleaning Image Zooming ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

29 Image from camera Auto threshold Image cleaning To Estimator Pose estimation LPF Pose Estimator To E-GNC Image zooming Image Processing Chain ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

30 angular errorangular error after zoomingangular error after LPF linear errorlinear error after zoominglinear error after LPF Implicit Priority Change ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

31 Direct Force Control Strategy Detect a contact with the object to be grasped Compensate residual errors Pure Force Only at the Palm Level felt by the JR3 sensor the contact point must belong to the palm surface known and constant Force-based Approach towards Objects ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

32 Velocity Generation Contact point estimation Velocity assigned to the estimated contact point Compute velocity reference with respect to the robot end-effectorRemarks Noisy sensor and too long distance from palm Initial error very small thanks to vision with Force-based Approach towards Objects ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

33 ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010 Simulative Results

34 Experimental Results ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010Video 3. EGP Failed Equipment Replacement.avi

35 EGP Effective autonomous Effective and autonomous robotic crew assistant Marker removal Potentially Potentially, flight model Planetary Rovers less than 1% Visual Odometry error less than 1% 3D reconstruction of the environment DEM construction and autonomous navigation Conclusions and Future Work ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

36 EGP [1] T. Kröger, D. Kubus and F. M. Wahl, “6D Force and Acceleration Sensor Fusion for Compliant Manipulation Control”, IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, October [2] B. J. Waibel and H. Kazerooni, “Theory and Experiments on the Stability of Robot Compliance Control”, IEEE Transactions on Robotics and Automation, February 1991, vol. 7, no. 1, pp [3] G. Bradski and A. Kaehler, “Learning OpenCV: Computer Vision with the OpenCV Library”, O'Reilly. [4] C. P. Lu, G. D. Hager and E. Mjolsness “Fast and Globally Convergent Pose Estimation From Video Images”, IEEE Transactions on Pattern Analysis and Machine Intelligence, June 2000, vol. 22, no. 6, pp References, I ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

37 [5] B. Kainz and M. Streit, “How to Write an Application with Studierstube 4.0”, Technical report, Graz University of Technology, [6] J. Cai, “Seminar Report: Augmented Reality: the Studierstube Project”, Seminar report. [7] E. Zereik, A. Sorbara, G. Casalino and F. Didot, “Autonomous Dual- Arm Mobile Manipulator Crew Assistant for Surface Operations: Force/Vision-Guided Grasping”, International Conference on Recent Advances in Space Technologies, Istanbul, Turkey, June [8] E. Zereik, A. Sorbara, G. Casalino and F. Didot, “Force/Vision- Guided Grasping for an Autonomous Dual-Arm Mobile Manipulator Crew Assistant for Space Exploration Missions”, International Conference on Automation Robotics and Control Systems, Orlando, USA, July References, II ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

38 [9] G. Casalino and A. Turetta, “Coordination and Control of Multiarm Nonholonomic Mobile Manipulators", MISTRAL: Methodologies and Integration of Subsystems and Technologies for Robotic Architectures and Locomotion, B. Siciliano, G. Casalino, A. De Luca, C. Melchiorri, Springer Tracts in Advanced Robotics, Springer-Verlag, April [10] G. Casalino, A. Turetta and A. Sorbara, “Dynamic Programming based Computationally Distributed Kinematic Inversion Technique”, Advanced Space Technologies for Robotics and Automation, Noordwijk, The Netherlands, November [11] G. Casalino, A. Turetta and A. Sorbara, “DP-Based Distributed Kinematic Inversion for Complex Robotic Systems”, 7th Portuguese Conference on Automatic Control, Lisbon, Portugal, September [12] E. Zereik, “Space Robotics Supporting Exploration Missions: Vision, Force Control and Coordination Strategies”, Ph.D. Thesis, University of Genova, References, III ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

39 Visual Odometry [1] M. Maurette and E. Baumgartner, “Autonomous Navigation Ability: FIDO Test Results”, 6th ESA Workshop on Advanced Space Technologies for Robotics and Automation, Noordwijk, The Netherlands, November [2] M. Maimone, Y. Cheng, and L. H. Matthies, “Two Years of Visual Odometry on the Mars Exploration Rovers”, Journal of Field Robotics, March 2007, vol. 24, no. 3, pp [3] A. E. Johnson, S. B. Goldberg, Y. Cheng and L. H. Matthies, “Robust and Efficient Stereo Feature Tracking for Visual Odometry”, IEEE International Conference on Robotics and Automation, Pasadena, USA, May [4] L. Matthies, “Dynamic Stereo Vision”, Ph.D. Thesis, Carnegie Mellon University. References, IV ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

40 [5] D. Nistér, O. Naroditsky and J. Bergen, “Visual Odometry”, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, USA, June [6] Richard Hartley and Andrew Zisserman, “Multiple View Geometry in Computer Vision”, Cambridge University Press, March [7] E. Trucco and A. Verri, “Introductory Techniques for 3-D Computer Vision”, Prentice Hall, [8] I. J. Cox, S. L. Hingorani, S. B. Rao and B. M. Maggs, “A Maximum Likelihood Stereo Algorithm”, Journal of Computer Vision and Image Understanding, 1996, vol. 63, no. 3, pp [9] M. Fischler and R. Bolles, “Random Sample Consensus: a Paradigm for Model Fitting with Application to Image Analysis and Automated Cartography”, Communications of the Association for Computing Machinery, June 1981, vol. 24, pp References, V ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010

41 Thank you for your kind attention!! ICRA Planetary Rover WorkshopAnchorage, Alaska, 3 May 2010


Download ppt "Rover Navigation and Visual Odometry: A New Framework for Exploration Activities Enrica Zereik, Enrico Simetti, Alessandro Sperindé, Sandro Torelli, Fabio."

Similar presentations


Ads by Google