Robot Vision Control of robot motion from video cmput 615/499 M. Jagersand.

Slides:



Advertisements
Similar presentations
Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
Advertisements

Introduction University of Bridgeport 1 Introduction to ROBOTICS.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Hybrid Position-Based Visual Servoing
Jan-Michael Frahm, Enrique Dunn Spring 2012
« Uncalibrated Vision based on Structured Light » Joaquim Salvi 2 David Fofi 1 El Mustapha Mouaddib 3 3 CREA EA 3299 Université de Picardie Jules Verne.
Two-view geometry.
CSCE 641: Forward kinematics and inverse kinematics Jinxiang Chai.
1Notes  Handing assignment 0 back (at the front of the room)  Read the newsgroup!  Planning to put 16mm films on the web soon (possibly tomorrow)
Camera calibration and epipolar geometry
Model Independent Visual Servoing CMPUT 610 Literature Reading Presentation Zhen Deng.
Vision-Based Motion Control of Robots
Vision Based Motion Control Martin Jagersand University of Alberta CIRA 2001.
Introduction to Robotics
Hybrid Manipulation: Force-Vision CMPUT 610 Martin Jagersand.
MSU CSE 240 Fall 2003 Stockman CV: 3D to 2D mathematics Perspective transformation; camera calibration; stereo computation; and more.
CSCE 641: Forward kinematics and inverse kinematics Jinxiang Chai.
Hand-Eye Coordination and Vision-based Interaction Martin Jagersand Collaborators: Zach Dodds, Greg Hager, Andreas Pichler.
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Uncalibrated Geometry & Stratification Sastry and Yang
MEAM 620 Project Report Nima Moshtagh.
Vision Based Motion Control CMPUT Martin Jagersand.
Image-based Control Convergence issues CMPUT 610 Winter 2001 Martin Jagersand.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Introduction to Robotics (ES159) Advanced Introduction to Robotics (ES259) Spring Ahmed Fathi
ME Robotics DIFFERENTIAL KINEMATICS Purpose: The purpose of this chapter is to introduce you to robot motion. Differential forms of the homogeneous.
CSCE 689: Forward Kinematics and Inverse Kinematics
© 2003 by Davi GeigerComputer Vision October 2003 L1.1 Structure-from-EgoMotion (based on notes from David Jacobs, CS-Maryland) Determining the 3-D structure.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
1 Three-view geometry 3-view constraint along F Minimal algebraic sol The content described in these slides is not required in the final exam!
Advanced Computer Vision Structure from Motion. Geometric structure-from-motion problem: using image matches to estimate: The 3D positions of the corresponding.
Estimating Visual-Motor Functions CMPUT Martin Jagersand.
Stockman MSU/CSE Math models 3D to 2D Affine transformations in 3D; Projections 3D to 2D; Derivation of camera matrix form.
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Definition of an Industrial Robot
Geometry and Algebra of Multiple Views
Epipolar geometry The fundamental matrix and the tensor
1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta Page 1.
Lecture 2: Introduction to Concepts in Robotics
Projective cameras Motivation Elements of Projective Geometry Projective structure from motion Planches : –
INVERSE KINEMATICS ANALYSIS TRAJECTORY PLANNING FOR A ROBOT ARM Proceedings of th Asian Control Conference Kaohsiung, Taiwan, May 15-18, 2011 Guo-Shing.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Lecture 04 22/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Robot Kinematics and linkage configurations 2 cmput 412 M. Jagersand With slides from A. Shademan and A Casals.
CSCE 441: Computer Graphics Forward/Inverse kinematics Jinxiang Chai.
Computer Vision cmput 499/615
Ch. 3: Geometric Camera Calibration
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Two-view geometry. Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the.
EECS 274 Computer Vision Affine Structure from Motion.
Hand-Eye Coordination and Vision-based Interaction / cmput610 Martin Jagersand.
1cs426-winter-2008 Notes  Will add references to splines on web page.
V ISION -B ASED T RACKING OF A M OVING O BJECT BY A 2 DOF H ELICOPTER M ODEL : T HE S IMULATION Chayatat Ratanasawanya October 30, 2009.
Robot Kinematics and linkage configurations 2 cmput 412 M. Jagersand With slides from A. Shademan and A Casals.
Typical DOE environmental management robotics require a highly experienced human operator to remotely guide and control every joint movement. The human.
City College of New York 1 Dr. John (Jizhong) Xiao Department of Electrical Engineering City College of New York Review for Midterm.
EECS 274 Computer Vision Projective Structure from Motion.
CSCE 441: Computer Graphics Forward/Inverse kinematics Jinxiang Chai.
Camera Calibration Course web page: vision.cis.udel.edu/cv March 24, 2003  Lecture 17.
Robot vision review Martin Jagersand What is Computer Vision? Three Related fieldsThree Related fields –Image Processing: Changes 2D images into other.
Robot Vision Control of robot motion from video
Depth from disparity (x´,y´)=(x+D(x,y), y)
Direct Manipulator Kinematics
Two-view geometry Computer Vision Spring 2018, Lecture 10
Epipolar geometry.
Uncalibrated Geometry & Stratification
Two-view geometry.
Chapter 4 . Trajectory planning and Inverse kinematics
Presentation transcript:

Robot Vision Control of robot motion from video cmput 615/499 M. Jagersand

Vision-Based Control (Visual Servoing) Initial Image Desired Image User

Vision-Based Control (Visual Servoing) : Current Image Features : Desired Image Features

Vision-Based Control (Visual Servoing) : Current Image Features : Desired Image Features

Vision-Based Control (Visual Servoing) : Current Image Features : Desired Image Features

Vision-Based Control (Visual Servoing) : Current Image Features : Desired Image Features

Vision-Based Control (Visual Servoing) : Current Image Features : Desired Image Features

u,v Image-Space Error : Current Image Features : Desired Image Features One point Pixel coord Many points u v

Conventional Robotics: Motion command in base coord We focus on the geometric transformsWe focus on the geometric transforms EE

Problem: Lots of coordinate frames to calibrate Camera –Center of projection –Different models Robot –Base frame –End-effector frame –Object

Image Formation is Nonlinear Camera frame

Camera Motion to Feature Motion Camera frame

Camera Motion to Feature Motion Interaction matrix is a Jacobian. This derivation is for one point. Q. How to extend to more points? Q. How many points needed?

Problem: Lots of coordinate frames to calibrate Camera –Center of projection –Different models Robot –Base frame –End-effector frame –Object Only covered one transform!

Other (easier) solution: Image-based motion control Motor-Visual function: y=f(x) Achieving 3d tasks via 2d image control

Other (easier) solution: Image-based motion control Motor-Visual function: y=f(x) Achieving 3d tasks via 2d image control Note: What follows will work for one or two (or 3..n) cameras. Either fixed or on the robot. Here we will use two cam

Vision-Based Control Image 1 Image 2

Vision-Based Control Image 1 Image 2

Vision-Based Control Image 1 Image 2

Vision-Based Control Image 1 Image 2

Image 1 Vision-Based Control Image 2

Image-based Visual Servoing Observed features:Observed features: Motor joint angles:Motor joint angles: Local linear model:Local linear model: Visual servoing steps: 1 Solve:Visual servoing steps: 1 Solve: 2 Update: 2 Update:

Find J Method 1: Test movements along basis Remember: J is unknown m by n matrixRemember: J is unknown m by n matrix Assume movementsAssume movements Finite difference:Finite difference:

Find J Method 2: Secant Constraints Constraint along a line:Constraint along a line: Defines m equationsDefines m equations Collect n arbitrary, but different measures yCollect n arbitrary, but different measures y Solve for JSolve for J

Find J Method 3: Recursive Secant Constraints Based on initial J and one measure pairBased on initial J and one measure pair Adjust J s.t.Adjust J s.t. Rank 1 update:Rank 1 update: Consider rotated coordinates:Consider rotated coordinates: –Update same as finite difference for n orthogonal moves

Achieving visual goals: Uncalibrated Visual Servoing 1.Solve for motion: 2.Move robot joints: 3.Read actual visual move 4.Update Jacobian: Can we always guarantee when a task is achieved/achievable?

Visually guided motion control Issues: 1.What tasks can be performed? –Camera models, geometry, visual encodings 2.How to do vision guided movement? –H-E transform estimation, feedback, feedforward motion control 3.How to plan, decompose and perform whole tasks?

How to specify a visual task?

Task and Image Specifications Task function T(x) = 0 Image encoding E(y) =0 task space error image space error image space satisfied task space satisfied

Visual specifications Point to Point task “error”:Point to Point task “error”: Why 16 elements?

Visual specifications 2 Point to LinePoint to LineLine: Note: y homogeneous coord.

Parallel Composition example E ( y ) = wrench y - y y (y  y ) (plus e.p. checks)

Achieving visual goals: Uncalibrated Visual Servoing 1.Solve for motion: 2.Move robot joints: 3.Read actual visual move 4.Update Jacobian: Can we always guarantee when a task is achieved/achievable?

Task ambiguity Will the scissors cut the paper in the middle?Will the scissors cut the paper in the middle?

Task ambiguity Will the scissors cut the paper in the middle? NO!Will the scissors cut the paper in the middle? NO!

Task Ambiguity Is the probe contacting the wire?Is the probe contacting the wire?

Task Ambiguity Is the probe contacting the wire? NO!Is the probe contacting the wire? NO!

Task Ambiguity Is the probe contacting the wire?Is the probe contacting the wire?

Task Ambiguity Is the probe contacting the wire? NO!Is the probe contacting the wire? NO!

A task T ( f )=0 is invariant under a group G of transformations, iff x  f  V, g  G with g( f )  V T ( f )=0  T (g( f ))=0 n x n If T ( f ) = 0 here, T ( f ) must be 0 here, if T is G invariant proj inj sim aff Task decidability and invariance

All achievable tasks... are ALL projectively distinguishable coordinate systems One point Coincidence Noncoincidence Collinearity General Position 2pt. coincidence 3pt. coincidence  pt. coincidence 2 2pt. coincidence Cross Ratio General Position Coplanarity

T pp T col T copl T cr  point-coincidence collinearity coplanarity cross ratios (  ) C inj C wk Task primitives T pp Injective Projective Composing possible tasks Task operators T 1  T 2 = T 1  T 2 = T 1 T 2  T = 0 if T = 0 1 otherwise T1T1 T2T2 AND OR NOT

wrench: view 1wrench: view 2 T wrench (x 1..8 ) = T col (x 2,x 5,x 6 ) T col (x 4,x 7,x 8 )  T pp (x 1,x 5 )  T pp (x 3,x 7 )  Result: Task toolkit

Serial Composition Solving whole real tasks Task primitive/”link”Task primitive/”link” 1.Acceptable initial (visual) conditions 2.Visual or Motor constraints to be maintained 3.Final desired condition Task =Task =

“Natural” primitive links 1.Transportation Coarse primitive for large movements Coarse primitive for large movements <= 3DOF control of object centroid <= 3DOF control of object centroid Robust to disturbances Robust to disturbances 2.Fine Manipulation –For high precision control of both position and orientation –6DOF control based on several object features

Example: Pick and place type of movement 3. Alignment??? To match transport final to fine manipulation initial conditions To match transport final to fine manipulation initial conditions

More primitives 4. Guarded move –Move along some direction until an external contraint (e.g. contact) is satisfied. 5. Open loop movements: When object is obscured When object is obscured Or ballistic fast movements Or ballistic fast movements Note can be done based on previously estimated Jacobians Note can be done based on previously estimated Jacobians

Solving the puzzle…

Visual servoing: The process of minimizing a visually- specified task by using visual feedback for motion control of a robot.Visual servoing: The process of minimizing a visually- specified task by using visual feedback for motion control of a robot. Is it difficult? Yes.Is it difficult? Yes. –Controlling 6D pose of the end-effector from 2D image features. –Nonlinear projection, degenerate features, etc. Is it important? Of course.Is it important? Of course. –Vision is a versatile sensor. –Many applications: industrial, health, service, space, humanoids, etc. Conclusions

What environments and tasks are of interest? Not the previous examples.Not the previous examples. –Could be solved “structured” Want to work in everyday unstructured environmentsWant to work in everyday unstructured environments

Use in space applications Space stationSpace station On-orbit serviceOn-orbit service Planetary ExplorationPlanetary Exploration

Power line inspection Model landscape: Edmonton railroad society A electric UAV carrying camera flown over the modelA electric UAV carrying camera flown over the model Simulation: Robot arm holds cameraSimulation: Robot arm holds camera

Use in Power Line Inspection

Some References M. Spong, S. Hutchinson, M. Vidyasagar. Robot Modeling and Control. Chapter 12: Vision-Based Control. John Wiley & Sons: USA, (Text) M. Spong, S. Hutchinson, M. Vidyasagar. Robot Modeling and Control. Chapter 12: Vision-Based Control. John Wiley & Sons: USA, (Text) S. Hutchinson, G. Hager, P. Corke, "A tutorial on visual servo control," IEEE Trans. on Robotics and Automation, October 1996, vol. 12, no 5, p (Tutorial) S. Hutchinson, G. Hager, P. Corke, "A tutorial on visual servo control," IEEE Trans. on Robotics and Automation, October 1996, vol. 12, no 5, p (Tutorial) F. Chaumette, S. Hutchinson, "Visual Servo Control, Part I: Basic Approaches," and "Part II: Advanced Approaches," IEEE Robotics and Automation Magazine 13(4):82-90, December 2006 and 14(1): , March (Tutorial) F. Chaumette, S. Hutchinson, "Visual Servo Control, Part I: Basic Approaches," and "Part II: Advanced Approaches," IEEE Robotics and Automation Magazine 13(4):82-90, December 2006 and 14(1): , March (Tutorial) B. Espiau, F. Chaumette, P. Rives, "A new approach to visual servoing in robotics," IEEE Trans. on Robotics and Automation, June 1992, vol. 8, no. 6, p (Image- based) B. Espiau, F. Chaumette, P. Rives, "A new approach to visual servoing in robotics," IEEE Trans. on Robotics and Automation, June 1992, vol. 8, no. 6, p (Image- based) W. J. Wilson, C. C. Williams Hulls, G. S. Bell, "Relative End-Effector Control Using Cartesian Position Based Visual Servoing", IEEE Transactions on Robotics and Automation, Vol. 12, No. 5, October 1996, pp (Position-based) W. J. Wilson, C. C. Williams Hulls, G. S. Bell, "Relative End-Effector Control Using Cartesian Position Based Visual Servoing", IEEE Transactions on Robotics and Automation, Vol. 12, No. 5, October 1996, pp (Position-based) E. Malis, F. Chaumette, S. Boudet, "2 1/2 D visual servoing," IEEE Trans. on Robotics and Automation, April 1999, vol. 15, no. 2, p (Hybrid) E. Malis, F. Chaumette, S. Boudet, "2 1/2 D visual servoing," IEEE Trans. on Robotics and Automation, April 1999, vol. 15, no. 2, p (Hybrid) M. Jägersand, O. Fuentes, R. Nelson, "Experimental Evaluation of Uncalibrated Visual Servoing for Precision Manipulation," Proc. IEEE Intl. Conference on Robotics and Automation (ICRA), pp , Apr (Uncalibrated, model-free) M. Jägersand, O. Fuentes, R. Nelson, "Experimental Evaluation of Uncalibrated Visual Servoing for Precision Manipulation," Proc. IEEE Intl. Conference on Robotics and Automation (ICRA), pp , Apr (Uncalibrated, model-free) F. Chaumette, "Potential problems of stability and convergence in image-based and position-based visual servoing," The Confluence of Vision and Control, LNCS Series, No 237, Springer-Verlag, 1998, p F. Chaumette, "Potential problems of stability and convergence in image-based and position-based visual servoing," The Confluence of Vision and Control, LNCS Series, No 237, Springer-Verlag, 1998, p