V ISION -B ASED T RACKING OF A M OVING O BJECT BY A 2 DOF H ELICOPTER M ODEL : T HE S IMULATION Chayatat Ratanasawanya October 30, 2009.

Slides:



Advertisements
Similar presentations
University of Karlsruhe September 30th, 2004 Masayuki Fujita
Advertisements

Chayatat Ratanasawanya Min He May 13, Background information The goal Tasks involved in implementation Depth estimation Pitch & yaw correction angle.
Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
I NTRODUCTION TO R OBOTICS CPSC Lecture 5B – Control.
Intellectual Property Rights are governed by PEGASE Contract Annex II Part C and PEGASE consortium agreements. Before using, reproducing, modifying or.
NONLINEAR BACKSTEPPING CONTROL WITH OBSERVER DESIGN FOR A 4 ROTORS HELICOPTER L. Mederreg, F. Diaz and N. K. M’sirdi LRV Laboratoire de Robotique de Versailles,
SIMULATION OF 2-DOF HELICOPTER RESULTS Maryam Alizadeh June 29 th
Computer vision: models, learning and inference
Hybrid Position-Based Visual Servoing
3D M otion D etermination U sing µ IMU A nd V isual T racking 14 May 2010 Centre for Micro and Nano Systems The Chinese University of Hong Kong Supervised.
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University.
Model Independent Visual Servoing CMPUT 610 Literature Reading Presentation Zhen Deng.
Vision-Based Motion Control of Robots
The City College of New York 1 Jizhong Xiao Department of Electrical Engineering City College of New York Manipulator Control Introduction.
Direct Methods for Visual Scene Reconstruction Paper by Richard Szeliski & Sing Bing Kang Presented by Kristin Branson November 7, 2002.
Uncalibrated Geometry & Stratification Sastry and Yang
MEAM 620 Project Report Nima Moshtagh.
CS485/685 Computer Vision Prof. George Bebis
Foundations of Computer Graphics (Spring 2010) CS 184, Lecture 25: Inverse Kinematics Many slides courtesy James O’Brien.
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
CS223b, Jana Kosecka Rigid Body Motion and Image Formation.
Intelligent Systems Lectures 17 Control systems of robots based on Neural Networks.
Lec 21: Fundamental Matrix
CS4670 / 5670: Computer Vision KavitaBala Lecture 15: Projection “The School of Athens,” Raphael.
Inverse Kinematics Jacobian Matrix Trajectory Planning
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
Introduction to ROBOTICS
INTRODUCTION TO DYNAMICS ANALYSIS OF ROBOTS (Part 5)
Definition of an Industrial Robot
Lecture 11 Stereo Reconstruction I Lecture 11 Stereo Reconstruction I Mata kuliah: T Computer Vision Tahun: 2010.
15/09/2015handout 31 Robot Kinematics Logics of presentation: Kinematics: what Coordinate system: way to describe motion Relation between two coordinate.
Camera Geometry and Calibration Thanks to Martial Hebert.
Geometry and Algebra of Multiple Views
Inverse Kinematics Find the required joint angles to place the robot at a given location Places the frame {T} at a point relative to the frame {S} Often.
Robot Vision Control of robot motion from video cmput 615/499 M. Jagersand.
Projective geometry ECE 847: Digital Image Processing Stan Birchfield Clemson University.
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
T. Bajd, M. Mihelj, J. Lenarčič, A. Stanovnik, M. Munih, Robotics, Springer, 2010 ROBOT CONTROL T. Bajd and M. Mihelj.
CS654: Digital Image Analysis Lecture 8: Stereo Imaging.
1 In this lecture we will compare two linearizing controller for a single-link robot: Linearization via Taylor Series Expansion Feedback Linearization.
The L-E (Torque) Dynamical Model: Inertial Forces Coriolis & Centrifugal Forces Gravitational Forces Frictional Forces.
Geometric Camera Models
Introduction to ROBOTICS
Lecture 03 15/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Single-view geometry Odilon Redon, Cyclops, 1914.
An Introduction to Kalman Filtering by Arthur Pece
Robot Vision SS 2007 Matthias Rüther 1 ROBOT VISION Lesson 9: Robots & Vision Matthias Rüther.
Lecture 7: State-Space Modeling 1.Introduction to state-space modeling Definitions How it relates to other modeling formalisms 2.State-space examples 3.Transforming.
Computer vision: models, learning and inference M Ahad Multiple Cameras
Interleaved Pixel Lookup for Embedded Computer Vision
INTRODUCTION TO DYNAMICS ANALYSIS OF ROBOTS (Part 4)
Chayatat Ratanasawanya Min He April 6,  Recall previous presentation  The goal  Progress report ◦ Image processing ◦ depth estimation ◦ Camera.
Camera Model Calibration
Chayatat Ratanasawanya May 18, Overview Recalls Progress & Achievement Results 2.
Robotics Chapter 6 – Machine Vision Dr. Amit Goradia.
ROBOTICS 01PEEQW Basilio Bona DAUIN – Politecnico di Torino.
SPACE MOUSE. INTRODUCTION  It is a human computer interaction technology  Helps in movement of manipulator in 6 degree of freedom * 3 translation degree.
CS682, Jana Kosecka Rigid Body Motion and Image Formation Jana Kosecka
Velocity Propagation Between Robot Links 3/4 Instructor: Jacob Rosen Advanced Robotic - MAE 263D - Department of Mechanical & Aerospace Engineering - UCLA.
Multiple View Geometry
Introduction to ROBOTICS
Zaid H. Rashid Supervisor Dr. Hassan M. Alwan
University of Bridgeport
QUANSER Flight Control Systems Design 2DOF Helicopter 3DOF Helicopter 3DOF Hover 3DOF Gyroscope Quanser Education Solutions Powered by.
Digital Control Systems (DCS)
2-DOF Manipulator Now, given the joint angles Ө1, Ө2 we can determine the end effecter coordinates x and y.
Multiple View Geometry for Robotics
Quanser Rotary Family Experiments
Chapter V Vertex Processing
Chapter 4 . Trajectory planning and Inverse kinematics
Presentation transcript:

V ISION -B ASED T RACKING OF A M OVING O BJECT BY A 2 DOF H ELICOPTER M ODEL : T HE S IMULATION Chayatat Ratanasawanya October 30, 2009

O VERVIEW Classification of visual-servo systems Components of the simulation The non-linear model of the helicopter The LQR controller Perspective projection model & camera calibration Coordinate systems Determination of the position of the ball The simulation: logic and result Summary Questions/comments

V ISUAL - SERVO SYSTEMS TAXONOMY In 1980, Sanderson and Weiss introduced a taxonomy of visual servo systems. Two questions: 1. Is the control structure hierarchical, with the vision system providing set-points as input to the robot’s joint-level controller, or does the visual controller directly compute the joint-level inputs? 2. Is the error signal defined in 3D (task space) coordinates or directly in terms of image features?

4 SYSTEM STRUCTURES Dynamic position-based look-and-move structure Dynamic image-based look-and-move structure

4 SYSTEM STRUCTURES Position-based visual servo structure Image-based visual servo structure

T HE IMAGE JACOBIAN Used in image-based control The matrix describes how image feature parameters change with respect to changing manipulator pose.

T HE SIMULATION - AN INTRODUCTION The system being simulated can be categorized as a dynamic position-based look-and-move system. The non-linear model of the helicopter and the joint- level LQR controller (implemented by Quanser). Perspective projection model & camera calibration. Coordinate systems Determination of the ball’s position in world frame & in camera frame.

T HE NON - LINEAR MODEL OF THE HELICOPTER A block in Simulink provided by Quanser, which captures the dynamic equations of the helicopter plant.

T HE LQR CONTROLLER A controller design technique that works with the state-space representation of a plant/system. with weighting matrices Q and R, calculate Has the same action as a PD or a PID controller. In the simulation, it is a joint-level controller.

P ERSPECTIVE PROJECTION MODEL & CAMERA CALIBRATION The projection model is used to relate the position of an object in the camera frame to the pixel coordinate of the image of that object on the image plane.

C OORDINATE SYSTEMS The world frame: stationary frame attached to the pivot point. The helicopter frame: attached to the helicopter at the pivot point. It moves with the helicopter. The camera frame: attached to the camera at the center of projection. x y z x y z x y z

D ETERMINATION OF THE BALL ’ S POSITION : THE SCENARIO Initially a ping-pong ball is right in front of the camera. r1r1 r2r2 r3r3

D ETERMINATION OF THE BALL ’ S POSITION : THE SCENARIO The ball is moved to a new position. The helicopter hasn’t moved yet. r1r1 r 2n r 3n

D ETERMINATION OF THE BALL ’ S POSITION : THE SCENARIO The helicopter moves to the new position to align the ball to the camera. r 3n r1r1 r 2ss

D ETERMINATION OF THE BALL ’ S POSITION : RECAP The ball is initially right in front of the camera. We know the pose of the helicopter (  0 and  0 ). The ball is moved. Get new ball position in the camera frame from inverse projection model. Use the current pose to calculate new ball position in the world frame. Use ball position in the world frame from the previous step to calculate the desired pose (  d and  d ). Pass these values to the LQR controller.

P UTTING IT ALL TOGETHER : THE SIMULATION

I MPLEMENTATION The first step towards implementation has been taken; i.e. locating the ball’s centre of gravity in real time.

SUMMARY Visual-servo systems taxonomy Components of the simulation The non-linear dynamic model The controller Projection model Coordinate systems Locating the ball The simulation First step towards implementation

Q UESTIONS / COMMENTS 7 th annual UVS Canada conference 2009 Victoria, BC. November 2-5, 2009