Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chayatat Ratanasawanya March 16, 2011. Overview Thesis problem The UAV Pose estimation by POSIT Previous work Development of POSIT-based real-time pose.

Similar presentations


Presentation on theme: "Chayatat Ratanasawanya March 16, 2011. Overview Thesis problem The UAV Pose estimation by POSIT Previous work Development of POSIT-based real-time pose."— Presentation transcript:

1 Chayatat Ratanasawanya March 16, 2011

2 Overview Thesis problem The UAV Pose estimation by POSIT Previous work Development of POSIT-based real-time pose estimation algorithm Experimental results Questions 2

3 Thesis problem statement Develop a flexible human/machine control system to hover an UAV carrying a VDO camera beside an object of interest such as a window for surveillance purposes. Method: Human control – Joystick Machine control – Visual-servoing Application: for the police to use the system to survey a room from outside of a building. 3

4 The UAV Q-ball: 6DOF quadrotor helicopter Came with SIMULINK-based real-time controllers y x z World frame 4 HelicopterController X, Z (desired) Optitrack X, Z IMU Roll, Pitch Sonar Y Y (desired)Yaw(desired) Yaw Magnetometer Desired inputs X, Y, Z, Yaw Camera

5 POSIT algorithm Developers: Daniel DeMenthon & Philip David The algorithm determines the pose of an object relative to the camera from a set of 2D image points 5 Reference: POSIT Image coordinates of min. 4 non-coplanar feature points 3D object coordinates of the same points Camera intrinsic parameters (f, cc) Rotation matrix of object wrt. camera Translation of object wrt. camera

6 Previous work Cardboard box target Took still images of the target from various locations in the lab Manual feature points identification Object pose was estimated offline Target was self-occluded Not a real-time process 6 y x z Object frame

7 Current work Image-based control algorithm is being developed Must be a real-time process UAV pose must be estimated real-time Target must not be self-occluded Image source: Live video Image processing has to be fast Feature points must be identified automatically 7

8 Feature points extraction 8 Camera Detect LED Detect Window Detect Corners Discard unwanted feature points detected

9 Distortion coeff. from cam calibration Feature points undistortion? Fast image processing – no unnecessary calculations Evaluate the pose estimated by POSIT from distorted and undistorted feature points locations 9 VDO from Camera Feature points extraction Undistortion by look-up table POSIT & Inv. kinematics Points location filter Compare Optitrack IMU POSIT & Inv. kinematics 6DOF UAV pose estimates 6DOF UAV pose Roll Pitch

10 Experimental setup 10

11 Experimental setup The Q-ball was randomly placed in 20 locations in the lab. Its pose was different in each location. Acquire live video stream and estimate the UAV pose with POSIT in real-time DOF pose estimations, Optitrack, and IMU readings were recorded. Optitrack readings are used as reference. 11

12 Results - X 12 Test Undistorted points Distorted points Optitrack Standard Deviation

13 Results - Y 13 Test Undistorted points Distorted points Optitrack Standard Deviation

14 Results - Z 14 Standard Deviation Test Undistorted points Distorted points Optitrack

15 Results - Roll 15 Standard Deviation Test Undistorted points Distorted points OptitrackIMU

16 Results - Pitch 16 Standard Deviation Test Undistorted points Distorted points OptitrackIMU

17 Results - Yaw 17 Standard Deviation Test Undistorted points Distorted points Optitrack

18 Mean and SD of error of all 3000 measurements 18 DOF Distorted feature points w.r.t. to Optitrack Undistorted feature points w.r.t. to Optitrack Optitrack w.r.t. IMU MeanSDMeanSDMeanSD X (cm) N/A Y (cm) N/A Z (cm) N/A Roll () Pitch () Yaw () N/A DOF Distorted feature points w.r.t. to Optitrack Undistorted feature points w.r.t. to Optitrack Optitrack w.r.t. IMU MeanSDMeanSDMeanSD X (cm) N/A Y (cm) N/A Z (cm) N/A Roll () Pitch () Yaw () N/A Excludes #3 & 15

19 Conclusion POSIT algorithm is an alternative for real-time UAV pose estimation Target consists of a white LED and a window 5 non-coplanar feature pts: the LED and 4 corners Pose estimation using undistorted feature points is more accurate than that using distorted points – significant improvement along Z-direction Image information may be mapped to positional control inputs via POSIT algorithm 19

20 Summary Thesis problem & the UAV Previous work on POSIT – the drawbacks POSIT-based real-time pose estimation algorithm Feature points extraction from live VDO Feature points image coordinates undistortion Feature points location filtering Real-time algorithm Comparison between pose estimated by POSIT, pose from Optitrack, and 2 attitude angles from IMU. 20

21 Homogeneous transformation is a matrix which shows how one coordinate frame is related to another. It is used to convert the location of a point between two frames. Homogeneous transformation y x z Frame C yx z Frame A (d x, d y, d z )

22 The process of deriving the transformation (rotation and translation) between two frames from a known transformation matrix Inverse kinematics Translation Inverse kinematics formulas Rotation angles

23 Inverse kinematics formulas ψ y x z θ

24 Result calculation y x z World frame, W y x z Object frame, L y x z Q-ball frame, Q zx y Cam frame, C CTLCTL POSIT y x z Object frame, L

25 Result calculation QTCQTC WTLWTL Translation Inverse kinematics formula Rotation angles y x z Object frame, L y x z Q-ball frame, Q zx y Cam frame, C CTLCTL y x z World frame, W


Download ppt "Chayatat Ratanasawanya March 16, 2011. Overview Thesis problem The UAV Pose estimation by POSIT Previous work Development of POSIT-based real-time pose."

Similar presentations


Ads by Google