Download presentation

Presentation is loading. Please wait.

Published byAshlyn Heathcoat Modified over 3 years ago

1
Chayatat Ratanasawanya March 16, 2011

2
Overview Thesis problem The UAV Pose estimation by POSIT Previous work Development of POSIT-based real-time pose estimation algorithm Experimental results Questions 2

3
Thesis problem statement Develop a flexible human/machine control system to hover an UAV carrying a VDO camera beside an object of interest such as a window for surveillance purposes. Method: Human control – Joystick Machine control – Visual-servoing Application: for the police to use the system to survey a room from outside of a building. 3

4
The UAV Q-ball: 6DOF quadrotor helicopter Came with SIMULINK-based real-time controllers y x z World frame 4 HelicopterController X, Z (desired) Optitrack X, Z IMU Roll, Pitch Sonar Y Y (desired)Yaw(desired) Yaw Magnetometer Desired inputs X, Y, Z, Yaw Camera

5
POSIT algorithm Developers: Daniel DeMenthon & Philip David The algorithm determines the pose of an object relative to the camera from a set of 2D image points 5 Reference: http://www.cfar.umd.edu/~daniel/classicPosit.m POSIT Image coordinates of min. 4 non-coplanar feature points 3D object coordinates of the same points Camera intrinsic parameters (f, cc) Rotation matrix of object wrt. camera Translation of object wrt. camera

6
Previous work Cardboard box target Took still images of the target from various locations in the lab Manual feature points identification Object pose was estimated offline Target was self-occluded Not a real-time process 6 y x z Object frame

7
Current work Image-based control algorithm is being developed Must be a real-time process UAV pose must be estimated real-time Target must not be self-occluded Image source: Live video Image processing has to be fast Feature points must be identified automatically 7

8
Feature points extraction 8 Camera Detect LED Detect Window Detect Corners Discard unwanted feature points detected

9
Distortion coeff. from cam calibration Feature points undistortion? Fast image processing – no unnecessary calculations Evaluate the pose estimated by POSIT from distorted and undistorted feature points locations 9 VDO from Camera Feature points extraction Undistortion by look-up table POSIT & Inv. kinematics Points location filter Compare Optitrack IMU POSIT & Inv. kinematics 6DOF UAV pose estimates 6DOF UAV pose Roll Pitch

10
Experimental setup 10

11
Experimental setup The Q-ball was randomly placed in 20 locations in the lab. Its pose was different in each location. Acquire live video stream and estimate the UAV pose with POSIT in real-time. 150 6DOF pose estimations, Optitrack, and IMU readings were recorded. Optitrack readings are used as reference. 11

12
Results - X 12 Test Undistorted points Distorted points Optitrack 11.0061.17480.001 215.809114.73210.0014 3101.81690.92790.0057 43.04823.57450.0013 52.08522.27310.0015 65.81565.6880.0016 71.50171.45470.0014 85.24795.41410.0011 94.14414.44840.0015 101.24671.28470.003 119.06079.94540.0046 125.56116.66070.0067 134.33974.09360.0014 143.59943.7820.0016 1538.025854.00480.0017 162.15972.25680.0013 178.02234.37960.0014 180.77390.86750.0013 190.95641.04130.0015 207.21998.01070.0016 Standard Deviation

13
Results - Y 13 Test Undistorted points Distorted points Optitrack 11.02351.31260.0016 23.24756.02260.0016 365.554853.76070.0015 412.681912.81140.0013 52.39032.59730.0015 66.02524.06520.0013 70.26030.3130.0013 80.90410.89210.0013 91.4191.5150.0016 101.67331.76930.0156 115.27065.96920.0023 125.24136.2730.0014 132.14562.30750.0033 140.760.82360.0011 1524.572334.64690.0017 161.33181.52060.0014 170.76960.44220.0013 181.57741.82280.0014 190.9210.95870.0017 204.5783.39810.0013 Standard Deviation

14
Results - Z 14 Standard Deviation Test Undistorted points Distorted points Optitrack 10.35110.49840.0007 22.11785.26210.0012 330.39943.45170.0012 43.83193.76980.0009 51.03141.08430.0011 62.0781.69470.0009 70.17030.14640.0012 80.4290.49840.001 91.93351.96160.0007 100.88540.91990.0014 112.27091.96480.0011 121.6711.73680.0013 130.55390.76990.0012 141.46321.47510.0008 1521.064935.00810.001 160.46320.51040.0009 173.19851.66580.0008 180.63830.78310.0011 190.82680.93940.0011 201.24451.38810.0012

15
Results - Roll 15 Standard Deviation Test Undistorted points Distorted points OptitrackIMU 10.16120.18230.0020.1752 20.41170.19930.00180.1811 320.680420.46540.00260.1342 40.32640.33030.00140.1377 50.37350.37570.00180.1366 60.25210.20360.00180.1468 70.12680.11680.00110.1334 80.11930.12430.00140.1448 90.24110.24240.00180.1162 100.16560.1740.03230.146 110.33480.34530.00350.133 120.34250.39590.00150.1421 130.35840.36240.0040.1479 140.35130.35110.0010.1417 1532.736549.5320.00240.1502 160.17050.1720.00140.1497 170.35270.18120.00110.1535 180.16060.17340.00210.158 190.20170.20680.00280.1541 200.16170.15470.00140.1755

16
Results - Pitch 16 Standard Deviation Test Undistorted points Distorted points OptitrackIMU 10.29290.3410.00240.1713 20.63081.34580.00240.1899 330.048437.89680.00240.1712 44.16154.03020.00190.1835 50.68950.71350.00210.1761 61.39320.89140.00230.1527 70.11010.11740.00180.1631 80.1080.1130.00210.186 90.38160.39250.00260.1635 100.40450.41790.02630.1641 111.05341.1120.00410.1969 121.0231.1170.00180.16 130.50730.53610.00570.1966 140.26610.26950.00180.1712 1524.664137.32220.00290.1528 160.45420.48170.00180.1664 170.16480.08210.00180.1609 180.38860.42660.00350.15 190.31420.31750.00270.197 200.85140.60.0020.1685

17
Results - Yaw 17 Standard Deviation Test Undistorted points Distorted points Optitrack 10.25620.27080.0014 22.3131.80.0017 36.737.98320.0068 40.84391.0090.0011 50.56830.6040.0012 61.33411.22840.0011 70.57280.52520.0021 81.28241.29130.0012 90.91230.94740.001 100.25920.26290.0049 111.77511.85150.0027 121.09731.23810.0069 130.96730.89130.0013 140.95920.97430.0012 155.412612.72220.0017 160.63240.63990.0009 171.91950.98850.001 180.21380.22220.0015 190.31180.32780.0009 201.36241.40420.0011

18
Mean and SD of error of all 3000 measurements 18 DOF Distorted feature points w.r.t. to Optitrack Undistorted feature points w.r.t. to Optitrack Optitrack w.r.t. IMU MeanSDMeanSDMeanSD X (cm)13.999010.349416.599310.9070N/A Y (cm)3.95714.49583.39563.8116N/A Z (cm)17.56778.15265.73794.3130N/A Roll ()1.61931.33891.28741.15801.32891.0149 Pitch ()1.36621.31031.54931.41600.76350.3610 Yaw ()3.55701.83504.13951.8739N/A DOF Distorted feature points w.r.t. to Optitrack Undistorted feature points w.r.t. to Optitrack Optitrack w.r.t. IMU MeanSDMeanSDMeanSD X (cm)15.760025.505317.653825.3427N/A Y (cm)5.774115.47534.688416.5980N/A Z (cm)18.276612.66326.14239.1493N/A Roll ()3.087313.02842.07549.00041.43421.1588 Pitch ()2.970312.40402.40438.77450.76790.3615 Yaw ()3.63423.65404.07072.4474N/A Excludes #3 & 15

19
Conclusion POSIT algorithm is an alternative for real-time UAV pose estimation Target consists of a white LED and a window 5 non-coplanar feature pts: the LED and 4 corners Pose estimation using undistorted feature points is more accurate than that using distorted points – significant improvement along Z-direction Image information may be mapped to positional control inputs via POSIT algorithm 19

20
Summary Thesis problem & the UAV Previous work on POSIT – the drawbacks POSIT-based real-time pose estimation algorithm Feature points extraction from live VDO Feature points image coordinates undistortion Feature points location filtering Real-time algorithm Comparison between pose estimated by POSIT, pose from Optitrack, and 2 attitude angles from IMU. 20

21
Homogeneous transformation is a matrix which shows how one coordinate frame is related to another. It is used to convert the location of a point between two frames. Homogeneous transformation y x z Frame C yx z Frame A (d x, d y, d z )

22
The process of deriving the transformation (rotation and translation) between two frames from a known transformation matrix Inverse kinematics Translation Inverse kinematics formulas Rotation angles

23
Inverse kinematics formulas ψ y x z θ

24
Result calculation y x z World frame, W y x z Object frame, L y x z Q-ball frame, Q zx y Cam frame, C CTLCTL POSIT y x z Object frame, L

25
Result calculation QTCQTC WTLWTL Translation Inverse kinematics formula Rotation angles y x z Object frame, L y x z Q-ball frame, Q zx y Cam frame, C CTLCTL y x z World frame, W

Similar presentations

OK

By D. Fisher Geometric Transformations. Reflection, Rotation, or Translation 1.

By D. Fisher Geometric Transformations. Reflection, Rotation, or Translation 1.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google