We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byMiranda Greenaway
Modified over 2 years ago
EUCLIDEAN POSITION ESTIMATION OF FEATURES ON A STATIC OBJECT USING A MOVING CALIBRATED CAMERA Nitendra Nath, David Braganza ‡, and Darren Dawson EUCLIDEAN POSITION ESTIMATION OF FEATURES ON A STATIC OBJECT USING A MOVING CALIBRATED CAMERA Nitendra Nath, David Braganza ‡, and Darren Dawson Department of Electrical and Computer Engineering, Clemson University, Clemson, SC 29634-0915, E-mail: email@example.com Abstract What is Euclidean Position Estimation? Three-dimensional (3D) reconstruction of an object, where the Euclidean coordinates of feature points on a moving or fixed object are recovered from a sequence of two- dimensional (2D) images is known as Euclidean position estimation or more broadly known as Structure from Motion (SFM) or Simultaneous Localization and Mapping (SLAM). They have significant impact on several applications such as: A 3D Euclidean position estimator using a single moving calibrated camera whose position is assumed to be measurable is developed in this paper to asymptotically recover the structure of a static object. To estimate the structure, an adaptive least squares estimation strategy is employed based on a novel prediction error formulation and a Lyapunov stability analysis. Autonomous vehicle navigationPath planningSurveillance Geometric Model A geometric relationship is developed between a moving camera and a stationary object. n feature points located on a static object, denoted by are considered. ¹ m i, [ x i y i z i ] T F i 8 i = 1 ;:::; n 3D coordinates of i th feature point w.r.t. : C Normalized Euclidean coordinates : m i, 1 z i ¹ m i = [ x i = z i y i = z i 1 ] T Corresponding projected pixel coordinates : p i, [ u i v i ] T, u i ( t ) 2 R v i ( t ) 2 R Pin-hole camera model: p i = A m i = 1 z i A ¹ m i A, · fk u fk u co t Á u 0 0 fk v s i n Á v 0 ¸ : Known constant intrinsic calibration matrix of the camera A 2 R 2 £ 3 The objective of this work is to accurately identify the unknown constant Euclidean coordinates of the feature x fi relative to the world frame in order to recover the 3D structure of the object. Geometric relationships between the fixed object, mechanical system and the camera. R b ( t ) 2 SO ( 3 ) x b ( t ) 2 R 3 R c ( t ) 2 SO ( 3 ) x c ( t ) 2 R 3 Measurable rotation matrix and translation vector from B to W Known constant rotation matrix and translation vector from C to B x f i 2 R 3 ¹ m i ( t ) 2 R 3 Unkown Euclidean Structure Estimation From the geometric model, the following expression can be obtained: ¹ m i = R T c £ R T b ( x f i ¡ x b ) ¡ x c ¤ After utilizing pin-hole camera model, pixel coordinates of i th feature point can be written as: p i = 1 z i AR T c £ R T b ( x f i ¡ x b ) ¡ x c ¤ Corresponding depth z i = R T c 3 £ R T b ( x f i ¡ x b ) ¡ x c ¤ Last row of R T c ( t ) in parameterized form: p i ( t ) p i = 1 ¦£ i W£ i W£ i = AR T c £ R T b ( x f i ¡ x b ) ¡ x c ¤ ¦£ i = z i = R T c 3 £ R T b ( x f i ¡ x b ) ¡ x c ¤, Prediction error for i th feature point ~ p i = 1 ¦£ i ( W ¡ ^ p i ¦ ) ~ £ i Combined prediction error ~ p = B ¹ W p ~ £ ¦ ( t ) 2 R 1 £ 4, W ( t ) 2 R 2 £ 4 : measurable regression matrices £ i 2 R 4 : unkown constant parameter vector ¹ W p ( t ) 2 R 2 n £ 4 n B ( t ) 2 R 2 n £ 2 n ~ £ ( t ) 2 R 4 n : measurable signal : auxiliary matrix : combined estimation error Adaptive update law is designed as: : ^ £, P ro j © ® ¡ ¹ W T p ~ p ª P ro j f ¢ g ® ( t ) 2 R ¡ ( t ) 2 R 4 n £ 4 n : ensures positiveness of the term ¦ ( t ) ^ £ i ( t ) : a positive scalar function : least-squares estimation gain matrix Simulation Results Case 1: No noise added to pixel coordinates Distance Estimation Error ObjectActual distance (cm)Estimated distance (cm)Error (cm)Convergence time (sec) Case 1 Length I Length II Length III 50.0 111.8 100.0 49.94 111.25 99.86 0.06 0.55 0.14 0.12 0.49 0.14 Case 2 Length I Length II Length III 50.0 111.8 100.0 49.90 111.15 99.74 0.10 0.65 0.26 0.20 0.58 0.26 Case 3 Length I Length II Length III 50.0 111.8 100.0 49.88 111.08 99.65 0.12 0.72 0.35 0.24 0.64 0.35 Experimental Results Case 2: Gaussian noise of variance 200 added to pixel coordinates Case 3: Gaussian noise of variance 400 added to pixel coordinates Robot Control PCVision PC 15 Hz Trigger PUMA 560 Robot Object Monochrome CCD Camera Experimental testbed with camera, robot and object ObjectActual distance (cm) Estimated distance (cm) Error (cm) Convergence time (sec) Length I Length II Length III Length IV Length V Length VI 11.24 2.81 11.24 5.62 16.86 5.62 11.41 2.76 11.56 5.72 17.31 5.44 0.17 0.05 0.32 0.10 0.45 0.18 37.3 33.1 33.3 32.2 37.6 35.4 Object I: Checker-boardDistance Estimation Error Object II: Doll-houseDistance Estimation Error ObjectActual distance (cm) Estimated distance (cm) Error (cm) Convergence time (sec) Length I Length II Length III Length IV Length V Length VI 40.0 12.2 12.2 13.0 15.0 26.5 41.3 12.7 11.6 13.4 14.3 27.4 1.3 0.5 0.6 0.4 0.7 0.9 32.2 33.4 30.1 32.2 34.7 33.5 Object III: Tool-boxesDistance Estimation Error ObjectActual distance (cm) Estimated distance (cm) Error (cm) Convergence time (sec) Length I Length II Length III Length IV Length V Length VI 14.7 4.2 5.0 9.0 9.6 3.8 14.15 4.10 4.96 8.78 9.44 3.64 0.55 0.10 0.04 0.22 0.16 0.16 37.6 39.9 36.7 39.8 39.8 38.9 The estimator accurately identifies the Euclidean distances between the features without having any information with regard to the object’s geometry. ‡ D. Braganza is with OFS, 50 Hall Road, Sturbridge, MA 01566. KLT feature tracking algorithm was used for tracking feature points from one frame to another.
Time-Varying Angular Rate Sensing for a MEMS Z-Axis Gyroscope Mohammad Salah †, Michael McIntyre †, Darren Dawson †, and John Wagner ‡ Mohammad Salah †,
Projected image of a cube. Classical Calibration.
1 Preview At least two views are required to access the depth of a scene point and in turn to reconstruct scene structure Multiple views can be obtained.
Camera Calibration. Issues: what are intrinsic parameters of the camera? what is the camera matrix? (intrinsic+extrinsic) General strategy: view calibration.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Two-view geometry. Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Computer vision: models, learning and inference
Automatic Camera Calibration
Reconstruction from Two Calibrated Views Two-View Geometry
Lecture 03 15/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
EECS 274 Computer Vision Affine Structure from Motion.
1 Chapter 2: Geometric Camera Models Objective: Formulate the geometrical relationships between image and scene measurements Scene: a 3-D function, g(x,y,z)
Computer vision: models, learning and inference M Ahad Multiple Cameras
Neural Network Grasping Controller for Continuum Robots David Braganza, Darren M. Dawson, Ian D. Walker, and Nitendra Nath David Braganza, Darren M. Dawson,
EECS 274 Computer Vision Projective Structure from Motion.
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
Affine Structure from Motion
3D SLAM for Omni-directional Camera
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
Structure from motion. Multiple-view geometry questions Scene geometry (structure): Given 2D point matches in two or more images, where are the corresponding.
Two-View Geometry CS Sastry and Yang
Camera calibration and epipolar geometry
Uncalibrated Geometry & Stratification Sastry and Yang
Structure from motion.
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Ch. 3: Geometric Camera Calibration
Single-view geometry Odilon Redon, Cyclops, 1914.
Euclidean cameras and strong (Euclidean) calibration Intrinsic and extrinsic parameters Linear least-squares methods Linear calibration Degenerate point.
Camera Model Calibration
Geometric Camera Models
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
Camera Geometry and Calibration Thanks to Martial Hebert.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Vehicle Segmentation and Tracking From a Low-Angle Off-Axis Camera Neeraj K. Kanhere Committee members Dr. Stanley Birchfield Dr. Robert Schalkoff Dr.
Reconstruction of a Scene with Multiple Linearly Moving Objects Mei Han and Takeo Kanade CISC 849.
State-Space Recursive Least Squares with Adaptive Memory College of Electrical & Mechanical Engineering National University of Sciences & Technology (NUST)
Epipolar Geometry and the Fundamental Matrix F
Plane-based external camera calibration with accuracy measured by relative deflection angle Chunhui Cui ， KingNgiNgan Journal Image Communication Volume.
Active Calibration of Cameras: Theory and Implementation Anup Basu Sung Huh CPSC 643 Individual Presentation II March 4 th,
Computer vision: geometric models Md. Atiqur Rahman Ahad Based on: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince.
Tracking Unknown Dynamics - Combined State and Parameter Estimation Tracking Unknown Dynamics - Combined State and Parameter Estimation Presenters: Hongwei.
© 2017 SlidePlayer.com Inc. All rights reserved.