Presentation is loading. Please wait.

Presentation is loading. Please wait.

Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.

Similar presentations


Presentation on theme: "Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM."— Presentation transcript:

1 Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM

2 Overview SFM and SLAM Extended Kalman filter Visual SLAM details Results Next

3 Estimating structure and motion Factorization [Tomasi & Kanade ’92] –Batch method –Efficient –Originally for affine camera –Missing data? –Finite camera [Sturm & Triggs] W = MX

4 Estimating structure and motion Reconstruction from N views [Hartley & Zisserman ’00] –Multiview geomteric entities and algorithms described by Faugeras, H, Z, and others –Minimize global error with bundle adjustment –Can be used sequentially –Upgrade to Euclidean with auto calibration x  F  P  Xx  F  P  X

5 SLAM Simultaneous Localisation And Mapping Estimate robot’s pose and map feature positions Probabilistic framework maintains –current estimate –estimate uncertainty (covariance) Update based on measurements and model Many systems use –odometry and active sensors as measurement devices –limited motion models

6 Vision-based SLAM Camera for measurements Trinocular –3D measurements by triangulation –Offline [Ayache, Faugeras ’89] –Real-time with SIFTs [Se, Lowe, Little ’01] Real-time monocular [Chiuso et al. ’00]

7 Kalman filter [Swerling ’58] [Welch, Bishop ’01] Estimates state of dynamic system Integrates noisy measurements to give optimal estimate Noise is Gaussian First order Markov process

8 KF: key variables estimate of state at time k error covariance (estimate uncertainty) state transition function measurement state to measure noise covariances

9 KF: Two phase estimation Predict –Predicted state –Predicted covariance

10 KF: Two phase estimation Update –Innovation –Innov. covar. –Kalman gain –State –Covariance

11 EKF: Extended Kalman filter Allow non-linear functions (F, H) Apply functions to state Apply jacobian to covariances Linearizing functions around current estimate

12 Visual SLAM details [Davison ’03] State representation x, P Process model F (motion) Measurement model H (projection) State update System initialization Adding and removing feature

13 State representation Scene structure (feature points) –Depth from reference image [Azarbayejani, Pentland ’95] –x,y,z coordinates Camera –Pose –Motion

14 State estimate vector Points y i Camera x v –6DOF pose –Constant velocity motion model –Acceleration modeled as noise

15 Covariance matrix Covariance blocks –P xx camera params –P y i y i point I Off diagonals represent correlation between estimates

16 Process model Points don’t move: y k = y k-1 Add velocity and acceleration to current camera parameters Covariance updated using jacobian

17 Measurement model H models projection of the predicted points by the predicted camera Covariance S i guides feature match search

18 Making measurements / Update Project innovation covariance to search ellipse Warp template based on camera and point prediction If viewing angle is good, match to get measurement Compute Kalman gain and update state and covariance

19 System initialization Need initial estimate and covariance –Calibration object –SFM Process covariance –Small: small searches, but can only handle small accelerations –Large: can handle big accelerations, but need many measurements Measurement covariance –Function of matching method (camera resolution)

20 Adding and removing features Add –Select salient feature in desired region –Search along epipolar line Remove –If matching repeatedly fails Davison ’03

21 Preliminary results Simulation [implemented with Birkbeck] –Behaves according to model –Initial estimate of camera and 4 key points is true value + small amount of noise –Initial estimate of other points is true value + significant noise –Initial covariance is scaled identity

22 Simulation

23 Adding points

24 Simulation with visibility

25 Next Real images (video sequence) –Feature matching –Tracking –SIFTs ? Real-time issues –Postponement [Davison ’01] Loop closing –Davison’s system automatically corrects if feature becomes visible and is correctly measured, but… –Prevent drift by incorporating explicit loop closing [Newman, Ho ’05]

26 References

27


Download ppt "Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM."

Similar presentations


Ads by Google