Motion from image and inertial measurements (additional slides) Dennis Strelow Carnegie Mellon University.

Slides:



Advertisements
Similar presentations
3D Model Matching with Viewpoint-Invariant Patches(VIP) Reporter :鄒嘉恆 Date : 10/06/2009.
Advertisements

The fundamental matrix F
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Patch to the Future: Unsupervised Visual Prediction
Vision Based Control Motion Matt Baker Kevin VanDyke.
1 Long-term image-based motion estimation Dennis Strelow.
A KLT-Based Approach for Occlusion Handling in Human Tracking Chenyuan Zhang, Jiu Xu, Axel Beaugendre and Satoshi Goto 2012 Picture Coding Symposium.
Computer Vision REU Week 2 Adam Kavanaugh. Video Canny Put canny into a loop in order to process multiple frames of a video sequence Put canny into a.
Computer Vision Optical Flow
Motion from image and inertial measurements Dennis Strelow Carnegie Mellon University.
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Motion from image and inertial measurements Dennis Strelow Honeywell Advanced Technology Lab.
Announcements Quiz Thursday Quiz Review Tomorrow: AV Williams 4424, 4pm. Practice Quiz handout.
Precise Omnidirectional Camera Calibration Dennis Strelow, Jeffrey Mishler, David Koes, and Sanjiv Singh.
Motion from image and inertial measurements Dennis Strelow Carnegie Mellon University.
Planar Matchmove Using Invariant Image Features Andrew Kaufman.
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Image-Based Rendering using Hardware Accelerated Dynamic Textures Keith Yerex Dana Cobzas Martin Jagersand.
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
CS664 Lecture #19: Layers, RANSAC, panoramas, epipolar geometry Some material taken from:  David Lowe, UBC  Jiri Matas, CMP Prague
Motion from image and inertial measurements Dennis Strelow Carnegie Mellon University.
Lecture 19: Optical flow CS6670: Computer Vision Noah Snavely
Matching Compare region of image to region of image. –We talked about this for stereo. –Important for motion. Epipolar constraint unknown. But motion small.
Goal: Fast and Robust Velocity Estimation P1P1 P2P2 P3P3 P4P4 Our Approach: Alignment Probability ●Spatial Distance ●Color Distance (if available) ●Probability.
KLT tracker & triangulation Class 6 Read Shi and Tomasi’s paper on good features to track
Miguel Lourenço, João P. Barreto, Abed Malti Institute for Systems and Robotics, Faculty of Science and Technology University of Coimbra, Portugal Feature.
CSCE 641 Computer Graphics: Image Registration Jinxiang Chai.
Accurate, Dense and Robust Multi-View Stereopsis Yasutaka Furukawa and Jean Ponce Presented by Rahul Garg and Ryan Kaminsky.
EE392J Final Project, March 20, Multiple Camera Object Tracking Helmy Eltoukhy and Khaled Salama.
Prakash Chockalingam Clemson University Non-Rigid Multi-Modal Object Tracking Using Gaussian Mixture Models Committee Members Dr Stan Birchfield (chair)
Lecture 12 Stereo Reconstruction II Lecture 12 Stereo Reconstruction II Mata kuliah: T Computer Vision Tahun: 2010.
Automatic Registration of Color Images to 3D Geometry Computer Graphics International 2009 Yunzhen Li and Kok-Lim Low School of Computing National University.
1. Introduction Motion Segmentation The Affine Motion Model Contour Extraction & Shape Estimation Recursive Shape Estimation & Motion Estimation Occlusion.
CSCE 643 Computer Vision: Structure from Motion
Robot Vision SS 2007 Matthias Rüther 1 ROBOT VISION Lesson 6a: Shape from Stereo, short summary Matthias Rüther Slides partial courtesy of Marc Pollefeys.
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
Pyramidal Implementation of Lucas Kanade Feature Tracker Jia Huang Xiaoyan Liu Han Xin Yizhen Tan.
Computer Vision Lecture #10 Hossam Abdelmunim 1 & Aly A. Farag 2 1 Computer & Systems Engineering Department, Ain Shams University, Cairo, Egypt 2 Electerical.
Communication Systems Group Technische Universität Berlin S. Knorr A Geometric Segmentation Approach for the 3D Reconstruction of Dynamic Scenes in 2D.
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University.
Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion Estimation I What affects the induced image motion?
Fast Semi-Direct Monocular Visual Odometry
Lecture 9 Feature Extraction and Motion Estimation Slides by: Michael Black Clark F. Olson Jean Ponce.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
3D Reconstruction Using Image Sequence
Structure from Motion Paul Heckbert, Nov , Image-Based Modeling and Rendering.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Representing Moving Images with Layers J. Y. Wang and E. H. Adelson MIT Media Lab.
Motion estimation Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2005/4/12 with slides by Michael Black and P. Anandan.
Multi-view Synchronization of Human Actions and Dynamic Scenes Emilie Dexter, Patrick Pérez, Ivan Laptev INRIA Rennes - Bretagne Atlantique
1 Long-term image-based motion estimation Dennis Strelow and Sanjiv Singh.
Motion estimation Parametric motion (image alignment) Tracking Optical flow.
Motion estimation Digital Visual Effects, Spring 2005 Yung-Yu Chuang 2005/3/23 with slides by Michael Black and P. Anandan.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University.
Motion from image and inertial measurements
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Head pose estimation without manual initialization
Object Tracking Based on Appearance and Depth Information
3D Photography: Epipolar geometry
Representing Moving Images with Layers
Representing Moving Images with Layers
Week 1 Alan Wright - UCF.
Presentation transcript:

Motion from image and inertial measurements (additional slides) Dennis Strelow Carnegie Mellon University

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Outline Robust image feature tracking (in detail) Lucas-Kanade and real sequences The “smalls” tracker Motion from omnidirectional images

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: Lucas- Kanade and real sequences (1) Combining image and inertial measurements improves our situation, but… we still need accurate feature tracking tracking some sequences do not come with inertial measurements

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: Lucas- Kanade and real sequences (2) better feature tracking for improved 6 DOF motion estimation remaining results will be image-only

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: Lucas- Kanade and real sequences (3) Lucas-Kanade has been the go-to feature tracker for shape-from-motion minimizes a correlation-like matching error using general minimization evaluates the matching error at only a few locations subpixel resolution

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: Lucas- Kanade and real sequences (4) Additional heuristics used to apply Lucas- Kanade to shape-from-motion: task:heuristic: choose features to trackhigh image texture identify mistracked, occluded, no-longer-visible convergence, matching error handle large motionsimage pyramid

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: Lucas- Kanade and real sequences (5) But Lucas-Kanade performs poorly on many real sequences…

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: the “smalls” tracker (1) smalls is a new feature tracker targeted at 6 DOF motion estimation exploits the rigid scene assumption eliminates the heuristics normally used with Lucas-Kanade SIFT is an enabling technology here

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: the “smalls” tracker (2) First step: epipolar geometry estimation use SIFT to establish matches between the two images get the 6 DOF camera motion between the two images get the epipolar geometry relating the two images

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: the “smalls” tracker (3)

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: the “smalls” tracker (4)

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: the “smalls” tracker (5) Second step: track along epipolar lines use nearby SIFT matches to get initial position on epipolar line exploits the rigid scene assumption eliminates heuristic: pyramid

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: the “smalls” tracker (6) Third step: prune features geometrically inconsistent features are marked as mistracked and removed clumped features are pruned eliminates heuristic: detecting mistracked features based on convergence, error

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: the “smalls” tracker (7) Fourth step: extract new features spatial image coverage is the main criterion required texture is minimal when tracking is restricted to the epipolar lines eliminates heuristic: extracting only textured features

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: the “smalls” tracker (8)

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: the “smalls” tracker (9) left: odometry onlyright: images only average error: 1.74 m maximum error: 5.14 m total distance: 230 m

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Robust image feature tracking: the “smalls” tracker (10) Recap: exploits the rigid scene and eliminates heuristics allows hands-free tracking for real sequences can still be defeated by textureless areas or repetitive texture

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Outline Robust image feature tracking (in detail) Motion from omnidirectional images

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Motion from omnidirectional images (1)

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Motion from omnidirectional images (2)

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Motion from omnidirectional images (3)

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Motion from omnidirectional images (4)

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Motion from omnidirectional images (5) left: non-rigid cameraright: rigid camera squares: ground truth points solid: image-only estimates dash-dotted: image-and-inertial estimates

Dennis Strelow -- Motion estimation from image and inertial measurements – January 6, Motion from omnidirectional images (6) In this experiment: omni images conventional images + inertial have roughly the same advantages But in general: inertial has some advantages that omni images alone can’t produce omni images can be harder to use