Tracking Course web page: vision.cis.udel.edu/~cv April 18, 2003  Lecture 23.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Mobile Robot Localization and Mapping using the Kalman Filter
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Robot Vision SS 2005 Matthias Rüther 1 ROBOT VISION Lesson 10: Object Tracking and Visual Servoing Matthias Rüther.
Dynamic Bayesian Networks (DBNs)
Reducing Drift in Parametric Motion Tracking
(Includes references to Brian Clipp
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Observers and Kalman Filters
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Introduction to Mobile Robotics Bayes Filter Implementations Gaussian filters.
Formation et Analyse d’Images Session 8
SA-1 Body Scheme Learning Through Self-Perception Jürgen Sturm, Christian Plagemann, Wolfram Burgard.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Today Introduction to MCMC Particle filters and MCMC
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Computer Vision Non-linear tracking Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, M. Isard, T. Darrell …
Vision Review: Motion & Estimation Course web page: September 24, 2002.
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
Automatic Image Alignment (feature-based) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and.
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
ROBOT MAPPING AND EKF SLAM
Markov Localization & Bayes Filtering
/09/dji-phantom-crashes-into- canadian-lake/
Human-Computer Interaction Human-Computer Interaction Tracking Hanyang University Jong-Il Park.
Computer vision: models, learning and inference Chapter 19 Temporal models.
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.2: Kalman Filter Jürgen Sturm Technische Universität München.
3D SLAM for Omni-directional Camera
Segmentation Course web page: vision.cis.udel.edu/~cv May 7, 2003  Lecture 31.
Computer vision: models, learning and inference Chapter 19 Temporal models.
Simultaneous Localization and Mapping Presented by Lihan He Apr. 21, 2006.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
Computer Vision - A Modern Approach Set: Tracking Slides by D.A. Forsyth The three main issues in tracking.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
CSCE 643 Computer Vision: Structure from Motion
Stereo Course web page: vision.cis.udel.edu/~cv April 11, 2003  Lecture 21.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Single View Geometry Course web page: vision.cis.udel.edu/cv April 9, 2003  Lecture 20.
Mobile Robot Localization (ch. 7)
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
State Estimation and Kalman Filtering
July 11, 2006Bayesian Inference and Maximum Entropy Probing the covariance matrix Kenneth M. Hanson T-16, Nuclear Physics; Theoretical Division Los.
Looking at people and Image-based Localisation Roberto Cipolla Department of Engineering Research team
Final Review Course web page: vision.cis.udel.edu/~cv May 21, 2003  Lecture 37.
Classification Course web page: vision.cis.udel.edu/~cv May 14, 2003  Lecture 34.
Object Tracking - Slide 1 Object Tracking Computer Vision Course Presentation by Wei-Chao Chen April 05, 2000.
Tracking with dynamics
Instructor: Mircea Nicolescu Lecture 10 CS 485 / 685 Computer Vision.
Tracking Hands with Distance Transforms Dave Bargeron Noah Snavely.
Camera Calibration Course web page: vision.cis.udel.edu/cv March 24, 2003  Lecture 17.
MASKS © 2004 Invitation to 3D vision Lecture 3 Image Primitives andCorrespondence.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Probabilistic Robotics
Simultaneous Localization and Mapping
Dynamical Statistical Shape Priors for Level Set Based Tracking
Filtering and State Estimation: Basic Concepts
A Short Introduction to the Bayes Filter and Related Models
Tracking Many slides adapted from Kristen Grauman, Deva Ramanan.
Presentation transcript:

Tracking Course web page: vision.cis.udel.edu/~cv April 18, 2003  Lecture 23

Announcements Reading in Forsyth & Ponce for Monday: –Chapter on the Kalman filter –Chapter on the problem of data association –“Bonus” chapter "Tracking with Non-linear Dynamic Models“ (2-2.3) on particle filtering

Outline Tracking as probabilistic inference Examples –Feature tracking –Snakes Kalman filter

What is Tracking? Following a feature or object over a sequence of images Motion is essentially differential, making frame-to-frame corres- pondence (relatively) easy This can be posed as an probabilistic inference problem –We know something about object shape, dynamics, but we want to estimate state –There’s also uncertainty due to noise, unpredictability of motion, etc. from [Hong, 1995]

Robotics –Manipulation, grasping [Hong, 1995] –Mobility, driving [Taylor et al., 1996] –Localization [Dellaert et al., 1998] Surveillance/Activity monitoring –Street, highway [Koller et al., 1994; Stauffer & Grimson, 1999] –Aerial [Cohen & Medioni, 1998] Human-computer interaction –Expressions, gestures [Kaucic & Blake, 1998; Starner & Pentland, 1996] –Smart rooms/houses [Shafer et al., 1998; Essa, 1999] Tracking Applications

Tracking As Probabilistic Inference Recall Bayes’ rule: For tracking, these random variables have common names: –X is the state –Z is the measurement These are multi-valued and time-indexed, so:

The Notion of State State X t is a vector of the parameters we are trying to estimate –Changing over time Some possibilities: –Position: Image coordinates, world coordinates (i.e., depth) –Orientation (2-D or 3-D) Rigid “pose” of entire object Joint angle(s) if the object is articulated (e.g., a person’s arm): Curvature if the object is “bendable” –Differential quantities like velocity, acceleration, etc.

Example: 2-D position, velocity State

Measurements Z t is what we observe at one moment –For example, image position, image dimensions, color, etc. Measurement likelihood P (Z t j X t ) : Probability of measurement given the state Implicitly contains: –Measurement prediction function H(X) mapping states to measurements E.g., perspective projection E.g., removal of velocity terms unobservable in single image (or perhaps simulating motion blur?) –Comparison function such that probability is inversely proportional to kZ t ¡ H(X t )k

Example: 2-D position, velocity State Measurement prediction

Dynamics The prior probability on the state P (X t ) depends on previous states: P (X t j X t ¡ 1, X t ¡ 2,...) Dynamics: –1 st -order: Only consider t ¡ 1 (Markov property) E.g., Random walk, constant velocity –2 nd order: Only use t ¡ 1 and t ¡ 2 E.g., Changes of direction, periodic motion Can be represented as a 1 st -order process by doubling the size of the state to “remember” the last value Implicitly contains: –State prediction function F(X) mapping current state to future –Comparison function: Bigger kX t ¡ F(X t ¡ 1 )k ) Less likely X t E.g, random walk dynamics: P (X t j X t ¡ 1 ) / expf¡kX t ¡ X t ¡ 1 k 2 g

Example: 2-D position, velocity State Measurement prediction State prediction

Probabilistic Inference Want best estimate of state given current measurement z t and previous state x t ¡ 1 : Use, for example, MAP criterion: For general measurement likelihood & state prior, obtaining best estimate requires iterative search –Can confine search to region of state space near F(x t ¡ 1 ) for efficiency since this is where probability mass is concentrated these are fixed

Feature Tracking Detect corner-type features State x t –Position of template image (original found corner) –Optional: Velocity, acceleration terms –Rotation, perspective: For a planar feature, homography describes full range of possibilities Measurement likelihood P (z t j X) : Similarity of match (e.g., SSD/correlation) between template and z t, which is patch of image ztzt H (x t ) |z t – H (x t )|

Feature Tracking Dynamics P (X j x t ¡ 1 ) : Static or with displacement prediction Inference is simple: Gradient descent on match function starting at the predicted feature location –Can actually do this in one step assuming a small enough displacement –Image pyramid representation (i.e., Gaussian) can help with larger motions

Example: Feature Selection & Tracking from J. Shi & C. Tomasi Separately tracked features for a forward-moving camera

Example: Track History from J. Shi & C. Tomasi Measurement Time

Example: Feature Tracking courtesy of H. Jin

Snakes Idea: Track contours such as silhouettes, road lines using edge information Dynamics –Low-dimensional warp of shape template [Blake et al., 1993] Translation, in-plane rotation, affine, etc. –Or more general non-rigid deformations of curve Measurement likelihood –Error measure = Mean distance from predicted curve to nearest Canny edge –Or integrate gradient orthogonal to curve along it

Example: Contour-based Hand Template Tracking courtesy of A. Blake

Example: Non-rigid Contour Tracking courtesy of A. Blake

Kalman Filter Optimal, closed-form solution when we have: –Gaussian probability distributions (unimodal) Measurement likelihood P (z t j X) State prior P (X j x t ¡ 1 ) –Linear prediction functions (i.e., they can be written as matrix multiplications) Measurement prediction function ! H(X) = H X State prediction function ! F(X) = F X Online version of least-squares estimation –Instead of having all data points (measurements) at once before fitting (aka “batch”), compute new estimate as each point comes in Remember that 1 st -order model means that only last estimate and current measurement are available

Optimal Linear Estimation Assume: Linear system with uncertainties –State x –Dynamical (system) model: x = F x t ¡ 1 + » –Measurement model: z = H x + ¹ – », ¹ indicate white, zero-mean, Gaussian noise with covariances Q, R respectively proportional to uncertainty Want best state estimate at each instant plus indication of uncertainty P

Kalman Filter Steps Mean and covariance of posterior completely describe distribution

Multi-Modal Posteriors The MAP estimate is just the tallest one when there are multiple peaks in the posterior This is fine when one peak dominates, but when they are of comparable heights, we might sometimes pick the wrong one Committing to just one possibility can lead to mistracking –Want a wider sense of the posterior distribution to keep track of other good candidate states adapted from [Hong, 1995] Multiple peaks in the measurement likelihood

Tracking Complications Correspondence ambiguity (multi- modal posterior) –Kalman filter Data association techniques: NN, PDAF, JPDAF, MHF –Particle filters Stochastic approximation of distributions Nonlinear measurement, state prediction functions –Extended Kalman filter Linearize nonlinear function(s) with 1 st -order Taylor series approximation at each time step –Particle filters