Tracking by Sampling Trackers Junseok Kwon* and Kyoung Mu lee Computer Vision Lab. Dept. of EECS Seoul National University, Korea Homepage:

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Robust Visual Tracking – Algorithms, Evaluations and Problems Haibin Ling Department of Computer and Information Sciences Temple University Philadelphia,
Introduction To Tracking
Visual Object Tracking Using Particle Filters : A Survey Satarupa Mukherjee Department of Computing Science, University of Alberta, Edmonton, Canada
Learning to estimate human pose with data driven belief propagation Gang Hua, Ming-Hsuan Yang, Ying Wu CVPR 05.
Reducing Drift in Parametric Motion Tracking
Online Multiple Classifier Boosting for Object Tracking Tae-Kyun Kim 1 Thomas Woodley 1 Björn Stenger 2 Roberto Cipolla 1 1 Dept. of Engineering, University.
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Shape From Light Field meets Robust PCA
Foreground Modeling The Shape of Things that Came Nathan Jacobs Advisor: Robert Pless Computer Science Washington University in St. Louis.
Forward-Backward Correlation for Template-Based Tracking Xiao Wang ECE Dept. Clemson University.
Robust Object Tracking via Sparsity-based Collaborative Model
A KLT-Based Approach for Occlusion Handling in Human Tracking Chenyuan Zhang, Jiu Xu, Axel Beaugendre and Satoshi Goto 2012 Picture Coding Symposium.
Robust Multi-Pedestrian Tracking in Thermal-Visible Surveillance Videos Alex Leykin and Riad Hammoud.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Formation et Analyse d’Images Session 8
Unsupervised Feature Selection for Multi-Cluster Data Deng Cai et al, KDD 2010 Presenter: Yunchao Gong Dept. Computer Science, UNC Chapel Hill.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Mean-Shift Algorithm and Its Application Bohyung Han
Tracking with Online Appearance Model Bohyung Han
Particle filters (continued…). Recall Particle filters –Track state sequence x i given the measurements ( y 0, y 1, …., y i ) –Non-linear dynamics –Non-linear.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Today Introduction to MCMC Particle filters and MCMC
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Real-Time Decentralized Articulated Motion Analysis and Object Tracking From Videos Wei Qu, Member, IEEE, and Dan Schonfeld, Senior Member, IEEE.
Object Recognition by Parts Object recognition started with line segments. - Roberts recognized objects from line segments and junctions. - This led to.
Computer Vision James Hays, Brown
Mutual Information-based Stereo Matching Combined with SIFT Descriptor in Log-chromaticity Color Space Yong Seok Heo, Kyoung Mu Lee, and Sang Uk Lee.
Prakash Chockalingam Clemson University Non-Rigid Multi-Modal Object Tracking Using Gaussian Mixture Models Committee Members Dr Stan Birchfield (chair)
BraMBLe: The Bayesian Multiple-BLob Tracker By Michael Isard and John MacCormick Presented by Kristin Branson CSE 252C, Fall 2003.
/09/dji-phantom-crashes-into- canadian-lake/
Visual Tracking Decomposition Junseok Kwon* and Kyoung Mu lee Computer Vision Lab. Dept. of EECS Seoul National University, Korea Homepage:
Orderless Tracking through Model-Averaged Posterior Estimation Seunghoon Hong* Suha Kwak Bohyung Han Computer Vision Lab. Dept. of Computer Science and.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
A General Framework for Tracking Multiple People from a Moving Camera
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
Probabilistic Robotics Bayes Filter Implementations.
CSE 185 Introduction to Computer Vision Pattern Recognition 2.
Particle Filters for Shape Correspondence Presenter: Jingting Zeng.
Loris Bazzani*, Marco Cristani*†, Vittorio Murino*† Speaker: Diego Tosato* *Computer Science Department, University of Verona, Italy †Istituto Italiano.
1 ROBUST VISUAL TRACKING A Brief Summary Gagan Mirchandani School of Engineering, University of Vermont 1 1 And Ben Schilling, Clark Vandam, Kevin Haupt.
BAGGING ALGORITHM, ONLINE BOOSTING AND VISION Se – Hoon Park.
Learning the Appearance and Motion of People in Video Hedvig Sidenbladh, KTH Michael Black, Brown University.
Mobile Robot Localization (ch. 7)
Expectation-Maximization (EM) Case Studies
Sparse Bayesian Learning for Efficient Visual Tracking O. Williams, A. Blake & R. Cipolloa PAMI, Aug Presented by Yuting Qi Machine Learning Reading.
Michael Isard and Andrew Blake, IJCV 1998 Presented by Wen Li Department of Computer Science & Engineering Texas A&M University.
Paper Reading Dalong Du Nov.27, Papers Leon Gu and Takeo Kanade. A Generative Shape Regularization Model for Robust Face Alignment. ECCV08. Yan.
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Real-Time Tracking with Mean Shift Presented by: Qiuhua Liu May 6, 2005.
 Present by 陳群元.  Introduction  Previous work  Predicting motion patterns  Spatio-temporal transition distribution  Discerning pedestrians  Experimental.
State Estimation and Kalman Filtering Zeeshan Ali Sayyed.
Tracking with dynamics
IEEE International Conference on Multimedia and Expo.
Presented by: Idan Aharoni
Max-Confidence Boosting With Uncertainty for Visual tracking WEN GUO, LIANGLIANG CAO, TONY X. HAN, SHUICHENG YAN AND CHANGSHENG XU IEEE TRANSACTIONS ON.
Detecting Occlusion from Color Information to Improve Visual Tracking
Learning Image Statistics for Bayesian Tracking Hedvig Sidenbladh KTH, Sweden Michael Black Brown University, RI, USA
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
Robust and Fast Collaborative Tracking with Two Stage Sparse Optimization Authors: Baiyang Liu, Lin Yang, Junzhou Huang, Peter Meer, Leiguang Gong and.
Tracking Objects with Dynamics
Face Recognition and Feature Subspaces
Convolutional Neural Networks for Visual Tracking
PRAKASH CHOCKALINGAM, NALIN PRADEEP, AND STAN BIRCHFIELD
Parallelizing the Condensation Algorithm for Visual Tracking
Tracking Many slides adapted from Kristen Grauman, Deva Ramanan.
Presentation transcript:

Tracking by Sampling Trackers Junseok Kwon* and Kyoung Mu lee Computer Vision Lab. Dept. of EECS Seoul National University, Korea Homepage:

Goal of Visual Tracking  Robustly tracks the target in real-world scenarios Frame #1 Frame #43

Bayesian Tracking Approach  Maximum a Posteriori (MAP) estimate Intensity edge

State Sampling  MAP estimate by Monte Carlo sampling X position Y position Scale State space Visual tracker Guided by

Problem of previous works  Conventional trackers have difficulty in obtaining good samples. Visual tracker Tracking environment changes Fixed can not reflect the changing tracking environment well.

Tracker space Tracker sampling Our approach : Tracker Sampling  Sampling tracker itself as well as state X position Y position Scale X position Y position Scale Tracker #2 Tracker #M X position Y position Scale Tracker #1 State sampling

Two challenges  How the tracker space is defined?  When and which tracker should be sampled ? Tracker space Tracker #1 Tracker #2 Tracker #M Tracker space

Challenge 1 : Tracker Space  Tracker space  Nobody tries to define tracker space.  Very difficult to design the space because the visual tracker is hard to be described. Tracker space

Bayesian Tracking Approach  Go back to the Bayesian tracking formulation Updating rule

Bayesian Tracking Approach  What is important ingredients of visual tracker? 1. Appearance model 2. Motion model 3. State representation type 4. Observation type

Appearance model Motion model State representation Observation Tracker Space

Motion model Observation type State representation type Appearance model Challenge 2 : Tracker Sampling  Tracker sampling  When and which tracker should be sampled ?  To reflect the current tracking environment. Tracker space Tracker #m

Reversible Jump-MCMC  We use the RJ-MCMC method for tracker sampling. Add Delete Set of sampled appearance models Add Delete Set of sampled motion models Add Delete Set of sampled state representation types Add Delete Set of sampled observation types Sampled basic trackers

Sampling of Appearance Model  Make candidates using SPCA*  The candidates are PCs of the target appearance. Appearance models * A. d’Aspremont et. al. A direct formulation for sparse PCA using semidefinite programming. Data Min. SIAM Review, Sparse Principle Component Analysis*

 Accept an appearance model  With acceptance ratio Sampling of Appearance Model Our method has the limited number of models

 The accepted model increase the total likelihood scores for recent frames  When it is adopted as the target reference

Sampling of Motion Model  Make candidates using KHM*  The candidates are mean vectors of the clusters for motion vectors. Motion models K-Harmonic Means Clustering (KHM)* * B. Zhang, M. Hsu, and U. Dayal. K-harmonic means - a data clustering algorithm. HP Technical Report, 1999

Sampling of Motion Model  Accept a motion model  With acceptance ratio Our method has the limited number of models

 The accepted model decreases the total clustering error of motion vectors for recent frames  When it is set to the mean vector of the cluster

Sampling of State Representation  Make candidates using VPE*  The candidates describe the target as the different combinations of multiple fragments. Vertical Projection of Edge (VPE)* Edge Position Intensity * F.Wang, S. Yua, and J. Yanga. Robust and efficient fragments-based tracking using mean shift. Int. J. Electron. Commun., 64(7):614–623, State representation Fragment 1 Fragment 2

 Accept a state representation type  With acceptance ratio Sampling of State Representation Our method has the limited number of types

 The accepted type reduce the total variance of target appearance in each fragment for recent frames

Sampling of Observation  Make candidates using GFB*  The candidates are the response of multiple Gaussian filters of which variances are different. Gaussian Filter Bank (GFB)* * J. Sullivan, A. Blake, M. Isard, and J. MacCormick. Bayesian object localisation in images. IJCV, 44(2):111–135, 2001.

Sampling of Observation  Accept an observation type  With acceptance ratio Our method has the limited number of types

 The accepted type makes more similar between foregrounds, but more different with foregrounds and backgrounds for recent frames

Tracker space Overall Procedure X position Y position Scale X position Y position Scale X position Y position Scale Tracker #1 Tracker #2 Tracker #M Tracker sampling State sampling Interaction

Qualitative Results

Iron-man dataset

Qualitative Results Matrix dataset

Qualitative Results Skating1 dataset

Qualitative Results Soccer dataset

Quantitative Results MCIVTMILVTDOurs soccer skating animal shaking Soccer* Skating1* Iron-man Matrix Average center location errors in pixels IVT : Ross et. al. Incremental learning for robust visual tracking. IJCV MIL : Babenko et. al. Visual tracking with online multiple instance learning. CVPR MC : Khan et. al. MCMC-based particle filtering for tracking a variable number of interacting targets. PAMI VTD: Kwon et. al. Visual tracking decomposition. CVPR 2010.

Summary  Visual tracker sampler  New framework, which samples visual tracker itself as well as state.  Efficient sampling strategy to sample the visual tracker.