Real-Time Decentralized Articulated Motion Analysis and Object Tracking From Videos Wei Qu, Member, IEEE, and Dan Schonfeld, Senior Member, IEEE.

Slides:



Advertisements
Similar presentations
IEEE CDC Nassau, Bahamas, December Integration of shape constraints in data association filters Integration of shape constraints in data.
Advertisements

Probabilistic Tracking and Recognition of Non-rigid Hand Motion
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Change Detection C. Stauffer and W.E.L. Grimson, “Learning patterns of activity using real time tracking,” IEEE Trans. On PAMI, 22(8): , Aug 2000.
Learning to estimate human pose with data driven belief propagation Gang Hua, Ming-Hsuan Yang, Ying Wu CVPR 05.
Reducing Drift in Parametric Motion Tracking
3D Human Body Pose Estimation from Monocular Video Moin Nabi Computer Vision Group Institute for Research in Fundamental Sciences (IPM)
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Real-Time Human Pose Recognition in Parts from Single Depth Images Presented by: Mohammad A. Gowayyed.
On-Line Probabilistic Classification with Particle Filters Pedro Højen-Sørensen, Nando de Freitas, and Torgen Fog, Proceedings of the IEEE International.
Robust Foreground Detection in Video Using Pixel Layers Kedar A. Patwardhan, Guillermoo Sapire, and Vassilios Morellas IEEE TRANSACTION ON PATTERN ANAYLSIS.
Robust Object Tracking via Sparsity-based Collaborative Model
Multiple People Detection and Tracking with Occlusion Presenter: Feifei Huo Supervisor: Dr. Emile A. Hendriks Dr. A. H. J. Stijn Oomes Information and.
A KLT-Based Approach for Occlusion Handling in Human Tracking Chenyuan Zhang, Jiu Xu, Axel Beaugendre and Satoshi Goto 2012 Picture Coding Symposium.
Oklahoma State University Generative Graphical Models for Maneuvering Object Tracking and Dynamics Analysis Xin Fan and Guoliang Fan Visual Computing and.
PHD Approach for Multi-target Tracking
Modeling Pixel Process with Scale Invariant Local Patterns for Background Subtraction in Complex Scenes (CVPR’10) Shengcai Liao, Guoying Zhao, Vili Kellokumpu,
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Segmentation and Tracking of Multiple Humans in Crowded Environments Tao Zhao, Ram Nevatia, Bo Wu IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,
A Bayesian Formulation For 3d Articulated Upper Body Segmentation And Tracking From Dense Disparity Maps Navin Goel Dr Ara V Nefian Dr George Bebis.
Probabilistic video stabilization using Kalman filtering and mosaicking.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
Particle filters (continued…). Recall Particle filters –Track state sequence x i given the measurements ( y 0, y 1, …., y i ) –Non-linear dynamics –Non-linear.
Object Detection and Tracking Mike Knowles 11 th January 2005
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Today Introduction to MCMC Particle filters and MCMC
Effective Gaussian mixture learning for video background subtraction Dar-Shyang Lee, Member, IEEE.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 20, NO. 11, NOVEMBER 2011 Qian Zhang, King Ngi Ngan Department of Electronic Engineering, the Chinese university.
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang.
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
Jacinto C. Nascimento, Member, IEEE, and Jorge S. Marques
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Action and Gait Recognition From Recovered 3-D Human Joints IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS— PART B: CYBERNETICS, VOL. 40, NO. 4, AUGUST.
Prakash Chockalingam Clemson University Non-Rigid Multi-Modal Object Tracking Using Gaussian Mixture Models Committee Members Dr Stan Birchfield (chair)
BraMBLe: The Bayesian Multiple-BLob Tracker By Michael Isard and John MacCormick Presented by Kristin Branson CSE 252C, Fall 2003.
Tracking by Sampling Trackers Junseok Kwon* and Kyoung Mu lee Computer Vision Lab. Dept. of EECS Seoul National University, Korea Homepage:
Visual Object Tracking Xu Yan Quantitative Imaging Laboratory 1 Xu Yan Advisor: Shishir K. Shah Quantitative Imaging Laboratory Computer Science Department.
Visual Tracking Decomposition Junseok Kwon* and Kyoung Mu lee Computer Vision Lab. Dept. of EECS Seoul National University, Korea Homepage:
A General Framework for Tracking Multiple People from a Moving Camera
Loris Bazzani*, Marco Cristani*†, Vittorio Murino*† Speaker: Diego Tosato* *Computer Science Department, University of Verona, Italy †Istituto Italiano.
1 ROBUST VISUAL TRACKING A Brief Summary Gagan Mirchandani School of Engineering, University of Vermont 1 1 And Ben Schilling, Clark Vandam, Kevin Haupt.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
Virtual Vector Machine for Bayesian Online Classification Yuan (Alan) Qi CS & Statistics Purdue June, 2009 Joint work with T.P. Minka and R. Xiang.
Action and Gait Recognition From Recovered 3-D Human Joints IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS— PART B: CYBERNETICS, VOL. 40, NO. 4, AUGUST.
Efficient Visual Object Tracking with Online Nearest Neighbor Classifier Many slides adapt from Steve Gu.
Expectation-Maximization (EM) Case Studies
Michael Isard and Andrew Blake, IJCV 1998 Presented by Wen Li Department of Computer Science & Engineering Texas A&M University.
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Looking at people and Image-based Localisation Roberto Cipolla Department of Engineering Research team
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
An Effective Three-step Search Algorithm for Motion Estimation
 Present by 陳群元.  Introduction  Previous work  Predicting motion patterns  Spatio-temporal transition distribution  Discerning pedestrians  Experimental.
A Dynamic Conditional Random Field Model for Object Segmentation in Image Sequences Duke University Machine Learning Group Presented by Qiuhua Liu March.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use dynamics models.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Max-Confidence Boosting With Uncertainty for Visual tracking WEN GUO, LIANGLIANG CAO, TONY X. HAN, SHUICHENG YAN AND CHANGSHENG XU IEEE TRANSACTIONS ON.
Zhaoxia Fu, Yan Han Measurement Volume 45, Issue 4, May 2012, Pages 650–655 Reporter: Jing-Siang, Chen.
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
A Forest of Sensors: Using adaptive tracking to classify and monitor activities in a site Eric Grimson AI Lab, Massachusetts Institute of Technology
Particle Filtering for Geometric Active Contours
Dynamical Statistical Shape Priors for Level Set Based Tracking
Multi-scale Visual Tracking by Sequential Belief Propagation
Online Graph-Based Tracking
Parallelizing the Condensation Algorithm for Visual Tracking
Presentation transcript:

Real-Time Decentralized Articulated Motion Analysis and Object Tracking From Videos Wei Qu, Member, IEEE, and Dan Schonfeld, Senior Member, IEEE

OUTLINE INTRODUCTION DAOT FRAMEWORK HAOT FRAMEWORK EXPERIMENTAL RESULTS

INTRODUCTION Articulated object tracking is a challenging task such as exponentially increased computational complexity in terms of the degrees of the object and the frequent self- occlusions. In this paper, we present two new articulated motion analysis and object tracking approaches: DAOT and HAOT.

DECENTRALIZED FRAMEWORK FOR ARTICULATED MOTION ANALYSIS AND OBJECT TRACKING A. Articulated Object Representation: An articulated object can be represented by a graphical model such as shown Fig1.

In order to describe the motion of an articulated object, we accommodate the state dynamics by a dynamical graphical model such as shown in Fig.2.

In order to facilitate the analysis and achieve real- time implementation, we adopt a decentralized framework. Fig. 3(a) shows the decomposition result for part 3 in Fig.2.

B. Bayesian Conditional Density Propagation: In this section, we formulate the motion estimation problem. In other words, given the observations, we want to determine the underlying object state. Apply the Markov properties

C. Sequential Monte Carlo Approximation: The basic idea of SMC approximation is to use a weighted sample set to estimate the importance density q(˙) is chosen to factorize such that

By substituting (6) and (8) into (7), we have models the interaction between two neighboring parts’ samples and. The local likelihood acts as a weight to the associated interaction.

HIERARCHICAL DECENTRALIZED FRAMEWORK FOR ARTICULATED MOTION ANALYSIS AND OBJECT TRACKING A. Hierarchical Graphical Modeling: we define a group of parts as a unit, which is denoted by, where ; is the total number of units.

Similar to DAOT, we adopt a decentralized framework and, therefore, decompose the graphical model for each part.

B. Hierarchical Bayesian Conditional Density Propagation: Similar to DAOT, we present a Bayesian conditional density propagation framework for each decomposed graphical model.

Apply the Markov properties

C. Sequential Monte Carlo Implementation: In HAOT, the importance density q(˙) is chosen to be The sample weights can be updated by

By substituting (15), (19), and (20) into (21) and approximating the integrals by summations, we have

in (22) can be further approximated by a product of all parts’ local observation likelihoods in unit By first calculating all parts’ local observation likelihood, we do not have to calculate the interunit observation likelihood.

D. High-Level Interaction Model: We used a Gaussian mixture model in our experiments to estimate the density from training data for a walking person.

EXPERIMENTAL RESULTS The tracking performance of the proposed two methods were compared both qualitatively and quantitatively with the multiple independent trackers (MIT), joint particle filter (JPF), mean field Monte Carlo (MFMC), and loose-limbed people tracking (LLPT), respectively.

The video GIRL contains a girl moving her arms. It has 122 frames and was captured by 25 fps with a resolution of 320 x 240 pixels. Qualitative Tracking Results:

The video 3D-FINGER has a finger bending into the image plane. It was captured by 15 fps with a resolution of 240 x 180 pixels and has 345 frames.

The sequence WALKING contains a person walking forward inside a classroom. It has 66 frames and was captured by 25 fps with a resolution of 320 x 240 pixels.

The video sequence GYM was captured in a gym from a sideview of a person on a walking machine. Compared with the WALKING sequence, this video is much longer (1716 frames) and has a very cluttered background.

With synthetic data: Quantitative Performance Analysis and Comparisons:

In Fig. 10, we compare the RMSE of MIT, JPF and DAOT on the synthetic video.

With real data: we compare the tracking accuracy of different approaches by defining the false position rate (FPR) and false label rate (FLR)

In Table III, we compare both the speed and accuracy data of different particle filter-based approaches on the WALKING sequence.