Tracking Multiple Cells By Correspondence Resolution In A Sequential Bayesian Framework Nilanjan Ray Gang Dong Scott T. Acton C.L. Brown Department of.

Slides:



Advertisements
Similar presentations
Motivating Markov Chain Monte Carlo for Multiple Target Tracking
Advertisements

Bayesian Estimation in MARK
Introduction of Markov Chain Monte Carlo Jeongkyun Lee.
Ch 11. Sampling Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by I.-H. Lee Biointelligence Laboratory, Seoul National.
Reducing Drift in Parametric Motion Tracking
Visual Tracking CMPUT 615 Nilanjan Ray. What is Visual Tracking Following objects through image sequences or videos Sometimes we need to track a single.
On-Line Probabilistic Classification with Particle Filters Pedro Højen-Sørensen, Nando de Freitas, and Torgen Fog, Proceedings of the IEEE International.
Markov Chains 1.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Bayesian Reasoning: Markov Chain Monte Carlo
Oklahoma State University Generative Graphical Models for Maneuvering Object Tracking and Dynamics Analysis Xin Fan and Guoliang Fan Visual Computing and.
Bayesian statistics – MCMC techniques
Stochastic approximate inference Kay H. Brodersen Computational Neuroeconomics Group Department of Economics University of Zurich Machine Learning and.
BAYESIAN INFERENCE Sampling techniques
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
Segmentation and Tracking of Multiple Humans in Crowded Environments Tao Zhao, Ram Nevatia, Bo Wu IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Sérgio Pequito Phd Student
Particle filters (continued…). Recall Particle filters –Track state sequence x i given the measurements ( y 0, y 1, …., y i ) –Non-linear dynamics –Non-linear.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Today Introduction to MCMC Particle filters and MCMC
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
. Approximate Inference Slides by Nir Friedman. When can we hope to approximate? Two situations: u Highly stochastic distributions “Far” evidence is discarded.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1.
Bayesian Filtering for Robot Localization
Bayes Factor Based on Han and Carlin (2001, JASA).
Particle Filtering in Network Tomography
Markov Localization & Bayes Filtering
BraMBLe: The Bayesian Multiple-BLob Tracker By Michael Isard and John MacCormick Presented by Kristin Branson CSE 252C, Fall 2003.
Bayesian parameter estimation in cosmology with Population Monte Carlo By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference,
Object Tracking using Particle Filter
Computer vision: models, learning and inference Chapter 19 Temporal models.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
A General Framework for Tracking Multiple People from a Moving Camera
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.1: Bayes Filter Jürgen Sturm Technische Universität München.
Bayes’ Nets: Sampling [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available.
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Improved Cross Entropy Method For Estimation Presented by: Alex & Yanna.
Latent Class Regression Model Graphical Diagnostics Using an MCMC Estimation Procedure Elizabeth S. Garrett Scott L. Zeger Johns Hopkins University
Michael Isard and Andrew Blake, IJCV 1998 Presented by Wen Li Department of Computer Science & Engineering Texas A&M University.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Lecture #9: Introduction to Markov Chain Monte Carlo, part 3
Multi-target Detection in Sensor Networks Xiaoling Wang ECE691, Fall 2003.
Probabilistic Robotics
OBJECT TRACKING USING PARTICLE FILTERS. Table of Contents Tracking Tracking Tracking as a probabilistic inference problem Tracking as a probabilistic.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Network Arnaud Doucet Nando de Freitas Kevin Murphy Stuart Russell.
Multiple Target Tracking Using Spatio-Temporal Monte Carlo Markov Chain Data Association Qian Yu, Gerard Medioni, and Isaac Cohen Edwin Lei.
Markov-Chain-Monte-Carlo (MCMC) & The Metropolis-Hastings Algorithm P548: Intro Bayesian Stats with Psych Applications Instructor: John Miyamoto 01/19/2016:
Daphne Koller Sampling Methods Metropolis- Hastings Algorithm Probabilistic Graphical Models Inference.
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
CS498-EA Reasoning in AI Lecture #19 Professor: Eyal Amir Fall Semester 2011.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
PI: Professor Yong Zeng Department of Mathematics and Statistics
MCMC Output & Metropolis-Hastings Algorithm Part I
Today.
Ch3: Model Building through Regression
Remember that our objective is for some density f(y|) for observations where y and  are vectors of data and parameters,  being sampled from a prior.
Visual Tracking CMPUT 615 Nilanjan Ray.
Graduate School of Information Sciences, Tohoku University
Presentation transcript:

Tracking Multiple Cells By Correspondence Resolution In A Sequential Bayesian Framework Nilanjan Ray Gang Dong Scott T. Acton C.L. Brown Department of Electrical and Computer Engineering University of Virginia Charlottesville, Virginia, US

Presentation Overview  The problem  The framework to solve the problem  Track mapping  Simultaneous Tracking and Detection

The Problem Can we estimate these paths (the mapping f t ) in a sequential Bayesian framework? Frame t Frame t-1 Frame t-2 dtdt d t-1 d t-2 ftft f t-1 d t : Set of detected cells on frame t

Sequential Bayesian Framework We are interested in estimating x t given information (z 1,z 2,…,z t )  z 1:t Applying Bayes rule: (1) Measurement z t is conditionally independent on the current state x t : (2) Current state is conditionally independent on immediate past state:. Incorporating (1) and (2): Assumptions: Note 1: dimension of x t may be a (non-random) variable over t Note 2: dimension of z t may be a (non-random) variable over t

Sequential Bayesian Framework… Sequential MAP estimation Marginal probability distribution for p(x t |z 1:t ): Likelihood Motion prior

from (A) by Hastings’ MCMC Algorithm: (1)Randomly choose a sample from (2)Generate a sample (3)Generate and compute (4)Set if u>r, else set Sequential Markov Chain Monte Carlo (MCMC) Computation If we approximate the posterior density at (t-1) by a set of samples then the posterior density at t becomes We can generate samples (A)

Sequential Track Map Estimation and Detection Refinement such that the restricted mapping (function): is one-to-one. We define track mapping (function) as: Apply Sequential MCMC to: We also define detection refinement mapping as:

Sampling Via Reverse Track Map Apply sequential MCMC sampling Let’s consider a reverse track map: such that One can uniquely construct f t-1 from g t-1 and vice-versa, so: which implies

A Generic Sequential MCMC Algorithm For ease of sampling we assume the density Factors as:

Sampling for Detection Refinement Map Assume detection refinement depends only on current track map Our choice of detection refinement density for a cell tracking problem We also assume measurement depends only on detection refinement map Our choice of measurement density MH ratio for sampling of detection refinement map:

Sampling for Track Map Where, h(.) is the motion model, a choice might be:

Detection and Track Likelihood Detection likelihood for the “ligocyte” video: Track likelihood for the “ligocyte” video:

Tracking Video Dong, please insert a few good example videos

Experimental Results

Summary A single framework –no ad hoc combination of detection and tracking –Variable number of targets automatically taken care of: no ad hoc computation –Detection and tracking becomes cooperative, performance of each may improve –No explicit effort to compute “track-to- measurement” association

Future Plan  Instead of starting with a initial crude detection of cells, we like to dynamically detect cells as tracking proceeds  Mathematical Implication: the dimension of the set d t is a random variable  This stochastic dynamic behavior can be modeled in the Bayes’ rule by point process formalism