Today Introduction to MCMC Particle filters and MCMC

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Bayesian Estimation in MARK
Efficient Cosmological Parameter Estimation with Hamiltonian Monte Carlo Amir Hajian Amir Hajian Cosmo06 – September 25, 2006 Astro-ph/
Introduction of Markov Chain Monte Carlo Jeongkyun Lee.
Analyzing error of fit functions for ellipses Paul L. Rosin BMVC 1996.
Ch 11. Sampling Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by I.-H. Lee Biointelligence Laboratory, Seoul National.
Learning Inhomogeneous Gibbs Models Ce Liu
Computer Vision Lab. SNU Young Ki Baik An Introduction to MCMC for Machine Learning (Markov Chain Monte Carlo)
Visual Tracking CMPUT 615 Nilanjan Ray. What is Visual Tracking Following objects through image sequences or videos Sometimes we need to track a single.
Markov Chains 1.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Oklahoma State University Generative Graphical Models for Maneuvering Object Tracking and Dynamics Analysis Xin Fan and Guoliang Fan Visual Computing and.
Bayesian statistics – MCMC techniques
BAYESIAN INFERENCE Sampling techniques
Exact Inference (Last Class) variable elimination  polytrees (directed graph with at most one undirected path between any two vertices; subset of DAGs)
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
PHD Approach for Multi-target Tracking
Computing the Posterior Probability The posterior probability distribution contains the complete information concerning the parameters, but need often.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Sérgio Pequito Phd Student
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle filters (continued…). Recall Particle filters –Track state sequence x i given the measurements ( y 0, y 1, …., y i ) –Non-linear dynamics –Non-linear.
End of Chapter 8 Neil Weisenfeld March 28, 2005.
Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this.
Monte Carlo Methods in Partial Differential Equations.
Introduction to Monte Carlo Methods D.J.C. Mackay.
Particle Filtering in Network Tomography
Object Tracking using Particle Filter
Human-Computer Interaction Human-Computer Interaction Tracking Hanyang University Jong-Il Park.
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Particle Filters for Shape Correspondence Presenter: Jingting Zeng.
1 Gil McVean Tuesday 24 th February 2009 Markov Chain Monte Carlo.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
MTA SzTAKI & Veszprém University (Hungary) Guests at INRIA, Sophia Antipolis, 2000 and 2001 Paintbrush Rendering of Images Tamás Szirányi.
Bayesian Reasoning: Tempering & Sampling A/Prof Geraint F. Lewis Rm 560:
Sequential Inference for Evolving Groups of Objects 이범진 Biointelligence Lab Seoul National University.
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Exact Inference (Last Class) Variable elimination  polytrees (directed graph with at most one undirected path between any two vertices; subset of DAGs)
Tracking Multiple Cells By Correspondence Resolution In A Sequential Bayesian Framework Nilanjan Ray Gang Dong Scott T. Acton C.L. Brown Department of.
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Mixture Kalman Filters by Rong Chen & Jun Liu Presented by Yusong Miao Dec. 10, 2003.
An Introduction to Markov Chain Monte Carlo Teg Grenager July 1, 2004.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Lecture #9: Introduction to Markov Chain Monte Carlo, part 3
Tracking with dynamics
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Bayesian Modelling Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Introduction to Sampling based inference and MCMC
Advanced Statistical Computing Fall 2016
Tracking Objects with Dynamics
CAP 5636 – Advanced Artificial Intelligence
Lecture 15 Sampling.
Slides for Sampling from Posterior of Shape
Markov Networks.
Presentation transcript:

Today Introduction to MCMC Particle filters and MCMC A simple example of particle filters: ellipse tracking

Introduction to MCMC Sampling technique Non-standard distributions (hard to sample) High dimensional spaces Origins in statistical physics in 1940s Gained popularity in statistics around late 1980s Markov Chain Monte Carlo

Markov chains* Homogeneous: T is time-invariant Represented using a transition matrix Series of samples such that * C. Andrieu et al., “An Introduction to MCMC for Machine Learning“, Mach. Learn., 2003

Markov chains Evolution of marginal distribution Stationary distribution Markov chain T has a stationary distribution Irreducible Aperiodic Bayes’ theorem

Markov chains Detailed balance Mass transfer Sufficient condition for stationarity of p Mass transfer Probability mass Probability mass Proportion of mass transfer x(i) x(i-1) Pair-wise balance of mass transfer

Metropolis-Hastings Target distribution: p(x) Set up a Markov chain with stationary p(x) Resulting chain has the desired stationary Detailed balance Propose (Easy to sample from q) with probability otherwise

Metropolis-Hastings Initial burn-in period Drop first few samples Successive samples are correlated Retain 1 out of every M samples Acceptance rate Proposal distribution q is critical

Monte-Carlo simulations* Using N MCMC samples Target density estimation Expectation MAP estimation p is a posterior * C. Andrieu et al., “An Introduction to MCMC for Machine Learning“, Mach. Learn., 2003

Tracking interacting targets* Using partilce filters to track multiple interacting targets (ants) * Khan et al., “MCMC-Based Particle Filtering for Tracking a Variable Number of Interacting Targets”, PAMI, 2005.

Particle filter and MCMC Joint MRF Particle filter Importance sampling in high dimensional spaces Weights of most particles go to zero MCMC is used to sample particles directly from the posterior distribution

MCMC Joint MRF Particle filter True samples (no weights) at each step Stationary distribution for MCMC Proposal density for Metropolis Hastings (MH) Select a target randomly Sample from the single target state proposal density

MCMC Joint MRF Particle filter MCMC-MH iterations are run every time step to obtain particles “One target at a time” proposal has advantages: Acceptance probability is simplified One likelihood evaluation for every MH iteration Computationally efficient Requires fewer samples compared to SIR

Particle filter for pupil (ellipse) tracking Pupil center is a feature for eye-gaze estimation Track pupil boundary ellipse Outliers Pupil boundary edge points Ellipse overlaid on the eye image

Tracking Brute force: Detect ellipse every video frame RANSAC: Computationally intensive Better: Detect + Track Ellipse usually does not change too much between adjacent frames Principle Detect ellipse in a frame Predict ellipse in next frame Refine prediction using data available from next frame If track lost, re-detect and continue

Particle filter? State: Ellipse parameters Measurements: Edge points Non-linear dynamics Non-linear measurements Edge points are the measured data

Motion model Simple drift with rotation State (x0 , y0 ) θ a b Could include velocity, acceleration etc. a b Gaussian

Likelihood Exponential along normal at each point di: Approximated using focal bisector distance d1 d2 d3 d4 d5 d6 z1 z2 z3 z4 z5 z6

Focal bisector distance* (FBD) Reflection property: PF’ is a reflection of PF Favorable properties Approximation to spatial distance to ellipse boundary along normal No dependence on ellipse size Foci FBD Focal bisector * P. L. Rosin, “Analyzing error of fit functions for ellipses”, BMVC 1996.

Implementation details Sequential importance re-sampling* Number of particles:100 Expected state is the tracked ellipse Possible to compute MAP estimate? Weights: Likelihood Proposal distribution: Mixture of Gaussians * Khan et al., “MCMC-Based Particle Filtering for Tracking a Variable Number of Interacting Targets”, PAMI, 2005.

Initial results Frame 1: Detect Frame 2: Track Frame 3: Track

Future? Incorporate velocity, acceleration into the motion model Use a domain specific motion model Smooth pursuit Saccades Combination of them? Data association* to reduce outlier confound * Forsyth and Ponce, “Computer Vision: A Modern Approach”, Chapter 17.

Thank you!