-Arnaud Doucet, Nando de Freitas et al, UAI 2000-.

Slides:



Advertisements
Similar presentations
Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal University of Málaga (Spain) Dpt. of System Engineering and Automation May Pasadena,
Advertisements

State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Rao-Blackwellised Particle Filtering Based on Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks by Arnaud Doucet, Nando de Freitas, Kevin.
Visual Tracking CMPUT 615 Nilanjan Ray. What is Visual Tracking Following objects through image sequences or videos Sometimes we need to track a single.
On-Line Probabilistic Classification with Particle Filters Pedro Højen-Sørensen, Nando de Freitas, and Torgen Fog, Proceedings of the IEEE International.
Aspects of Conditional Simulation and estimation of hydraulic conductivity in coastal aquifers" Luit Jan Slooten.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
On Systems with Limited Communication PhD Thesis Defense Jian Zou May 6, 2004.
Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
Graphical Models for Mobile Robot Localization Shuang Wu.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
A brief Introduction to Particle Filters
Sérgio Pequito Phd Student
An Optimal Learning Approach to Finding an Outbreak of a Disease Warren Scott Warren Powell
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Today Introduction to MCMC Particle filters and MCMC
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
Darryl MorrellStochastic Modeling Seminar1 Particle Filtering.
Arizona State University DMML Kernel Methods – Gaussian Processes Presented by Shankar Bhargav.
Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Sampling Methods for Estimation: An Introduction
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Particle Filtering in Network Tomography
The horseshoe estimator for sparse signals CARLOS M. CARVALHO NICHOLAS G. POLSON JAMES G. SCOTT Biometrika (2010) Presented by Eric Wang 10/14/2010.
1 Miodrag Bolic ARCHITECTURES FOR EFFICIENT IMPLEMENTATION OF PARTICLE FILTERS Department of Electrical and Computer Engineering Stony Brook University.
Importance Sampling ICS 276 Fall 2007 Rina Dechter.
1 Mohammed M. Olama Seddik M. Djouadi ECE Department/University of Tennessee Ioannis G. PapageorgiouCharalambos D. Charalambous Ioannis G. Papageorgiou.
Object Tracking using Particle Filter
Computer vision: models, learning and inference Chapter 19 Temporal models.
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
Nonlinear Data Assimilation and Particle Filters
Particle Filtering (Sequential Monte Carlo)
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
Probabilistic Robotics Bayes Filter Implementations.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Dynamic Bayesian Networks and Particle Filtering COMPSCI 276 (chapter 15, Russel and Norvig) 2007.
A unifying framework for hybrid data-assimilation schemes Peter Jan van Leeuwen Data Assimilation Research Center (DARC) National Centre for Earth Observation.
Virtual Vector Machine for Bayesian Online Classification Yuan (Alan) Qi CS & Statistics Purdue June, 2009 Joint work with T.P. Minka and R. Xiang.
Processing Sequential Sensor Data The “John Krumm perspective” Thomas Plötz November 29 th, 2011.
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 07: BAYESIAN ESTIMATION (Cont.) Objectives:
Gaussian Processes Li An Li An
CS Statistical Machine learning Lecture 24
Twenty Second Conference on Artificial Intelligence AAAI 2007 Improved State Estimation in Multiagent Settings with Continuous or Large Discrete State.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use dynamics models.
Tracking with dynamics
SLAM Tutorial (Part I) Marios Xanthidis.
Nonlinear State Estimation
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Network Arnaud Doucet Nando de Freitas Kevin Murphy Stuart Russell.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks Arnaud Doucet, Nando de Freitas, Kevin Murphy and Stuart Russell CS497EA presentation.
The Unscented Kalman Filter for Nonlinear Estimation Young Ki Baik.
Particle Filters: Theory and applications to mobile robot localization and map building Jose Luis Blanco Claraco August 2005 System Engineering and Automation.
Introduction to Sampling based inference and MCMC
Auxiliary particle filtering: recent developments
Particle Filtering ICS 275b 2002.
Filtering and State Estimation: Basic Concepts
Particle Filtering.
Robust Full Bayesian Learning for Neural Networks
Presentation transcript:

-Arnaud Doucet, Nando de Freitas et al, UAI 2000-

Introduction Problem Formulation Importance Sampling and Rao- Blackwellisation Rao-Blackwellisation Particle Filter Example Conclusion

Famous state estimaton algorithm, The Kalman filter and the HMM filter, are only applicable to linear-Gaussian models and if state space is so large, the computatuion cost becomes too expensive. Sequential Monte Carlo methods(Particle Filtering) have been introduced (Handschine and Mayne,1969) to handle large state model.

Particle Filtering(PF) = “ condensation ” = “ sequential Monte Carlo ” = “ survival of the fittest ”  PF can treat any type of probability distribution,nonlinearity and non- stationarity.  PF are powerful sampling based inference/learning algorithms for DBNs

Drawback of PF  Inefficent in high-dimensional spaces (Variance becomes so large) Solution  Rao-Balckwellisation, that is, sample a subset of the variables allowing the remainder to be integrated out exactly. The resulting estimates can be shown to have lower variance. Rao-Blackwell Theorem

Model : general state space model/DBN with hidden variables and observed variables Objective:  or filtering density  To solve this problem,one need approximation schemes because of intractable integrals

Additive assumption in this paper:  Divide hidden variables into two groups,  Conditional posterior distribution is analytically tractable  We only need to focus on estimating Which lies in a space of reduced dimension

Monte Carlo integration

 But it ’ s impossible to sample efficiently from the “ target ” posterior distribution. Importance Sampling Method (Alternative way) Weight function Importance function

Point mass approximation Normalized Importance weight

In case, we can marginalize out analytically

We can estimate with a reduced variance

Sequential Importance Sampling  Restrict importance function  We can obtain recursive formulas and obtain “ incremental weight ” is given by

Choice of importance Distribution  Simplest choice is to just sample from the prior, => it can be inefficent, since it ignores the most recent evidence,.  “ optimal ” importance distribution :Minimizing the variance of the importance weight.

But it is often too expensive.Several Deterministic approximations to the optimal distribution have been proposed, see for example(de Freitas 1999,Doucet 1998) Selection step  Using Resampling : elimate samples with low importance weight and multiply samples with high importance weight. ( ex: residual sampling, stratified sampling, multinomial sampling)

Goal : It is paossible to simulate and to compute coefficent analytically using Kalman filters. This is because the output of the neural network is linear in Number of basis function

Successful application  Conditionaliiy linear Gaussian state-space models  Conditionally finite state-space HMMs Possible extensions  Dynamic models for counting observations  Dynamic models with a time-varying unknown covariance matrix for the dynamic noise  Calsses of the exponential family state space models etc..