SIS Sequential Importance Sampling Advanced Methods In Simulation 096320 Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal University of Málaga (Spain) Dpt. of System Engineering and Automation May Pasadena,
State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
HMM II: Parameter Estimation. Reminder: Hidden Markov Model Markov Chain transition probabilities: p(S i+1 = t|S i = s) = a st Emission probabilities:
Hidden Markov Models. A Hidden Markov Model consists of 1.A sequence of states {X t |t  T } = {X 1, X 2,..., X T }, and 2.A sequence of observations.
Visual Tracking CMPUT 615 Nilanjan Ray. What is Visual Tracking Following objects through image sequences or videos Sometimes we need to track a single.
Particle Filters.
Maximum likelihood (ML) and likelihood ratio (LR) test
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
A brief Introduction to Particle Filters
Sérgio Pequito Phd Student
Maximum likelihood (ML) and likelihood ratio (LR) test
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
Maximum likelihood (ML)
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
Markov Localization & Bayes Filtering
Computer vision: models, learning and inference Chapter 19 Temporal models.
System Identification of Nonlinear State-Space Battery Models
Computer vision: models, learning and inference Chapter 19 Temporal models.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
Computer Vision - A Modern Approach Set: Tracking Slides by D.A. Forsyth The three main issues in tracking.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Probabilistic Robotics Bayes Filter Implementations.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Mobile Robot Localization (ch. 7)
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Pattern Recognition and Machine Learning-Chapter 13: Sequential Data
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
OBJECT TRACKING USING PARTICLE FILTERS. Table of Contents Tracking Tracking Tracking as a probabilistic inference problem Tracking as a probabilistic.
Tracking with dynamics
Nonlinear State Estimation
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Network Arnaud Doucet Nando de Freitas Kevin Murphy Stuart Russell.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Probabilistic Robotics
PSG College of Technology
Course: Autonomous Machine Learning
Introduction to particle filter
Propagating Uncertainty In POMDP Value Iteration with Gaussian Process
Non-parametric Filters
Filtering and State Estimation: Basic Concepts
Introduction to particle filter
Particle Filtering.
Non-parametric Filters
Presentation transcript:

SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash

Talk Layout  SIS – Overview and algorithm  Random walk – SIS simulation  Nonlinear Filtering – Overview & Added value  Nonlinear Filtering – Simulation

Importance Sampling - General Overview  Importance Sampling –  The most fundamental variance reduction technique  Leads to a dramatic variance reduction – particularly when estimating rare event probabilities  Target – Expected performance of- Likelihood Ratio Estimator - the sample performance importance sampling density probability density of X

SIS - Overview  Sequential Importance Sampling  Also known as “Dynamic Importance Sampling”.  Simply means importance sampling that carried out in sequential manner.  Why Sequential?  Problematic to sample from multi-dimensional vector  Dependency between the variables  It is difficult to sample from

SIS - Overview  Assumptions –  X is decomposable  can present g(x) –  Easy to sample from g(x) sequentially

SIS – Overview (cont’)  It is easy to generate sequentially from  Generate from  ….. We get – Due to the product rule of probability we can write - The Likelihood function -

SIS – Overview (cont’) Likelihood till time t Likelihood till time t-1

SIS – Overview  In order to update the likelihood ratio recursively, we need to know how to calculate  We know  In order to calculate it requires integrating over  There options to solve this –  Use auxiliary pdfs that can be easy evaluated and each is a good approximation to considered hard integral Easy to calculate Where

SIS – Algorithm  SIS algorithm (Sequential) 1.For each finite t = 1,…,n, Sample from 2.Compute where and 3.Repeat N times and estimate via  SIS algorithm (Dynamic) 1.At time t, arrival of t th sample 2.Sample x t N times according to 3.Calculate 4.estimate according to the existing samples (1,…,t) t = 1,…,n Parallel computing

SIS Algorithm - Sequential 1 st sample: 2 nd sample:. N th sample: Calculate Estimate by Computing

SIS Algorithm - Dynamic 1 st sample: 2 nd sample:. N th sample: At Time t =1 Calculate recalculate Calculate recalculate At time t =2 At time t=n Estimate by Computing With the existing samples

Random Walk  Problem statement  Reminder -  Go forward  Probability p  Go backward  probability q  p < q (has drift to - )  Goal – estimating the rare event probability of reaching state K (large number) before 0 (zero) starting at k. 0K12…k start

Random Walk – Simulation Result

SIS Application: Non Linear Filtering

State Space Models x T x1x1 x2x2 x3x3 y2y2 y3y3 y1y1

Dynamic Model Measurement Equation State Equation Observation Equation HMM

State Space Models cont’ Known pdf - P w Known pdf - P v Markov Property

Linear Models Kalman Filter  Linear Dynamic models  Linear Measurement Equations  v, w, x 0 – Gaussian & independent  Kalman Filter is the optimal estimator (MSE)

 Assuming models  Motion models - Linear/Non-Linear State Dynamic  Linear/Non-Linear Measurement Equations  v, u, x 0 – independent, not necessarily Gaussian General Models

Problem Description θaθa θbθb θcθc LOP – Line Of Position Observers – Known exact location (x a,,y a ) (x b,y b ) (x c,,y c ) Target – Unknown location (x e,,y e )

θaθa θbθb θcθc Bearing Only Measurements (x e,,y e ) (x a,,y a ) (x b,y b ) (x c,,y c )

Bearing Only Measurements

Non-Linear Filtering  Motivation  Non linear dynamic/measurement equations  Noise distribution not Gauss  Kalman Filter:  No longer the optimal estimator (MSE)  EKF – Linearization of the state space Equations  Suboptimal estimator  Convergence is not guaranteed 

The Bootstrap Filter  Represent the pdf as a set of rv (and not as a function)  The Bootstrap Filter – Recursive algorithm for propagating and updating these rv samples  Samples are naturally concentrated in regions of high probability “Novel Approach to nonlinear/non Gaussian Bayesian state estimation” N.J. Gordon, D.J. Salmond & A.F.M Smith

Motivation For having P (X(k)|Y(1:k)) MSE ML

The Bootstrap Filter Recursive Calculation of P (X(k)|Y(1:k)) Assume we know Bayes & y t |x t independent of y 1:t-1

The importance Sampling

The Bootstrap Filter Algorithm 1. Initialization: k = 0, Generate x 0 i ~Px 0 i = 1…N 2. Measurement Update: Given y k calculate likelihood for each current sample

Algorithm (cont’) 3. Re-Sampling - Sample N samples from {x k *i } i=1:N, with replacement, where the probability to choose the i-th particle is q k i at stage k 4. Prediction: Pass the new samples through the system Equation 5. Set k = k+1 and return to 2 The Bootstrap Filter

V x = -0.1 [km/sec] Vy= 0.01 [km/sec] dt = 300 [sec] Measurement Variance ~ 1 o (141,,141) [km] (100,120) [km]

1

2

3

4

5

6

7

8

1

2

3

4

5

6

7

Simulations

150 time steps

Backup

Markov Chain  Markov property –  Given the present state, future states are independent of the past states.  The present state fully captures all the information that could influence the future evolution of the process.  The changes of state are called transitions, and the probabilities associated with various state-changes are called transition probabilities P=

F(X) - calculations

Where to put Markov PropertyBayes Rule Normalization Constant

Where to put 2