Download presentation
1
Particle Filters
2
Outline Introduction to particle filters Bayesian Importance sampling
Recursive Bayesian estimation Bayesian Importance sampling Sequential Importance sampling (SIS) Sampling Importance resampling (SIR) Improvements to SIR On-line Markov chain Monte Carlo Basic Particle Filter algorithm Example for robot localization Conclusions
3
But what if not a gaussian distribution in our problem?
4
Motivation for particle filters
5
Key Idea of Particle Filters
Idea = we try to have more samples where we expect to have the solution
6
Motion Model Reminder Density of samples represents the expected probability of robot location
8
Global Localization of Robot with Sonar http://www. cs. washington
This is the lost robot problem
10
Particles are used for probability density function Approximation
11
Function Approximation
Particle sets can be used to approximate functions The more particles fall into an interval, the higher the probability of that interval How to draw samples from a function/distribution?
12
Importance Sampling Principle
13
Importance Sampling Principle
weight w = f / g f is often called target g is often called proposal Pre-condition: f(x)>0 g(x)>0
14
Importance sampling: another example of calculating weight samples
How to calculate formally the f/g value?
15
Importance Sampling Formulas for f, g and f/g
16
History of Monte Carlo Idea and especially Particle Filters
First attempts – simulations of growing polymers M. N. Rosenbluth and A.W. Rosenbluth, “Monte Carlo calculation of the average extension of molecular chains,” Journal of Chemical Physics, vol. 23, no. 2, pp. 356–359, 1956. First application in signal processing N. J. Gordon, D. J. Salmond, and A. F. M. Smith, “Novel approach to nonlinear/non-Gaussian Bayesian state estimation,” IEE Proceedings-F, vol. 140, no. 2, pp. 107–113, 1993. Books A. Doucet, N. de Freitas, and N. Gordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, 2001. B. Ristic, S. Arulampalam, N. Gordon, Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech House Publishers, 2004. Tutorials M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-gaussian Bayesian tracking,” IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174–188, 2002.
17
What is the problem that we want to solve?
The problem is tracking the state of a system as it evolves over time Sequentially arriving (noisy or ambiguous) observations We want to know: Best possible estimate of the hidden variables
18
Solution: Sequential Update
Storing and processing all incoming measurements is inconvenient and may be impossible Recursive filtering: Predict next state pdf from current estimate Update the prediction using sequentially arriving new measurements Optimal Bayesian solution: recursively calculating exact posterior density These lead to various particle filters
19
Particle Filters Sequential Monte Carlo methods for on-line learning within a Bayesian framework. Known as Particle filters Sequential sampling-importance resampling (SIR) Bootstrap filters Condensation trackers Interacting particle approximations Survival of the fittest
20
Particle Filter characteristics
21
Approaches to Particle Filters
METAPHORS
22
Particle filters Sequential and Monte Carlo properties
Representing belief by sets of samples or particles are nonnegative weights called importance factors Updating procedure is sequential importance sampling with re-sampling
23
Tracking in 1D: the blue trajectory is the target
Tracking in 1D: the blue trajectory is the target. The best of10 particles is in red.
24
Short, more formal, Introduction to Particle Filters and Monte Carlo Localization
25
Proximity Sensor Model Reminder
26
Particle filtering ideas
Recursive Bayesian filter by Monte Carlo sampling The idea: represent the posterior density by a set of random particles with associated weights. Compute estimates based on these samples and weights Posterior density Sample space
27
Particle filtering ideas
Particle filters are based on recursive generation of random measures that approximate the distributions of the unknowns. Random measures: particles and importance weights. As new observations become available, the particles and the weights are propagated by exploiting Bayes theorem. Sample space Posterior density
28
Mathematical tools needed for Particle Filters
Recall “law of total probability” and “Bayes’ rule”
29
Recursive Bayesian estimation (I)
Recursive filter: System model: Measurement model: Information available:
30
Recursive Bayesian estimation (II)
Seek: i = 0: filtering. i > 0: prediction. i<0: smoothing. Prediction: since:
31
Recursive Bayesian estimation (III)
Update: where: since:
32
Bayes Filters (second pass)
Estimating system state from noisy observations System state dynamics Observation dynamics We are interested in: Belief or posterior density
33
From above, constructing two steps of Bayes Filters
Predict: Update:
34
Assumptions: Markov Process
Predict: Update:
35
How to use it? What else to know?
Bayes Filter How to use it? What else to know? Motion Model Perceptual Model Start from:
36
Particle Filters: Compare Gaussian and Particle Filters
37
Example 1: theoretical PDF
38
Example 1: theoretical PDF
Step 0: initialization Step 1: updating
39
Example 2: Particle Filter
Step 0: initialization Each particle has the same weight Step 1: updating weights. Weights are proportional to p(z|x)
40
Example 1 (continue) Step 2: predicting Step 3: updating
41
Robot Motion
42
Example 2: Particle Filter
43
Example 2: Particle Filter
Step 2: predicting. Predict the new locations of particles. Step 3: updating weights. Weights are proportional to p(z|x) Step 4: predicting. Predict the new locations of particles. Particles are more concentrated in the region where the person is more likely to be
44
Robot Motion
45
Compare Particle Filter with Bayes Filter with Known Distribution
Updating Example 1 Example 2 Predicting Example 1 Example 2
46
Classical approximations
Analytical methods: Extended Kalman filter, Gaussian sums… (Alspach et al. 1971) Perform poorly in numerous cases of interest Numerical methods: point masses approximations, splines. (Bucy 1971, de Figueiro 1974…) Very complex to implement, not flexible.
47
Monte Carlo Localization
48
Mobile Robot Localization
Each particle is a potential pose of the robot Proposal distribution is the motion model of the robot (prediction step) The observation model is used to compute the importance weight (correction step)
49
Monte Carlo Localization
Each particle is a potential pose of the robot Proposal distribution is the motion model of the robot (prediction step) The observation model is used to compute the importance weight (correction step)
50
Sample-based Localization (sonar)
51
Random samples and the pdf (I)
Take p(x)=Gamma(4,1) Generate some random samples Plot histogram and basic approximation to pdf 200 samples
52
Random samples and the pdf (II)
53
Random samples and the pdf (III)
54
Importance Sampling
55
Importance Sampling where
Unfortunately it is often not possible to sample directly from the posterior distribution, but we can use importance sampling. Let p(x) be a pdf from which it is difficult to draw samples. Let xi ~ q(x), i=1, …, N, be samples that are easily generated from a proposal pdf q, which is called an importance density. Then approximation to the density p is given by where
56
Bayesian Importance Sampling
By drawing samples from a known easy to sample proposal distribution we obtain: where are normalized weights.
57
Sensor Information: Importance Sampling
58
Sequential Importance Sampling (I)
Factorizing the proposal distribution: and remembering that the state evolution is modeled as a Markov process we obtain a recursive estimate of the importance weights: Factorizing is obtained by recursively applying
59
Sequential Importance Sampling (SIS) Particle Filter
SIS Particle Filter Algorithm for i=1:N Draw a particle Assign a weight end (k is index over time and i is the particle index)
60
Rejection Sampling
61
Rejection Sampling Let us assume that f(x)<1 for all x
Sample x from a uniform distribution Sample c from [0,1] if f(x) > c keep the sample otherwise reject the sample f(x’) c c’ OK f(x) x x’
62
Importance Sampling with Resampling: Landmark Detection Example
63
Distributions
64
Distributions Wanted: samples distributed according to p(x| z1, z2, z3)
65
This is Easy! We can draw samples from p(x|zl) by adding noise to the detection parameters.
66
Importance sampling with Resampling
After Resampling
67
Particle Filter Algorithm
68
weight = target distribution / proposal distribution
69
Particle Filter Algorithm
Importance factor for xit: draw xit from p(xt | xit-1,ut-1) draw xit-1 from Bel(xt-1)
70
Particle Filter Algorithm
71
Particle Filter Algorithm
Algorithm particle_filter( St-1, ut-1 zt): For Generate new samples Sample index j(i) from the discrete distribution given by wt-1 Sample from using and Compute importance weight Update normalization factor Insert For Normalize weights
72
Particle Filter for Localization
73
Particle Filter in Matlab
74
Matlab code: truex is a vector of 100 positions to be tracked.
75
Application: Particle Filter for Localization (Known Map)
76
Sources Longin Jan Latecki Keith Copsey Paul E. Rybski
Cyrill Stachniss Sebastian Thrun Alex Teichman Michael Pfeiffer J. Hightower L. Liao D. Schulz G. Borriello Honggang Zhang Wolfram Burgard Dieter Fox Giorgio Grisetti Maren Bennewitz Christian Plagemann Dirk Haehnel Mike Montemerlo Nick Roy Kai Arras Patrick Pfaff Miodrag Bolic Haris Baltzakis
78
Perfect Monte Carlo simulation
Recall that Random samples are drawn from the posterior distribution. Represent posterior distribution using a set of samples or particles. Easy to approximate expectations of the form: by:
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.