Presentation is loading. Please wait.

Presentation is loading. Please wait.

Markov Localization & Bayes Filtering

Similar presentations


Presentation on theme: "Markov Localization & Bayes Filtering"— Presentation transcript:

1 Markov Localization & Bayes Filtering
with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics

2 Markov Localization The robot doesn’t know where it is. Thus, a reasonable initial believe of it’s position is a uniform distribution.

3 Markov Localization A sensor reading is made (USE SENSOR MODEL) indicating a door at certain locations (USE MAP). This sensor reading should be integrated with prior believe to update our believe (USE BAYES).

4 Markov Localization The robot is moving (USE MOTION MODEL) which adds noise.

5 Markov Localization A new sensor reading (USE SENSOR MODEL) indicates a door at certain locations (USE MAP). This sensor reading should be integrated with prior believe to update our believe (USE BAYES).

6 Markov Localization The robot is moving (USE MOTION MODEL) which adds noise. …

7 Bayes Formula

8 Bayes Rule with Background Knowledge

9 Normalization Algorithm:

10 Recursive Bayesian Updating
Markov assumption: zn is independent of z1,...,zn-1 if we know x.

11 Putting oberservations and actions together: Bayes Filters
Given: Stream of observations z and action data u: Sensor model P(z|x). Action model P(x|u,x’). Prior probability of the system state P(x). Wanted: Estimate of the state X of a dynamical system. The posterior of the state is also called Belief:

12 Graphical Representation and Markov Assumption
Underlying Assumptions Static world Independent noise Perfect model, no approximation errors

13 Bayes Filters z = observation u = action x = state Bayes Markov
Total prob. Markov Markov

14 Prediction Correction

15 Bayes Filter Algorithm
Algorithm Bayes_filter( Bel(x),d ): h=0 If d is a perceptual data item z then For all x do Else if d is an action data item u then Return Bel’(x)

16 Bayes Filters are Familiar!
Kalman filters Particle filters Hidden Markov models Dynamic Bayesian networks Partially Observable Markov Decision Processes (POMDPs)

17

18 Probabilistic Robotics
Bayes Filter Implementations Gaussian filters

19 Linear transform of Gaussians
Univariate Linear transform of Gaussians

20 Multivariate Gaussians
We stay in the “Gaussian world” as long as we start with Gaussians and perform only linear transformations.

21 Discrete Kalman Filter
Estimates the state x of a discrete-time controlled process that is governed by the linear stochastic difference equation with a measurement

22 Linear Gaussian Systems: Initialization
Initial belief is normally distributed:

23 Linear Gaussian Systems: Dynamics
Dynamics are linear function of state and control plus additive noise:

24 Linear Gaussian Systems: Observations
Observations are linear function of state plus additive noise:

25 Kalman Filter Algorithm
Algorithm Kalman_filter( mt-1, St-1, ut, zt): Prediction: Correction: Return mt, St

26 Kalman Filter Summary Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k n2) Optimal for linear Gaussian systems! Most robotics systems are nonlinear!

27 Nonlinear Dynamic Systems
Most realistic robotic problems involve nonlinear functions

28 Linearity Assumption Revisited

29 Non-linear Function

30 EKF Linearization (1)

31 EKF Linearization (2)

32 EKF Linearization (3)

33 EKF Linearization: First Order Taylor Series Expansion
Prediction: Correction:

34 EKF Algorithm Extended_Kalman_filter( mt-1, St-1, ut, zt): Prediction:
Correction: Return mt, St

35 Localization “Using sensory information to locate the robot in its environment is the most fundamental problem to providing a mobile robot with autonomous capabilities.” [Cox ’91] Given Map of the environment. Sequence of sensor measurements. Wanted Estimate of the robot’s position. Problem classes Position tracking Global localization Kidnapped robot problem (recovery)

36 Landmark-based Localization

37 EKF Summary Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k n2) Not optimal! Can diverge if nonlinearities are large! Works surprisingly well even when all assumptions are violated!

38 Kalman Filter-based System
[Arras et al. 98]: Laser range-finder and vision High precision (<1cm accuracy) [Courtesy of Kai Arras]

39 Multi- hypothesis Tracking

40 Localization With MHT Belief is represented by multiple hypotheses
Each hypothesis is tracked by a Kalman filter Additional problems: Data association: Which observation corresponds to which hypothesis? Hypothesis management: When to add / delete hypotheses? Huge body of literature on target tracking, motion correspondence etc.

41 MHT: Implemented System (2)
Courtesy of P. Jensfelt and S. Kristensen

42 Probabilistic Robotics
Bayes Filter Implementations Discrete filters

43 Piecewise Constant

44 Discrete Bayes Filter Algorithm
Algorithm Discrete_Bayes_filter( Bel(x),d ): h=0 If d is a perceptual data item z then For all x do Else if d is an action data item u then Return Bel’(x)

45 Grid-based Localization

46 Sonars and Occupancy Grid Map

47 Probabilistic Robotics
Bayes Filter Implementations Particle filters

48 Sample-based Localization (sonar)

49 Particle Filters Represent belief by random samples
Estimation of non-Gaussian, nonlinear processes Monte Carlo filter, Survival of the fittest, Condensation, Bootstrap filter, Particle filter Filtering: [Rubin, 88], [Gordon et al., 93], [Kitagawa 96] Computer vision: [Isard and Blake 96, 98] Dynamic Bayesian Networks: [Kanazawa et al., 95]d

50 Importance Sampling Weight samples: w = f / g

51 Importance Sampling with Resampling: Landmark Detection Example

52 Particle Filters

53 Sensor Information: Importance Sampling

54 Robot Motion

55 Sensor Information: Importance Sampling

56 Robot Motion

57 Particle Filter Algorithm
Algorithm particle_filter( St-1, ut-1 zt): For Generate new samples Sample index j(i) from the discrete distribution given by wt-1 Sample from using and Compute importance weight Update normalization factor Insert For Normalize weights

58 Particle Filter Algorithm
Importance factor for xit: draw xit from p(xt | xit-1,ut-1) draw xit-1 from Bel(xt-1)

59 Motion Model Reminder Start

60 Proximity Sensor Model Reminder
Sonar sensor Laser sensor

61 Initial Distribution

62 After Incorporating Ten Ultrasound Scans

63 After Incorporating 65 Ultrasound Scans

64 Estimated Path

65 Localization for AIBO robots

66 Limitations The approach described so far is able to
track the pose of a mobile robot and to globally localize the robot. How can we deal with localization errors (i.e., the kidnapped robot problem)?

67 Approaches Randomly insert samples (the robot can be teleported at any point in time). Insert random samples proportional to the average likelihood of the particles (the robot has been teleported with higher probability when the likelihood of its observations drops).

68 Global Localization

69 Kidnapping the Robot

70 Recovery from Failure

71 Summary Particle filters are an implementation of recursive Bayesian filtering They represent the posterior by a set of weighted samples. In the context of localization, the particles are propagated according to the motion model. They are then weighted according to the likelihood of the observations. In a re-sampling step, new particles are drawn with a probability proportional to the likelihood of the observation.


Download ppt "Markov Localization & Bayes Filtering"

Similar presentations


Ads by Google