Download presentation

Presentation is loading. Please wait.

Published byMariah Jones Modified over 6 years ago

2
Markov Localization & Bayes Filtering 1 with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics

3
Faculties of Engineering & Computer Science 2 Autonomous Robotics CSCI 6905 / Mech 6905 – Section 3 Dalhousie Fall 2011 / 2012 Academic Term Introduction Motion Perception Control Concluding Remarks LEGO Mindstorms Control Scheme for Autonomous Mobile Robot

4
Faculties of Engineering & Computer Science 3 Autonomous Robotics CSCI 6905 / Mech 6905 – Section 3 Dalhousie Fall 2011 / 2012 Academic Term Introduction Motion Perception Control Concluding Remarks LEGO Mindstorms Control Scheme for Autonomous Mobile Robot – the plan –Thomas will cover generalized Bayesian filters for localization next week –Mae sets up the background for him, today, by discussing motion and sensor models as well as robot control –Mae then follows on Bayesian filters to do a specific example, underwater SLAM

5
Markov Localization 4 The robot doesn’t know where it is. Thus, a reasonable initial believe of it’s position is a uniform distribution.

6
Markov Localization 5 A sensor reading is made (USE SENSOR MODEL) indicating a door at certain locations (USE MAP). This sensor reading should be integrated with prior believe to update our believe (USE BAYES).

7
Markov Localization 6 The robot is moving (USE MOTION MODEL) which adds noise.

8
Markov Localization 7 A new sensor reading (USE SENSOR MODEL) indicates a door at certain locations (USE MAP). This sensor reading should be integrated with prior believe to update our believe (USE BAYES).

9
Markov Localization 8 The robot is moving (USE MOTION MODEL) which adds noise. …

10
9 Bayes Formula

11
10 Bayes Rule with Background Knowledge

12
11 Normalization Algorithm:

13
12 Recursive Bayesian Updating Markov assumption: z n is independent of z 1,...,z n-1 if we know x.

14
13 Putting oberservations and actions together: Bayes Filters Given: Stream of observations z and action data u: Sensor model P(z|x). Action model P(x|u,x’). Prior probability of the system state P(x). Wanted: Estimate of the state X of a dynamical system. The posterior of the state is also called Belief:

15
14 Graphical Representation and Markov Assumption Underlying Assumptions Static world Independent noise Perfect model, no approximation errors

16
15 Bayes Filters Bayes z = observation u = action x = state Markov Total prob. Markov

17
Prediction Correction

18
17 Bayes Filter Algorithm 1. Algorithm Bayes_filter( Bel(x),d ): 2. 0 3. If d is a perceptual data item z then 4. For all x do 5. 6. 7. For all x do 8. 9. Else if d is an action data item u then 10. For all x do 11. 12. Return Bel’(x)

19
18 Bayes Filters are Familiar! Kalman filters Particle filters Hidden Markov models Dynamic Bayesian networks Partially Observable Markov Decision Processes (POMDPs)

20
19

21
SA-1 Probabilistic Robotics Bayes Filter Implementations Gaussian filters

22
Linear transform of Gaussians -- Univariate Gaussians

23
We stay in the “Gaussian world” as long as we start with Gaussians and perform only linear transformations. Multivariate Gaussians

24
23 Discrete Kalman Filter Estimates the state x of a discrete-time controlled process that is governed by the linear stochastic difference equation with a measurement

25
24 Linear Gaussian Systems: Initialization Initial belief is normally distributed:

26
25 Dynamics are linear function of state and control plus additive noise: Linear Gaussian Systems: Dynamics

27
26 Observations are linear function of state plus additive noise: Linear Gaussian Systems: Observations

28
27 Kalman Filter Algorithm 1. Algorithm Kalman_filter( t-1, t-1, u t, z t ): 2. Prediction: 3. 4. 5. Correction: 6. 7. 8. 9. Return t, t

29
28 Kalman Filter Summary Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k 2.376 + n 2 ) Optimal for linear Gaussian systems! Most robotics systems are nonlinear!

30
29 Nonlinear Dynamic Systems Most realistic robotic problems involve nonlinear functions

31
30 Linearity Assumption Revisited

32
31 Non-linear Function

33
32 EKF Linearization (1)

34
33 EKF Linearization (2)

35
34 EKF Linearization (3)

36
35 Prediction: Correction: EKF Linearization: First Order Taylor Series Expansion

37
36 EKF Algorithm 1.Extended_Kalman_filter ( t-1, t-1, u t, z t ): 2. Prediction: 3. 4. 5. Correction: 6. 7. 8. 9. Return t, t

38
37 Localization Given Map of the environment. Sequence of sensor measurements. Wanted Estimate of the robot’s position. Problem classes Position tracking Global localization Kidnapped robot problem (recovery) “Using sensory information to locate the robot in its environment is the most fundamental problem to providing a mobile robot with autonomous capabilities.” [Cox ’91]

39
38 Landmark-based Localization

40
39 EKF Summary Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k 2.376 + n 2 ) Not optimal! Can diverge if nonlinearities are large! Works surprisingly well even when all assumptions are violated!

41
40 [Arras et al. 98]: Laser range-finder and vision High precision (<1cm accuracy) Kalman Filter-based System [Courtesy of Kai Arras]

42
41 Multi- hypothesis Tracking

43
42 Belief is represented by multiple hypotheses Each hypothesis is tracked by a Kalman filter Additional problems: Data association: Which observation corresponds to which hypothesis? Hypothesis management: When to add / delete hypotheses? Huge body of literature on target tracking, motion correspondence etc. Localization With MHT

44
43 MHT: Implemented System (2) Courtesy of P. Jensfelt and S. Kristensen

45
SA-1 Probabilistic Robotics Bayes Filter Implementations Discrete filters

46
45 Piecewise Constant

47
46 Discrete Bayes Filter Algorithm 1. Algorithm Discrete_Bayes_filter( Bel(x),d ): 2. 0 3. If d is a perceptual data item z then 4. For all x do 5. 6. 7. For all x do 8. 9. Else if d is an action data item u then 10. For all x do 11. 12. Return Bel’(x)

48
47 Grid-based Localization

49
48 Sonars and Occupancy Grid Map

50
SA-1 Probabilistic Robotics Bayes Filter Implementations Particle filters

51
Sample-based Localization (sonar)

52
Represent belief by random samples Estimation of non-Gaussian, nonlinear processes Monte Carlo filter, Survival of the fittest, Condensation, Bootstrap filter, Particle filter Filtering: [Rubin, 88], [Gordon et al., 93], [Kitagawa 96] Computer vision: [Isard and Blake 96, 98] Dynamic Bayesian Networks: [Kanazawa et al., 95]d Particle Filters

53
Weight samples: w = f / g Importance Sampling

54
Importance Sampling with Resampling: Landmark Detection Example

55
Particle Filters

56
Sensor Information: Importance Sampling

57
Robot Motion

58
Sensor Information: Importance Sampling

59
Robot Motion

60
1. Algorithm particle_filter( S t-1, u t-1 z t ): 2. 3. For Generate new samples 4. Sample index j(i) from the discrete distribution given by w t-1 5. Sample from using and 6. Compute importance weight 7. Update normalization factor 8. Insert 9. For 10. Normalize weights Particle Filter Algorithm

61
draw x i t 1 from Bel (x t 1 ) draw x i t from p ( x t | x i t 1,u t 1 ) Importance factor for x i t : Particle Filter Algorithm

62
Start Motion Model Reminder

63
Proximity Sensor Model Reminder Laser sensor Sonar sensor

64
63 Initial Distribution

65
64 After Incorporating Ten Ultrasound Scans

66
65 After Incorporating 65 Ultrasound Scans

67
66 Estimated Path

68
Localization for AIBO robots

69
68 Limitations The approach described so far is able to track the pose of a mobile robot and to globally localize the robot. How can we deal with localization errors (i.e., the kidnapped robot problem)?

70
69 Approaches Randomly insert samples (the robot can be teleported at any point in time). Insert random samples proportional to the average likelihood of the particles (the robot has been teleported with higher probability when the likelihood of its observations drops).

71
70 Global Localization

72
71 Kidnapping the Robot

73
Recovery from Failure

74
73 Summary Particle filters are an implementation of recursive Bayesian filtering They represent the posterior by a set of weighted samples. In the context of localization, the particles are propagated according to the motion model. They are then weighted according to the likelihood of the observations. In a re-sampling step, new particles are drawn with a probability proportional to the likelihood of the observation.

Similar presentations

© 2021 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google