Presentation is loading. Please wait.

Presentation is loading. Please wait.

Probabilistic Robotics Bayes Filter Implementations Gaussian filters.

Similar presentations


Presentation on theme: "Probabilistic Robotics Bayes Filter Implementations Gaussian filters."— Presentation transcript:

1 Probabilistic Robotics Bayes Filter Implementations Gaussian filters

2 Markov  Kalman Filter Localization Markov localization localization starting from any unknown position recovers from ambiguous situation. However, to update the probability of all positions within the whole state space at any time requires a discrete representation of the space (grid). The required memory and calculation power can thus become very important if a fine grid is used. Kalman filter localization tracks the robot and is inherently very precise and efficient. However, if the uncertainty of the robot becomes to large (e.g. collision with an object) the Kalman filter will fail and the position is definitively lost.

3 Kalman Filter Localization

4 4 Bayes Filter Reminder 1. Algorithm Bayes_filter( Bel(x),d ): 2. 0 3. If d is a perceptual data item z then 4. For all x do 5. 6. 7. For all x do 8. 9. Else if d is an action data item u then 10. For all x do 11. 12. Return Bel’(x)

5 Prediction Correction Bayes Filter Reminder

6 Kalman Filter Bayes filter with Gaussians Developed in the late 1950's Most relevant Bayes filter variant in practice Applications range from economics, wheather forecasting, satellite navigation to robotics and many more. The Kalman filter "algorithm" is a couple of matrix multiplications! 6

7 Gaussians --   Univariate  Multivariate

8 Gaussians 1D 2D 3D Video

9 Properties of Gaussians Multivariate Univariate We stay in the “Gaussian world” as long as we start with Gaussians and perform only linear transformations

10 Introduction to Kalman Filter (1) Two measurements no dynamics Weighted least-square Finding minimum error After some calculation and rearrangements Another way to look at it – weigthed mean

11 11 Discrete Kalman Filter Estimates the state x of a discrete-time controlled process that is governed by the linear stochastic difference equation with a measurement Matrix (nxn) that describes how the state evolves from t to t-1 without controls or noise. Matrix (nxl) that describes how the control u t changes the state from t to t-1. Matrix (kxn) that describes how to map the state x t to an observation z t. Random variables representing the process and measurement noise that are assumed to be independent and normally distributed with covariance R t and Q t respectively.

12 12 Kalman Filter Updates in 1D predictionmeasurement correction It's a weighted mean!

13 13 Kalman Filter Updates in 1D

14 14 Kalman Filter Updates in 1D

15 15 Kalman Filter Updates

16 16 Linear Gaussian Systems: Initialization Initial belief is normally distributed:

17 17 Dynamics are linear function of state and control plus additive noise: Linear Gaussian Systems: Dynamics

18 18 Linear Gaussian Systems: Dynamics

19 19 Observations are linear function of state plus additive noise: Linear Gaussian Systems: Observations

20 20 Linear Gaussian Systems: Observations

21 21 Kalman Filter Algorithm 1. Algorithm Kalman_filter(  t-1,  t-1, u t, z t ): 2. Prediction: 3. 4. 5. Correction: 6. 7. 8. 9. Return  t,  t

22 Kalman Filter Algorithm

23  Prediction  Observation  Matching  Correction

24 24 The Prediction-Correction-Cycle Prediction

25 25 The Prediction-Correction-Cycle Correction

26 26 The Prediction-Correction-Cycle Correction Prediction

27 27 Kalman Filter Summary Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k 2.376 + n 2 ) Optimal for linear Gaussian systems! Most robotics systems are nonlinear!

28 28 Nonlinear Dynamic Systems Most realistic robotic problems involve nonlinear functions To be continued

29 29 Linearity Assumption Revisited

30 30 Non-linear Function

31 31 EKF Linearization (1)

32 32 EKF Linearization (2)

33 33 EKF Linearization (3) Depends on uncertainty

34 34 Prediction: Correction: EKF Linearization: First Order Taylor Series Expansion

35 35 EKF Algorithm 1.Extended_Kalman_filter (  t-1,  t-1, u t, z t ): 2. Prediction: 3. 4. 5. Correction: 6. 7. 8. 9. Return  t,  t

36 Kalman Filter for Mobile Robot Localization Robot Position Prediction In a first step, the robots position at time step k+1 is predicted based on its old location (time step k) and its movement due to the control input u(k): f: Odometry function

37 Kalman Filter for Mobile Robot Localization Robot Position Prediction: Example Odometry

38 Kalman Filter for Mobile Robot Localization Observation The second step it to obtain the observation Z(k+1) (measurements) from the robot’s sensors at the new location at time k+1 The observation usually consists of a set n 0 of single observations z j (k+1) extracted from the different sensors signals. It can represent raw data scans as well as features like lines, doors or any kind of landmarks. The parameters of the targets are usually observed in the sensor frame {S}. Therefore the observations have to be transformed to the world frame {W} or the measurement prediction have to be transformed to the sensor frame {S}. This transformation is specified in the function h i (seen later) – measurement prediction

39 Kalman Filter for Mobile Robot Localization Observation: Example Raw Date of Laser Scanner Extracted Lines in Model Space Sensor (robot) frame Set of discrete line features

40 Kalman Filter for Mobile Robot Localization Measurement Prediction In the next step we use the predicted robot position and the map M(k) to generate multiple predicted observations z t – what you should see They have to be transformed into the sensor frame We can now define the measurement prediction as the set containing all n i predicted observations The function h i is mainly the coordinate transformation between the world frame and the sensor frame Need to compute the measurement Jacobian to also predict uncertainties

41 Kalman Filter for Mobile Robot Localization Measurement Prediction: Example For prediction, only the walls that are in the field of view of the robot are selected. This is done by linking the individual lines to the nodes of the path

42 Kalman Filter for Mobile Robot Localization Measurement Prediction: Example The generated measurement predictions have to be transformed to the robot frame {R} According to the figure in previous slide the transformation is given by and its Jacobian by

43 Kalman Filter for Mobile Robot Localization Matching Assignment from observations z j (k+1) (gained by the sensors) to the targets z t (stored in the map) For each measurement prediction for which an corresponding observation is found we calculate the innovation: and its innovation covariance found by applying the error propagation law: The validity of the correspondence between measurement and prediction can e.g. be evaluated through the Mahalanobis distance:

44 Kalman Filter for Mobile Robot Localization Matching: Example

45 To find correspondence (pairs) of predicted and observed features we use the Mahalanobis distance with

46 Kalman Filter for Mobile Robot Localization Estimation: Applying the Kalman Filter Kalman filter gain: Update of robot’s position estimate: The associate variance

47 Kalman Filter for Mobile Robot Localization Estimation: 1D Case For the one-dimensional case with we can show that the estimation corresponds to the Kalman filter for one-dimension presented earlier.

48 Kalman Filter for Mobile Robot Localization Estimation: Example Kalman filter estimation of the new robot position : By fusing the prediction of robot position (magenta) with the innovation gained by the measurements (green) we get the updated estimate of the robot position (red)

49 Autonomous Indoor Navigation (Pygmalion EPFL) very robust on-the-fly localization one of the first systems with probabilistic sensor fusion 47 steps,78 meter length, realistic office environment, conducted 16 times > 1km overall distance partially difficult surfaces (laser), partially few vertical edges (vision)

50 50 Localization Given Map of the environment. Sequence of sensor measurements. Wanted Estimate of the robot’s position. Problem classes Position tracking Global localization Kidnapped robot problem (recovery) “Using sensory information to locate the robot in its environment is the most fundamental problem to providing a mobile robot with autonomous capabilities.” [Cox ’91]

51 51 Landmark-based Localization

52 52 1.EKF_localization (  t-1,  t-1, u t, z t, m): Prediction: 6. 7. 8. Motion noise Jacobian of g w.r.t location Predicted mean Predicted covariance Jacobian of g w.r.t control

53 53 1.EKF_localization (  t-1,  t-1, u t, z t, m): Correction: 6. 7. 8. 9. 10. Predicted measurement mean Pred. measurement covariance Kalman gain Updated mean Updated covariance Jacobian of h w.r.t location

54 54 EKF Prediction Step

55 55 EKF Observation Prediction Step

56 56 EKF Correction Step

57 57 Estimation Sequence (1)

58 58 Estimation Sequence (2)

59 59 Comparison to GroundTruth

60 60 EKF Summary Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k 2.376 + n 2 ) Not optimal! Can diverge if nonlinearities are large! Works surprisingly well even when all assumptions are violated!

61 61 Linearization via Unscented Transform EKF UKF

62 62 UKF Sigma-Point Estimate (2) EKF UKF

63 63 UKF Sigma-Point Estimate (3) EKF UKF

64 64 Unscented Transform Sigma points Weights Pass sigma points through nonlinear function Recover mean and covariance

65 65 UKF_localization (  t-1,  t-1, u t, z t, m): Prediction: Motion noise Measurement noise Augmented state mean Augmented covariance Sigma points Prediction of sigma points Predicted mean Predicted covariance

66 66 UKF_localization (  t-1,  t-1, u t, z t, m): Correction: Measurement sigma points Predicted measurement mean Pred. measurement covariance Cross-covariance Kalman gain Updated mean Updated covariance

67 67 1.EKF_localization (  t-1,  t-1, u t, z t, m): Correction: 6. 7. 8. 9. 10. Predicted measurement mean Pred. measurement covariance Kalman gain Updated mean Updated covariance Jacobian of h w.r.t location

68 68 UKF Prediction Step

69 69 UKF Observation Prediction Step

70 70 UKF Correction Step

71 71 EKF Correction Step

72 72 Estimation Sequence EKF PF UKF

73 73 Estimation Sequence EKF UKF

74 74 Prediction Quality EKF UKF

75 75 UKF Summary Highly efficient: Same complexity as EKF, with a constant factor slower in typical practical applications Better linearization than EKF: Accurate in first two terms of Taylor expansion (EKF only first term) Derivative-free: No Jacobians needed Still not optimal!

76 76 [Arras et al. 98]: Laser range-finder and vision High precision (<1cm accuracy) Kalman Filter-based System [Courtesy of Kai Arras]

77 Autonomous Indoor Navigation (Pygmalion EPFL) very robust on-the-fly localization one of the first systems with probabilistic sensor fusion 47 steps,78 meter length, realistic office environment, conducted 16 times > 1km overall distance partially difficult surfaces (laser), partially few vertical edges (vision)

78 78 Multi- hypothesis Tracking

79 79 Belief is represented by multiple hypotheses Each hypothesis is tracked by a Kalman filter Additional problems: Data association: Which observation corresponds to which hypothesis? Hypothesis management: When to add / delete hypotheses? Huge body of literature on target tracking, motion correspondence etc. Localization With MHT

80 80 Hypotheses are extracted from LRF scans Each hypothesis has probability of being the correct one: Hypothesis probability is computed using Bayes’ rule Hypotheses with low probability are deleted. New candidates are extracted from LRF scans. MHT: Implemented System (1) [Jensfelt et al. ’00]

81 81 MHT: Implemented System (2) Courtesy of P. Jensfelt and S. Kristensen

82 82 MHT: Implemented System (3) Example run Map and trajectory # hypotheses #hypotheses vs. time P(H best ) Courtesy of P. Jensfelt and S. Kristensen


Download ppt "Probabilistic Robotics Bayes Filter Implementations Gaussian filters."

Similar presentations


Ads by Google