Presentation is loading. Please wait.

Presentation is loading. Please wait.

Marginalization and sliding window estimation

Similar presentations


Presentation on theme: "Marginalization and sliding window estimation"— Presentation transcript:

1 Marginalization and sliding window estimation
Leo Koppel 13/06/2017

2 Consider a multivariate Gaussian distribution
Marginalization and sliding window| Introduction Consider a multivariate Gaussian distribution Marginal distribution, Y is marginalized out Marginal distribution, X is marginalized out “SLAM is tracking a normal distribution through a large state space” —Sibley, “A Sliding Window Filter for SLAM”

3 Basics of Gaussian distributions Marginalization The SLAM problem
Marginalization and sliding window| Overview Basics of Gaussian distributions Marginalization The SLAM problem Applications: sliding window filters Homework problems

4 Basics of Gaussian distributions
Marginalization and sliding window Part 1 Basics of Gaussian distributions What does covariance mean? What does inverse covariance mean?

5 Consider a Gaussian distribution with zero mean,
Marginalization and sliding window| Gaussian basics Consider a Gaussian distribution with zero mean, A is the inverse of the covariance, K (Since mean is zero) Adapted from: Mackay, David. “The humble Gaussian distribution,” 2006.

6 Marginalization and sliding window| Gaussian basics
An example: 𝑦 2 is the outside temperature, 𝑦 1 is the temperature inside building 1, and 𝑦 3 is the temperature inside building 3. Adapted from: Mackay, David. “The humble Gaussian distribution,” 2006.

7 What is the covariance matrix?
Marginalization and sliding window| Gaussian basics What is the covariance matrix? Adapted from: Mackay, David. “The humble Gaussian distribution,” 2006.

8 What is the covariance matrix?
Marginalization and sliding window| Gaussian basics What is the covariance matrix? Eventually… Adapted from: Mackay, David. “The humble Gaussian distribution,” 2006.

9 What is the inverse covariance matrix?
Marginalization and sliding window| Gaussian basics What is the inverse covariance matrix? Recall: So write out P(y): and solve for A Adapted from: Mackay, David. “The humble Gaussian distribution,” 2006.

10 What is the inverse covariance matrix?
Marginalization and sliding window| Gaussian basics What is the inverse covariance matrix? Adapted from: Mackay, David. “The humble Gaussian distribution,” 2006.

11 What is the inverse covariance matrix?
Marginalization and sliding window| Gaussian basics What is the inverse covariance matrix? Adapted from: Mackay, David. “The humble Gaussian distribution,” 2006.

12 What does the inverse covariance tell us?
Marginalization and sliding window| Gaussian basics What does the inverse covariance tell us? The meaning of 0: “conditional on all the other variables, these two variables i and j are independent.” Note K-113 = 0 does not mean they are uncorrelated. Adapted from: Mackay, David. “The humble Gaussian distribution,” 2006.

13 Marginalization and sliding window| Gaussian basics
From: Mackay, David. “The humble Gaussian distribution,” 2006.

14 Marginalization and sliding window| Gaussian basics
Answer: A and B can be covariance. C and D can be inverse covariance. See element (1,3): zero in covariance means 1, 3 are uncorrelated (false) zero in inverse covariance means 1, 3 are independent conditional on 2 (true) From: Mackay, David. “The humble Gaussian distribution,” 2006.

15 Marginalization Part 2 How do we remove a variable in covariance form?
Marginalization and sliding window Part 2 Marginalization How do we remove a variable in covariance form? How do we remove a variable in information form? What is the Schur complement?

16 Taking away a variable Marginarine
Marginalization and sliding window| Marginalization, covariance form Taking away a variable Yes, this is it! Marginalization! Marginarine Adapted from: Mackay, David. “The humble Gaussian distribution,” 2006. BMK Wikimedia

17 What about the inverse covariance matrix?
Marginalization and sliding window| Marginalization, information form What about the inverse covariance matrix? It is not so easy if you don’t have the coloured terms! Adapted from: Mackay, David. “The humble Gaussian distribution,” 2006.

18 Marginalization and sliding window| Schur complement
Consider a matrix where A and D are square and invertible. We can turn it into a left- or right- triangular matrix: The Schur complement is Combining these, we get a block diagonal matrix: Source: Sibley, Gabe. “A Sliding Window Filter for SLAM.” (2006)

19 Marginalization and sliding window| Schur complement
We can decompose the matrix as Then the inverse is: Source: Sibley, Gabe. “A Sliding Window Filter for SLAM.” (2006)

20 Marginalization and sliding window| Schur complement
Schur derivation #1: show that marginalization of covariance is extracting a block Split the state into two sections, where A is cov(a), B is cov(b), C is cross-terms Write the covariance matrix as Then Substitute for the inverse covariance: Sources: Sibley, Gabe. “A Sliding Window Filter for SLAM.” (2006) Huang, Gary B. “Conditional and marginal distributions of a multivariate Gaussian.” (2010)

21 Marginalization and sliding window| Schur complement
See: Huang, Gary B. “Conditional and marginal distributions of a multivariate Gaussian.” (2010)

22 In the information form, it requires the Schur complement.
Marginalization and sliding window| Schur complement This shows that marginalizing a multivariate Gaussian distribution is as simple as taking a block of the covariance. In the information form, it requires the Schur complement. For formal proof see: Schön, Thomas B., and Fredrik Lindsten. "Manipulating the multivariate gaussian density." Division of Automatic Control, Linköping University, Sweden, Tech. Rep (2011).

23 ? Marginalization and sliding window| Schur complement
Schur derivation #2: what is marginalization in information form? Original covariance Original information Marginalized covariance Marginalized information ?

24 Marginalization and sliding window| Schur complement
From before, Multiply it out: That Schur was fun!

25 The marginal distribution 𝑃 𝑎 is given by:
Marginalization and sliding window| Marginalization, information form To recap: This matrix represents the probability distribution 𝑃 𝑎, 𝑏 in information form: The marginal distribution 𝑃 𝑎 is given by: For another derivation see: Thrun, Sebastian, and Michael Montemerlo. “The Graph SLAM Algorithm with Applications to Large-Scale Mapping of Urban Structures.” Robotics Research 25.5–6 (2006): 403–429.

26  Let’s apply that to the example:
Marginalization and sliding window| Marginalization, information form Let’s apply that to the example: Adapted from: Mackay, David. “The humble Gaussian distribution,” 2006.

27 Marginalization and sliding window| Covariance and information duality
Marginalization is easy in the covariance form, but hard in the information form. Conditioning is easy in information form, but hard in covariance form. Source: Walter, Matthew, Ryan Eustice, and John Leonard. "A provably consistent method for imposing sparsity in feature-based SLAM information filters." Robotics Research. Springer Berlin Heidelberg,

28 The SLAM problem Part 3 How is SLAM a least squares problem?
Marginalization and sliding window Part 3 The SLAM problem How is SLAM a least squares problem? Where does marginalization apply?

29 Consider a sequence of robot poses
Marginalization and sliding window| Back to batch Consider a sequence of robot poses With a motion model And measurement model with random noise

30 Estimate over the whole m poses:
Marginalization and sliding window| Back to batch Estimate over the whole m poses: The PDF for the whole path is Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. “A sliding window filter for incremental SLAM.” Unifying perspectives in computational and robot vision. Springer US,

31 Adding the measurement model,
Marginalization and sliding window| Back to batch Adding the measurement model, The PDF for the measurements is Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. “A sliding window filter for incremental SLAM.” Unifying perspectives in computational and robot vision. Springer US,

32 We also have a prior for the first pose
Marginalization and sliding window| Back to batch We also have a prior for the first pose The posterior probability of the system is We can put measurements, motion, and prior together: Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. “A sliding window filter for incremental SLAM.” Unifying perspectives in computational and robot vision. Springer US,

33 Take log of posterior probability:
Marginalization and sliding window| Back to batch Take log of posterior probability: This is a non-linear least squares problem. Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. “A sliding window filter for incremental SLAM.” Unifying perspectives in computational and robot vision. Springer US,

34 Marginalization and sliding window| Demo: MAP vs EKF

35 Extend to an actual SLAM problem, with landmarks. A few changes
Marginalization and sliding window| Extending to SLAM Extend to an actual SLAM problem, with landmarks. A few changes Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. “A sliding window filter for incremental SLAM.” Unifying perspectives in computational and robot vision. Springer US,

36 The Gauss-Newton solution: an iterative method
Marginalization and sliding window| SLAM as a NLS problem The Gauss-Newton solution: an iterative method where G is Jacobian of g(x) Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. “A sliding window filter for incremental SLAM.” Unifying perspectives in computational and robot vision. Springer US,

37 The Gauss-Newton solution: an iterative method
Marginalization and sliding window| SLAM as a NLS problem The Gauss-Newton solution: an iterative method where G is Jacobian of g(x) Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. “A sliding window filter for incremental SLAM.” Unifying perspectives in computational and robot vision. Springer US,

38 Marginalization and sliding window| SLAM as a NLS problem
Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. “A sliding window filter for incremental SLAM.” Unifying perspectives in computational and robot vision. Springer US,

39 Marginalization and sliding window| SLAM marginalization
The same as our simple example (slide 24), except marginalizing out the start of the state instead of the end. Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. "Sliding window filter with application to planetary landing." Journal of Field Robotics 27.5 (2010):

40 Applications Part 4 Sliding Window Filter
Marginalization and sliding window Part 4 Applications Sliding Window Filter Multi-State Constraint Kalman Filter Sliding Window Factor Graphs

41 The Sliding Window Filter algorithm
Marginalization and sliding window| Sliding Window Filter The Sliding Window Filter algorithm Keep a state vector of k poses. On each step: Add new pose parameters Remove old parameters Add new landmark parameters Update parameters Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. "Sliding window filter with application to planetary landing." Journal of Field Robotics 27.5 (2010):

42 The Sliding Window Filter algorithm
Marginalization and sliding window| Sliding Window Filter The Sliding Window Filter algorithm Keep a state vector of k poses. On each step: Add new pose parameters Apply process model and use Gauss Newton to update information matrix Remove old parameters If more than k poses, marginalize out oldest pose using Schur complement Marginalize out any landmarks no longer visible Add new landmark parameters Update parameters Solve the least squares problem with all measurements (Gauss-Newton plus some outlier rejection method) Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. "Sliding window filter with application to planetary landing." Journal of Field Robotics 27.5 (2010):

43 Marginalization and sliding window| Graphical example
Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. "Sliding window filter with application to planetary landing." Journal of Field Robotics 27.5 (2010):

44 Marginalize out first pose
Marginalization and sliding window| Graphical example with extra graphs Initial system Marginalize out first pose Marginalize out first landmark Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. "Sliding window filter with application to planetary landing." Journal of Field Robotics 27.5 (2010):

45 Marginalization and sliding window| Effect of window size
Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. "Sliding window filter with application to planetary landing." Journal of Field Robotics 27.5 (2010):

46 The SWF’s purpose is O(1) complexity.
Marginalization and sliding window| SWF results The SWF’s purpose is O(1) complexity. No marginalization Full SLAM State vector size Time per frame (s) With marginalization SWF Frame Frame Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. "Sliding window filter with application to planetary landing." Journal of Field Robotics 27.5 (2010):

47 Sibley argues that optimization is “greatly superior to filtering.”
Marginalization and sliding window| SWF results Sibley argues that optimization is “greatly superior to filtering.” Source: Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. "Sliding window filter with application to planetary landing." Journal of Field Robotics 27.5 (2010):

48 Multi-State Constraint Kalman Filter (MSCKF)
Marginalization and sliding window| MSCKF Multi-State Constraint Kalman Filter (MSCKF) Filtering method (EKF) Goal is pose estimation only (not mapping) Keeps a window of camera poses Covariance form Mourikis, A I, and S I Roumeliotis. “A Multi-State Kalman Filter for Vision-Aided Inertial Navigation.” Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) April (2007)

49 Multi-State Constraint Kalman Filter (MSCKF)
Marginalization and sliding window| MSCKF Multi-State Constraint Kalman Filter (MSCKF) Sliding window filter / SLAM MSCKF

50 Marginalization and sliding window| MSCKF

51 Sliding-Window Factor Graphs (SWFG)
Marginalization and sliding window| SWFG Sliding-Window Factor Graphs (SWFG) Extension of iSAM2 Keep a map of 3D landmarks A short-term smoother handles constraints from landmark measurements A long-term smoother handles loop closure Old states marginalized out of the graph Chiu, Han-pang et al. “Robust Vision-Aided Navigation Using Sliding-Window Factor Graphs.” (2013)

52 Interesting three-stage handling of landmarks
Marginalization and sliding window| SWFG Interesting three-stage handling of landmarks 1. Feature initialized as binary factor with 3D information marginalized out 2. Landmark stored as variable; re-projection factors added to all poses from which it’s observed 3. Landmark marginalized out into binary factors. Chiu, Han-pang et al. “Robust Vision-Aided Navigation Using Sliding-Window Factor Graphs.” (2013)

53 Marginalization and sliding window
Part 5 Homework problems

54 Homework problem #1 (easy)
Marginalization and sliding window| Homework Homework problem #1 (easy) Answer questions 3-5 from “The humble Gaussian distribution” Mackay, David. “The humble Gaussian distribution,” 2006.

55 Homework problem #1 (easy)
Marginalization and sliding window| Homework Homework problem #1 (easy) Answer questions 3-5 from “The humble Gaussian distribution” Mackay, David. “The humble Gaussian distribution,” 2006.

56 Homework problem #2 (hard)
Marginalization and sliding window| Homework Homework problem #2 (hard) Consider Example 2 from “The humble Gaussian distribution”: a) Find the covariance and inverse covariance matrix b) Marginalize out y1 in each form Mackay, David. “The humble Gaussian distribution,” 2006.

57 References Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. “A sliding window filter for incremental SLAM.” Unifying perspectives in computational and robot vision. Springer US, Sibley, Gabe, Larry Matthies, and Gaurav Sukhatme. "Sliding window filter with application to planetary landing." Journal of Field Robotics 27.5 (2010): Schön, Thomas B., and Fredrik Lindsten. "Manipulating the multivariate gaussian density." Division of Automatic Control, Linköping University, Sweden, Tech. Rep (2011). Mackay, David. “The humble Gaussian distribution,” 2006. Huang, Gary B. “Conditional and marginal distributions of a multivariate Gaussian.” (2010) Walter, Matthew, Ryan Eustice, and John Leonard. "A provably consistent method for imposing sparsity in feature-based SLAM information filters." Robotics Research. Springer Berlin Heidelberg, Thrun, Sebastian, and Michael Montemerlo. “The Graph SLAM Algorithm with Applications to Large-Scale Mapping of Urban Structures.” The International Journal of Robotics Research 25.5–6 (2006): 403–429. Mourikis, A I, and S I Roumeliotis. “A Multi-State Kalman Filter for Vision-Aided Inertial Navigation.” Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) April (2007): 10–14 Chiu, Han-pang et al. “Robust Vision-Aided Navigation Using Sliding-Window Factor Graphs.” (2013)


Download ppt "Marginalization and sliding window estimation"

Similar presentations


Ads by Google