Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 26: Singular Value Decomposition.

Similar presentations


Presentation on theme: "University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 26: Singular Value Decomposition."— Presentation transcript:

1 University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 26: Singular Value Decomposition and Filter Augmentations

2 University of Colorado Boulder  Homework due Friday  Lecture quiz due Friday  Exam 2 – Friday, November 7 ◦ No homework or lecture quiz due next week 2

3 University of Colorado Boulder 3 Lecture Quiz 5

4 University of Colorado Boulder 4  We recommend that those towards the bottom of the scale review lectures 18-22 ◦ Next exam may include conceptual questions similar to these!

5 University of Colorado Boulder  Percent Correct: 68.18% 5  Given infinite precision, the Kalman filter is more accurate than the batch (after one iteration). ◦ True ◦ False

6 University of Colorado Boulder  Percent Correct: 86.36% 6  We are estimating one-dimensional position and velocity of an object with linear dynamics moving away from the origin in the positive X direction. Before receiving any observations, we have a mean and covariance matrix describing the initial state. We observe the instantaneous distance from the origin to the object over time. We also know the variance of the observation error, which is independent of the state estimate probability density function.  In the scenario above, using a Kalman filter yields a solution Bayes theorem for updating a state estimate PDF. ◦ True ◦ False

7 University of Colorado Boulder  In the derivation of the Kalman filter as a solution to the Bayesian estimation, what did we assume? 7

8 University of Colorado Boulder  Percent Correct: 72.73% 8  We have measurements with an observation error variance of 4.0. After processing observations in the batch, we expect the standard deviation of the unweighted post-fit residuals to be: ◦ 1 ◦ 2 ◦ 4 ◦ 8

9 University of Colorado Boulder  Should examine both the pre- and post-fit residuals: 9

10 University of Colorado Boulder 10  In the example problem presented before the lecture, we discussed the analysis of residuals.  We expect the statistics of the post-fit residuals to match the observation error statistics ◦ More on this today and Wednesday

11 University of Colorado Boulder  Percent Correct: 18.18% 11  Given an a priori estimated state for position and velocity (mean and covariance matrix), we can udate the state with a single observation using the minimum variance estimator because: ◦ The minimum norm estimator is mathematically equivalent to the minimum variance estimator. ◦ The information matrix is full rank. ◦ Conceptually, we consider the a priori information as a observation of the true state, thereby yielding more measurements than unknowns. ◦ The minimum variance estimator actually does not work in the scenario described. 41% 64% 73% 9%

12 University of Colorado Boulder  Minimum Variance: Yields the linear, unbiased, minimum variance estimate of the state  Minimum Norm: Add constraints to the cost function to uniquely determine an estimate state 12

13 University of Colorado Boulder  This is analogous to treating the a priori information as an observation of the estimated state at the epoch time 13

14 University of Colorado Boulder  Percent Correct: 31.82% 14  The CKF updates the reference trajectory sequentially, e.g., as observations become available over time. ◦ True ◦ False

15 University of Colorado Boulder 15 What are we doing with the reference trajectory at each update?

16 University of Colorado Boulder 16 Singular Value Decomposition

17 University of Colorado Boulder 17 Singular Value Decomposition-Based Least Squares (not in book)

18 University of Colorado Boulder  The SVD of any real m×n matrix H is 18

19 University of Colorado Boulder 19

20 University of Colorado Boulder  It turns out that we can solve the linear system 20 using the pseudoinverse given by the SVD

21 University of Colorado Boulder  For the linear system 21 the solution minimizes the least squares cost function

22 University of Colorado Boulder  Recall that for the normal solution, 22  This squares the condition number of H !  Instead, SVD operates on H, thereby improving solution accuracy

23 University of Colorado Boulder  The covariance matrix P with R the identity matrix is: 23 Home Practice Exercise: Derive the equation for P above

24 University of Colorado Boulder  Solving the LS problem via SVD provides one of (if not the most) numerically stable solutions  Also a square-root method (does not square the condition number of H )  Generating the SVD is more computationally intensive than most methods 24

25 University of Colorado Boulder 25 Predicted Residuals

26 University of Colorado Boulder  Previously, we have discussed the pre-fit and post-fit residuals:  How can this change in the context of the CKF? 26

27 University of Colorado Boulder  At each measurement time in the CKF, we can take a look at the prediction residual (sometimes called innovation):  Covariance of the prediction residual (derived in HW 7): 27

28 University of Colorado Boulder  How might we use the prediction residual PDF? 28

29 University of Colorado Boulder  Compute the prefit residual variance via 29  An observation may be ignored in the filter if (for example):

30 University of Colorado Boulder 30 Bias Estimation

31 University of Colorado Boulder  As shown in the homework, i.e., biased observations, yields a biased estimator.  To compensate, we can estimate the bias: 31

32 University of Colorado Boulder  What are some example sources of bias in an observation? 32

33 University of Colorado Boulder  GPS receiver solutions for Jason-2  Antenna is offset ~1.4 meters from COM  What could be causing the bias change after 80 hours? 33


Download ppt "University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 26: Singular Value Decomposition."

Similar presentations


Ads by Google