Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Colorado Boulder ASEN 5070 Statistical Orbit determination I Fall 2012 Professor George H. Born Professor Jeffrey S. Parker Lecture 9: Least.

Similar presentations


Presentation on theme: "University of Colorado Boulder ASEN 5070 Statistical Orbit determination I Fall 2012 Professor George H. Born Professor Jeffrey S. Parker Lecture 9: Least."— Presentation transcript:

1 University of Colorado Boulder ASEN 5070 Statistical Orbit determination I Fall 2012 Professor George H. Born Professor Jeffrey S. Parker Lecture 9: Least Squares Estimate 1

2 University of Colorado Boulder  Homework 2 Graded ◦ Comments included on D2L ◦ Any questions, talk with us this week ◦ Homework 3 Graded by Thursday EOD  Homework 4 due Thursday ◦ Solutions online now. ◦ The solutions show a lot of information – be sure to demonstrate that you know the answers and are not just copying the solutions!  Homework 5 out today 2

3 University of Colorado Boulder  Problem 1: ◦ Simulate the Stat OD process on a 2-dimensional, point-mass orbit.  Problem 2: ◦ Simple Batch Process exercise.  Problem 3: ◦ Process noisy observations of a spring-mass system. 3

4 University of Colorado Boulder 4

5 University of Colorado Boulder  Review of Stat OD Concepts  Least Squares Estimator  Quiz: Observability ◦ ~10% of the class got 4/4 ◦ Majority of the class got 2/4 ◦ We’ll go over the topic more closely toward the end of the lecture! 5

6 University of Colorado Boulder 6

7 University of Colorado Boulder  We have noisy observations of certain aspects of the system.  We need some way to relate each observation to the trajectory that we’re estimating. 7 Observed Range Computed Range True Range = ??? ε = O-C = “Residual” X*X*

8 University of Colorado Boulder  How do we best fit the data? 8 Residuals = ε = O-C No Not bad Good

9 University of Colorado Boulder  Lp norm estimators 9 p = 1: even weights, but discontinuous p = 2: favors outliers Interior options: optimal for certain scenarios that involve outliers.

10 University of Colorado Boulder  A good solution, and one easy to code up, is the least-squares solution 10

11 University of Colorado Boulder  Linearization  Introduce the state deviation vector  If the reference/nominal trajectory is close to the truth trajectory, then a linear approximation is reasonable. 11

12 University of Colorado Boulder  Goal of the Stat OD process:  The best fit trajectory is represented by 12 This is what we want

13 University of Colorado Boulder  How do we map the state deviation vector from one time to another?  The state transition matrix. ◦ It tracks how a perturbation grows/shrinks/changes over time along the trajectory. 13

14 University of Colorado Boulder  The state transition matrix maps a deviation in the state from one epoch to another.  It is constructed via numerical integration, in parallel with the trajectory itself. 14

15 University of Colorado Boulder  The “A” Matrix: 15

16 University of Colorado Boulder 16

17 University of Colorado Boulder 17

18 University of Colorado Boulder 18

19 University of Colorado Boulder  Step 1: Define your state and your dynamics. 19

20 University of Colorado Boulder  Step 1: Define your state and your dynamics. 20

21 University of Colorado Boulder  Step 2: Integrate your state and STM. 21 Use the “reshape” command: X0 = [ X; reshape(Phi, 8*8, 1) ]

22 University of Colorado Boulder  Step 2: Integrate your state and STM. 22

23 University of Colorado Boulder  Step 2: Integrate your state and STM. ◦ Did we do too much work? Yes, yes, we did. 23

24 University of Colorado Boulder  Step 2: Integrate your state and STM. ◦ Did we do too much work? Yes, yes, we did. 24 Don’t have to integrate these parameters!

25 University of Colorado Boulder  Step 2: Integrate your state and STM. ◦ Did we do too much work? Yes, yes, we did. 25 Don’t have to integrate these parameters! Might not have to integrate these either!

26 University of Colorado Boulder  Step 2: Integrate your state and STM. 26 Phi0 = eye(6,8) X0 = [ X; reshape(Phi, 6*8, 1) ]

27 University of Colorado Boulder  Step 2: Integrate your state and STM. ◦ This is important for your final project. ◦ Rather than integrating 19x18 = 342 equations, Integrate only the 10x6 = 60 equations that change over time. 27

28 University of Colorado Boulder  Step 3: Process observations.  Step 4: Determine a new best estimate of the state.  Step 5: Iterate/repeat as needed. 28

29 University of Colorado Boulder  How do we map an observation to the trajectory? 29

30 University of Colorado Boulder  The Mapping Matrix 30

31 University of Colorado Boulder  Our state and measurement mappings. 31 ε = O-C = “Residual” X*X*

32 University of Colorado Boulder  If the state has n unknown parameters and there are l observations at different times  Then there are a total of n×l unknowns.  If we can instead relate each state to an epoch, then we can reduce it to only n unknown state parameters. 32 If the errors are equal to zero, then we can deterministically solve this system using n linearly independent observations.

33 University of Colorado Boulder 33

34 University of Colorado Boulder 34 If the errors are not zero, then we now have n unknown state parameters and l unknown errors. If we have multiple observation types (say p types), then we have m = p×l observations and a total of m unknown observation errors. Total unknowns: m+n Total equations: m Least squares criterion provides a best estimate solution in this scenario.

35 University of Colorado Boulder  Least squares estimate

36 University of Colorado Boulder  Least squares estimate

37 University of Colorado Boulder  The state deviation vector that minimizes the least-squares cost function:  Some observations of this solution: ◦ What is the k ? ◦ Are all data points equally weighted? ◦ Does this solution consider correlated data or statistical information in the observations? ◦ What happens if we acquire more data?

38 University of Colorado Boulder  The state deviation vector that minimizes the least-squares cost function:  Additional Details: ◦ is called the normal matrix ◦ If H is full rank, then this will be positive definite.  If it’s not then we don’t have a least squares estimate! ◦ The best estimate of the observation errors: ◦

39 University of Colorado Boulder 39

40 University of Colorado Boulder Or in component form: Expressed in first order form:

41 University of Colorado Boulder

42 University of Colorado Boulder

43 University of Colorado Boulder

44 University of Colorado Boulder

45 University of Colorado Boulder  Notice that all variables end up in either the A or matrix, but not necessarily both. A

46 University of Colorado Boulder  Which equations of motion do we need to integrate? A

47 University of Colorado Boulder  The previous example (4.2.1 from the book) left us off at the definitions of the matrices.  Next example takes us through the least squares estimate. 47

48 University of Colorado Boulder Example of least squares Let (Note that this is a linear system) Assume we wish to estimate

49 University of Colorado Boulder Note that H T H is always a symmetric matrix

50 University of Colorado Boulder Assume: Then

51 University of Colorado Boulder Assume: Then, i.e. we have chosen perfect observations

52 University of Colorado Boulder  Quick Break  Next subject: observability 52

53 University of Colorado Boulder 53

54 University of Colorado Boulder 54

55 University of Colorado Boulder 55

56 University of Colorado Boulder 56

57 University of Colorado Boulder 57

58 University of Colorado Boulder 58

59 University of Colorado Boulder 59

60 University of Colorado Boulder 60

61 University of Colorado Boulder 61

62 University of Colorado Boulder 62

63 University of Colorado Boulder 63

64 University of Colorado Boulder 64

65 University of Colorado Boulder Three major shortcomings of simple least squares solution: 1. Each observation error is weighted equally even though the accuracy of observations may differ 2. The observation errors may be correlated (not independent), and the simple least squares solution makes no allowance for this. 3. The method does not consider that the errors are samples from a random process and makes no attempt to utilize statistical information 65

66 University of Colorado Boulder Weighted Least Squares -includes weighting matrix for observations Minimum Variance -considers statistical characteristics of measurement errors Minimum Variance w/ A Priori Information 66

67 University of Colorado Boulder

68 University of Colorado Boulder

69 University of Colorado Boulder

70 University of Colorado Boulder

71 University of Colorado Boulder The Weighting Matrix may be chosen by using the RMS of the observation residuals. Compute the RMS of the observation errors for each type of observation

72 University of Colorado Boulder Let i represent the observation type-say so for two observation types let We use the mean square (MS) so that will be dimensionless. This can enhance numerical stability of the normal equations.

73 University of Colorado Boulder  Homework 2 Graded ◦ Comments included on D2L ◦ Any questions, talk with us this week ◦ Homework 3 Graded by Thursday EOD  Homework 4 due Thursday ◦ Solutions online now. ◦ The solutions show a lot of information – be sure to demonstrate that you know the answers and are not just copying the solutions!  Homework 5 out today 73


Download ppt "University of Colorado Boulder ASEN 5070 Statistical Orbit determination I Fall 2012 Professor George H. Born Professor Jeffrey S. Parker Lecture 9: Least."

Similar presentations


Ads by Google