Presentation is loading. Please wait.

Presentation is loading. Please wait.

On Optimal Distributed Kalman Filtering in Non-ideal Situations

Similar presentations


Presentation on theme: "On Optimal Distributed Kalman Filtering in Non-ideal Situations"β€” Presentation transcript:

1 On Optimal Distributed Kalman Filtering in Non-ideal Situations
Marc Reinhardt, Benjamin Noack Intelligent Sensor-Actuator-Systems Laboratory (ISAS), Institute for Anthropomatics and Robotics, Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany

2 http://www.faz.net/-gup-7kc5e, Wolfgang Eilmes
Motivation Wolfgang Eilmes EMPHASIZE No simple task, isas drafted proposal with company Hirschmann Benjamin Valeo

3 Calculate optimal estimate π‘₯ π‘˜ of state 𝒙 π‘˜
Monitoring Proposal: Stability of Cranes Monitoring of risk that crane tips Different sensor types Battery-powered sensors Wireless communication  Preprocessing at sensors Problem Setup Unknown state Interconnected sensors monitor stability of crane with multiple sensors load sensor, windsensor, inclination sensor, etc. Cranes are dynamic, prevent cable breaks, no electricity cables Idea: battery + wireless communication Achieve reliable communication, minimize battery costs/number of transmissions Combine observations at sensors, communicate estimates if necessary Uncertain measurements Calculate optimal estimate π‘₯ π‘˜ of state 𝒙 π‘˜ Benjamin Valeo

4 Estimation Framework System Linear(-ized) evolution model
𝒙 π‘˜+1 = 𝐴 π‘˜ 𝒙 π‘˜ + π’˜ π‘˜ Linear(-ized) meas. models 𝒛 π‘˜ = 𝐻 π‘˜ 𝒙 π‘˜ + 𝒗 π‘˜ Independent noise terms π’˜ π‘˜ , 𝒗 π‘˜ Prediction Filtering Central optimal: Kalman filter Problem Setup Unknown state Calculate optimal estimate π‘₯ π‘˜ of state 𝒙 π‘˜ Interconnected sensors Uncertain measurements LMMSE Estimator Derive (recursive) linear estimator that minimizes argmin π‘₯ π‘˜ π‘‘π‘Ÿ 𝐢 π‘˜ π‘€π‘–π‘‘β„Ž 𝐢 π‘˜ = π‘₯ π‘˜ βˆ’ 𝒙 π‘˜ 2 Estimation Estimator π‘₯ π‘˜ is function of measurements Error π‘₯ π‘˜ βˆ’ 𝒙 π‘˜ is to be minimized Abstract/general -> consider tractable mathematical problem Assume: linear models can be inferred from real world problem: describe evolution of the state Design variables: reduce complexity: focus on linear functions, stochastic error function: quadratic error Mean squared error criterion Only (recursive) linear functions Benjamin Valeo

5 Sensor Network Sensor Network
Multiple nodes are connected to one network Sensor Network The same phenomenum is observed by the sensors One Phenomenum Relation to example in background The processing is done at synchronized discrete time-steps Discrete-time Benjamin Valeo

6 Centralized Processing
Reference for sensor network algorithms Optimal Estimator Central Processing Not scalable Bad reliability High data traffic As a reference Why bad solution Benjamin Valeo

7 Communication in a Sensor Network
Links are not stable Communication Measurements must be processed locally Local Processing Benjamin Valeo

8 Distributed Sensor Network
Optimize average estimation quality Decentralized Estimation Optimize estimation at sink only Distributed Estimation Benjamin Valeo

9 Illustrating Example Sensor Network 100 sensor nodes Limited Range
No communication Reconstruction of a 2D path of an object Estimates are stored every 10th time step Collection of data after 100 time steps Sensor Network Benjamin Valeo

10 Optimal Distributed Kalman Filter
Number of nodes as well as global measurement model is known to all nodes Application to Example Scenario Optimal Results are equivalent to centralized processing Kalman filter The initial estimates π‘₯ π‘˜ 𝑖 , 𝐢 π‘˜ 𝑖 are globalized for π‘–βˆˆ{1,…,𝑛} Transformation of Local Estimates (Globalization) π‘₯ π‘˜ 𝑖 = 𝐢 π‘˜ 𝐢 π‘˜ 𝑖 βˆ’1 x k i 𝐢 π‘˜ βˆ’1 = 𝑖=1 𝑛 𝐢 π‘˜ 𝑖 βˆ’1 Slightly modified Kalman filter Presented by Koch Benjamin Valeo

11 Optimal Distributed Kalman Filter (2)
Similar to standard prediction Local Relaxed Prediction With modified gains Local Filtering π‘₯ π‘˜|π‘˜ 𝑖 = 𝐢 π‘˜|π‘˜ 𝐢 π‘˜ 𝑖 βˆ’1 x k i + 𝐻 π‘˜ 𝑖 𝑇 𝐢 π‘˜ 𝑣 𝑖 βˆ’1 z k i 𝐢 π‘˜|π‘˜ βˆ’1 = 𝐢 π‘˜ 𝑖 βˆ’1 + 1 𝑛 𝑖=1 𝑛 𝐻 π‘˜ 𝑖 𝑇 𝐢 π‘˜ 𝑣 𝑖 βˆ’1 𝐻 π‘˜ 𝑖 π‘₯ π‘˜+1 𝑖 =𝐴 π‘₯ π‘˜ 𝑖 𝐢 π‘˜+1 =𝐴 𝐢 π‘˜ 𝐴 𝑇 +𝑛 𝐢 𝑀 Fusion at Data Sink π‘₯ π‘˜ = 1 𝑛 𝑖=1 𝑛 π‘₯ π‘˜ 𝑖 𝐢 π‘˜ = 1 𝑛 𝐢 π‘˜ C bar is the same at each node KEINE GÜLTIGE SCHΓ„TZUNG! Benjamin Valeo

12 Illustrating Example Sensor Network 100 sensor nodes Defect / working
No communication Reconstruct 2D path of an object Estimate is stored every 10th time step Collection of data after 100 time steps Sensor Network Benjamin Valeo

13 Missing Estimates / Failed Measurements
The number of defect nodes is only approximately known Instead of 20, only 15 sensors are defect 85 are functioning Example 1 Average RMSE BLUE DKF x 0.123 3.833 y 0.138 2.687 Result The fused estimates are biased and significantly worse than the optimal ones Benjamin Valeo

14 Distributed Kalman Filter
Similar to standard prediction Local Relaxed Prediction With modified gains Local Filtering π‘₯ π‘˜|π‘˜ 𝑖 = 𝐢 π‘˜|π‘˜ 𝐢 π‘˜ 𝑖 βˆ’1 x k i + 𝐻 π‘˜ 𝑖 𝑇 𝐢 π‘˜ 𝑣 𝑖 βˆ’1 z k i 𝐢 π‘˜|π‘˜ βˆ’1 = 𝐢 π‘˜ 𝑖 βˆ’1 + 1 𝑛 𝑖=1 𝑛 𝐻 π‘˜ 𝑖 𝑇 𝐢 π‘˜ 𝑣 𝑖 βˆ’1 𝐻 π‘˜ 𝑖 π‘₯ π‘˜+1 𝑖 =𝐴 π‘₯ π‘˜ 𝑖 𝐢 π‘˜+1 =𝐴 𝐢 π‘˜ 𝐴 𝑇 +𝑛 𝐢 𝑀 Fusion at Data Sink π‘₯ π‘˜ = 1 𝑛 𝑖=1 𝑛 π‘₯ π‘˜ 𝑖 𝐢 π‘˜ = 1 𝑛 𝐢 π‘˜ C bar is the same at each node Benjamin Valeo

15 Weak Points of the DKF Works well when all requirements are fulfilled
Model Knowledge Global measurement model is not precisely known Defect Nodes The number of functioning nodes is not known exactly Uncertainties Measurement noise is variable or state dependent Model Knowledge Global measurement model is not precisely known Defect Nodes The number of functioning nodes is not known exactly Uncertainties Measurement noise is variable or state dependent When the assumptions about models, number of nodes etc. are not met DKF in Non-ideal Situations Benjamin Valeo

16 DKF in Non-ideal Situation
Derive a multiplicative bias-correction matrix Ξ” π‘˜ with 𝐸{𝒙}= E{Ξ” π‘˜ π‘₯ π‘˜ 𝑓 } Basic Idea Initialization Prediction Filtering Correction Matrix Calculation Standard DKF Equations Initialization of DKF Prediction of DKF Filtering of DKF Benjamin Valeo

17 DKF in Non-ideal Situation
We assume the globalized estimates to be combined according to Correct the Fused Estimate π‘₯ π‘˜ 𝑓 = 1 π‘š 𝑖=1 π‘š π‘₯ π‘˜ 𝑖 𝐸 𝒙 =𝐸{ Ξ” π‘˜ 1 π‘š 𝑖=1 π‘š π‘₯ π‘˜ 𝑖 } with Ξ” π‘˜ =𝐼 Initialization Ξ” π‘˜+1 =𝐴 Ξ” π‘˜ 𝐴 βˆ’1 Prediction π‘₯ π‘˜+1 𝑖 =𝐴 π‘₯ π‘˜ 𝑖 Filtering With 𝐢 π‘˜ βˆ’1 = Ξ” π‘˜|π‘˜βˆ’1 𝐢 π‘˜|π‘˜βˆ’1 βˆ’1 + 1 𝑛 𝑖=1 π‘š 𝐻 π‘˜ 𝑖 𝑇 𝐢 π‘˜ 𝑣 𝑖 βˆ’1 𝐻 π‘˜ 𝑖 Ξ” π‘˜ = 𝐢 π‘˜ 𝐢 π‘˜ βˆ’1 π‘₯ π‘˜|π‘˜ 𝑖 = 𝐢 π‘˜|π‘˜ 𝐢 π‘˜ 𝑖 βˆ’1 x k i + 𝐻 π‘˜ 𝑖 𝑇 𝐢 π‘˜ 𝑣 𝑖 βˆ’1 z k i Actually utilized models must be known Benjamin Valeo

18 DKF in Non-ideal Situation
Globally optimal results when assumptions are correct (Ξ”=𝐼) Unbiased estimates anyhow Globally Optimal Actual expected error can also be derived in closed-form recursively Actual MSE covariance matrix Regular state transition matrix Regular correction matrix Utilized models must be known at fusion node Requirements Relaxations Reformulation Hypothesizing KF Benjamin Valeo

19 Missing Estimates The number of defect nodes is only approximately known Instead of 20, only 15 sensors are defect 85 are functioning Example 1 Average RMSE BLUE DKF Extended DKF x 0.096 3.814 y 0.115 2.645 Result The fused estimates are unbiased and almost optimal Benjamin Valeo

20 Full Example Example 1 Average RMSE BLUE HKF x 1.323 1.363 (3%) y
The number of defect nodes is only approximately known (30 vs. 24) The measurement range is limited (15) and the quality decreases linearly with the distance to the object according to 𝐢 π‘˜ 𝑣 𝑖 =ceil 𝑝 𝑖 βˆ’ 𝑝 π‘˜ π‘₯ 𝐢 π‘˜ 𝑣 Example 1 Average RMSE BLUE HKF x 1.323 (3%) y 0.932 (0.7%) Result The fused estimates are unbiased and almost optimal Benjamin Valeo

21 Example 2: Hypothesizing Filtering
Use hypothesis as substitute for unknown measurement capacity Centralized KF with 𝐢 π‘˜ 𝑧 𝑠 = 𝐻 π‘˜ 𝑠 𝑇 𝐢 π‘˜ 𝑣 𝑠 βˆ’1 𝐻 π‘˜ 𝑠 bad optimization 𝐢 π‘˜ 𝑧 = 𝑠=1 𝑆 𝐻 π‘˜ 𝑠 𝑇 𝐢 π‘˜ 𝑣 𝑠 βˆ’1 𝐻 π‘˜ 𝑠 40 sensors, distance dependent measurement uncertainty Benjamin Valeo

22 Evaluation Hypothesizing Filtering
Use hypothesis as substitute for unknown measurement capacity good optimization 100 runs Benjamin Valeo

23 Applications of Hypothesizing Filtering
Global measurement model is not precisely known Model knowledge The number of functioning nodes is not known exactly Defect Nodes Measurement noise is variable or state dependent Uncertainties Assumption Only an assumption about the model is necessary Correction The number of nodes does not effect the estimation quality Average Only the global uncertainty must be estimated Benjamin Valeo

24 Summary Bias Calculation
The globalized estimates of the DKF are biased We have derived this bias in closed-form Bias Calculation The bias leads to bad estimates when assumptions are not met We derived a correction matrix that allows to derive an unbiased estimate Even when assumptions are not met, the result is almost optimal DKF in Non-ideal Situations The derivation of the correction matrix requires the utilized models to be available at the fusion center Restrictions Benjamin Valeo

25 Benjamin Valeo


Download ppt "On Optimal Distributed Kalman Filtering in Non-ideal Situations"

Similar presentations


Ads by Google