Carnegie Mellon Kalman and Kalman 50: Distributed and Intermittency TexPoint fonts used in EMF. Read the TexPoint manual before you delete this.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Generalized Method of Moments: Introduction
Probabilistic Reasoning over Time
1 Closed-Form MSE Performance of the Distributed LMS Algorithm Gonzalo Mateos, Ioannis Schizas and Georgios B. Giannakis ECE Department, University of.
Modeling Maze Navigation Consider the case of a stationary robot and a mobile robot moving towards a goal in a maze. We can model the utility of sharing.
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
Modeling Uncertainty over time Time series of snapshot of the world “state” we are interested represented as a set of random variables (RVs) – Observable.
Carnegie Mellon Distributed Inference: High Dimensional Consensus TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA.
Robot Localization Using Bayesian Methods
Econ 140 Lecture 61 Inference about a Mean Lecture 6.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Observers and Kalman Filters
On Systems with Limited Communication PhD Thesis Defense Jian Zou May 6, 2004.
Distributed Consensus with Limited Communication Data Rate Tao Li Key Laboratory of Systems & Control, AMSS, CAS, China ANU RSISE System and Control Group.
Department of Computer Science, University of Maryland, College Park, USA TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
Problem Statement Given a control system where components, i.e. plant, sensors, controllers, actuators, are connected via a communication network, design.
A gentle introduction to fluid and diffusion limits for queues Presented by: Varun Gupta April 12, 2006.
Location Estimation in Sensor Networks Moshe Mishali.
Probability By Zhichun Li.
Chess Review May 11, 2005 Berkeley, CA Closing the loop around Sensor Networks Bruno Sinopoli Shankar Sastry Dept of Electrical Engineering, UC Berkeley.
Visual Recognition Tutorial
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
Kalman Filtering Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
Adaptive Signal Processing
RLSELE Adaptive Signal Processing 1 Recursive Least-Squares (RLS) Adaptive Filters.
Slam is a State Estimation Problem. Predicted belief corrected belief.
Review of Probability.
References for M/G/1 Input Process
Simulation Output Analysis
1 Patch Complexity, Finite Pixel Correlations and Optimal Denoising Anat Levin, Boaz Nadler, Fredo Durand and Bill Freeman Weizmann Institute, MIT CSAIL.
Consensus-based Distributed Estimation in Camera Networks - A. T. Kamal, J. A. Farrell, A. K. Roy-Chowdhury University of California, Riverside
Fluid Limits for Gossip Processes Vahideh Manshadi and Ramesh Johari DARPA ITMANET Meeting March 5-6, 2009 TexPoint fonts used in EMF. Read the TexPoint.
4. Linear optimal Filters and Predictors 윤영규 ADSLAB.
Adaptive CSMA under the SINR Model: Fast convergence using the Bethe Approximation Krishna Jagannathan IIT Madras (Joint work with) Peruru Subrahmanya.
November 1, 2012 Presented by Marwan M. Alkhweldi Co-authors Natalia A. Schmid and Matthew C. Valenti Distributed Estimation of a Parametric Field Using.
Message-Passing for Wireless Scheduling: an Experimental Study Paolo Giaccone (Politecnico di Torino) Devavrat Shah (MIT) ICCCN 2010 – Zurich August 2.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Stat 112: Notes 2 Today’s class: Section 3.3. –Full description of simple linear regression model. –Checking the assumptions of the simple linear regression.
Distributed state estimation with moving horizon observers Marcello Farina, Giancarlo Ferrari-Trecate, Riccardo Scattolini Dipartimento di Elettronica.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
University of Houston Cullen College of Engineering Electrical & Computer Engineering Capacity Scaling in MIMO Wireless System Under Correlated Fading.
A Trust Based Distributed Kalman Filtering Approach for Mode Estimation in Power Systems Tao Jiang, Ion Matei and John S. Baras Institute for Systems Research.
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
Analyzing wireless sensor network data under suppression and failure in transmission Alan E. Gelfand Institute of Statistics and Decision Sciences Duke.
The Restricted Matched Filter for Distributed Detection Charles Sestok and Alan Oppenheim MIT DARPA SensIT PI Meeting Jan. 16, 2002.
Lecture 2: Statistical learning primer for biologists
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Rank Minimization for Subspace Tracking from Incomplete Data
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Chapter 27: Linear Filtering - Part I: Kalman Filter Standard Kalman filtering – Linear dynamics.
The Unscented Kalman Filter for Nonlinear Estimation Young Ki Baik.
Sequential Off-line Learning with Knowledge Gradients Peter Frazier Warren Powell Savas Dayanik Department of Operations Research and Financial Engineering.
Geology 6600/7600 Signal Analysis 26 Oct 2015 © A.R. Lowry 2015 Last time: Wiener Filtering Digital Wiener Filtering seeks to design a filter h for a linear.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Proposed Courses. Important Notes State-of-the-art challenges in TV Broadcasting o New technologies in TV o Multi-view broadcasting o HDR imaging.
Mean Field Methods for Computer and Communication Systems Jean-Yves Le Boudec EPFL Network Science Workshop Hong Kong July
Stochastic Process - Introduction
STATISTICS POINT ESTIMATION
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Soummya Kar NAS, Data Science Symposium Jun. 14, 2018
Lecture 10: Observers and Kalman Filters
Kalman Filter فيلتر كالمن در سال 1960 توسط R.E.Kalman در مقاله اي تحت عنوان زير معرفي شد. “A new approach to liner filtering & prediction problem” Transactions.
STOCHASTIC HYDROLOGY Random Processes
Signal Processing on Graphs: Performance of Graph Structure Estimation
Presentation transcript:

Carnegie Mellon Kalman and Kalman 50: Distributed and Intermittency TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAAAAAAAAAA José M. F. Moura Joint Work with Soummya Kar Advanced Network Colloquium University of Maryland College Park, MD November 04, 2011 Acknowledgements: NSF under grants CCF and CCF , and AFOSR grant FA

Carnegie Mellon Outline  Brief Historical Comments: From Kolmogorov to Kalman-Bucy  Filtering Then … Filtering Today  Consensus: Distributed Averaging in Random Environments  Distributed Filtering: Consensus + innovations  Random field (parameter) estimation: Large scale  Intermittency: Infrastructure failures, Sensor failures  Random protocols: Gossip  Limited Resources: Quantization  Linear Parameter Estimator: Mixed time scale  Linear filtering: Intermittency – Random Riccati Eqn.  Stochastic boundedness  Invariant distribution  Moderate deviation  Conclusion

Carnegie Mellon Outline  Brief Historical Comments: From Kolmogorov to Kalman-Bucy  Filtering Then … Filtering Today  Consensus: Distributed Averaging in Random Environments  Distributed Filtering: Consensus + innovations  Random field (parameter) estimation: Large scale  Intermittency: Infrastructure failures, Sensor failures  Random protocols: Gossip  Limited Resources: Quantization  Linear Parameter Estimator: Mixed time scale  Linear filtering: Intermittency – Random Riccati Eqn.  Stochastic boundedness  Invariant distribution  Moderate deviation  Conclusion

Carnegie Mellon In the 40’s  Wiener Model  Wiener filter  Wiener-Hopf equation (1931; 1942)  : A. N. Kolmogorov, "Interpolation und Extrapolation von Stationaren Zufalligen Folgen,“ Bull. Acad. Sci. USSR, 1941  Dec 1940: anti-aircraft control pr.–extract signal from noise : N. Wiener "Extrap., Interp., and Smoothing of Stat. time Series with Eng. Applications," 1942; declassified, published Wiley, NY, 1949.

Carnegie Mellon Norbert WIENER. The extrapolation, interpolation and smoothing of stationary time series with engineering applications. [Washington, D.C.: National Defense Research Council,] 1942.

Carnegie Mellon Kalman 51 Trans. of the ASME-J. of Basic Eng., 82 (Series D): 35-45, March 1960

Carnegie Mellon Kalman-Bucy 50 Transactions of the ASME-Journal of Basic Eng., 83 (Series D): , March 1961

Carnegie Mellon Outline  Brief Historical Comments: From Kolmogorov to Kalman-Bucy  Filtering Then … Filtering Today  Consensus: Distributed Averaging in Random Environments  Distributed Filtering: Consensus + innovations  Random field (parameter) estimation: Large scale  Intermittency: Infrastructure failures, Sensor failures  Random protocols: Gossip  Limited Resources: Quantization  Linear Parameter Estimator: Mixed time scale  Linear filtering: Intermittency – Random Riccati Eqn.  Stochastic boundedness  Invariant distribution  Moderate deviation  Conclusion

Carnegie Mellon Filtering Then …  Centralized  Measurements always available (not lost)  Optimality: structural conditions – observability/controllability  Applications: Guidance, chemical plants, noisy images, … “Kalman Gain” “Innovations”“Prediction”

Carnegie Mellon Filtering Today: Distributed Solution  Local communications  Agents communicate with neighbors  No central collection of data  Cooperative solution  In isolation: myopic view and knowledge  Cooperation: better understanding/global knowledge  Iterative solution  Realistic Problem: Intermittency  Sensors fail  Local communication channels fail  Limited resources:  Noisy sensors  Noisy communications  Limited bandwidth (quantized communications)  Optimality:  Asymptotically  Convergence rate Structural Random Failures

Carnegie Mellon Outline  Brief Historical Comments: From Kolmogorov to Kalman-Bucy  Filtering Then … Filtering Today  Consensus: Distributed Averaging  Standard consensus  Consensus in random environments  Distributed Filtering: Consensus + innovations  Random field (parameter) estimation  Realistic large scale problem:  Intermittency: Infrastructure failures, Sensor failures  Random protocols: Gossip  Limited Resources: Quantization  Two Linear Estimators:  LU: Stochastic Approximation  GLU: Mixed time scale estimator  Performance Analysis: Asymptotics  Conclusion

Carnegie Mellon Consensus: Distributed Averaging  Network of (cooperating) agents updating their beliefs:  (Distributed) Consensus:  Asymptotic agreement: λ 2 (L) > 0 DeGroot, JASA 74; Tsitsiklis, 74, Tsitsiklis, Bertsekas, Athans, IEEE T-AC 1986 Jadbabaie, Lin, Morse, IEEE T-AC 2003

Carnegie Mellon  Consensus: random links, comm. or quant. noise  Consensus (reinterpreted): a.s. convergence to unbiased rv θ: Consensus in Random Environments Xiao, Boyd, Sys Ct L., 04, Olfati-Saber, ACC 05, Kar, Moura, Allerton 06, T-SP 10, Jakovetic, Xavier, Moura, T-SP, 10, Boyd, Ghosh, Prabhakar, Shah, T-IT, 06

Carnegie Mellon Outline  Brief Historical Comments: From Kolmogorov to Kalman-Bucy  Filtering Then … Filtering Today  Consensus: Distributed Averaging in Random Environments  Distributed Filtering: Consensus + innovations  Random field (parameter) estimation: Large scale  Intermittency: Infrastructure failures, Sensor failures  Random protocols: Gossip  Limited Resources: Quantization  Linear Parameter Estimator: Mixed time scale  Linear filtering: Intermittency – Random Riccati Eqn.  Stochastic boundedness  Invariant distribution  Moderate deviation  Conclusion

Carnegie Mellon In/Out Network Time Scale Interactions  Consensus : In network dominated interactions  fast comm. (cooperation) vs slow sensing (exogenous, local)  Consensus + innovations: In and Out balanced interactions  communications and sensing at every time step  Distributed filtering: Consensus +Innovations ζ comm ζ sensing ζ comm « ζ sensing time scale ζ comm ~ ζ sensing time scale

Carnegie Mellon Filtering: Random Field  Random field:  Network of agents: each agent observes:  Intermittency: sensors fail at random times  Structural failures (random links)/ random protocol (gossip):  Quantization/communication noise spatially correlated, temporally iid,

Carnegie Mellon Consensus+Innovations: Generalized Lin. Unbiased  Distributed inference: Generalized linear unbiased (GLU) Consensus: local avg “Innovations” “Prediction” “Kalman Gain” Gain Innovations Weights Consensus Weights

Carnegie Mellon Consensus+Innovations: Asymptotic Properties  Properties  Asymptotic unbiasedness, consistency, MS convergence, As. Normality  Compare distributed to centralized performance  Distributed observability condition: Matrix G is full rank  Distributed connectivity: Network connected in the mean  Structural conditions

Carnegie Mellon Consensus+Innovations: GLU  Observation:  Assumptions:  iid, spatially correlated,  L(i) iid, independent  Distributed observable + connected on average  Estimator:  A6. assumption: Weight sequences Soummya Kar, José M. F. Moura, IEEE J. Selected Topics in Sig. Pr., Aug2011.

Carnegie Mellon Consensus+Innovations: GLU Properties  A1-A6 hold,, generic noise distribution (finite 2 nd moment)  Consistency: sensor n is consistent  Asymptotically normality:  Asymptotic variance matches that of centralized estimator  Efficiency: Further, if noise is Gauss, GLU estimator is asymptotically efficient

Carnegie Mellon Consensus+Innovations: Remarks on Proofs  Define  Let  Find dynamic equation for  Show is nonnegative supermartingale, converges a.s., hence pathwise bounded (this would show consistency)  Strong convergence rates: study sample paths more critically  Characterize information flow (consensus): study convergence to averaged estimate  Study limiting properties of averaged estimate:  Rate at which convergence of averaged estimate to centralized estimate  Properties of centralized estimator used to show convergence to

Carnegie Mellon Outline  Intermittency: networked systems, packet loss  Random Riccati Equation: stochastic Boundedness  Random Riccati Equation: Invariant distribution  Random Riccati Equation: Moderate deviation principle  Rate of decay of probability of rare events  Scalar numerical example  Conclusions

Carnegie Mellon Kalman Filtering with Intermittent Observations  Model:  Intermittent observations:  Optimal Linear Filter (conditioned on path of observations) – Kalman filter with Random Riccati Equation

Carnegie Mellon Outline  Intermittency: networked systems, packet loss  Random Riccati Equation: stochastic Boundedness  Random Riccati Equation: Invariant distribution  Random Riccati Equation: Moderate deviation principle  Rate of decay of probability of rare events  Scalar numerical example  Conclusions

Carnegie Mellon Random Riccati Equation (RRE)  Sequence is random  Define operators f 0 (X), f 1 (X) and reexpress P t : [2] S. Kar, Bruno Sinopoli and J.M.F. Moura, “Kalman filtering with intermittent observations: weak convergence to a stationary distribution,” IEEE Tr. Aut Cr, Jan 2012.

Carnegie Mellon Outline  Intermittency: networked systems, packet loss  Random Riccati Equation: stochastic Boundedness  Random Riccati Equation: Invariant distribution  Random Riccati Equation: Moderate deviation principle  Rate of decay of probability of rare events  Scalar numerical example  Conclusions

Carnegie Mellon Random Riccati Equation: Invariant Distribution  Stochastic Boundedness: 

Carnegie Mellon Moderate Deviation Principle (MDP)  Interested in probability of rare events:  As ϒ 1: rare event: steady state cov. stays away from P* (det. Riccati)  RRE satisfies an MDP at a given scale:  Pr(rare event) decays exponentially fast with good rate function  String:  Counting numbers of Soummya Kar and José M. F. Moura, “Kalman Filtering with Intermittent Observations: Weak Convergence and Moderate Deviations,” IEEE Tr. Automatic Control;

Carnegie Mellon MDP for Random Riccati Equation  P*P* Soummya Kar and José M. F. Moura, “Kalman Filtering with Intermittent Observations: Weak Convergence and Moderate Deviations,” IEEE Tr. Automatic Control

Carnegie Mellon Outline  Intermittency: networked systems, packet loss  Random Riccati Equation: stochastic Boundedness  Random Riccati Equation: Invariant distribution  Random Riccati Equation: Moderate deviation principle  Rate of decay of probability of rare events  Scalar numerical example  Conclusions

Carnegie Mellon Support of the Measure  Example: scalar  Lyapunov/Riccati operators:  Support is independent of

Carnegie Mellon Self-Similarity of Support of Invariant Measure  ‘Fractal like’:

Carnegie Mellon Class A Systems: MDP    Scalar system  Define

Carnegie Mellon MDP: Scalar Example  Scalar system: Soummya Kar and José M. F. Moura, “Kalman Filtering with Intermittent Observations: Weak Convergence and Moderate Deviations,” accepted EEE Tr. Automatic Control

Carnegie Mellon Outline  Intermittency: networked systems, packet loss  Random Riccati Equation: stochastic Boundedness  Random Riccati Equation: Invariant distribution  Random Riccati Equation: Moderate deviation principle  Rate of decay of probability of rare events  Scalar numerical example  Conclusions

Carnegie Mellon Conclusion  Filtering 50 years after Kalman and Kalman-Bucy:  Consensus+innovations: Large scale distributed networked agents  Intermittency: sensors fail; comm links fail  Gossip: random protocol  Limited power: quantization  Observ. Noise  Linear estimators:  Interleave consensus and innovations  Single scale: stochastic approximation  Mixed scale: can optimize rate of convergence and limiting covariance  Structural conditions: distributed observability+ mean connectivitiy  Asymptotic properties: Distributed as Good as Centralized  unbiased, consistent, normal, mixed scale converges to optimal centralized

Carnegie Mellon Conclusion  Intermittency: packet loss  Stochastically bounded as long as rate of measurements strictly positive  Random Riccati Equation: Probability measure of random covariance is invariant to initial condition  Support of invariant measure is ‘fractal like’  Moderate Deviation Principle: rate of decay of probability of ‘bad’ (rare) events as rate of measurements grows to 1  All is computable P*P*

Carnegie Mellon Thanks Questions?