Soft Sensor for Faulty Measurements Detection and Reconstruction in Urban Traffic Department of Adaptive systems, Institute of Information Theory and Automation,

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Active Appearance Models
Regression analysis Relating two data matrices/tables to each other Purpose: prediction and interpretation Y-data X-data.
Kriging.
Cost of surrogates In linear regression, the process of fitting involves solving a set of linear equations once. For moving least squares, we need to form.
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Paper Discussion: “Simultaneous Localization and Environmental Mapping with a Sensor Network”, Marinakis et. al. ICRA 2011.
On the Influence of Weather Forecast Errors in Short-Term Load Forecasting Models Damien Fay, John V. Ringwood IEEE POWER SYSTEMS, 2010.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Mutual Information Mathematical Biology Seminar
Overview of Non-Parametric Probability Density Estimation Methods Sherry Towers State University of New York at Stony Brook.
Predictive Automatic Relevance Determination by Expectation Propagation Yuan (Alan) Qi Thomas P. Minka Rosalind W. Picard Zoubin Ghahramani.
Information Agents for Autonomous Acquisition of Sensor Network Data A. Rogers and N. R. Jennings University of Southampton, UK M. A. Osborne and S. J.
Learning From Data Chichang Jou Tamkang University.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
1 Adaptive Kalman Filter Based Freeway Travel time Estimation Lianyu Chu CCIT, University of California Berkeley Jun-Seok Oh Western Michigan University.
GAUSSIAN PROCESS REGRESSION FORECASTING OF COMPUTER NETWORK PERFORMANCE CHARACTERISTICS 1 Departments of Computer Science and Mathematics, 2 Department.
Fast and Robust Worm Detection Algorithm Tian Bu Aiyou Chen Scott Vander Wiel Thomas Woo bearhsu.
Study of Sparse Online Gaussian Process for Regression EE645 Final Project May 2005 Eric Saint Georges.
Lecture II-2: Probability Review
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
RLSELE Adaptive Signal Processing 1 Recursive Least-Squares (RLS) Adaptive Filters.
Gaussian process modelling
A Shaft Sensorless Control for PMSM Using Direct Neural Network Adaptive Observer Authors: Guo Qingding Luo Ruifu Wang Limei IEEE IECON 22 nd International.
Mining Discriminative Components With Low-Rank and Sparsity Constraints for Face Recognition Qiang Zhang, Baoxin Li Computer Science and Engineering Arizona.
Model Inference and Averaging
Prognosis of Gear Health Using Gaussian Process Model Department of Adaptive systems, Institute of Information Theory and Automation, May 2011, Prague.
© 2010 IBM Corporation IBM Research - Ireland © 2014 IBM Corporation xStream Data Fusion for Transport Smarter Cities Technology Centre IBM Research.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Hyperparameter Estimation for Speech Recognition Based on Variational Bayesian Approach Kei Hashimoto, Heiga Zen, Yoshihiko Nankaku, Akinobu Lee and Keiichi.
Part II: Model Class Selection Given: Dynamic data from system and set of candidate model classes where each model class defines a set of possible predictive.
17 May 2007RSS Kent Local Group1 Quantifying uncertainty in the UK carbon flux Tony O’Hagan CTCD, Sheffield.
PROCESS MODELLING AND MODEL ANALYSIS © CAPE Centre, The University of Queensland Hungarian Academy of Sciences Statistical Model Calibration and Validation.
Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Gaussian Process Model Identification: a Process.
Additional Topics in Prediction Methodology. Introduction Predictive distribution for random variable Y 0 is meant to capture all the information about.
Linear Regression Basics III Violating Assumptions Fin250f: Lecture 7.2 Spring 2010 Brooks, chapter 4(skim) 4.1-2, 4.4, 4.5, 4.7,
Bayes Theorem The most likely value of x derived from this posterior pdf therefore represents our inverse solution. Our knowledge contained in is explicitly.
Channel-Independent Viterbi Algorithm (CIVA) for DNA Sequencing
How Good is a Model? How much information does AIC give us? –Model 1: 3124 –Model 2: 2932 –Model 3: 2968 –Model 4: 3204 –Model 5: 5436.
Data Mining and Decision Support
NTU & MSRA Ming-Feng Tsai
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Chance Constrained Robust Energy Efficiency in Cognitive Radio Networks with Channel Uncertainty Yongjun Xu and Xiaohui Zhao College of Communication Engineering,
Approximate Models and Noise. State of the Art Sources of uncertainty –Uncertainty in inputs –Uncertainty in external factors –Uncertainty model output.
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
Computacion Inteligente Least-Square Methods for System Identification.
“Jožef Stefan” Institute Department of Systems and Control Modelling and Control of Nonlinear Dynamic Systems with Gaussian Process Models Juš Kocijan.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
A Study on Speaker Adaptation of Continuous Density HMM Parameters By Chin-Hui Lee, Chih-Heng Lin, and Biing-Hwang Juang Presented by: 陳亮宇 1990 ICASSP/IEEE.
Probability Theory and Parameter Estimation I
Model Inference and Averaging
How Good is a Model? How much information does AIC give us?
Ch3: Model Building through Regression
Tracking Objects with Dynamics
Department of Civil and Environmental Engineering
Rutgers Intelligent Transportation Systems (RITS) Laboratory
CSCI 5822 Probabilistic Models of Human and Machine Learning
Predictive distributions
DataMining, Morgan Kaufmann, p Mining Lab. 김완섭 2004년 10월 27일
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
10701 / Machine Learning Today: - Cross validation,
GAUSSIAN PROCESS REGRESSION WITHIN AN ACTIVE LEARNING SCHEME
Integration of sensory modalities
Introduction to Sensor Interpretation
Ensemble forecasts and seasonal precipitation tercile probabilities
Introduction to Sensor Interpretation
Yalchin Efendiev Texas A&M University
Probabilistic Surrogate Models
Presentation transcript:

Soft Sensor for Faulty Measurements Detection and Reconstruction in Urban Traffic Department of Adaptive systems, Institute of Information Theory and Automation, June 2010, Prague

Outline  Problem description  Soft sensors  Gaussian Process models  Soft sensor for faulty measurement detection and reconstruction  Conclusions

Outline  Problem description  Soft sensors  Gaussian Process models  Soft sensor for faulty measurement detection and reconstruction  Conclusions

Problem description  Traffic crossroad - count of vehicles  Inductive loop is a popular choice  Devastating for traffic control system  Failure detection and recovery of sensor signal

Example of controlled network (Zličin shopping centre, Prague)  Sensors on crossroads  Failure: control system has no means to react  Possible solution: soft sensor for failure detection and signal reconstruction

Soft sensors  Models that provide estimation of another variable  `Soft sensor’: process engineering mainly  Applications in various engineering fields  Model-driven, data-driven soft sensors  Issues: missing data, data outliers, drifting data, data co-linearity, different sampling rates, measurement delays.

Outline  Problem description  Soft sensors  Gaussian Process models  Soft sensor for faulty measurement detection and reconstruction  Conclusions

 Probabilistic (Bayes) nonparametric model.  GP model determined by: Input/output data (data points, not signals) (learning data – identification data): Covariance matrix: GP model

 Covariance function: functional part and noise part stationary/unstationary, periodic/nonperiodic, etc. Expreses prior knowledge about system properties, frequently: Gaussian covariance function »smooth function »stationary function Covariance function

 Identification of GP model = optimisation of covariance function parameters Cost function: maximum likelihood of data for learning Hyperparameters

GP model prediction  Prediction of the output based on similarity test input – training inputs  Output: normal distribution Predicted mean Prediction variance

Static illustrative example  Static example:  9 learning points:  Prediction  Rare data density  increased variance (higher uncertainty) x y Nonlinear function to be modelled from learning points y=f(x) Learning points x y Nonlinear fuction and GP model x e Prediction error and double standard deviation of prediction 2  |e| Learning points   2   f(x)

GP model attributes (vs. e.g. ANN)  Smaller number of parameters  Measure of confidence in prediction, depending on data  Data smoothing  Incorporation of prior knowledge *  Easy to use (engineering practice)  Computational cost increases with amount of data   Recent method, still in development  Nonparametrical model * (also possible in some other models)

Outline  Problem description  Soft sensors  Gaussian Process models  Soft sensor for faulty measurement detection and reconstruction  Conclusions

The profile of vehicle arrival data

Modelling  One working day for estimation data  Different working day for validation data  Validation based regressor selection  the fourth order AR model (four delayed output values as regressors)  Gaussian+constant covariance function  Residuals of predictions with 3  band

Estimation

Validation

Proposed algorithm for detecting irregularities and for reconstruction the data with prediction Sensor fault: longer lasting outliers.

The comparison of MRSE for k-step- ahead predictions Purposiveness of the obtained model (the measure of measurement validity, close-enough prediction, fast calculation, model robustness)

Soft sensor applied on faulty data

Conclusions  Soft sensors: promising for FD and signal reconstruction.  GP models: excessive noise, outliers, no delay in prediction, measure of prediction confidence.  The excessive noise limits the possibility to develop better predictor.  Traffic sensor problem successfully solved for working days.