Advanced Diagnostics Algorithms in Online Field Device Monitoring Vagan Terziyan (editor) Industrial Ontologies.

Slides:



Advertisements
Similar presentations
APPLICATIONS OF ANN IN MICROWAVE ENGINEERING.
Advertisements

KULIAH II JST: BASIC CONCEPTS
Approaches, Tools, and Applications Islam A. El-Shaarawy Shoubra Faculty of Eng.
© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
1 Probability and the Web Ken Baclawski Northeastern University VIStology, Inc.
Taxonomy & Ontology Impact on Search Infrastructure John R. McGrath Sr. Director, Fast Search & Transfer.
Pat Langley Computational Learning Laboratory Center for the Study of Language and Information Stanford University, Stanford, California
4-th IEEE International Conference on Advanced Learning Technologies, Joensuu, Finland, August 30 – September 1, th IEEE International Conference.
Similarity Evaluation Techniques for Filtering Problems ? Vagan Terziyan University of Jyvaskyla
1 Machine Learning: Lecture 1 Overview of Machine Learning (Based on Chapter 1 of Mitchell T.., Machine Learning, 1997)
The 20th International Conference on Software Engineering and Knowledge Engineering (SEKE2008) Department of Electrical and Computer Engineering
ARCHITECTURES FOR ARTIFICIAL INTELLIGENCE SYSTEMS
Running a model's adjoint to obtain derivatives, while more efficient and accurate than other methods, such as the finite difference method, is a computationally.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Slides from: Doug Gray, David Poole
USER-assisted SEMANTIC INTEROPERABILITY in INTERNET of THINGS
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
STAT 497 APPLIED TIME SERIES ANALYSIS
Industrial Diagnostics Using Algebra of Uncertain Temporal Relations Vladimir Ryabov, Vagan Terziyan* IASTED-2003 Innsbruck, Austria.
Decision Making: An Introduction 1. 2 Decision Making Decision Making is a process of choosing among two or more alternative courses of action for the.
Industrial Ontologies Group University of Jyväskylä SIMILARITY/CLOSENESS-BASED RESOURCE BROWSER Oleksiy Khriyenko July 13 – 15, 2009, Cambridge, United.
Semantic Web Enabled Network of Maintenance Services for Smart Devices Agora Center, University of Jyväskylä, March 2003 “Industrial Ontologies” Group.
Zharko A., ”Industrial Ontologies” Group, February 2004 Community Formation Scenarios in Peer-to-Peer Web Service Environments Olena Kaykova, Oleksandr.
Semantic Web Services for Smart Devices based on Mobile Agents Vagan Terziyan Industrial Ontologies Group University of Jyväskylä
UNIVERSITY OF JYVÄSKYLÄ Yevgeniy Ivanchenko Yevgeniy Ivanchenko University of Jyväskylä
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 4: Modeling Decision Processes Decision Support Systems in the.
Industrial Ontologies Group University of Jyväskylä International Master Program: “Mobile Technologies and Business”
Predictive and Contextual Feature Separation for Bayesian Metanetworks Vagan Terziyan Industrial Ontologies Group, University of Jyväskylä,
A Similarity Evaluation Technique for Data Mining with Ensemble of Classifiers Seppo Puuronen, Vagan Terziyan International Workshop on Similarity Search.
Industrial Ontologies Group Oleksiy Khriyenko, Vagan Terziyan INDIN´04: 24th – 26th June, 2004, Berlin, Germany OntoSmartResource: An Industrial Resource.
Engineering Data Analysis & Modeling Practical Solutions to Practical Problems Dr. James McNames Biomedical Signal Processing Laboratory Electrical & Computer.
An Interval Approach to Discover Knowledge from Multiple Fuzzy Estimations Vagan Terziyan * & **, Seppo Puuronen **, Helen Kaikova * *Department of Artificial.
SmartResource: Proactive Self-Maintained Resources in Semantic Web TEKES Project proposal Vagan Terziyan, Project Leader Industrial Ontologies Group Agora.
A Review of Research Topics of the AI Department in Kharkov (MetaIntelligence Laboratory) Vagan Terziyan, Helen Kaikova November 25 - December 5, 1999.
A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.
Data Mining with Decision Trees Lutz Hamel Dept. of Computer Science and Statistics University of Rhode Island.
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 7: Expert Systems and Artificial Intelligence Decision Support.
Bayesian Metanetworks for Context-Sensitive Feature Relevance Vagan Terziyan Industrial Ontologies Group, University of Jyväskylä, Finland.
Dynamic Integration of Virtual Predictors Vagan Terziyan Information Technology Research Institute, University of Jyvaskyla, FINLAND
Intelligent Web Applications (Part 1) Course Introduction Vagan Terziyan AI Department, Kharkov National University of Radioelectronics / MIT Department,
Building Knowledge-Driven DSS and Mining Data
Neural Networks (NN) Ahmad Rawashdieh Sa’ad Haddad.
1 A Semantic Metanetwork Vagan Terziyan University of Jyvaskyla, Finland
ONTOLOGY-BASED INTERNATIONAL DEGREE RECOGNITION Vagan Terziyan, Olena Kaykova University of Jyväskylä, Finland Oleksandra Vitko, Lyudmila Titova (speaker)
Overview and Mathematics Bjoern Griesbach
Semantic Web Technologies Lecture # 2 Faculty of Computer Science, IBA.
INTEGRATION OF ARTIFICIAL INTELLIGENCE [AI] SYSTEMS FOR NUCLEAR POWER PLANT SURVEILLANCE & DIAGNOSTICS.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence From Data Mining To Knowledge.
B. RAMAMURTHY EAP#2: Data Mining, Statistical Analysis and Predictive Analytics for Automotive Domain CSE651C, B. Ramamurthy 1 6/28/2014.
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
Ihr Logo Chapter 5 Business Intelligence: Data Warehousing, Data Acquisition, Data Mining, Business Analytics, and Visualization Turban, Aronson, and Liang.
Copyright © 2012, SAS Institute Inc. All rights reserved. ANALYTICS IN BIG DATA ERA ANALYTICS TECHNOLOGY AND ARCHITECTURE TO MANAGE VELOCITY AND VARIETY,
Semantic Web: The Future Starts Today “Industrial Ontologies” Group InBCT Project, Agora Center, University of Jyväskylä, 29 April 2003.
Chapter 4 Decision Support System & Artificial Intelligence.
Artificial Intelligence, Expert Systems, and Neural Networks Group 10 Cameron Kinard Leaundre Zeno Heath Carley Megan Wiedmaier.
OPERATING SYSTEMS CS 3530 Summer 2014 Systems and Models Chapter 03.
Discovery and Systems Health Technical Area NASA Ames Research Center - Computational Sciences Division Automated Diagnosis Sriram Narasimhan University.
Data Mining and Decision Support
IEEE AI - BASED POWER SYSTEM TRANSIENT SECURITY ASSESSMENT Dr. Hossam Talaat Dept. of Electrical Power & Machines Faculty of Engineering - Ain Shams.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
DATA MINING and VISUALIZATION Instructor: Dr. Matthew Iklé, Adams State University Remote Instructor: Dr. Hong Liu, Embry-Riddle Aeronautical University.
CS 9633 Machine Learning Support Vector Machines
OPERATING SYSTEMS CS 3502 Fall 2017
Chapter 7. Classification and Prediction
School of Computer Science & Engineering
Objective of This Course
Presentation transcript:

Advanced Diagnostics Algorithms in Online Field Device Monitoring Vagan Terziyan (editor) Industrial Ontologies Group: Industrial Ontologies Group, Agora Center, University of Jyväskylä, 2003

Contents OntoServ.Net §Introduction: OntoServ.Net – Global Health- Care Environment for Industrial Devices; Bayesian Metanetworks § Bayesian Metanetworks for Context-Sensitive Industrial Diagnostics; §Temporal Industrial Diagnostics §Temporal Industrial Diagnostics with Uncertainty; §Dynamic Integration §Dynamic Integration of Classification Algorithms for Industrial Diagnostics; Real-Time Neuro- Fuzzy Systems §Industrial Diagnostics with Real-Time Neuro- Fuzzy Systems; §Conclusion.

Vagan Terziyan Oleksiy Khriyenko Oleksandr Kononenko Andriy Zharko

Web Services for Smart Devices Smart industrial devices can be also Web Service users. Their embedded agents are able to monitor the state of appropriate device, to communicate and exchange data with another agents. There is a good reason to launch special Web Services for such smart industrial devices to provide necessary online condition monitoring, diagnostics, maintenance support, etc. OntoServ.Net: Semantic Web Enabled Network of Maintenance Services for Smart Devices, Industrial Ontologies Group, Tekes Project Proposal, March 2003,

Global Network of Maintenance Services OntoServ.Net: Semantic Web Enabled Network of Maintenance Services for Smart Devices, Industrial Ontologies Group, Tekes Project Proposal, March 2003,

Embedded Maintenance Platforms Service Agents Host Agent Embedded Platform Based on the online diagnostics, a service agent, selected for the specific emergency situation, moves to the embedded platform to help the host agent to manage it and to carry out the predictive maintenance activities Maintenance Service

OntoServ.Net Challenges smart industrial devices §New group of Web service users – smart industrial devices. §Internalexternal service platforms §Internal (embedded) and external (Web-based) agent enabled service platforms. Mobile Service Component §Mobile Service Component concept supposes that any service component can move, be executed and learn at any platform from the Service Network, including service requestor side. §Semantic Peer-to-Peer §Semantic Peer-to-Peer concept for service network management assumes ontology-based decentralized service network management.

Agents in Semantic Web 1. I feel bad, pressure more than 200, headache, … Who can advise what to do ? 4. Never had such experience. No idea what to do 3. Wait a bit, I will give you some pills 2. I think you should stop drink beer for a while Agents in Semantic Web supposed to understand each other because they will share common standard, platform, ontology and language

The Challenge: Global Understanding eNvironment (GUN) How to make entities from our physical world to understand each other when necessary ?.. … Its elementary ! But not easy !! Just to make agents from them !!!

GUN Concept Entities will interoperate through OntoShells, which are supplements of these entities up to Semantic Web enabled agents 1. I feel bad, temperature 40, pain in stomach, … Who can advise what to do ? 2. I have some pills for you

Semantic Web: Before GUN Semantic Web Resources Semantic Web Applications Semantic Web applications understand, (re)use, share, integrate, etc. Semantic Web resources

GUN Concept: All GUN resources understand each other Real World objects OntoAdapters Real World Object + + OntoAdapter + + OntoShell = GUN Resource = GUN Resource GUN OntoShells Real World objects of new generation (OntoAdapter inside)

Read Our Recent Reports §Semantic Web: The Future Starts Today l (collection of research papers and presentations of Industrial Ontologies Group for the Period November 2002-April 2003) §Semantic Web and Peer-to-Peer: Integration and Interoperability in Industry §Semantic Web Enabled Web Services: State-of-Art and Challenges §Distributed Mobile Web Services Based on Semantic Web: Distributed Industrial Product Maintenance System §Available online in: Industrial Ontologies Group V. Terziyan A. Zharko O. Kononenko O. Khriyenko

Vagan Terziyan Oleksandra Vitko

Example of Simple Bayesian Network Conditional (in)dependence rule Joint probability rule Marginalization rule Bayesian rule

Contextual and Predictive Attributes Machine Environment Sensors X x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 predictive attributes contextual attributes air pressure dust humidity temperature emission

Contextual Effect on Conditional Probability X x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 predictive attributes contextual attributes xkxk xrxr Assume conditional dependence between predictive attributes (causal relation between physical quantities)… xtxt … some contextual attribute may effect directly the conditional dependence between predictive attributes but not the attributes itself

Contextual Effect on Conditional Probability X ={x 1, x 2, …, x n } – predictive attribute with n values; Z ={z 1, z 2, …, z q } – contextual attribute with q values; P(Y|X) = {p 1 (Y|X), p 2 (Y|X), …, p r (Y|X)} – conditional dependence attribute (random variable) between X and Y with r possible values; P(P(Y|X)|Z) – conditional dependence between attribute Z and attribute P(Y|X);

Contextual Effect on Unconditional Probability X x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 predictive attributes contextual attributes xkxk Assume some predictive attribute is a random variable with appropriate probability distribution for its values… xtxt … some contextual attribute may effect directly the probability distribution of the predictive attribute x1x1 x2x2 x3x3 x4x4 X P(X)

Contextual Effect on Unconditional Probability X ={x 1, x 2, …, x n } – predictive attribute with n values; · Z ={z 1, z 2, …, z q } – contextual attribute with q values and P(Z) – probability distribution for values of Z; P(X) = {p 1 (X), p 2 (X), …, p r (X)} – probability distribution attribute for X (random variable) with r possible values (different possible probability distributions for X) and P(P(X)) is probability distribution for values of attribute P(X); · P(Y|X) is a conditional probability distribution of Y given X; · P(P(X)|Z) is a conditional probability distribution for attribute P(X) given Z

Bayesian Metanetworks for Advanced Diagnostics Terziyan V., Vitko O., Probabilistic Metanetworks for Intelligent Data Analysis, Artificial Intelligence, Donetsk Institute of Artificial Intelligence, Vol. 3, 2002, pp Terziyan V., Vitko O., Bayesian Metanetwork for Modelling User Preferences in Mobile Environment, In: German Conference on Artificial Intelligence (KI-2003), Hamburg, Germany, September 15-18, 2003.

Two-level Bayesian Metanetwork for managing conditional dependencies

Causal Relation between Conditional Probabilities xkxk xrxr xmxm xnxn P 1 (X n |X m ) P(X n | X m ) P(P(X n | X m )) P 2 (X n |X m )P 3 (X n |X m ) P 1 (X r |X k ) P(X r | X k ) P(P(X r | X k )) P 2 (X r |X k ) P(P(X r | X k )|P(X n | X m )) There might be causal relationship between two pairs of conditional probabilities

Example of Bayesian Metanetwork The nodes of the 2 nd -level network correspond to the conditional probabilities of the 1 st -level network P(B|A) and P(Y|X). The arc in the 2 nd - level network corresponds to the conditional probability P(P(Y|X)|P(B|A))

Other Cases of Bayesian Metanetwork (1) Unconditional probability distributions associated with nodes of the predictive level network depend on probability distributions associated with nodes of the contextual level network

Other Cases of Bayesian Metanetwork (2) The metanetwork on the contextual level models conditional dependence particularly between unconditional and conditional probabilities of the predictive level

Other Cases of Bayesian Metanetwork (3) The combination of cases 1 and 2

Relevance 2-level Relevance Bayesian Metanetwork (for modelling relevant features selection)

Simple Relevance Bayesian Metanetwork We consider relevance as a probability of importance of the variable to the inference of target attribute in the given context. In such definition relevance inherits all properties of a probability.

Example of 2-level Relevance Bayesian Metanetwork In a relevance network the relevancies are considered as random variables between which the conditional dependencies can be learned.

More Complicated Case of Managing Relevance (1)

More Complicated Case of Managing Relevance (2)

General Case of Managing Relevance (1) Predictive attributes: X1 with values {x1 1,x1 2,…,x1 nx1 }; X2 with values {x2 1,x2 2,…,x2 nx2 }; … XN with values {xn 1,xn 2,…,xn nxn }; Target attribute: Y with values {y 1,y 2,…,y ny }. Probabilities: P(X1), P(X2),…, P(XN); P(Y|X1,X2,…,XN). Relevancies: X1 = P( (X1) = yes); X2 = P( (X2) = yes); … XN = P( (XN) = yes); Goal: to estimate P(Y).

General Case of Managing Relevance (2)

Example of Relevance Metanetwork Relevance level Predictive level

Combined Bayesian Metanetwork In a combined Metanetwork two controlling (contextual) levels will effect the basic level

Learning Bayesian Metanetworks from Data §Learning Bayesian Metanetwork structure (conditional, contextual and relevance (in)dependencies at each level); §Learning Bayesian Metanetwork parameters (conditional and unconditional probabilities and relevancies at each level). Vitko O., Multilevel Probabilistic Networks for Modelling Complex Information Systems under Uncertainty, Ph.D. Thesis, Kharkov National University of Radioelectronics, June Supervisor: Terziyan V.

When Bayesian Metanetworks ? 1. Bayesian Metanetwork can be considered as very powerful tool in cases where structure (or strengths) of causal relationships between observed parameters of an object essentially depends on context (e.g. external environment parameters); 2. Also it can be considered as a useful model for such an object, which diagnosis depends on different set of observed parameters depending on the context.

Vagan Terziyan Vladimir Ryabov

Temporal Diagnostics of Field Devices The approach to temporal diagnostics uses the algebra of uncertain temporal relations*. Uncertain temporal relations are formalized using probabilistic representation. Relational networks are composed of uncertain relations between some events (set of symptoms) A number of relational networks can be combined into a temporal scenario describing some particular course of events (diagnosis). In future, a newly composed relational network can be compared with existing temporal scenarios, and the probabilities of belonging to each particular scenario are derived. * Ryabov V., Puuronen S., Terziyan V., Representation and Reasoning with Uncertain Temporal Relations, In: A. Kumar and I. Russel (Eds.), Proceedings of the Twelfth International Florida AI Research Society Conference - FLAIRS-99, AAAI Press, California, 1999, pp

Conceptual Schema for Temporal Diagnostics N S1S1 S2S2 … SnSn Temporal scenarios Recognition of temporal scenarios We estimate the probability of belonging of the particular relational network to known temporal scenarios. Generating temporal scenarios We compose a temporal scenario combining a number of relational networks consisting of the same set of symptoms and possibly different temporal relations between them. N1N1 N2N2 N3N3 N4N4 N5N5 S Terziyan V., Ryabov V., Abstract Diagnostics Based on Uncertain Temporal Scenarios, International Conference on Computational Intelligence for Modelling Control and Automation CIMCA2003, Vienna, Austria, February 2003, 6 pp.

Industrial Temporal Diagnostics (conceptual schema) Industrial object Temporal data Relational network DB of scenarios Estimation Recognition Diagnosis Learning Ryabov V., Terziyan V., Industrial Diagnostics Using Algebra of Uncertain Temporal Relations, IASTED International Conference on Artificial Intelligence and Applications, Innsbruck, Austria, February 2003, 6 pp.

Event 2 - imperfect temporal relation between temporal points ( Event 1 and Event 2 ): P( event 1, before, event 2 ) = a 1 ; P( event 1, same time, event 2 ) = a 2 ; P( event 1, after, event 2 ) = a 3. Event 1 Imperfect Relation Between Temporal Point Events: Definition Ryabov V., Handling Imperfect Temporal Relations, Ph.D. Thesis, University of Jyvaskyla, December Supervisors: Puuronen S., Terziyan V.

Example of Imperfect Relation Event 2 - imperfect temporal relation between temporal points: P( event 1, before, event 2 ) = 0.5; P( event 1, same time, event 2 ) = 0.2; P( event 1, after, event 2 ) = 0.3. Event 1 1 < = > R(Event 1,Event 2)

Operations for Reasoning with Temporal Relations r a,b r b,c r a,c = r a,b r b,c a b c Inversion Sum Composition

Temporal Interval Relations §The basic interval relations are the thirteen Allens relations: A before (b) BB after (bi) A A meets (m) BB met-by (mi) A A overlaps (o) BB overlapped-by (oi) A A starts (s) BB started-by (si) A A during (d) BB contains (di) A A finishes (f) BB finished-by (fi) A A equals (eq) BB equals A A B A B A B B A A B A B B A

Imperfect Relation Between Temporal Intervals: Definition interval 2 - imperfect temporal relation between temporal intervals ( interval 1 and interval 2 ): P( interval 1, before, interval 2 ) = a 1 ; P( interval, meets, interval 2 ) = a 2 ; P( interval 1, overlaps, interval 2 ) = a 3 ; … P( interval 1, equals, interval 2 ) = a 13 ; interval 1

Industrial Temporal Diagnostics (composing a network of relations) Sensor 3 Sensor 2 Relational network representing the particular case Industrial object Sensor 1 Estimation of temporal relations between symptoms

Industrial Temporal Diagnostics (generating temporal scenarios) N1N1 Scenario S N3N3 N2N2 Object A Object B Object C Generating the temporal scenario for Failure X DB of scenarios 1. for i=1 to n do 2. for j=i+1 to n do 3. if ( R 1 ) or…or ( R k ) then 4. begin 5. for g=1 to n do 6. if not ( R g ) then Reasoning(, R g ) 7. // if Reasoning = False then ( R g )=TUR 8. ( R) = Å ( R t ), where t=1,..k 9. end 10. else go to line 2

Recognition of Temporal Scenario Bal(R A,B ) = Industrial object Temporal data Relational network DB of scenarios Estimation Recognition Diagnosis Learning b m o fi di si eqeq s d f oi mimi bi w bi =1 w eq =0.5 w b =0 w f =0.75 Balance point for R A,B Balance point for R C,D Probability value

When Temporal Diagnostics ? 1. Temporal diagnostics considers not only a static set of symptoms, but also the time during which they were monitored. This often allows having a broader view on the situation, and sometimes only considering temporal relations between different symptoms can give us a hint to precise diagnostics; 2. This approach might be useful for example in cases when appropriate causal relationships between events (symptoms) are not yet known and the only available for study are temporal relationships; 3. Combination of Bayesian (based on probabilistic causal knowledge) and Temporal Diagnostics would be quite powerful diagnostic tool.

Terziyan V., Dynamic Integration of Virtual Predictors, In: L.I. Kuncheva, F. Steimann, C. Haefke, M. Aladjem, V. Novak (Eds), Proceedings of the International ICSC Congress on Computational Intelligence: Methods and Applications - CIMA'2001, Bangor, Wales, UK, June , 2001, ICSC Academic Press, Canada/The Netherlands, pp Vagan Terziyan

The Problem During the past several years, in a variety of application domains, researchers in machine learning, computational learning theory, pattern recognition and statistics have tried to combine efforts to learn how to create and combine an ensemble of classifiers. The primary goal of combining several classifiers is to obtain a more accurate prediction than can be obtained from any single classifier alone.

Approaches to Integrate Multiple Classifiers Integrating Multiple Classifiers Selection Combination Global (Static) Local (Dynamic) Local (Virtual Classifier) Global (Voting-Type) Decontextualization

Inductive learning with integration of predictors Sample Instances ytyt Learning Environment P1P1 P2P2...PnPn Predictors/Classifiers

Virtual Classifier TC - Team Collector TM - Training Manager TP - Team Predictor TI - Team Integrator FS - Feature Selector DE - Distance Evaluator CL - Classification Processor Virtual Classifier is a group of seven cooperative agents:

Classification Team: Feature Selector FS - Feature Selector

Feature Selector: finds the minimally sized feature subset that is sufficient for correct classification of the instance Feature Selector Sample Instances

Classification Team: Distance Evaluator DE - Distance Evaluator

Distance between Two Instances with Heterogeneous Attributes (example) where: d (red, yellow) = 1d (15°, 25°) = 10°/((+50°)-(-50°)) = 0.1

Distance Evaluator: measures distance between instances based on their numerical or nominal attribute values Distance Evaluator

Classification Team: Classification Processor CL - Classification Processor

Classification Processor: predicts class for a new instance based on its selected features and its location relatively to sample instances Classification Processor Sample Instances Feature Selector Distance Evaluator

Team Instructors: Team Collector TC - Team Collector completes Classification Teams for training

Team Collector completes classification teams for future training Team Collector FS i DE j CL k Feature Selection methods Distance Evaluation functions Classification rules

Team Instructors: Training Manager TM - Training Manager trains all completed teams on sample instances

Training Manager trains all completed teams on sample instances Training Manager FS i1 DE j1 CL k1 FS i2 DE j2 CL k2 FS in DE jn CL kn Sample InstancesSample Metadata Classification Teams

Team Instructors: Team Predictor TP - Team Predictor predicts weights for every classification team in certain location

Team Predictor predicts weights for every classification team in certain location Team Predictor: e.g. WNN algorithm Sample Metadata Predicted weights of classification teams Location

Team Prediction: Locality assumption Each team has certain subdomains in the space of instance attributes, where it is more reliable than the others; This assumption is supported by the experiences, that classifiers usually work well not only in certain points of the domain space, but in certain subareas of the domain space [Quinlan, 1993]; If a team does not work well with the instances near a new instance, then it is quite probable that it will not work well with this new instance also.

Team Instructors: Team Integrator TI - Team Integrator produces classification result for a new instance by integrating appropriate outcomes of learned teams

Team integrator produces classification result for a new instance by integrating appropriate outcomes of learned teams Team Integrator FS i1 DE j1 CL k1 FS i2 DE j2 CL k2 FS in DE jn CL kn New instance y t1 y t2 y t1 ytyt Weights of classification teams in the location of a new instance Classification teams

Static Selection of a Classifier §Static selection means that we try all teams on a sample set and for further classification select one, which achieved the best classification accuracy among others for the whole sample set. Thus we select a team only once and then use it to classify all new domain instances.

Dynamic Selection of a Classifier §Dynamic selection means that the team is being selected for every new instance separately depending on where this instance is located. If it has been predicted that certain team can better classify this new instance than other teams, then this team is used to classify this new instance. In such case we say that the new instance belongs to the competence area of that classification team.

Conclusion §Knowledge discovery with an ensemble of classifiers is known to be more accurate than with any classifier alone [e.g. Dietterich, 1997]. §If a classifier somehow consists of certain feature selection algorithm, distance evaluation function and classification rule, then why not to consider these parts also as ensembles making a classifier itself more flexible? § We expect that classification teams completed from different feature selection, distance evaluation, and classification methods will be more accurate than any ensemble of known classifiers alone, and we focus our research and implementation on this assumption.

Yevgeniy Bodyanskiy Volodymyr Kushnaryov

Online Stochastic Faults Prediction Control Systems Research Laboratory, AI Department, Kharkov National University of Radioelectronics. Head: Prof. E. Bodyanskiy. Carries out research on development of mathematical and algorithmic support of systems for control, diagnostics, forecasting and emulation: 1. Neural network architectures and real-time algorithms for observation and sensor data processing (smoothing, filtering, prediction) under substantial uncertainty conditions; 2. Neural networks in polyharmonic sequence analysis with unknown non-stationary parameters; 3. Analysis of chaotic time series; adaptive algorithms and neural network architectures for early fault detection and diagnostics of stochastic processes; 4. Adaptive multivariable predictive control algorithms for stochastic systems under various types of constraints; 5. Adaptive neuro-fuzzy control of non-stationary nonlinear systems; 6. Adaptive forecasting of non-stationary nonlinear time series by means of neuro-fuzzy networks; 7. Fast real-time adaptive learning procedures for various types of neural and neuro-fuzzy networks. Bodyanskiy Y., Vorobyov S, Recurrent Neural Network Detecting Changes in the Properties of Non-Linear Stochastic Sequences, Automation and Remote Control, V. 1, No. 7, 2000, pp Bodyanskiy Y., Vorobyov S., Cichocki A., Adaptive Noise Cancellation for Multi-Sensory Signals, Fluctuation and Noise Letters, V. 1, No. 1, 2001, pp Bodyanskiy Y., Kolodyazhniy V., Stephan A. An Adaptive Learning Algorithm for a Neuro-Fuzzy Network, In: B. Reusch (ed.), Computational Intelligence. Theory and Applications, Berlin-Heidelberg-New York: Springer, 2001, pp

Existing Tools Most existing (neuro-) fuzzy systems used for fault diagnosis or classification are based on offline learning with the use of genetic algorithms or modifications of the error back propagation. When the number of features and possible fault situations is large, tuning of the classifying system becomes very time consuming. Moreover, such systems perform very poorly in high dimensions of the input space, so special modifications of the known architectures are required.

Neuro-Fuzzy Fault Diagnostics Successful application of the neuro-fuzzy synergism to fault diagnosis of complex systems demands development of an online diagnosing system that quickly learns from examples even with a large amount of data, and maintains high processing speed and high classification accuracy when the number of features is large as well.

Challenge: Growing (Learning) Probabilistic Neuro-Fuzzy Network (1) input layer, n inputs 1-st hidden layer, N neurons 2-nd hidden layer, (m+1) elements output layer, m divisors Bodyanskiy Ye., Gorshkov Ye., Kolodyazhniy V., Wernstedt J., Probabilistic Neuro-Fuzzy Network with Non-Conventional Activation Functions, In: Knowledge-Based Intelligent Information & Engineering Systems, Proceedings of Seventh International Conference KES2003, 3–5 September, Oxford, United Kingdom, LNAI, Springer-Verlag, Bodyanskiy Ye., Gorshkov Ye., Kolodyazhniy V. Resource-Allocating Probabilistic Neuro-Fuzzy Network, In: Proceedings of International Conference on Fuzzy Logic and Technology, 10–12 September, Zittau, Germany, 2003.

Challenge: Growing (Learning) Probabilistic Neuro-Fuzzy Network (2) fuzzy classification network §Implements fuzzy reasoning and classification (fuzzy classification network); growing network §Creates automatically neurons based on training set (growing network); learning network §Learns free parameters of the network based on training set (learning network); high- performance network §Guarantees high precision of classification based on fast learning (high- performance network); powerful and economical network §Able to perform with huge volumes of data with limited computational resources (powerful and economical network); real-time network §Able to work in real-time (real-time network). Tested on real data in comparison with classical probabilistic neural network Unique combination of features

Tests for Neuro-Fuzzy Algorithms Industrial Ontologies Group (Kharkovs Branch), Data Mining Research Group and Control Systems Research Laboratory of the Artificial Intelligence Department of Kharkov National University of Radioelectronics have essential theoretical and practical experience in implementing neuro-fuzzy approach and specifically Real-Time Probabilistic Neuro-Fuzzy Systems for Simulation, Modeling, Forecasting, Diagnostics, Clustering, Control. We are interested in cooperation with Metso in that area and we are ready to present the performance of our algorithms on real data taken from any of Metsos products to compare our algorithms with existing in Metso algorithms.

Inventions we can offer (1) §Method of intelligent preventive or predictive diagnostics and forecasting of technical condition of industrial equipment, machines, devices, systems, etc. in real time based on analysis of non-stationary stochastic signals (e.g. from sensors of temperature, pressure, current, shifting, frequency, energy consumption, and other parameters with threshold values). §The method is based on advanced data mining techniques, which utilize fuzzy-neuro technologies, and differs from existing tools by flexible self-organizing network structure and by optimization of computational resources while learning.

Inventions we can offer (2) §Method of intelligent real-time preventive or predictive diagnostics and forecasting of technical condition of industrial equipment, machines, devices, systems, etc. based on analysis of signals with non-stationary and non-multiplied periodical components (e.g. from sensors of vibration, noise, frequencies of rotation, current, voltage, etc.). §The method is based on optimization of computational resources while learning because of intelligent reducing of the number of signal components being analyzed.

Inventions we can offer (3) §Method and mechanism of optimal control of dosage and real-time infusion of anti-wear oil additives into industrial machines based on its real-time condition monitoring.

Summary of problems we can solve §Rather global system for condition monitoring and preventive maintenance based on OntoServ.Net (global, agent-based, ontology-based, Semantic Web services-based, semantic P2P search-based) technologies, modern and advanced data-mining methods and tools with knowledge creation, warehousing, and updating during not only devices lifetime, but also utilizing (for various maintenance needs) knowledge obtained afterwards (various testing and investigations techniques other than information taken from living devices sensors) from broken-down, worn out or aged components of the same type.

Recently Performed Case Studies (1) §Ontology Development for Gas Compressing Equipment Diagnostics Realized by Neural Networks §Available in: Volodymyr Kushnaryov Semen Simkin

Recently Performed Case Studies (2) §The use of Ontologies for Faults and State Description of Gas- Transfer Units §Available in: Volodymyr Kushnaryov Konstantin Tatarnikov

Conclusion §Industrial Ontologies Research Group OntoServ.Net branches in Kharkov experts and experiences Metso §Industrial Ontologies Research Group (University of Jyvaskyla), which is piloting the OntoServ.Net concept of the Global Semantic Web - Based System for Industrial Maintenance, has also powerful branches in Kharkov (e.g. IOG-Kharkovs Branch, Control Systems Research Laboratory, Data Mining Research Group, etc.) with experts and experiences in various and challenging data mining and knowledge discovery, online diagnostics, forecasting and control, models learning and integration, etc. methods, which can be and reasonable to be successfully utilized within going-on cooperation between Metso and Industrial Ontologies Group.