Download presentation

Presentation is loading. Please wait.

Published byMaurice Woolcock Modified over 3 years ago

1
Advanced Diagnostics Algorithms in Online Field Device Monitoring Vagan Terziyan (editor) http://www.cs.jyu.fi/ai/Metso_Diagnostics.ppt Industrial Ontologies Group: http://www.cs.jyu.fi/ai/OntoGroup/index.html Industrial Ontologies Group, Agora Center, University of Jyväskylä, 2003

2
Contents OntoServ.Net §Introduction: OntoServ.Net – Global Health- Care Environment for Industrial Devices; Bayesian Metanetworks § Bayesian Metanetworks for Context-Sensitive Industrial Diagnostics; §Temporal Industrial Diagnostics §Temporal Industrial Diagnostics with Uncertainty; §Dynamic Integration §Dynamic Integration of Classification Algorithms for Industrial Diagnostics; Real-Time Neuro- Fuzzy Systems §Industrial Diagnostics with Real-Time Neuro- Fuzzy Systems; §Conclusion.

3
Vagan Terziyan Oleksiy Khriyenko Oleksandr Kononenko Andriy Zharko

4
Web Services for Smart Devices Smart industrial devices can be also Web Service users. Their embedded agents are able to monitor the state of appropriate device, to communicate and exchange data with another agents. There is a good reason to launch special Web Services for such smart industrial devices to provide necessary online condition monitoring, diagnostics, maintenance support, etc. OntoServ.Net: Semantic Web Enabled Network of Maintenance Services for Smart Devices, Industrial Ontologies Group, Tekes Project Proposal, March 2003,

5
Global Network of Maintenance Services OntoServ.Net: Semantic Web Enabled Network of Maintenance Services for Smart Devices, Industrial Ontologies Group, Tekes Project Proposal, March 2003,

6
Embedded Maintenance Platforms Service Agents Host Agent Embedded Platform Based on the online diagnostics, a service agent, selected for the specific emergency situation, moves to the embedded platform to help the host agent to manage it and to carry out the predictive maintenance activities Maintenance Service

7
OntoServ.Net Challenges smart industrial devices §New group of Web service users – smart industrial devices. §Internalexternal service platforms §Internal (embedded) and external (Web-based) agent enabled service platforms. Mobile Service Component §Mobile Service Component concept supposes that any service component can move, be executed and learn at any platform from the Service Network, including service requestor side. §Semantic Peer-to-Peer §Semantic Peer-to-Peer concept for service network management assumes ontology-based decentralized service network management.

8
Agents in Semantic Web 1. I feel bad, pressure more than 200, headache, … Who can advise what to do ? 4. Never had such experience. No idea what to do 3. Wait a bit, I will give you some pills 2. I think you should stop drink beer for a while Agents in Semantic Web supposed to understand each other because they will share common standard, platform, ontology and language

9
The Challenge: Global Understanding eNvironment (GUN) How to make entities from our physical world to understand each other when necessary ?.. … Its elementary ! But not easy !! Just to make agents from them !!!

10
GUN Concept Entities will interoperate through OntoShells, which are supplements of these entities up to Semantic Web enabled agents 1. I feel bad, temperature 40, pain in stomach, … Who can advise what to do ? 2. I have some pills for you

11
Semantic Web: Before GUN Semantic Web Resources Semantic Web Applications Semantic Web applications understand, (re)use, share, integrate, etc. Semantic Web resources

12
GUN Concept: All GUN resources understand each other Real World objects OntoAdapters Real World Object + + OntoAdapter + + OntoShell = GUN Resource = GUN Resource GUN OntoShells Real World objects of new generation (OntoAdapter inside)

13
Read Our Recent Reports §Semantic Web: The Future Starts Today l (collection of research papers and presentations of Industrial Ontologies Group for the Period November 2002-April 2003) §Semantic Web and Peer-to-Peer: Integration and Interoperability in Industry §Semantic Web Enabled Web Services: State-of-Art and Challenges §Distributed Mobile Web Services Based on Semantic Web: Distributed Industrial Product Maintenance System §Available online in: http://www.cs.jyu.fi/ai/OntoGroup/index.html Industrial Ontologies Group V. Terziyan A. Zharko O. Kononenko O. Khriyenko

14
Vagan Terziyan Oleksandra Vitko

15
Example of Simple Bayesian Network Conditional (in)dependence rule Joint probability rule Marginalization rule Bayesian rule

16
Contextual and Predictive Attributes Machine Environment Sensors X x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 predictive attributes contextual attributes air pressure dust humidity temperature emission

17
Contextual Effect on Conditional Probability X x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 predictive attributes contextual attributes xkxk xrxr Assume conditional dependence between predictive attributes (causal relation between physical quantities)… xtxt … some contextual attribute may effect directly the conditional dependence between predictive attributes but not the attributes itself

18
Contextual Effect on Conditional Probability X ={x 1, x 2, …, x n } – predictive attribute with n values; Z ={z 1, z 2, …, z q } – contextual attribute with q values; P(Y|X) = {p 1 (Y|X), p 2 (Y|X), …, p r (Y|X)} – conditional dependence attribute (random variable) between X and Y with r possible values; P(P(Y|X)|Z) – conditional dependence between attribute Z and attribute P(Y|X);

19
Contextual Effect on Unconditional Probability X x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 predictive attributes contextual attributes xkxk Assume some predictive attribute is a random variable with appropriate probability distribution for its values… xtxt … some contextual attribute may effect directly the probability distribution of the predictive attribute x1x1 x2x2 x3x3 x4x4 X P(X)

20
Contextual Effect on Unconditional Probability X ={x 1, x 2, …, x n } – predictive attribute with n values; · Z ={z 1, z 2, …, z q } – contextual attribute with q values and P(Z) – probability distribution for values of Z; P(X) = {p 1 (X), p 2 (X), …, p r (X)} – probability distribution attribute for X (random variable) with r possible values (different possible probability distributions for X) and P(P(X)) is probability distribution for values of attribute P(X); · P(Y|X) is a conditional probability distribution of Y given X; · P(P(X)|Z) is a conditional probability distribution for attribute P(X) given Z

21
Bayesian Metanetworks for Advanced Diagnostics Terziyan V., Vitko O., Probabilistic Metanetworks for Intelligent Data Analysis, Artificial Intelligence, Donetsk Institute of Artificial Intelligence, Vol. 3, 2002, pp. 188-197. Terziyan V., Vitko O., Bayesian Metanetwork for Modelling User Preferences in Mobile Environment, In: German Conference on Artificial Intelligence (KI-2003), Hamburg, Germany, September 15-18, 2003.

22
Two-level Bayesian Metanetwork for managing conditional dependencies

23
Causal Relation between Conditional Probabilities xkxk xrxr xmxm xnxn P 1 (X n |X m ) P(X n | X m ) P(P(X n | X m )) P 2 (X n |X m )P 3 (X n |X m ) P 1 (X r |X k ) P(X r | X k ) P(P(X r | X k )) P 2 (X r |X k ) P(P(X r | X k )|P(X n | X m )) There might be causal relationship between two pairs of conditional probabilities

24
Example of Bayesian Metanetwork The nodes of the 2 nd -level network correspond to the conditional probabilities of the 1 st -level network P(B|A) and P(Y|X). The arc in the 2 nd - level network corresponds to the conditional probability P(P(Y|X)|P(B|A))

25
Other Cases of Bayesian Metanetwork (1) Unconditional probability distributions associated with nodes of the predictive level network depend on probability distributions associated with nodes of the contextual level network

26
Other Cases of Bayesian Metanetwork (2) The metanetwork on the contextual level models conditional dependence particularly between unconditional and conditional probabilities of the predictive level

27
Other Cases of Bayesian Metanetwork (3) The combination of cases 1 and 2

28
Relevance 2-level Relevance Bayesian Metanetwork (for modelling relevant features selection)

29
Simple Relevance Bayesian Metanetwork We consider relevance as a probability of importance of the variable to the inference of target attribute in the given context. In such definition relevance inherits all properties of a probability.

30
Example of 2-level Relevance Bayesian Metanetwork In a relevance network the relevancies are considered as random variables between which the conditional dependencies can be learned.

31
More Complicated Case of Managing Relevance (1) 1 2 3 4

32
More Complicated Case of Managing Relevance (2)

33
General Case of Managing Relevance (1) Predictive attributes: X1 with values {x1 1,x1 2,…,x1 nx1 }; X2 with values {x2 1,x2 2,…,x2 nx2 }; … XN with values {xn 1,xn 2,…,xn nxn }; Target attribute: Y with values {y 1,y 2,…,y ny }. Probabilities: P(X1), P(X2),…, P(XN); P(Y|X1,X2,…,XN). Relevancies: X1 = P( (X1) = yes); X2 = P( (X2) = yes); … XN = P( (XN) = yes); Goal: to estimate P(Y).

34
General Case of Managing Relevance (2)

35
Example of Relevance Metanetwork Relevance level Predictive level

36
Combined Bayesian Metanetwork In a combined Metanetwork two controlling (contextual) levels will effect the basic level

37
Learning Bayesian Metanetworks from Data §Learning Bayesian Metanetwork structure (conditional, contextual and relevance (in)dependencies at each level); §Learning Bayesian Metanetwork parameters (conditional and unconditional probabilities and relevancies at each level). Vitko O., Multilevel Probabilistic Networks for Modelling Complex Information Systems under Uncertainty, Ph.D. Thesis, Kharkov National University of Radioelectronics, June 2003. Supervisor: Terziyan V.

38
When Bayesian Metanetworks ? 1. Bayesian Metanetwork can be considered as very powerful tool in cases where structure (or strengths) of causal relationships between observed parameters of an object essentially depends on context (e.g. external environment parameters); 2. Also it can be considered as a useful model for such an object, which diagnosis depends on different set of observed parameters depending on the context.

39
Vagan Terziyan Vladimir Ryabov

40
Temporal Diagnostics of Field Devices The approach to temporal diagnostics uses the algebra of uncertain temporal relations*. Uncertain temporal relations are formalized using probabilistic representation. Relational networks are composed of uncertain relations between some events (set of symptoms) A number of relational networks can be combined into a temporal scenario describing some particular course of events (diagnosis). In future, a newly composed relational network can be compared with existing temporal scenarios, and the probabilities of belonging to each particular scenario are derived. * Ryabov V., Puuronen S., Terziyan V., Representation and Reasoning with Uncertain Temporal Relations, In: A. Kumar and I. Russel (Eds.), Proceedings of the Twelfth International Florida AI Research Society Conference - FLAIRS-99, AAAI Press, California, 1999, pp. 449-453.

41
Conceptual Schema for Temporal Diagnostics N S1S1 S2S2 … SnSn Temporal scenarios Recognition of temporal scenarios We estimate the probability of belonging of the particular relational network to known temporal scenarios. Generating temporal scenarios We compose a temporal scenario combining a number of relational networks consisting of the same set of symptoms and possibly different temporal relations between them. N1N1 N2N2 N3N3 N4N4 N5N5 S Terziyan V., Ryabov V., Abstract Diagnostics Based on Uncertain Temporal Scenarios, International Conference on Computational Intelligence for Modelling Control and Automation CIMCA2003, Vienna, Austria, 12-14 February 2003, 6 pp.

42
Industrial Temporal Diagnostics (conceptual schema) Industrial object Temporal data Relational network DB of scenarios Estimation Recognition Diagnosis Learning Ryabov V., Terziyan V., Industrial Diagnostics Using Algebra of Uncertain Temporal Relations, IASTED International Conference on Artificial Intelligence and Applications, Innsbruck, Austria, 10-13 February 2003, 6 pp.

43
Event 2 - imperfect temporal relation between temporal points ( Event 1 and Event 2 ): P( event 1, before, event 2 ) = a 1 ; P( event 1, same time, event 2 ) = a 2 ; P( event 1, after, event 2 ) = a 3. Event 1 Imperfect Relation Between Temporal Point Events: Definition Ryabov V., Handling Imperfect Temporal Relations, Ph.D. Thesis, University of Jyvaskyla, December 2002. Supervisors: Puuronen S., Terziyan V.

44
Example of Imperfect Relation Event 2 - imperfect temporal relation between temporal points: P( event 1, before, event 2 ) = 0.5; P( event 1, same time, event 2 ) = 0.2; P( event 1, after, event 2 ) = 0.3. Event 1 1 < = > R(Event 1,Event 2)

45
Operations for Reasoning with Temporal Relations r a,b r b,c r a,c = r a,b r b,c a b c Inversion Sum Composition

46
Temporal Interval Relations §The basic interval relations are the thirteen Allens relations: A before (b) BB after (bi) A A meets (m) BB met-by (mi) A A overlaps (o) BB overlapped-by (oi) A A starts (s) BB started-by (si) A A during (d) BB contains (di) A A finishes (f) BB finished-by (fi) A A equals (eq) BB equals A A B A B A B B A A B A B B A

47
Imperfect Relation Between Temporal Intervals: Definition interval 2 - imperfect temporal relation between temporal intervals ( interval 1 and interval 2 ): P( interval 1, before, interval 2 ) = a 1 ; P( interval, meets, interval 2 ) = a 2 ; P( interval 1, overlaps, interval 2 ) = a 3 ; … P( interval 1, equals, interval 2 ) = a 13 ; interval 1

48
Industrial Temporal Diagnostics (composing a network of relations) Sensor 3 Sensor 2 Relational network representing the particular case Industrial object Sensor 1 Estimation of temporal relations between symptoms

49
Industrial Temporal Diagnostics (generating temporal scenarios) N1N1 Scenario S N3N3 N2N2 Object A Object B Object C Generating the temporal scenario for Failure X DB of scenarios 1. for i=1 to n do 2. for j=i+1 to n do 3. if ( R 1 ) or…or ( R k ) then 4. begin 5. for g=1 to n do 6. if not ( R g ) then Reasoning(, R g ) 7. // if Reasoning = False then ( R g )=TUR 8. ( R) = Å ( R t ), where t=1,..k 9. end 10. else go to line 2

50
Recognition of Temporal Scenario Bal(R A,B ) = Industrial object Temporal data Relational network DB of scenarios Estimation Recognition Diagnosis Learning b m o fi di si eqeq s d f oi mimi bi w bi =1 w eq =0.5 w b =0 w f =0.75 Balance point for R A,B Balance point for R C,D Probability value

51
When Temporal Diagnostics ? 1. Temporal diagnostics considers not only a static set of symptoms, but also the time during which they were monitored. This often allows having a broader view on the situation, and sometimes only considering temporal relations between different symptoms can give us a hint to precise diagnostics; 2. This approach might be useful for example in cases when appropriate causal relationships between events (symptoms) are not yet known and the only available for study are temporal relationships; 3. Combination of Bayesian (based on probabilistic causal knowledge) and Temporal Diagnostics would be quite powerful diagnostic tool.

52
Terziyan V., Dynamic Integration of Virtual Predictors, In: L.I. Kuncheva, F. Steimann, C. Haefke, M. Aladjem, V. Novak (Eds), Proceedings of the International ICSC Congress on Computational Intelligence: Methods and Applications - CIMA'2001, Bangor, Wales, UK, June 19 - 22, 2001, ICSC Academic Press, Canada/The Netherlands, pp. 463-469. Vagan Terziyan

53
The Problem During the past several years, in a variety of application domains, researchers in machine learning, computational learning theory, pattern recognition and statistics have tried to combine efforts to learn how to create and combine an ensemble of classifiers. The primary goal of combining several classifiers is to obtain a more accurate prediction than can be obtained from any single classifier alone.

54
Approaches to Integrate Multiple Classifiers Integrating Multiple Classifiers Selection Combination Global (Static) Local (Dynamic) Local (Virtual Classifier) Global (Voting-Type) Decontextualization

55
Inductive learning with integration of predictors Sample Instances ytyt Learning Environment P1P1 P2P2...PnPn Predictors/Classifiers

56
Virtual Classifier TC - Team Collector TM - Training Manager TP - Team Predictor TI - Team Integrator FS - Feature Selector DE - Distance Evaluator CL - Classification Processor Virtual Classifier is a group of seven cooperative agents:

57
Classification Team: Feature Selector FS - Feature Selector

58
Feature Selector: finds the minimally sized feature subset that is sufficient for correct classification of the instance Feature Selector Sample Instances

59
Classification Team: Distance Evaluator DE - Distance Evaluator

60
Distance between Two Instances with Heterogeneous Attributes (example) where: d (red, yellow) = 1d (15°, 25°) = 10°/((+50°)-(-50°)) = 0.1

61
Distance Evaluator: measures distance between instances based on their numerical or nominal attribute values Distance Evaluator

62
Classification Team: Classification Processor CL - Classification Processor

63
Classification Processor: predicts class for a new instance based on its selected features and its location relatively to sample instances Classification Processor Sample Instances Feature Selector Distance Evaluator

64
Team Instructors: Team Collector TC - Team Collector completes Classification Teams for training

65
Team Collector completes classification teams for future training Team Collector FS i DE j CL k Feature Selection methods Distance Evaluation functions Classification rules

66
Team Instructors: Training Manager TM - Training Manager trains all completed teams on sample instances

67
Training Manager trains all completed teams on sample instances Training Manager FS i1 DE j1 CL k1 FS i2 DE j2 CL k2 FS in DE jn CL kn Sample InstancesSample Metadata Classification Teams

68
Team Instructors: Team Predictor TP - Team Predictor predicts weights for every classification team in certain location

69
Team Predictor predicts weights for every classification team in certain location Team Predictor: e.g. WNN algorithm Sample Metadata Predicted weights of classification teams Location

70
Team Prediction: Locality assumption Each team has certain subdomains in the space of instance attributes, where it is more reliable than the others; This assumption is supported by the experiences, that classifiers usually work well not only in certain points of the domain space, but in certain subareas of the domain space [Quinlan, 1993]; If a team does not work well with the instances near a new instance, then it is quite probable that it will not work well with this new instance also.

71
Team Instructors: Team Integrator TI - Team Integrator produces classification result for a new instance by integrating appropriate outcomes of learned teams

72
Team integrator produces classification result for a new instance by integrating appropriate outcomes of learned teams Team Integrator FS i1 DE j1 CL k1 FS i2 DE j2 CL k2 FS in DE jn CL kn New instance y t1 y t2 y t1 ytyt Weights of classification teams in the location of a new instance Classification teams

73
Static Selection of a Classifier §Static selection means that we try all teams on a sample set and for further classification select one, which achieved the best classification accuracy among others for the whole sample set. Thus we select a team only once and then use it to classify all new domain instances.

74
Dynamic Selection of a Classifier §Dynamic selection means that the team is being selected for every new instance separately depending on where this instance is located. If it has been predicted that certain team can better classify this new instance than other teams, then this team is used to classify this new instance. In such case we say that the new instance belongs to the competence area of that classification team.

75
Conclusion §Knowledge discovery with an ensemble of classifiers is known to be more accurate than with any classifier alone [e.g. Dietterich, 1997]. §If a classifier somehow consists of certain feature selection algorithm, distance evaluation function and classification rule, then why not to consider these parts also as ensembles making a classifier itself more flexible? § We expect that classification teams completed from different feature selection, distance evaluation, and classification methods will be more accurate than any ensemble of known classifiers alone, and we focus our research and implementation on this assumption.

76
Yevgeniy Bodyanskiy Volodymyr Kushnaryov

77
Online Stochastic Faults Prediction Control Systems Research Laboratory, AI Department, Kharkov National University of Radioelectronics. Head: Prof. E. Bodyanskiy. Carries out research on development of mathematical and algorithmic support of systems for control, diagnostics, forecasting and emulation: 1. Neural network architectures and real-time algorithms for observation and sensor data processing (smoothing, filtering, prediction) under substantial uncertainty conditions; 2. Neural networks in polyharmonic sequence analysis with unknown non-stationary parameters; 3. Analysis of chaotic time series; adaptive algorithms and neural network architectures for early fault detection and diagnostics of stochastic processes; 4. Adaptive multivariable predictive control algorithms for stochastic systems under various types of constraints; 5. Adaptive neuro-fuzzy control of non-stationary nonlinear systems; 6. Adaptive forecasting of non-stationary nonlinear time series by means of neuro-fuzzy networks; 7. Fast real-time adaptive learning procedures for various types of neural and neuro-fuzzy networks. Bodyanskiy Y., Vorobyov S, Recurrent Neural Network Detecting Changes in the Properties of Non-Linear Stochastic Sequences, Automation and Remote Control, V. 1, No. 7, 2000, pp. 1113-1124. Bodyanskiy Y., Vorobyov S., Cichocki A., Adaptive Noise Cancellation for Multi-Sensory Signals, Fluctuation and Noise Letters, V. 1, No. 1, 2001, pp. 12-23. Bodyanskiy Y., Kolodyazhniy V., Stephan A. An Adaptive Learning Algorithm for a Neuro-Fuzzy Network, In: B. Reusch (ed.), Computational Intelligence. Theory and Applications, Berlin-Heidelberg-New York: Springer, 2001, pp. 68-75.

78
Existing Tools Most existing (neuro-) fuzzy systems used for fault diagnosis or classification are based on offline learning with the use of genetic algorithms or modifications of the error back propagation. When the number of features and possible fault situations is large, tuning of the classifying system becomes very time consuming. Moreover, such systems perform very poorly in high dimensions of the input space, so special modifications of the known architectures are required.

79
Neuro-Fuzzy Fault Diagnostics Successful application of the neuro-fuzzy synergism to fault diagnosis of complex systems demands development of an online diagnosing system that quickly learns from examples even with a large amount of data, and maintains high processing speed and high classification accuracy when the number of features is large as well.

80
Challenge: Growing (Learning) Probabilistic Neuro-Fuzzy Network (1) input layer, n inputs 1-st hidden layer, N neurons 2-nd hidden layer, (m+1) elements output layer, m divisors Bodyanskiy Ye., Gorshkov Ye., Kolodyazhniy V., Wernstedt J., Probabilistic Neuro-Fuzzy Network with Non-Conventional Activation Functions, In: Knowledge-Based Intelligent Information & Engineering Systems, Proceedings of Seventh International Conference KES2003, 3–5 September, Oxford, United Kingdom, LNAI, Springer-Verlag, 2003. Bodyanskiy Ye., Gorshkov Ye., Kolodyazhniy V. Resource-Allocating Probabilistic Neuro-Fuzzy Network, In: Proceedings of International Conference on Fuzzy Logic and Technology, 10–12 September, Zittau, Germany, 2003.

81
Challenge: Growing (Learning) Probabilistic Neuro-Fuzzy Network (2) fuzzy classification network §Implements fuzzy reasoning and classification (fuzzy classification network); growing network §Creates automatically neurons based on training set (growing network); learning network §Learns free parameters of the network based on training set (learning network); high- performance network §Guarantees high precision of classification based on fast learning (high- performance network); powerful and economical network §Able to perform with huge volumes of data with limited computational resources (powerful and economical network); real-time network §Able to work in real-time (real-time network). Tested on real data in comparison with classical probabilistic neural network Unique combination of features

82
Tests for Neuro-Fuzzy Algorithms Industrial Ontologies Group (Kharkovs Branch), Data Mining Research Group and Control Systems Research Laboratory of the Artificial Intelligence Department of Kharkov National University of Radioelectronics have essential theoretical and practical experience in implementing neuro-fuzzy approach and specifically Real-Time Probabilistic Neuro-Fuzzy Systems for Simulation, Modeling, Forecasting, Diagnostics, Clustering, Control. We are interested in cooperation with Metso in that area and we are ready to present the performance of our algorithms on real data taken from any of Metsos products to compare our algorithms with existing in Metso algorithms.

83
Inventions we can offer (1) §Method of intelligent preventive or predictive diagnostics and forecasting of technical condition of industrial equipment, machines, devices, systems, etc. in real time based on analysis of non-stationary stochastic signals (e.g. from sensors of temperature, pressure, current, shifting, frequency, energy consumption, and other parameters with threshold values). §The method is based on advanced data mining techniques, which utilize fuzzy-neuro technologies, and differs from existing tools by flexible self-organizing network structure and by optimization of computational resources while learning.

84
Inventions we can offer (2) §Method of intelligent real-time preventive or predictive diagnostics and forecasting of technical condition of industrial equipment, machines, devices, systems, etc. based on analysis of signals with non-stationary and non-multiplied periodical components (e.g. from sensors of vibration, noise, frequencies of rotation, current, voltage, etc.). §The method is based on optimization of computational resources while learning because of intelligent reducing of the number of signal components being analyzed.

85
Inventions we can offer (3) §Method and mechanism of optimal control of dosage and real-time infusion of anti-wear oil additives into industrial machines based on its real-time condition monitoring.

86
Summary of problems we can solve §Rather global system for condition monitoring and preventive maintenance based on OntoServ.Net (global, agent-based, ontology-based, Semantic Web services-based, semantic P2P search-based) technologies, modern and advanced data-mining methods and tools with knowledge creation, warehousing, and updating during not only devices lifetime, but also utilizing (for various maintenance needs) knowledge obtained afterwards (various testing and investigations techniques other than information taken from living devices sensors) from broken-down, worn out or aged components of the same type.

87
Recently Performed Case Studies (1) §Ontology Development for Gas Compressing Equipment Diagnostics Realized by Neural Networks §Available in: http://www.cs.jyu.fi/ai/OntoGroup/docs/July2003.pdf Volodymyr Kushnaryov Semen Simkin

88
Recently Performed Case Studies (2) §The use of Ontologies for Faults and State Description of Gas- Transfer Units §Available in: http://www.cs.jyu.fi/ai/OntoGroup/docs/July2003.pdf Volodymyr Kushnaryov Konstantin Tatarnikov

90
Conclusion §Industrial Ontologies Research Group OntoServ.Net branches in Kharkov experts and experiences Metso §Industrial Ontologies Research Group (University of Jyvaskyla), which is piloting the OntoServ.Net concept of the Global Semantic Web - Based System for Industrial Maintenance, has also powerful branches in Kharkov (e.g. IOG-Kharkovs Branch, Control Systems Research Laboratory, Data Mining Research Group, etc.) with experts and experiences in various and challenging data mining and knowledge discovery, online diagnostics, forecasting and control, models learning and integration, etc. methods, which can be and reasonable to be successfully utilized within going-on cooperation between Metso and Industrial Ontologies Group.

Similar presentations

OK

A core course on Modeling kees van Overveld Week-by-week summary.

A core course on Modeling kees van Overveld Week-by-week summary.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on db2 introduction to physics Ppt on ip address classes table Maths ppt on binomial theorem Ppt on non biodegradable waste recycling Ppt on credit policy and procedures Ppt on resources of water Slides for ppt on pollution of air Ppt on various types of virus and antivirus Ppt on db2 architecture Ppt on swami vivekananda