Neuro-fuzzy modeling between minimum and maximum Claudio Moraga University of Dortmund Germany © cm Universidad Técnica Federico Santa María Valparaíso,

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

Applied Informatics Štefan BEREŽNÝ
Fuzzy Sets and Applications Introduction Introduction Fuzzy Sets and Operations Fuzzy Sets and Operations.
5.1 Accumulating Change: Introduction to results of change
Support Vector Machines
Lecture 14 – Neural Networks
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Basic Calculus. Outline Differentiation as finding slope Integration as finding area Integration as inverse of differentiation.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
September 21, 2010Neural Networks Lecture 5: The Perceptron 1 Supervised Function Approximation In supervised learning, we train an ANN with a set of vector.
1 Section 10.1 Boolean Functions. 2 Computers & Boolean Algebra Circuits in computers have inputs whose values are either 0 or 1 Mathematician George.
Neuro-Fuzzy Control Adriano Joaquim de Oliveira Cruz NCE/UFRJ
Chapter 6: Multilayer Neural Networks
PART 3 Operations on fuzzy sets
Ming-Feng Yeh General Fuzzy Systems A fuzzy system is a static nonlinear mapping between its inputs and outputs (i.e., it is not a dynamic system).
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
Logic Functions and their Representation. Slide 2 Combinational Networks x1x1 x2x2 xnxn f.
Ragionamento in condizioni di incertezza: Approccio fuzzy Paolo Radaelli Corso di Inelligenza Articifiale - Elementi.
Fuzzy Rule-based Models *Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997.
What are Neuro-Fuzzy Systems A neuro-fuzzy system is a fuzzy system that uses a learning algorithm derived from or inspired by neural network theory to.
Neuro-fuzzy Systems Xinbo Gao School of Electronic Engineering Xidian University 2004,10.
NUMERICAL EXAMPLE APPENDIX A in “A neuro-fuzzy modeling tool to estimate fluvial nutrient loads in watersheds under time-varying human impact” Rafael Marcé.
Principle Component Analysis (PCA) Networks (§ 5.8) PCA: a statistical procedure –Reduce dimensionality of input vectors Too many features, some of them.
A Simple Method to Extract Fuzzy Rules by Measure of Fuzziness Jieh-Ren Chang Nai-Jian Wang.
Chapter 11 Analysis and Explanation. Chapter 11 Outline Explain how CI systems do what they do Only a few methodologies are discussed here Sensitivity.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Outline of a Course on Computational Intelligence Claudio Moraga University of Dortmund Germany JEP Bitola Workshop December 2003
Chapter 9 Neural Network.
1 On one-generated projective BL-algebras Antonio Di Nola and Revaz Grigolia University of Salerno Tbilisi State University Department of Mathematics Institute.
Introduction to Artificial Neural Network Models Angshuman Saha Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
Fuzzy Inference (Expert) System
So Far……  Clustering basics, necessity for clustering, Usage in various fields : engineering and industrial fields  Properties : hierarchical, flat,
Lecture 22: 11/19/2002CS170 Fall CS170 Computer Organization and Architecture I Ayman Abdel-Hamid Department of Computer Science Old Dominion University.
ANFIS (Adaptive Network Fuzzy Inference system)
Theory and Applications
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 12 FUZZY.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Efficient Optimal Linear Boosting of a Pair of Classifiers.
Chap 3: Fuzzy Rules and Fuzzy Reasoning J.-S. Roger Jang ( 張智星 ) CS Dept., Tsing Hua Univ., Taiwan Fuzzy.
MUNICIPALITIES CLASSIFICATION BASED ON FUZZY RULES
Fuzzy Inference Systems
Chapter 4: Fuzzy Inference Systems Introduction (4.1) Mamdani Fuzzy models (4.2) Sugeno Fuzzy Models (4.3) Tsukamoto Fuzzy models (4.4) Other Considerations.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Foundation of Computing Systems
Chapter 2: FUZZY SETS Introduction (2.1)
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
1 a1a1 A1A1 a2a2 a3a3 A2A Mixed Strategies When there is no saddle point: We’ll think of playing the game repeatedly. We continue to assume that.
Signal & Weight Vector Spaces
Computational Intelligence: Methods and Applications Lecture 29 Approximation theory, RBF and SFN networks Włodzisław Duch Dept. of Informatics, UMK Google:
Ch.3 Fuzzy Rules and Fuzzy Reasoning
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Nonlinear balanced model residualization via neural networks Juergen Hahn.
Chap 2: Aggregation Operations on Fuzzy Sets u Goal: In this chapter we provide a number of different methodologies for aggregating fuzzy subsets. u t-norm.
Fuzzy Relations( 關係 ), Fuzzy Graphs( 圖 形 ), and Fuzzy Arithmetic( 運算 ) Chapter 4.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Chapter 3: Fuzzy Rules & Fuzzy Reasoning Extension Principle & Fuzzy Relations (3.2) Fuzzy if-then Rules(3.3) Fuzzy Reasonning (3.4)
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Chapter 12 Case Studies Part B. Control System Design.
Principle Component Analysis (PCA) Networks (§ 5.8)
6.1 One-to-One Functions; Inverse Function
Chap 3: Fuzzy Rules and Fuzzy Reasoning
Dr. Unnikrishnan P.C. Professor, EEE
Chap 3: Fuzzy Rules and Fuzzy Reasoning
Dr. Unnikrishnan P.C. Professor, EEE
Chap 3: Fuzzy Rules and Fuzzy Reasoning
4.1 One-to-One Functions; Inverse Function
The principle of inclusion and exclusion; inversion formulae
Matrices and Determinants
Presentation transcript:

Neuro-fuzzy modeling between minimum and maximum Claudio Moraga University of Dortmund Germany © cm Universidad Técnica Federico Santa María Valparaíso, Marzo, 2001

Outline Motivation Data-driven modeling Das ANFIS system Compensating systems Symmetric sums and S-functions Rules interpretation Conclusions © cm

Motivation

Data-driven fuzzy modeling Generating fuzzy rules from examples: Method of L.X. Wang and J.M. Mendel Neuro-fuzzy extraction of rules from examples: Using feedforward neural networks with appropriate architecture © cm

“The goal“ © cm Fuzzy IT-Rules Neural Network

Let L(x) denote the number of linguistic terms associated to x. The extracted rule base has L(x 1 )·L(x 2 ) rules, but not necessarily as many different conclusions! ANFIS-like rule extraction © cm If x 1 is T 1j and x 2 is T 2i then x1x1 x2x2 T 1j T 2i x x x x x x x x x conclusion T jk (x i ) and

Analysis of ANFIS Advantages of ANFIS: ANFIS is a very good system to extract numerical models from numerical data ANFIS allows in principle the extraction of fuzzy rules from numerical data, but this aspect has not been further developed Drawbacks of ANFIS: The user has to define a priori the number of linguistic terms to be considered for the variables The conjunction of premises is based on (differentiable) t-norms, i.e. they are strickter than minimum and they induce a grid-like partition of the input space © cm

x 1 x 2 Evolutionary front-end

The golden rule of Soccer for a Coach “ If a player runs (with the ball) 100 m in 10 sec and he is a dribbling king and where he sees a free spot of the net he shoots the ball and the transfer fee is reasonable then go get him!! ” © cm

Compensating systems. (“The world between min and max“) Combination of t-norms and t-conorms.. e.g.:  -operators (Zimmermann and Zysno) Weighted min and max (Dubois) Symmetric Sums (Silbert, Dombi) Generalized average operators.. e.g.: Ordered weighted average (OWA) Weighted ordered weighted average (WOWA) Quasi-linear average © cm

x2x2 T 2i x + x + x + x + x + x1x1 T 1j x + x + x + x 31-3 33 aggr 3 [T 11 (x 1 ), T 23 (x 2 )] Let y 1 = T 11 (x 1 ) and y 2 = T 23 (x 2 ) aggr(y 1,y 2 ) =  t(y 1,y 2 ) + (1-  )t * (y 1,y 2 ) with t(y 1,y 2 ) = y 1 y 2 t * (y 1,y 2 ) = y 1 + y 2 - y 1 y 2 Learning the  -Operator with a neural network © cm

Generalized weighted operators of Dubois Let w = [w 1, w 2,..., w n ] where w i  [0,1], 1 < i < n. Then  y i  [0,1], 1 < i < n; t a t-norm and s ist dual t-conorm: t w (y 1,..., y n ) = t( s(y 1,1-w 1 ), s(y 2, 1-w 2 ),..., s(y n, 1-w n ) ) s w (y 1,..., y n ) = s( t(y 1, w 1 ), t(y 2, w 2 ),..., t(y n, w n ) ) Example: n=2; let t be the product and s, the algebraic sum. Then: t w (y 1, y 2 ) = (y 1 + (1-w 1 ) – y 1 (1-w 1 ))·(y 2 + (1-w 2 ) – y 2 (1-w 2 )) = ((1-w 1 ) + w 1 y 1 )·((1-w 2 ) + w 2 y 2 ) s w (y 1, y 2 ) = (w 1 y 1 + w 2 y 2 – w 1 w 2 y 1 y 2 ) © cm

x 1 x 2 p_sum w (T 1i (x 1 ),T 2j (x 2 )) + +. w i w j w i 1-w i j w j prod (T 1i (x 1 ), T 2j (x 2 )) W T kl (x k ) Generalized weighted operators of Dubois in ANFIS © cm p_sum W (y 1,y 2 ) = (w 1 y 1 + w 2 y 2 – w 1 w 2 y 1 y 2 ) prod W (y 1, y 2 ) =((1-w 1 ) + w 1 y 1 )·((1-w 2 ) + w 2 y 2 )

Extended logistic function - Symmetric sum  represents a Symmetric Sum operation and gives a non- linear combination of a t-norm and a t-conorm. Moreover ((0,1),  ) is an abelian group with identity ½ and inverse 1- ( ). © cm s t

Interpretation © cm  S (k) (x) S (2) (x) S (1) (x) xnxn x2x2 x1x1 S(x) S (1) (x) w 21 x 2 w n1 x n  w 11 x 1 S (1)  The weight w ij affects the slope of S (j), thus acting as a linguistic modifier.

 S (k) (x) S (2) (x) S (1) (x) xnxn x2x2 x1x1 S(x) S (1) (x) w 21 x 2 w n1 x n  w 11 x 1 S (1)  Neural network Fuzzy logic interpretation The value of the i-th input represents the value of the i-th premise and the value of the corresponding conclusion will be obtained as the symmetric summation of the degrees of satisfaction of the modified linguistic terms induced by the premises. © cm

 S (k) (x) S (2) (x) S (1) (x) xnxn x2x2 x1x1 S(x) S (1) (x) w 21 x 2 w n1 x n  w 11 x 1 S (1)  Neural network Fuzzy logic interpretation Let y j = S (j) (w ·x) = S (j) (w 1j x 1 + w 2j x w nj x n ) if x 1 is in S (j) and... and x n is in S (j) then y j = S (j) (x 1 ) ...  S (j) (x n ) w 1j w nj w 1j w nj © cm

f(x) = 1/[1 + e -x ] ;f(x) = 1/[1 + k -x ] f(x) = 1 – (1/  )arccot(x) f(x) = (1/2)[1 + x/(1 + |x|)]

S - Activation Functions Theorem: A neural network using S-functions as activation functions have the property of universal approximation. Definition: Let f be an S-function. Moreover  x 1, x 2  R, let f(x 1 ) = v x1 and f(x 2 ) = v x2. Then: f(x 1 )  f(x 2 ) = def f( f -1 (v x1 ) + f -1 (v x2 ) ) is an aggregation operator -(the general form of a symmetric summation)- and f is its generating function. cm

Examples ( with f = [1 + k -x-y ] -1 ) k=5 k=1.5 f(x)  f(y) cm

Conclusions There are real-world problems of compensating type, which cannot be properly modelled with t-norms Feedforward neural networks with S-activation functions may be used to extract compensating fuzzy if-then rules, where the premises are combined with a symmetric sum The extracted rules explain the role of the hidden nodes of the neural network, i.e. neural networks (of the above class) are no longer „black boxes“ The ANFIS-Architecture may be extended to allow extracting the  parameter of the linear combination of a t and a t* and to learn weighted operators © cm