Presentation is loading. Please wait.

Presentation is loading. Please wait.

Neuro-fuzzy modeling between minimum and maximum Claudio Moraga University of Dortmund Germany © cm Universidad Técnica Federico Santa María Valparaíso,

Similar presentations


Presentation on theme: "Neuro-fuzzy modeling between minimum and maximum Claudio Moraga University of Dortmund Germany © cm Universidad Técnica Federico Santa María Valparaíso,"— Presentation transcript:

1 Neuro-fuzzy modeling between minimum and maximum Claudio Moraga University of Dortmund Germany © cm Universidad Técnica Federico Santa María Valparaíso, Marzo, 2001

2 Outline Motivation Data-driven modeling Das ANFIS system Compensating systems Symmetric sums and S-functions Rules interpretation Conclusions © cm

3 Motivation

4 Data-driven fuzzy modeling Generating fuzzy rules from examples: Method of L.X. Wang and J.M. Mendel....... Neuro-fuzzy extraction of rules from examples: Using feedforward neural networks with appropriate architecture © cm

5 “The goal“ © cm Fuzzy IT-Rules Neural Network

6 Let L(x) denote the number of linguistic terms associated to x. The extracted rule base has L(x 1 )·L(x 2 ) rules, but not necessarily as many different conclusions! ANFIS-like rule extraction © cm If x 1 is T 1j and x 2 is T 2i then x1x1 x2x2 T 1j T 2i x x x x x x x x x conclusion T jk (x i ) and

7 Analysis of ANFIS Advantages of ANFIS: ANFIS is a very good system to extract numerical models from numerical data ANFIS allows in principle the extraction of fuzzy rules from numerical data, but this aspect has not been further developed Drawbacks of ANFIS: The user has to define a priori the number of linguistic terms to be considered for the variables The conjunction of premises is based on (differentiable) t-norms, i.e. they are strickter than minimum and they induce a grid-like partition of the input space © cm

8 x 1 x 2 Evolutionary front-end

9 The golden rule of Soccer for a Coach “ If a player runs (with the ball) 100 m in 10 sec and he is a dribbling king and where he sees a free spot of the net he shoots the ball and the transfer fee is reasonable then go get him!! ” © cm

10

11 Compensating systems. (“The world between min and max“) Combination of t-norms and t-conorms.. e.g.:  -operators (Zimmermann and Zysno) Weighted min and max (Dubois) Symmetric Sums (Silbert, Dombi) Generalized average operators.. e.g.: Ordered weighted average (OWA) Weighted ordered weighted average (WOWA) Quasi-linear average © cm

12 x2x2 T 2i x + x + x + x + x + x1x1 T 1j x + x + x + x + + +1 + 1-31-3 33 aggr 3 [T 11 (x 1 ), T 23 (x 2 )] Let y 1 = T 11 (x 1 ) and y 2 = T 23 (x 2 ) aggr(y 1,y 2 ) =  t(y 1,y 2 ) + (1-  )t * (y 1,y 2 ) with t(y 1,y 2 ) = y 1 y 2 t * (y 1,y 2 ) = y 1 + y 2 - y 1 y 2 Learning the  -Operator with a neural network © cm

13 Generalized weighted operators of Dubois Let w = [w 1, w 2,..., w n ] where w i  [0,1], 1 < i < n. Then  y i  [0,1], 1 < i < n; t a t-norm and s ist dual t-conorm: t w (y 1,..., y n ) = t( s(y 1,1-w 1 ), s(y 2, 1-w 2 ),..., s(y n, 1-w n ) ) s w (y 1,..., y n ) = s( t(y 1, w 1 ), t(y 2, w 2 ),..., t(y n, w n ) ) Example: n=2; let t be the product and s, the algebraic sum. Then: t w (y 1, y 2 ) = (y 1 + (1-w 1 ) – y 1 (1-w 1 ))·(y 2 + (1-w 2 ) – y 2 (1-w 2 )) = ((1-w 1 ) + w 1 y 1 )·((1-w 2 ) + w 2 y 2 ) s w (y 1, y 2 ) = (w 1 y 1 + w 2 y 2 – w 1 w 2 y 1 y 2 ) © cm

14 x 1 x 2 p_sum w (T 1i (x 1 ),T 2j (x 2 )) + +. w i w j 1 + + +. w i 1-w i j w j 1 1 1 prod (T 1i (x 1 ), T 2j (x 2 )) W T kl (x k ) Generalized weighted operators of Dubois in ANFIS © cm p_sum W (y 1,y 2 ) = (w 1 y 1 + w 2 y 2 – w 1 w 2 y 1 y 2 ) prod W (y 1, y 2 ) =((1-w 1 ) + w 1 y 1 )·((1-w 2 ) + w 2 y 2 )

15 Extended logistic function - Symmetric sum  represents a Symmetric Sum operation and gives a non- linear combination of a t-norm and a t-conorm. Moreover ((0,1),  ) is an abelian group with identity ½ and inverse 1- ( ). © cm 0 1 1 s t

16 Interpretation © cm  S (k) (x) S (2) (x) S (1) (x) xnxn x2x2 x1x1 S(x) S (1) (x) w 21 x 2 w n1 x n  w 11 x 1 S (1)  The weight w ij affects the slope of S (j), thus acting as a linguistic modifier.

17  S (k) (x) S (2) (x) S (1) (x) xnxn x2x2 x1x1 S(x) S (1) (x) w 21 x 2 w n1 x n  w 11 x 1 S (1)  Neural network Fuzzy logic interpretation The value of the i-th input represents the value of the i-th premise and the value of the corresponding conclusion will be obtained as the symmetric summation of the degrees of satisfaction of the modified linguistic terms induced by the premises. © cm

18  S (k) (x) S (2) (x) S (1) (x) xnxn x2x2 x1x1 S(x) S (1) (x) w 21 x 2 w n1 x n  w 11 x 1 S (1)  Neural network Fuzzy logic interpretation Let y j = S (j) (w ·x) = S (j) (w 1j x 1 + w 2j x 2 +... + w nj x n ) if x 1 is in S (j) and... and x n is in S (j) then y j = S (j) (x 1 ) ...  S (j) (x n ) w 1j w nj w 1j w nj © cm

19

20 f(x) = 1/[1 + e -x ] ;f(x) = 1/[1 + k -x ] f(x) = 1 – (1/  )arccot(x) f(x) = (1/2)[1 + x/(1 + |x|)]

21 S - Activation Functions Theorem: A neural network using S-functions as activation functions have the property of universal approximation. Definition: Let f be an S-function. Moreover  x 1, x 2  R, let f(x 1 ) = v x1 and f(x 2 ) = v x2. Then: f(x 1 )  f(x 2 ) = def f( f -1 (v x1 ) + f -1 (v x2 ) ) is an aggregation operator -(the general form of a symmetric summation)- and f is its generating function. cm

22 Examples ( with f = [1 + k -x-y ] -1 ) k=5 k=1.5 f(x)  f(y) cm

23 Conclusions There are real-world problems of compensating type, which cannot be properly modelled with t-norms Feedforward neural networks with S-activation functions may be used to extract compensating fuzzy if-then rules, where the premises are combined with a symmetric sum The extracted rules explain the role of the hidden nodes of the neural network, i.e. neural networks (of the above class) are no longer „black boxes“ The ANFIS-Architecture may be extended to allow extracting the  parameter of the linear combination of a t and a t* and to learn weighted operators © cm


Download ppt "Neuro-fuzzy modeling between minimum and maximum Claudio Moraga University of Dortmund Germany © cm Universidad Técnica Federico Santa María Valparaíso,"

Similar presentations


Ads by Google