Presentation is loading. Please wait.

Presentation is loading. Please wait.

Auto-Tuning Fuzzy Granulation for Evolutionary Optimization Auto-Tuning Fuzzy Granulation for Evolutionary Optimization Mohsen Davarynejad, Ferdowsi University.

Similar presentations


Presentation on theme: "Auto-Tuning Fuzzy Granulation for Evolutionary Optimization Auto-Tuning Fuzzy Granulation for Evolutionary Optimization Mohsen Davarynejad, Ferdowsi University."— Presentation transcript:

1 Auto-Tuning Fuzzy Granulation for Evolutionary Optimization Auto-Tuning Fuzzy Granulation for Evolutionary Optimization Mohsen Davarynejad, Ferdowsi University of Mashhad, davarynejad@kiaeee.org davarynejad@kiaeee.org M.-R.Akbarzadeh-T., Ferdowsi University of Mashhad. akbarzadeh@ieee.org akbarzadeh@ieee.org Carlos A. Coello Coello, CINVESTAV-IPN, ccoello@cs.cinvestav.mx ccoello@cs.cinvestav.mx

2 Outline of the presentation Brief introduction to evolutionary algorithms Fitness Approximation Methods Meta-Models in Fitness Evaluations Adaptive Fuzzy Fitness Granulation (AFFG) AFFG with fuzzy supervisory Numerical Results Summery and Contributions

3 Outline of the presentation Brief introduction to evolutionary algorithms Fitness Approximation Methods Meta-Models in Fitness Evaluations Adaptive Fuzzy Fitness Granulation (AFFG) AFFG with fuzzy supervisory Numerical Results Summery and Contributions

4 Brief Introduction to Evolutionary Algorithms Generic structure evolutionary algorithms –Problem representation (encoding) –Selection, recombination and mutation –Fitness evaluations Evolutionary algorithms –Genetic algorithms –Evolution strategies –Genetic programming … Pros and cons –Stochastic, global search –No requirement for derivative information –Need large number of fitness evaluations –Not well-suitable for on-line optimization

5 Motivations (When fitness approximation is necessary?) No explicit fitness function exists: to define fitness quantitatively Fitness evaluation is highly time-consuming: to reduce computation time Fitness is noisy: to cancel out noise Search for robust solutions: to avoid additional expensive fitness evaluations

6 Outline of the presentation Brief introduction to evolutionary algorithms Fitness Approximation Methods Meta-Models in Fitness Evaluations Adaptive Fuzzy Fitness Granulation (AFFG) AFFG with fuzzy supervisory Numerical Results Summery and Contributions

7 Fitness Approximation Methods Problem approximation –To replace experiments with simulations –To replace full simulations /models with reduced simulations / models Ad hoc methods –Fitness inheritance (from parents) –Fitness imitation (from brothers and sisters) Data-driven functional approximation (Meta-Models) –Polynomials (Response surface methodology) –Neural networks, e.g., multilayer perceptrons (MLPs), RBFN –Support vector machines (SVM)

8 Fitness Approximation Methods Problem approximation –To replace experiments with simulations –To replace full simulations /models with reduced simulations / models Ad hoc methods –Fitness inheritance (from parents) –Fitness imitation (from brothers and sisters) Data-driven functional approximation (Meta-Models) –Polynomials (Response surface methodology) –Neural networks, e.g., multilayer perceptrons (MLPs), RBFN –Support vector machines (SVM)

9 Outline of the presentation Brief introduction to evolutionary algorithms Fitness Approximation Methods Meta-Models in Fitness Evaluations Adaptive Fuzzy Fitness Granulation (AFFG) AFFG with fuzzy supervisory Numerical Results Summery and Contributions

10 Meta-Model in Fitness Evaluations* Use Meta-Models Only: Risk of False Convergence So a combination of meta-model with the original fitness function is necessary –Generation-Based Evolution Control –Individual-Based Evolution Control (random strategy or a best strategy) *Yaochu Jin. “A comprehensive survey of fitness approximation in evolutionary computation”. Soft Computing, 9(1), 3-12, 2005

11 Generation-Based Evolution Control Generation t Generation t+1 Generation t+2 Generation t+3 Generation t+4

12 Individual-Based Evolution Control Generation t Generation t+1 Generation t+2 Generation t+3 Generation t+4

13 Fitness Approximation Methods Problem approximationMeta-ModelAd hoc methods Generation-BasedIndividual-Based

14 Fitness Approximation Methods Problem approximationMeta-ModelAd hoc methods Generation-BasedIndividual-Based ADAPTIVE FUZZY FITNESS GRANULATION is Individual-Based Meta-Model Fitness Approximation Method

15 GRADUATION AND GRANULATION The basic concepts of graduation and granulation form the core of FL and are the principal distinguishing features of fuzzy logic. More specifically, in fuzzy logic everything is or is allowed to be graduated, or equivalently, fuzzy. Furthermore, in fuzzy logic everything is or is allowed to be granulated, with a granule being a clump of attribute-values drawn together by indistinguishability, similarity, proximity or functionality. Graduated granulation, or equivalently fuzzy granulation, is a unique feature of fuzzy logic. Graduated granulation is inspired by the way in which humans deal with complexity and imprecision.* *L. A. Zadeh, www.fuzzieee2007.org/ZadehFUZZ-IEEE2007London.pdf

16 Outline of the presentation Brief introduction to evolutionary algorithms Fitness Approximation Methods Meta-Models in Fitness Evaluations Adaptive Fuzzy Fitness Granulation (AFFG) AFFG with fuzzy supervisory Numerical Results Summery and Contributions

17 Fuzzy Similarity Analysis based on Granulation Pool Yes FF Evaluation ADAPTIVE FUZZY FITNESS GRANULATION (AFFG) FF Association No Phenospace Fitness of Individual Update Granulation Table

18 Create initial random population Evaluate Fitness of solution exactly Termination Criterion Satisfied? Designed Results Is the solution similar to an existing granule? Gen:=0 Gen:=Gen+1 Yes No Yes No Look up fitness from the queue of granules End Select a solution Update table life function and granule radius Yes Add to the queue as a new granule No AFFG Part Finished Fitness Evaluation of population? Perform Reproduction Perform Crossover and Mutation Fig. 1. Flowchart of the Purposed AFFG Algorithm Flowchart of the Purposed AFFG Algorithm

19 AFFG (III) Random parent population is initially created. Here, G is a set of fuzzy granules,

20 AFFG (IV) Calculate average similarity of a new solution to each granule is:

21 AFFG (V) larger and Since members of the initial populations generally have less fitness, larger and is smaller, fitness is assumed more often initially by estimation/association to the granules.

22 AFFG (V) larger and Since members of the initial populations generally have less fitness, larger and is smaller, fitness is assumed more often initially by estimation/association to the granules.

23 Outline of the presentation Brief introduction to evolutionary algorithms Fitness Approximation Methods Meta-Models in Fitness Evaluations Adaptive Fuzzy Fitness Granulation (AFFG) AFFG with fuzzy supervisory Numerical Results Summery and Contributions

24 AFFG - FS In order to avoid tuning parameters, a fuzzy supervisor as auto- tuning algorithm is employed with three inputs. Number of Design Variables (NDV) Maximum Range of Design Variables (MRDV) Percentage of Completed Generations (PCG)

25 The knowledge base of the above architecture has large number of rules and the extraction of these rules is very difficult. Consequently the new architecture is proposed in which the controller is separated in two controllers to diminish the complexity of rules.

26 AFFG – FS Granules compete for their survival through a life index. is initially set at N and subsequently updated as below: Where M is the life reward of the granule and K is the index of the winning granule for each individual in generation i. Here we set M = 5. At each table update, only granules with highest index are kept, and others are discarded.

27 Outline of the presentation Brief introduction to evolutionary algorithms Fitness Approximation Methods Meta-Models in Fitness Evaluations Adaptive Fuzzy Fitness Granulation (AFFG) AFFG with fuzzy supervisory Numerical Results Summery and Contributions

28 List of Test benchmark Functions FunctionFormula Griewangk Rastrigin Akley

29 Parameters used for AFFG Function Griewangk0.00012190 Rastrigin0.0040.15 Akley0.020.25 is considered as a constant parameter and is equal to 0.9 for all simulation results

30

31

32

33 Effect of on the convergence behavior only granules with highest index are kept, and others are discarded. The effect of varying number of granules on the convergence behavior of AFFG and AFFG-FS is studied.

34

35 Outline of the presentation Brief introduction to evolutionary algorithms Fitness Approximation Methods Meta-Models in Fitness Evaluations Adaptive Fuzzy Fitness Granulation (AFFG) AFFG with fuzzy supervisory Numerical Results Summery and Contributions

36 Summery Fitness approximation Replaces an accurate, but costly function evaluation with approximate, but cheap function evaluations. Meta-modeling and other fitness approximation techniques have found a wide range of applications. Proper control of meta-models plays a critical role in the success of using meta-models. Small number of individuals in granule pool can still make good results.

37 Contribution Why AFFG-FS? –Avoids initial training. –Uses guided association to speed up search process. –Gradually sets up an independent model of initial training data to compensate the lack of sufficient training data and to reach a model with sufficient approximation accuracy. –Avoids the use of model in unrepresented design variable regions in the training set. –In order to avoid tuning of parameters, A fuzzy supervisor as auto-tuning algorithm is being employed. Features of AFFG-FS: –Exploits design variable space by means of Fuzzy Similarity Analysis (FSA) to avoid premature convergence. –Dynamically updates the association model with minimal overhead cost.


Download ppt "Auto-Tuning Fuzzy Granulation for Evolutionary Optimization Auto-Tuning Fuzzy Granulation for Evolutionary Optimization Mohsen Davarynejad, Ferdowsi University."

Similar presentations


Ads by Google