Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 5: EP and DE 1 Evolutionary Computational Intelligence Lecture 5a: Overview about Evolutionary Programming Ferrante Neri University of Jyväskylä.

Similar presentations


Presentation on theme: "Lecture 5: EP and DE 1 Evolutionary Computational Intelligence Lecture 5a: Overview about Evolutionary Programming Ferrante Neri University of Jyväskylä."— Presentation transcript:

1 Lecture 5: EP and DE 1 Evolutionary Computational Intelligence Lecture 5a: Overview about Evolutionary Programming Ferrante Neri University of Jyväskylä

2 Lecture 5: EP and DE 2 EP quick overview Developed: USA in the 1960’s Early names: D. Fogel Typically applied to: – traditional EP: machine learning tasks by finite state machines – contemporary EP: (numerical) optimization Attributed features: – very open framework: any representation and mutation op’s OK – crossbred with ES (contemporary EP) – consequently: hard to say what “standard” EP is Special: – no recombination – self-adaptation of parameters standard (contemporary EP)

3 Lecture 5: EP and DE 3 EP technical summary tableau RepresentationReal-valued vectors RecombinationNone MutationGaussian perturbation Parent selectionDeterministic Survivor selection Probabilistic (  +  ) SpecialtySelf-adaptation of mutation step sizes (in meta-EP)

4 Lecture 5: EP and DE 4 Historical EP perspective EP aimed at achieving intelligence Intelligence was viewed as adaptive behaviour Prediction of the environment was considered a prerequisite to adaptive behaviour Thus: capability to predict is key to intelligence

5 Lecture 5: EP and DE 5 Finite State Machine as predictor Consider the following FSM Task: predict next input Quality: % of in (i+1) = out i Given initial state C Input sequence 011101 Leads to output 110111 Quality: 3 out of 5

6 Lecture 5: EP and DE 6 Representation For continuous parameter optimization Chromosomes consist of two parts: – Object variables: x 1,…,x n – Mutation step sizes:  1,…,  n Full size:  x 1,…,x n,  1,…,  n 

7 Lecture 5: EP and DE 7 Mutation Chromosomes:  x 1,…,x n,  1,…,  n   i ’ =  i (1 +  N(0,1)) x’ i = x i +  i ’ N i (0,1)   0.2 boundary rule:  ’ <  0   ’ =  0 Other variants proposed & tried: – Lognormal scheme as in ES – Using variance instead of standard deviation – Mutate  -last – Other distributions, e.g, Cauchy instead of Gaussian

8 Lecture 5: EP and DE 8 Recombination None Rationale: one point in the search space stands for a species, not for an individual and there can be no crossover between species Much historical debate “mutation vs. crossover” Pragmatic approach seems to prevail today

9 Lecture 5: EP and DE 9 Parent selection Each individual creates one child by mutation Thus: – Deterministic – Not biased by fitness

10 Lecture 5: EP and DE 10 Survivor selection P(t):  parents, P’(t):  offspring Pairwise competitions in round-robin format: – Each solution x from P(t)  P’(t) is evaluated against q other randomly chosen solutions – For each comparison, a "win" is assigned if x is better than its opponent – The  solutions with the greatest number of wins are retained to be parents of the next generation Parameter q allows tuning selection pressure Typically q = 10

11 Lecture 5: EP and DE 11 Example application: evolving checkers players (Fogel’02) Neural nets for evaluating future values of moves are evolved NNs have fixed structure with 5046 weights, these are evolved + one weight for “kings” Representation: – vector of 5046 real numbers for object variables (weights) – vector of 5046 real numbers for  ‘s Mutation: – Gaussian, lognormal scheme with  -first – Plus special mechanism for the kings’ weight Population size 15

12 Lecture 5: EP and DE 12 Example application: evolving checkers players (Fogel’02) Tournament size q = 5 Programs (with NN inside) play against other programs, no human trainer or hard-wired intelligence After 840 generation (6 months!) best strategy was tested against humans via Internet Program earned “expert class” ranking outperforming 99.61% of all rated players

13 Lecture 5: EP and DE 13 Evolutionary Computational Intelligence Lecture 5b:Differential Evolution

14 Lecture 5: EP and DE 14 Brief historical overview The Term Differntial Evolution has been coined in 1994 by Storn and Proce (Germany-USA) Some important invesigations have been recently done by Lampinen The so far only existing book has been published in 2005

15 Lecture 5: EP and DE 15 Representation Differential Evolution in its original implementation is intended for vectors of real numbers Nevertheless it can be employed also in the case of integer problems, probably loosing in terms of efficiency

16 Lecture 5: EP and DE 16 Population models GA and “comma” ES employ a generational logic: offspring population replaces entirely the previous population “plus” ES considers both parents and offspring and after having sorted them selects a predetermined number of best performing individuals Differential Evolution (DE) emplys a steady-state logic (also used in some GAs): the successfull offspring immediately “kills” the weakest parent

17 Lecture 5: EP and DE 17 Initial Sampling A set of vectors in sampled, usually at random with the boundaries of the decision space And these vector represent the design variables that we are willing to optimize Our population size must be at least four

18 Lecture 5: EP and DE 18 Parent selection Four individuals x 1, x 2, x 3, x 4 are selected at random from the population by means of a uniformly distributed function Like in ES there is no selection pressure for the choice of the parents undergoing variation operators (recombination and mutation)

19 Lecture 5: EP and DE 19 Recombination A provisional offspring x offp is generated by: x offp =x 1 +K(x 2 -x 3 ) where K is s constant value usually set equal to 0.7

20 Lecture 5: EP and DE 20 Mutation With a certain probability some genes of the provisional offspring are replaced with some genes of x 4. The probability of happening such mutation is usually set to 0.3

21 Lecture 5: EP and DE 21 Survivor seelection The offspring x off is thus generated. The fitness value of x off is calculated and,according to a steady-state strategy, if x off outperforms x 4, it replaces x 4, if on the contrary f(x off )>f(x 4 ), no replacement occurs.

22 Lecture 5: EP and DE 22 Observations The steady state logic makes the DE structure without generation loops since the replacements occurs as soon as a better solution is generated Exploratory logic of DE has a slight analogy with Nelder Mead since it lets the search directions been led by means of existing solutions. Analogy for 2 dimension case is rather strong The DE is very promising but the biggest limit it has is the risk of stagnation

23 Lecture 5: EP and DE 23 Premature Convergence/ Stagnation There are the main defects in EAs Premature Convergence: It occurs when all the population does not have any difference (one genotype) and the corrensponding fitness value is suboptimal (+ strategy) Stagnation:It occurs when, notwithstanding a high diversity, there are no improvements (superfit individual)

24 Lecture 5: EP and DE 24 Evolutionary Computational Intelligence Lecture 5c:Handling Multimodality

25 Lecture 5: EP and DE 25 Motivation 1: Multimodality Most interesting problems have more than one locally optimal solution.

26 Lecture 5: EP and DE 26 Motivation 2: Genetic Drift Finite population with global (panmictic) mixing and selection eventually convergence around one optimum Often might want to identify several possible peaks This can aid global optimisation when sub-optima has the largest basin of attraction

27 Lecture 5: EP and DE 27 Biological Motivation 1: Speciation In nature different species adapt to occupy different environmental niches, which contain finite resources, so the individuals are in competition with each other Species only reproduce with other members of the same species (Mating Restriction) These forces tend to lead to phenotypic homogeneity within species, but differences between species

28 Lecture 5: EP and DE 28 Biological Motivation 2: Punctuated Equilbria Theory that periods of stasis are interrupted by rapid growth when main population is “invaded” by individuals from previously spatially isolated group of individuals from the same species The separated sub-populations (demes) often show local adaptations in response to slight changes in their local environments

29 Lecture 5: EP and DE 29 Implications for Evolutionary Optimization Two main approaches to diversity maintenance: Implicit approaches: – Impose an equivalent of geographical separation – Impose an equivalent of speciation Explicit approaches – Make similar individuals compete for resources (fitness) – Make similar individuals compete with each other for survival


Download ppt "Lecture 5: EP and DE 1 Evolutionary Computational Intelligence Lecture 5a: Overview about Evolutionary Programming Ferrante Neri University of Jyväskylä."

Similar presentations


Ads by Google