Download presentation

Presentation is loading. Please wait.

Published byZachary Corcoran Modified over 3 years ago

1
Evolutionary Algorithms

2
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Overview Motivation Nature as a Standard Genetic Algorithms Genetic Programming Evolutionary Strategies Conclusion

3
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Motivation Since millions of years creatures populate Earth By changes in the biosphere there are again and again new environmental conditions Populations had to learn to adapt to the new conditions; permanent stepwise development, few stagnancy Organisms are optimally adapted with respect to their needs Nature has its own laws, rules, strategies, and mechanisms ' Evolution: successful, robust mechanism, allows creatures over generations to adapt to environmental conditions ' Goal of evolution is not predefined; optimisation, innovation, creativity ' Selection factors: competition, food supply, enemies, climate, environment, via human beings additionally breed,

4
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Overview Motivation Nature as a Standard Genetic Algorithms Genetic Programming Evolutionary Strategies Conclusion

5
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Nature as a Standard - Evolution, Genome Lamarck's thesis (1809): Adaptation, urge to perfection (by specific needs) spontaneous creations, heredity of acquired characteristics (somatic induction) -> no feedback in genome Darwin's thesis (1859): permanent evolution, common descent, multiplication of species, gradual change, natural selection, descending of characteristics with modification Basic conditions: too rich production of genetic variations, limitation of resources (competition) Fitness: suitability, result of multiple interactions with selection factors of the environment

6
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Nature as a Standard - Evolution, Genome Gene: functional unit, relative short segment of DNA, information how to build a protein molecule Gene-pool: sum of all genotype-variants of a population Genotype: all the genes (genome), generally structures, contain information, instructions to define individual characteristics Phenotype: interpretation of the genes, expression of the genome as individual characteristics, competes with other phenotypes for reproductive success in a specific setting (basic conditions of the environment) => selection filter

7
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Overview Motivation Nature as a Standard Genetic Algorithms Classic Always Algorithm Selection Representation of Hypothesis Genetic Operators Procedure of Evolution Schema Theorem Applications Genetic Programming Evolutionary Strategies Conclusion

8
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Genetic Algorithms John H. Holland 1975 ; David E. Goldberg 1986 Goal of optimisation, "generate-and-test beam search" Variability (Heterogenity of the characteristics, singleness, variety) Differential fitness (propagation rate depends on the ability to survive in a specific setting, to reproduce descendants) Heritable fitness (circulate the genome, incomplete copy, by mixture of different descendants) Dualism Genotype/Phenotype

9
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Classic Always Algorithm Coding, structures representation of hypothesis and individuals Interpretation function what does the coding represent? Fitness function shall be optimised Termination criteria is the optimum approximately reached? Selection function which individuals determine the next population? Initialise: generate randomly n individuals for the initial population P(0) Evaluate: determine for all t := 0 Generation 0 while not Selection: choose stochastically individuals according to their fitness Crossover: create children via the recombination of parental individuals from P' Mutation: change randomly the representation of child individuals from P' Update: put n, randomly picked child individuals from P' to P(t+1) t := t + 1 increment generation Evaluate return Individual with highest fitness value

10
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Representation of Hypothesis Coding Representation of the parameters (hypothesis, individuals) to be optimised by structures over a discrete alphabet, mostly bit-strings s = ( ) s = (atggcaact) with alphabet A = {a, t, g, c} Interpretation Mapping p from the genotypical structure space into the phenotypical characteristics and behaviour space Production system s = ( ) : IF a1=T & a2=F THEN c=T ; IF a2=T THEN c=F Triplet : amino acid

11
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Selection Best fitting individuals shall build a descendant population, one-sidedness shall be avoided by stochastic selection algorithms Fitness-proportional selection: Roulette algorithm proportional to their own fitness, indirectly proportional to competitors. Problem: Super individuals may dominate too much Rank-based selection: Individuals are sorted ascendingly according to their fitness; selection is done by a roulette algorithm based on the position in this ranking

12
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Genetic Operators Mutation Mutation probability, uniformly distributed random number With a discrete alphabet and a maximal mutation distance, to limit variation. Defining the distance measure of the alphabet P = > bit-wise cgeehadcdhh --> chdcgadcdfh Mutation distance 2 (lexicographic)

13
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Genetic Operators (2) Multipoint analogous, e.g. uniformly or odd-even: Mask: Multi-recombination (more than 2 parental chromosomes): random selection of 2 parents, as above several parents \ Mask: /

14
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Genetic Operators (2) Recombination (Crossover) Crossing point(s) randomly determined or by a fixed mask Single-point: Crossing point: Mask: Dual-point: bbafdeacca bbabacacca Crossing points: 3, 6 edebacbfbb edefdebfbb Mask:

15
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Genetic Operators (3) Inversion mirrored (generally: permuted) insertion of the middle part > inverted fgbbcdadace --> fgbbcdcdaea permuted Deletion loss of an arbitrary part > intercalar fgbbcdadace --> fgbbcda terminal Duplication duplication of an arbitrary part > fgbbcdadace --> fgbbcdadacedace

16
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Procedure of Evolution Example: global maximum of a (multi-modal) function Bit-vectors: Interpretation: as in the example Evaluation: compute the function at the interpreted location 5 (3) Populations independently of each other Strategies: population size, recombination partner, create descendants, mutate Plus selection from parents and mutated children Comma selection from mutated children, individuals survive at most one generation Variants

17
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Schema Theorem Schema: word over alphabet A* Instance: all words which are equal to at the fixed positions Order: o(H) number of fixed elements Defining length: segment length between the outermost fixed positions e.g. A = {a,b,c}, = (b, *, c, a, *, *) ; o( ) = 3, = = 3 Instances: (b, a, c, a, a, a), (b, b, c, a, b, c), (b, c, c, a, c, a) Premises: infinite large population, single-point-crossover, punctual mutation Which templates survive (stay instances of the schema)? exponential propagation, if Selection: more than average fitness Recombination: short defining length Mutation: few fixed positions As compact as possible conglomeration of gene groups, which are responsible for the increased fitness: building blocks

18
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Applications Can be run easily in parallel In combination with gradient algorithms (hill-climbing): Maximum search for rough restriction of the search space Simulation of living cells Production system as an extension to expert systems Planning optimisation (storage, production processes,...) Optimal game strategies Travelling-Salesman-Problem: structure contains indices of the nodes in visiting order. To visit each node exactly once: modification of the genetic operators Evolution of the structure of neural nets: representation organised in segments depending on the number of output-neurones; codes the number of layers, hidden neurons and according weights

19
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Overview Motivation Nature as a Standard Genetic Algorithms Genetic Programming Representation of the Hypothesis Differences to GA Applications Evolutionary Strategies Conclusion

20
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Genetic Programming John R. Koza, 1989 Further development of the idea of genetic algorithms Genetic creation and optimisation of computer programs for special problem-areas Representation of the Hypothesis Differences to GA Applications

21
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Representation of the Hypothesis Computer program as tree structure (like parse-tree, LISP-Syntax) Combining elements: definition of terms and functions Arithmetic expressions: {PLUS2, MINUS2, MULT2, DIV2} Functions: {SIN1, COS1, EXP2, LOG2,...} Relations, conditional statement: {LESS2, EQUAL2, IF-THEN-ELSE3,...} Problem related: {TURN-LEFT, PICK-UP, MOVE-RANDOM,...} Tree structure: IF-THEN-ELSE LESS MULT ADD A B A C B C LISP-Syntax: ( IF-THEN-ELSE ( LESS A B ) ( MULT A C ) ( ADD B C ) ) Closed under composition Complete according to the problem to be solved

22
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Differences to GA Recombination Exchange arbitrarily chosen sub- trees Random determination of the crossing points Even with identical terms mostly new structure pairs Both children survive Mutation Substitution of a sub-tree by a newly generated sub-tree Random selection of a node Substitution by a randomly new term which is correctly generated out of building blocks

23
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Differences to GA Recombination, Mutation: context-sensitive variation Selection: matches the algorithmic solution of the given problem Formulation as fitness value, e.g. Distance measure for numeric problems Successfully solved / identified cases Copy operator: copies a GP-chromosome unchanged into the next generation Each genome is only modified by a single operator: selection between operators Extension of terms to symbolic expressions

24
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Genetic Programming (Example) Minimise a Boolean function 24

25
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Genetic Programming (Example Results) 1000 individuals, 1000 steps (8 minutes) starting length: 127, results 71, 57

26
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Applications Ant searching for food on a minimal route Classification of groups belonging together for complex areas, e.g. swallowed spirals Robots searching for objects and doing precisely oriented moves Robots following walls Random number generator with a distribution as uniformly as possible Backwards docking of a truck with its hanger Steering a robot arm with two joints to points in a field Design of electronic circuits for analogous filters

27
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Overview Motivation Nature as a Standard Genetic Algorithms Genetic Programming Evolutionary Strategies Idea, basic Principles Differences to GA Applications Conclusion

28
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Evolutionary Strategies Ingo Rechenberg, 1964 / 1994 Adaptation of the basic mechanisms of natural evolution to technical optimisation problems by engineering sciences Root: evolutionary experimental methods, focussed on the physical experiment Results of the (at that time) unorthodox methods could not be analytically founded or reproduced Idea, basic principles Differences to GA Applications

29
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Development, Idea Given: experimental equipment with variable parameters Mechanic: changing position by pitch and angle Elastic: outline by bending Combination of segments of different sizes Random change of the parameters in a certain area (mostly binomially distributed: little mutation prefered) Measuring the experimental result: if getting worse then back propagation of the changes Repeat until optimum is found Representation: Parameter as real-valued vector Original experiment: orthogonal pipe redirection with smallest loss

30
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Development, Idea (2)

31
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Differences to GA Algorithmic representable, expandable to populations / several descendants Representation expanded by strategy parameters: Describe variance for controlling the mutation spreading of the appropriate parameter, can be integrated in the optimum search (adaptation of the increment) Real-valued structures: adaptation of the genetic operators Mutation: numeric deviation ; Gauss distributed random number, average 0, variance Recombination: discrete (randomly copied from the one or the other parent chromosome), intermediary (average building), local (single individuals), global (whole population) Random selection, no proportionality of the fitness Surplus of descendants, selection of the best for succeeding population

32
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Evolutionary Strategies (Example) Fluid storage Changeable shape Fixed volume Minimal surface

33
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Evolutionary Strategies (Example Results) 100 individuals 100 generations

34
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Applications Pressure output of a two phases supersonic cone, segments with variable diameter Flow resistance of a joint plate, 5 joints with 51 engaging levels (0, +, -) each Rotation body form with little flow resistance, air plane... dolphin spindle Minimal weight construction of a bow bridge Flexion of a lens for concentration on focus Magic square: 30x30 with magic sum Networking with minimal lengths and a given branching degree

35
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning Conclusion Evolutionary algorithms solve optimisation problems Standard is the natural evolution, which produces permanently new and partly improved organisms, which must assert themselves in their environment Basis is the biological adaptation as a learning procedure of populations of natural organisms Hypotheses are interpreted and evaluated by a fitness function The hypothesis room is explored by a stochastic search: Selection as fitness proportional procedure New hypotheses come up by recombination and mutation, similar to the chromosomes of organisms The representation can be done by bit-strings/character-strings (GA), programs as term and function trees (GP) or real-valued parameter vectors (ES) The convergence of the algorithms is mostly very good, but not guaranteed They work also with complex problems, where other algorithms have failed on or are not (yet) known

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google