Presentation is loading. Please wait.

Presentation is loading. Please wait.

14/09/05 DMI - Università di Catania 1 An Introduction to Evolutionary Algorithms Giuseppe Nicosia Department.

Similar presentations


Presentation on theme: "14/09/05 DMI - Università di Catania 1 An Introduction to Evolutionary Algorithms Giuseppe Nicosia Department."— Presentation transcript:

1 14/09/05 DMI - Università di Catania 1 An Introduction to Evolutionary Algorithms Giuseppe Nicosia nicosia@dmi.unict.it www.dmi.unict.it/~nicosia Department of Mathematics and Computer Science University of Catania

2 14/09/05 DMI - Università di Catania 1 Evolutionary Optimization EAs are optimization methods based on an evolutionary metaphor that showed effective in solving difficult problems. Aims Introduce the main concepts, techniques and applications in the field of evolutionary computation. Introduce the main concepts, techniques and applications in the field of evolutionary computation.

3 14/09/05 DMI - Università di Catania 1 Darwinian Evolution C. Darwin, On the Origin of Species by Means of Natural Selection, 1859. Four Postulates 1. Individuals within species are variable; 2. Some of the variations are passed on to offspring; 3. In every generation, more offspring are produced than can survive; 4. The survival and reproduction of individuals are not random: The individuals who survive and go on to reproduce, or who reproduce the most, are those with the most favourable variations. They are naturally selected.

4 14/09/05 DMI - Università di Catania 1 Nature of Natural Selection - Freeman, S. & Herron, J.. C. Evolutionary Analysis, 2nd ed., Prentice-Hall 2001. - Stearns, S. C. and Hoekstra, R. F. Evolution. An Introduction, Oxford University Press, 2000. – Natural Evolution acts... On Individuals, but the Consequences occur in the population On Individuals, not groups On Phenotypes, but evolution consist of changes in the Genotype On exixting traits, but can produce new traits – Evolution... Is backward looking Is not perfect Is nonrandom Is not progressive

5 14/09/05 DMI - Università di Catania 1 Why are we Interested ? – Results' of Evolution are 'Creative', 'Surprising', 'Unexpected' 'Highly adapted' to 'Environmental Niches' – Can a program 'create things like this' ?

6 14/09/05 DMI - Università di Catania 1 Why are we Interested ? (contd..) – Unsupervised ! – No 'conscious' design – No knowledge involved – Instead: Reproductive Fitness – But ! Natural Evolution had an extremely long time (3.7 Billion Years!) Natural Evolution acts in parallel

7 14/09/05 DMI - Università di Catania 1 Evolutionary Computation – Is the study of computational systems which use ideas and get inspirations from natural evolution and other biological systems (Artificial Ants, Articifial Cells, Artificial immune systems, and so on). – Evolutionary Algorithms (EAs) are closely linked to AI techniques, especially search techniques. EAs can be regarded as population-based stochastic generate-and-test algorithms. V.I. – Evolutionary Computation (EC) techniques are used in optimization, machine learning and automatic design.

8 14/09/05 DMI - Università di Catania 1 A Simple EA 1. t := 0; 2. Generate initial Population P (t) at random; 3. Evaluate the fitness of each individual in P (t) ; 4. while (not termination condition) do 5. Select parents, Pa (t) from P (t) based on their fitness in P (t) ; 6. Apply crossover to create offspring from parents: Pa (t) ->O (t) 7. Apply mutation to the offspring: O (t) ->O (t) 8. Evaluate the fitness of each individual in O (t) ; 9. Select population P (t+1) from current offspring O (t) and parents P (t) ; 10. t := t+1; 11. end-do

9 14/09/05 DMI - Università di Catania 1 Remarks – There is no restriction on the fitness (objective) function. It can be non-differentiable or even discontinuous V.I. – There is no need to know the exact form of the objective function. Simulation can be used to derive a fitness value V.I. – The initial population does not have to be generated randomly You can use existing knowledge to seed population – The representation does not have to be binary – There are alternative options for all steps of the algorithm Genetic operators, selection, termination

10 14/09/05 DMI - Università di Catania 1 Main components of EAs Coding (Binary, integer, Real, etc.) 1. Representation of individuals: Coding (Binary, integer, Real, etc.). Fitness 2. Evaluation method for individuals: Fitness. 1st generation (random or ad-hoc) 3. Initialization procedure for the 1st generation (random or ad-hoc). mutationcrossover 4. Definition of variation operators (mutation and crossover). mating 5. Parent (mating) selection mechanism. environmental 6. Survivor (environmental) selection mechanism. Technical parameters 7. Technical parameters (e.g. mutation rates, population size).

11 14/09/05 DMI - Università di Catania 1 Mutation and Crossover EAs manipulate partial solutions in their search for the overall optimal solution EAs manipulate partial solutions in their search for the overall optimal solution. These partial solutions or `building blocks' correspond to sub-strings of a trial solution - in our case local sub-structures within the overall conformation.

12 14/09/05 DMI - Università di Catania 1 Mutation Operators Bit-Flipping Random bit assignment Multiple bit mutations Inversely proportional hypermutation Mutation rate: the probability of applying mutation Per-chromosome mutation rate vs. per-gene (bit) mutation rate Other operators for non-binary representations

13 14/09/05 DMI - Università di Catania 1 Recombination / Crossover Operators One-Point Crossover K-Point Crossover Uniform Crossover Crossover rate: the probability of applying crossover Other operators for nonlinear representations

14 14/09/05 DMI - Università di Catania 1 Selection Roulette wheel selection Fitness proportional selection Rank-based selection Tournament selection Aging operator More complex operators Hot topic, the Selection Pressure: Different selection operators produce different behaviour: Exploration vs. Exploitation

15 14/09/05 DMI - Università di Catania 1 Problems: Search Bias! Some offspring tend to be more likely to be generated than others. This is called a bias Depends on representation and operators – Crossover bias, e.g. One-point CXS vs. Uniform CXS – Mutation bias, e.g. 1-bit-flip vs. K-bit-flip Search operators are applied to individuals. It is very important to realize the interdependency between operators and the representation of individuals.

16 14/09/05 DMI - Università di Catania 1 `Optimal' Parameter Tuning: Experimental tests. Adaptation based on measured quality. Self-adaptation based on evolution !

17 14/09/05 DMI - Università di Catania 1 Example

18 14/09/05 DMI - Università di Catania 1  Different historial branches of EC “Evolution is the natural way to program” “Evolution is the natural way to program” Thomas Ray (Frontiers of Life, Vol. One The origins of life, R. Dulbecco, D. Baltimore, F. Jacob, R. Levi-Montalcini (eds.), Academic Press 2001).

19 14/09/05 DMI - Università di Catania 1 Evolutionary Programming (L. Fogel, 1966) procedure EP; { t = 0; initialize population P(t); evaluate P(t); until (done) { t = t + 1; parent_selection P(t); mutate P(t); evaluate P(t); survive P(t); } The individuals are Finite State Machine (FSM) representation, real-valued vectors, ordered lists, graphs. All N individuals are selected to be parents, and then are mutated, producing N children. These children are evaluated and N survivors are chosen from the 2N individuals, using a probabilistic function based on fitness (individuals with a greater fitness have a higher chance of survival). Mutation is based on the representation used, and is often adaptive. For example, when using a real-valued vector, each variable within an individual may have an adaptive mutation rate that is normally distributed. No Recombination Applied to phenotypes

20 14/09/05 DMI - Università di Catania 1 Evolution Strategies (Rechenberg & Schwefel, 1973) procedure ES; { t = 0; initialize population P(t); evaluate P(t); until (done) { t = t + 1; parent_selection P(t); recombine P(t) mutate P(t); evaluate P(t); survive P(t); } ES typically use real-valued vector. Individuals are selected uniformly randomly to be parents. Pairs of parents produces children via recombination. The number of children created is greater than N. Survival is deterministic: 1. ES allows the N best children to survive, and replaces the parents with these children. 2. ES allows the N best children and parents to survive. Like EP, adapting mutation. Unlike EP, recombination does play an important role in ES, especially in adapting mutation.

21 14/09/05 DMI - Università di Catania 1 Genetic Algorithms (John Holland, 1975) procedure GA; { t = 0; initialize population P(t); evaluate P(t); until (done) { t = t + 1; parent_selection P(t); recombine P(t) mutate P(t); evaluate P(t); survive P(t); } GAs traditionally use a more domain independent representation, namely, bit-strings. Parents are selected according to a probabilistic function based on relative fitness. N children are created via recombination from the N parents. The N children are mutated and survive, replacing the N parents in the population. Crossover is important: the primary search operator Mutation flips bits with some small probability (background operator). Recombination, on the other hand, is emphasized as. Schema Theorem classical GA theory.

22 14/09/05 DMI - Università di Catania 1 Genetic Programming (John Koza, 1992) 1/2 – Evolved LISP programs Tree Representation Widest variety of application – Term also used by De Garis for evolution of artificial neural networks Each tree (program) is composed of functions and terminals appropriate to the particular problem domain; the set of all functions and terminals is selected a priori in such a way that some of the composed trees yield a solution. The initial pop is composed of such trees. The evaluation function assigns a fitness value which evaluates the performance of a tree. The evaluation is based on a preselected set of test cases,a fitness cases; in general, the fitness function returns the sum of distances between the correct and obtained results on all test cases.

23 14/09/05 DMI - Università di Catania 1 Genetic Programming (John Koza, 1992) 2/2 procedure GP; { t = 0; initialize population P(t); /* randomly create an initial pop of individuals computer program */ evaluate P(t); /* execute each program in the pop and assign it a fitness value */ until (done) { t = t + 1; parent_selection P(t); /* select one or two program(s) with a probability based on fitness (with reselection allowed) */ create P(t); /* create new programs for the pop by applying the following ops with specified probability */ reproduction; /* Copy the selected program to the new pop */ crossover; /* create new offspring programs for the new pop by recombining randomly chosen parts from 2 selected prgs*/ mutation; /* Create one new offspring program for the new pop by randomly mutating a randomly chosen part of one selected program. */ Architecture-altering ops; /* Choose an architecture-altering operation from the available repertoire of such op. and create one new offspring program for the new pop by applying the chosen architecture-altering op. to the one selected prg */ }

24 14/09/05 DMI - Università di Catania 1 Constraint handling strategies [Michalewicz, Evolutionary Computation, 4(1), 1996] 1.Repair strategy: whenever an unfeasible solution is produced "repair" it, i.e. find a feasible solution "close“ to the unfeasible one; 2.Penalize strategy: admit unfeasible individuale in the population, but penalize them adding a suitable term to the energy.

25 14/09/05 DMI - Università di Catania 1 Representation GAsbit strings genotypic level of representation phenotypic representations Traditionally, GAs use bit strings. In theory, this representation makes the GA more problem independent. We can also see this as a more genotypic level of representation, since the individual is in some sense encoded in the bit string. Recently, however, the GA community has investigated more phenotypic representations, including vectors of real values, ordered lists, graphs, programs, machines, automata,neural networks. ESEPreal-valued vector ordered listfinite state automata The ES and EP communities focus on real-valued vector representations, although the EP community has also used ordered list and finite state automata representations. adaptive representations. Very little has been done in the way of adaptive representations.

26 14/09/05 DMI - Università di Catania 1 Distributed EAs inherent natural parallelism implementation of EAs on parallel machines Because of the inherent natural parallelism within an EA, much recent work has concentrated on the implementation of EAs on parallel machines. Typically either one processor holds one individual (in SIMD machines), or a subpopulation (in MIMD machines). Clearly, such implementations hold promise of execution time decreases. evolutionary effects speciationnicheing punctuated equilibria More interestingly, are the evolutionary effects that can be naturally illustrated with parallel machines, namely, speciation, nicheing, and punctuated equilibria (Belew and Booker,1991).

27 14/09/05 DMI - Università di Catania 1 Summary Different problems require different search operators and selection schemes. There is no universally best one. Using real vectors is usually more appropriate than binary strings for function optimisation. Many search operators are heuristics- based. Domain knowledge can often be incorporated into search operators and representation.

28 14/09/05 DMI - Università di Catania 1 Advantages of EAs Widely applicable Widely applicable, also in cases where no (good) problem specic techniques are available: – Multimodalities, discontinuities, constraints. – Noisy objective functions. – Multiple criteria decision making problems. No presumptions No presumptions with respect to the problem space. Low development costs Low development costs; i.e. costs to adapt to new problem spaces. solutionsstraightforward interpretations The solutions of EA's have straightforward interpretations. They can be run interactively (online parameter adjustment).

29 14/09/05 DMI - Università di Catania 1 Drawbacks of EAs No guarantee for finding optimal solutions No guarantee for finding optimal solutions within a finite amount of time. No complete theoretical basis No complete theoretical basis (yet), but much progress is being made (X.Yao, J. He. Artificial Intelligence, 145, 2003 ). Parameter tuning Parameter tuning is largely based on trial and error (genetic algorithms); solution: Self-adaptation (evolution strategies). Parallelism Often computationally expensive: Parallelism.

30 14/09/05 DMI - Università di Catania 1 (Some!) Application of EAs Optimization and Problem Solving (J. Foster, Evolutionary Computation, Nature Genetics Reviews, 2001). NP-Complete Problem (Lemmon & Milinkovitch, PNAS, 99(16), 2002) Protein Folding (M. Damsbo et al., PNAS, 101(19), 2004;G. Nicosia, V. Cutello, Narzisi, Royal Society Interface, 2005). Automated Synthesis of Analog Electrical Circuits; (J. Koza et al., Genetic Programming III, Morgan Kaufmann, 1999). Control & Modelling (Assion et al, Science, 282, 1998).

31 14/09/05 DMI - Università di Catania 1 Turing’s three approaches Turing’s three approaches for creating intelligent computer program logic-driven One approach was logic-driven knowledge-based while a second was knowledge-based. The third approach that Turing specifically identified in 1948 for achieving machine intelligence is “... the genetical or evolutionary search by which a combination of genes is looked for, the criterion being the survival value”. A. M Turing A. M.,Intelligent machines, Machine Intelligence, 1948.

32 14/09/05 DMI - Università di Catania 1 In 1950 Turing described how evolution and natural selection might be used to create an intelligent program: good child- machine “... we cannot expect to find a good child- machine at first attempt. One must experiment with teaching one such machine and see how well it learns. One can then try another and see if it is better or worse”. Turing A. M.,Computing machinery and intelligence,Mind,1950.

33 14/09/05 DMI - Università di Catania 1 Turing’s third approach & EAs There is an obvious connection between this process and evolution, by identifications: 1. Structure of the child machine = Hereditary material; 2. Changes of the child machine = Mutations; 3. Judgment of the experimenter = Natural selection. The above features of Turing's third approach to machine intelligence are common to the various forms of evolutionary computation developed over the past four decades.


Download ppt "14/09/05 DMI - Università di Catania 1 An Introduction to Evolutionary Algorithms Giuseppe Nicosia Department."

Similar presentations


Ads by Google