Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evolutionary Algorithms

Similar presentations


Presentation on theme: "Evolutionary Algorithms"— Presentation transcript:

1 Evolutionary Algorithms
An Introduction "[G]enetic algorithms are based on a biological metaphor: They view learning as a competition among a population of evolving candidate problem solutions. A 'fitness' function evaluates each solution to decide whether it will contribute to the next generation of solutions. Then, through operations analogous to gene transfer in sexual reproduction, the algorithm creates a new population of candidate solutions." Matthias Trapp, Diploma Student - Computer Science, Theoretical Ecology Group - University of Potsdam, Stanislaw Lem Workshop on Evolution – October - Lviv 2005,

2 Agenda Introduction Structure of an EA Genetic Operators
Classification Implementation Discussion

3 Introduction As the area of genetic algorithms is very wide, it is not possible to cover everything in these pages. But you should get some idea what genetic algorithms are and what they could be useful for. Do not expect any sophisticated mathematics theory here.

4 Motivation - The Problem(s)
Global optimization problem: Function has many local optima Function is changing over time Function have many parameters  very large search space Combinatorial problems / Data Mining Classical NP-hard problems: TSP SAT Noise in the data hindering exact calculations Evolutionary Algoritms - Introduction

5 Overview Application Domains
Evolutionary Algoritms - Introduction

6 Evolution and Problem Solving
Algorithm = Automated Problem Solver Broad Scope: Natural Computing Family of algorithms which mimicking natural processes: Neural Networks Simulated Annealing DNA Computing Evolutionary Algorithms Evolution vs. Problem Solving Environment  Problem Individual Candidate Solution Fitness Quality Approximation Optimization Evolutionary Algoritms - Introduction

7 Evolutionary Algorithms
EAs are adaptive heuristic search algorithms Metaphor: trail and error (a.k.a generate and test) EAs are inspired by Darwin's theory of evolution: problems are solved by an evolutionary process resulting in a best (fittest) solution (survivor) from a population of solution candidates EAs has been successfully applied to a wide range of problems: Aircraft Design, Routing in Communications Networks, Tracking Windshear, Game Playing, Robotics, Air Traffic Control, Design, Scheduling, Machine Learning, Pattern Recognition, Job Shop Scheduling, VLSI Circuit Layout, Strike Force Allocation, Market Forecasting, Egg Price Forecasting, Design of Filters and Barriers, Data-Mining, User-Mining, Resource Allocation, Path Planning, Theme Park Tours … solve mathematical problems, EC is the field of study devoted to the design, development, and analysis is problem solvers based on natural selection (simulated evolution) - the solution is evolved. Evolutionary Algoritms - Introduction

8 Characteristics  Another useful “hammer” ?
Differences to classical algorithms/optimization methods: EAs search a set of possible solutions in parallel EAs do not require derivative information EAs use probabilistic transition rules EAs are generally straightforward to apply EAs provide a number of potential solutions EAs are able to apply self-adaptation evolutionary algorithms differ substantially from more traditional search and optimization methods. The most significant differences are:  Another useful “hammer” ?  If yes, then how can that be achieved ? Evolutionary Algoritms - Introduction

9 Structure of an EA

10 EA Components Representation mechanism (definition of individuals)
Evaluation function (or fitness function) Population as container data structure Parent/Survivor selection mechanism Variation operators (Recombination, Mutation) Initialization procedure / Termination condition Evolutionary Algoritms - Structure of an EA

11 Evolutionary Search (Flow Chart Model)
General Schema EA Evolutionary Search (Flow Chart Model) Boxes are Instances of Datatypes Arrows represent control flow Evolutionary Algoritms - Structure of an EA

12 Evolutionary Search (Pseudo Code)
General Schema EA Evolutionary Search (Pseudo Code) procedure EA { t = 0; Initialize(Pop(t)); Evaluate(Pop(t)); while(!TerminalCondition(Pop(t)) { Parents(t) = ParentSelection(Pop(t)); Offspring(t) = Recombination(Parents(t)); Offspring(t) = Mutation(Offspring(t)); Evaluate(Offspring(t)); Pop(t+1)= Replace(Pop(t),Offspring(t)); t = t + 1; } Boxes are Instances of Datatypes Arrows represent control flow Evolutionary Algoritms - Structure of an EA

13 Representation x = E(D(x))
Mapping: Problem context Problem solving space: Phenotype space P (candidate solution,individuals) Genotype space G (chromosomes, individuals) Encoding E : P  G Decoding D : G  P Encoding: Technical representation of individuals GA:Binary Encoding: ( ) = 7556 ES: Valued vectors: (ABDJEIFJDHDIE)||( ) EP: Finite state machines: GP: Tree of objects (LISP): Binary: not natural for many problems Permutation Encoding : Permutation encoding can be used in ordering problems, such as travelling salesman problem or task ordering problem. (IF_THEN_ELSE(> x 0)(SetX(*(* x 3) (- 4 y)))(SetY(+ x (- y 1)))) Evolutionary Algoritms - Back Matter

14 Population P(t) = {x1t,..., xnt}
Multi-set of genotypes = unit of evolution Invariants: Population Size n: static (common) dynamic (unusually) Non-overlapping (Simple GA): entire population is replaced each generation Overlapping (Steady-State GA): few individuals are replaced each generation Sometimes associated with spatial structure Diversity: number of different solutions in P(t) Multi-Population approaches (Pohlheim, 1995) Two of the most common genetic algorithm implementations are 'simple' and 'steady state'. The simple genetic algorithm is described by Goldberg in his 1989 book, Genetic Algorithms in Search and Optimization. It is a generational algorithm in which the entire population is replaced each generation. The steady state genetic algorithm is used by the Genitor program. In this algorithm, only a few individuals are replaced each 'generation'. This type of replacement is often referred to as overlapping populations. Evolutionary Algoritms - Back Matter

15 Genetic Operators

16 Selection/Sampling Operators
Distinguish between parent and survivor selection Typically probabilistic; work on population level Use fitness assignment of solution candidates Role: pushing quality improvement Generational selection vs. steady-state selection Common steady state selection methods: Elitist Selection Roulette Wheel Selection Tournament Selection Scaling Selection Rank Selection Fitness-proportionate Selection Hierarchical Selection Boltzmann Selection Remainder stochastic sampling Stochastic uniform sampling Elitist selection: The most fit members of each generation are guaranteed to be selected. (Most GAs do not use pure elitism, but instead use a modified form where the single best, or a few of the best, individuals from each generation are copied into the next generation just in case nothing better turns up.) Fitness-proportionate selection: More fit individuals are more likely, but not certain, to be selected. Roulette-wheel selection: A form of fitness-proportionate selection in which the chance of an individual's being selected is proportional to the amount by which its fitness is greater or less than its competitors' fitness. (Conceptually, this can be represented as a game of roulette - each individual gets a slice of the wheel, but more fit ones get larger slices than less fit ones. The wheel is then spun, and whichever individual "owns" the section on which it lands each time is chosen.) Scaling selection: As the average fitness of the population increases, the strength of the selective pressure also increases and the fitness function becomes more discriminating. This method can be helpful in making the best selection later on when all individuals have relatively high fitness and only small differences in fitness distinguish one from another. Tournament selection: Subgroups of individuals are chosen from the larger population, and members of each subgroup compete against each other. Only one individual from each subgroup is chosen to reproduce. Rank selection: Each individual in the population is assigned a numerical rank based on fitness, and selection is based on this ranking rather than absolute differences in fitness. The advantage of this method is that it can prevent very fit individuals from gaining dominance early at the expense of less fit ones, which would reduce the population's genetic diversity and might hinder attempts to find an acceptable solution. Generational selection: The offspring of the individuals selected from each generation become the entire next generation. No individuals are retained between generations. Steady-state selection: The offspring of the individuals selected from each generation go back into the pre-existing gene pool, replacing some of the less fit members of the previous generation. Some individuals are retained between generations. Hierarchical selection: Individuals go through multiple rounds of selection each generation. Lower-level evaluations are faster and less discriminating, while those that survive to higher levels are evaluated more rigorously. The advantage of this method is that it reduces overall computation time by using faster, less selective evaluation to weed out the majority of individuals that show little or no promise, and only subjecting those who survive this initial test to more rigorous and more computationally expensive fitness evaluation. Evolutionary Algoritms - Genetic Operators

17 Mutation Operator mi : G G
Unary operator, always stochastic Bit-Strings: Bit-flips (00101)  (10101) Tree: Sub tree destructive Sub tree/Node swap List: Generative/Destructive Node/Sequence Swap Array: Destructive Element Flip/Swap You can use more than one operator during an evolution. Evolutionary Algoritms - Genetic Operators

18 Recombination ci : G ×…× G  G
Inherit genotype traits, typically stochastic Often binary operator: Offspring = Sex(Mum, Dad) Bit-Strings: k-Point Recombination Uniform Recombination Genetic Programming: (seldom used) Evolutionary Algoritms - Genetic Operators

19 EA for Knapsack Problem
A Simple Example EA for Knapsack Problem Representation {0,1}n Recombination 1-Point Crossover Recombination probability 70% Mutation Uniform Bit-Flip Mutation probability pm 1/n Parent selection Best out of random two Survival selection Generational Population size 500 Number of offspring Initialization Random Termination condition No improvement in last 25 generations Evolutionary Algoritms - Genetic Operators

20 Effects of Genetic Operators
Selection alone will tend to fill the population with copies of the best individual Selection and crossover operators will tend to cause the algorithms to converge on a good but sub-optimal solution Mutation alone induces a random walk through the search space. Selection and mutation creates a parallel, noise-tolerant, hill-climbing algorithm Evolutionary Algoritms - Genetic Operators

21 Terminal Conditions  often disjunction of different conditions
Discovery of an optimal solution (precision ε > 0), Discovery of an optimal or near optimal solution, Convergence on a single or set of similar solutions, A user-specified threshold has been reached, A maximum number of cycles are evaluated, EA detects the problem has no feasible solution  often disjunction of different conditions Evolutionary Algoritms - Genetic Operators

22 Classification

23 Classification - Overview
1948 Alan Turing: „genetically or evolutionary search“ >1950 Idea: simulate evolution to solve engineering and design problems Box, 1957 Friedberg, 1958 Bremermann, 1962 Evolutionary Algoritms - Classification

24 Genetic Algorithms (GA)
By Holland (1975), USA concerned with developing robust adaptive systems Initially as abstraction of biological evolution Use of bit-strings for solution representation First EA which uses recombination Recombination seen as main operator very successful for combinatory optimization problems Evolutionary Algoritms - Classification

25 Evolutionary Strategies (ES)
By Rechenberg (1973), Schwefel (1981), Germany Parameter optimization of real-valued functions Accentuation on mutation Selection (μ – Parents, λ – Offspring): (μ, λ): choose fittest of λ > μ offspring (μ + λ): choose fittest of λ + μ solutions Recombination (u,v parent vectors, w child vector): More soon… (Implementation Example) Evolutionary Algoritms - Classification

26 Evolutionary Programming (EP)
By Fogel, Owens, and Walsh (1966), USA Application: Artificial Intelligence, Initially for evolution of finite-state machines, Using mutation and selection, Later applications to mainly real-valued functions Strong similarity to evolutionary strategies Example: Prediction of binary cycles Evolutionary Algoritms - Classification

27 Genetic Programming (GP)
Koza (1992), developed to simulate special functions Application: Function fitting f(x) Using parse trees of Terminal and Non-Terminals: Assumptions: Completeness Seclusion Problems: Variable count Variable types Classical Problems for genetic programming: Gun Firing Program Water Sprinkler System Maze Solving Program Evolutionary Algoritms - Classification

28 Implementation and Software
void ga::rank(void) { fitness_struct temp; int pos; calc_fitness(); for (int pass=1; pass<POP_SIZE; ++pass) { temp = rankings[pass]; pos = pass; while ((pos > 0) && temp.fitness < rankings[pos-1].fitness) { rankings[pos] = rankings[pos-1]; --pos; } rankings[pos] = temp; best_sol = rankings[0].fitness; worst_sol = rankings[POP_SIZE-1].fitness; if (best_sol < best_overall) best_overall = best_sol; if (worst_sol > worst_overall) worst_overall = worst_sol;

29 Another Simple Example
Search Space: Evolutionary strategy: Solution candidate (no encoding necessary): Fitness-Function : Parent selection: Elitist Recombination: 1-Point, fixed Non-overlapping population  Hybrid approach: evolutionary strategy and genetic program Evolutionary Algoritms - Implementation

30 Applying Self-Adaptation
Evolution of the Evolution: Self-adaptation = specific on-line parameter calibration technique Random number from Gaussian distribution with zero mean and standard deviation σ Mutation operator: Extending the candidate representation: Evolutionary Algoritms - Implementation

31 Working of an EA Distinct search phases:
Exploration Exploitation Trade-Off between exploration and exploitation: Inefficient search vs. Propensity to quick search focus Premature Convergence: “climbing the wrong hill” Losing diversity  Converge in local optimum Techniques to prevent this well-known effect „Any-time“ behaviour Evolutionary Algoritms - Implementation

32 Multiobjective EA Multiobjective GA (MOGA) (Fonseca und Fleming (1993)), Niched Pareto GA (NPGA) (Horn und Nafpliotis (1993)), Nondominated Sorting GA (NSGA-II) (Deb u. a. (2000)) , Strength Pareto EA (SPEA) (Zitzler und Thiele (1998)), Strength Pareto EA (SPEA2) (Zitzler u. a. (2001)) Evolutionary Algoritms - Implementation

33 Parallel Implementation of EA
Subpopulation on MIMD (Belew and Booker (1991)) Decrease of execution time Migration Model Unrestricted migration Ring migration Neighbourhood migration Global Model (Worker/Farmer) Diffusion Model: handles every individual separately selects the mating partner in a local neighbourhood diffusion of information takes place Parallel genetic algorithms were developed to speed up the computation by harnessing the power of parallel computers Evolutionary Algoritms - Implementation

34 API Comparison Name Language Licence Target PGAPack Fortran / C
Freeware All EO C++ GNU LGPL GALib BSD Lnx,Win GAGS JAGA Java JGAP Jave GAGS genetic algorithm C++ class library (http://kal-el.ugr.es/GAGS/newGAGS.html) GA Lib (http://lancet.mit.edu/ga/) Tolkien (http://lcsweb.cs.bath.ac.uk/people/jdrugo/tolkien/) Evolutionary Algoritms - Implementation

35 Discussion

36 EA Advantages Applicable to a wide range of problems
Useful in areas without good problem specific techniques No explicit assumptions about the search space necessary Easy to implement Any-time behaviour “..is a good designer of complex structures that are well adapted to a given environment or task.” Evolutionary Algoritms - Discussion

37 EA Disadvantages Problem representation must be robust
No general guarantee for an optimum No solid theoretically foundations (yet) Parameter tuning: trial-and-error Process (but self-adaptive variants in evolution strategies) Sometimes high memory requirements Implementation: High degree of freedom Problem representation must be robust; i.e., it must be able to tolerate random changes such that fatal errors or nonsense do not consistently result. Evolutionary Algoritms - Discussion

38 Summary EAs are different from classical algorithms
„an EA is the second best algorithm for any problem“ EAs are different from classical algorithms Less effort to develop an EA which: Delivers acceptable solutions, In acceptable running time, Low costs of men and time EAs are distributable (Belew and Booker (1991)): Subpopulations on MIMD, Via network EAs are easy to implement „In order to make evolutionary computing work well, there must be a programmer that sets the parameters right.“ Evolutionary Algoritms - Discussion

39 Thank You ! Questions, Concerns, Comments, Sarcasm, Insults…

40 Sources Spears, W. M., De Jong, K. A., Bäck, T., Fogel, D. B., and de Garis, H. (1993). “An Overview of Evolutionary Computation,” The Proceedings of the European Conference on Machine Learning, v667, pp A.E. Eiben, “Evolutionary computing: the most powerful problem solver in the universe?” Zbigniew Michalewicz, “Genetic Algorithms + Data Structures = Evolution Programs”, Springer, 1999, Lawrence J. Fogel, Alvin J. Owens, Michael J. Walsh: Artificial intelligence through simulated evolution, Wiley Verlag 1966 John R. Koza: Genetic Programming on the programming of computers by means of natural selection, MIT Verlag 1992


Download ppt "Evolutionary Algorithms"

Similar presentations


Ads by Google