Presentation is loading. Please wait.

Presentation is loading. Please wait.

Derivative-Free Optimization: Genetic Algorithms Dan Simon Cleveland State University 1.

Similar presentations


Presentation on theme: "Derivative-Free Optimization: Genetic Algorithms Dan Simon Cleveland State University 1."— Presentation transcript:

1 Derivative-Free Optimization: Genetic Algorithms Dan Simon Cleveland State University 1

2 Outline 1.Biological Genetics 2.Genetic Algorithm: A Short History 3.Genetic Algorithm Example: Robot Design 4.Genetic Algorithm Options 5.Genetic Algorithm Example: Ackley Function 6.Continuous Genetic Algorithm 7.GA Code Walk-Through 2

3 Charles Darwin Born in England, 1809 “You care for nothing but shooting, dogs, and rat-catching; and you will be a disgrace to yourself and all your family.” Medicine? Theology? Biology? 3

4 Charles Darwin H. M. S. Beagle: 1831–1836 The Origin of Species: 1836–… Paper from Alfred Wallace: 1858 Presentation of both papers by Darwin: 1858 The Origin of Species: 1859 “Only” 500 pages The first printing (1,250 copies) sold out the first day 4

5 Darwin’s Theory of Natural Selection Survival of the fittest Controversial – Anti-Christian? – How are traits passed to children? Misconceptions – Traits of parents could be blended in children – Acquired traits could be passed to children 5

6 Gregor Mendel Born in 1822 in Czech. Poor farming family Joined Augustinian monastery at age 21 Studied botany (peas) Discovered the idea of genes, heredity, and dominance His publication (1865) was ignored at the time 6

7 Genetic Algorithms Princeton, John von Neumann Nils Barricelli (mathematician), 1953, artificial life 1954: “Esempi numerici di processi di evoluzione” (Numerical models of evolutionary processes) 7

8 Genetic Algorithms Alexander Fraser (biologist) – England, Hong Kong, New Zealand, Scotland, Australia – 1957: “Simulation of genetic systems by automatic digital computers” Hans-Joachim Bremermann – University of Washington, UC Berkeley – 1958: “The evolution of intelligence” 8

9 Genetic Algorithms George Box (statistician) – Imperial Chemical Industries (England) – 1957: “Evolutionary operation: A method for increasing industrial productivity” “Essentially, all models are wrong, but some are useful” (1987) George Friedman, UCLA – 1956: “Selective Feedback Computers for Engineering Synthesis and Nervous System Analogy” (Master’s thesis) 9

10 GA for Robot Design 000 = 5-volt stepper 001 = 9-volt stepper 010 = 12-volt stepper 011 = 24-volt stepper 100 = 5-volt servo 101 = 9-volt serv 110 = 12-volt serv 111 = 24-volt servo 000 = 12-volt NiCd battery 001 = 24-volt NiCd battery 010 = 12-volt Li-ion battery 011 = 24-volt Li-ion battery 100 = 12-volt solar panel 101 = 24-volt solar panel 110 = 12-volt fusion reactor 111 = 24-volt fusion reactor encoding for motor specencoding for power spec 10

11 GA for Robot Design Fitness = Range (hrs) + Power (W) – Weight (kg) Experiment or simulation We are combining incompatible units Randomly create initial population: Each individual is represented with a chromosome which has two genes 11

12 GA for Robot Design Individual 1 chromosome = 010 101 Individual 1’s motor genotype is 010, and its motor phenotype is “12-V stepper” 010101 101001 crossover point 011001 100101 Two ParentsTwo Children 12

13 GA for Robot Design How do we decide which individuals to mate? Fitness proportional selection, AKA roulette- wheel selection Example: four individuals with fitness values 10, 20, 30, and 40 Individual 1 Individual 2 Individual 4 Individual 3 10 20 30 40 13

14 A Simple Genetic Algorithm Parents  {randomly generated population} While not (termination criterion) Calculate the fitness of each parent in the population Children =  While |Children| < |Parents| Use fitnesses to select a pair of parents for mating Mate parents to create children c 1 and c 2 Children  Children  { c 1, c 2 } Loop Randomly mutate some of the children Parents  Children Next generation 14

15 GA Termination Criteria 1.Generation count 2.Fitness threshold 3.Fitness improvement threshold 15

16 Critical GA Design Parameters 1.Elitism 2.Encoding scheme 3.Fitness function and scaling 4.Population size 5.Selection method (tournament, rank, …) 6.Mutation rate 7.Crossover type 8.Speciation / incest 16

17 GA Schematic 10010110 01100010 10100100 10011001 01111101 --- SelectionCrossoverMutation Current generation Next generation Elitism 10010110 01100010 10100100 10111100 11001011 --- 17

18 Encoding Binary: Neighboring phenotypes have dissimilar genotypes, and vice versa Gray: Neighboring phenotypes have similar genotypes 000 001 011 010 110 111 101 100 x = -5 : 0.1 : 2 plot(x, x.^4 + 5*x.^3 + 4*x.^2 – 4*x + 1); 18

19 Gray Codes Bell Labs researcher Frank Gray introduced the term reflected binary code in his 1947 patent application. 19

20 Ackley Function Minimization problem; global minimum = 0 (at x = y = 0) Can be generalized to any number of dimensions 20

21 Ackley Function 100 Monte Carlo simulations Population size = 50 Mutation rate = 2% Crossover probability = 100% Single point crossover Encoding: binary or gray Elitism: 0 or 2 21

22 Ackley Function 22

23 Ackley Function Average of 100 Monte Carlo simulations 23

24 Ackley Function 24

25 Continuous Genetic Algorithms Parents crossover: [1.23, 4.76, 2.19, 7.63] [9.73, 1.09, 4.87, 8.28] Children: [1.23, 1.09, 4.87, 8.28] [9.73, 4.76, 2.19, 7.63] crossover point Usually, GAs for continuous problems are implemented as continuous GAs 25

26 Continuous Genetic Algorithms Blended crossover: Select a random number r  [0, 1] Genotype operation: c = p 1 + r(p 2 —p 1 ) Parent 1 Parent 2 Child 26

27 Continuous Genetic Algorithms Mutation: Suppose x = [9.73, 1.09, 4.87, 8.28] Problem dimension = 4 r  random number  [0, 1] If r < p m then i  random integer  [1, 4] r  random number  [0, 1] x(i)  x min + r(x max – x min ) end if Aggressive Mutation 27

28 Continuous Genetic Algorithms Mutation: Suppose x = [9.73, 1.09, 4.87, 8.28] Problem dimension = 4 r  random number  [0, 1] If r < p m then i  random integer  [1, 4] r  Gaussian random number  N(0,  ) x(i)  x(i) + r end if Gentle Mutation 28

29 Rastrigin Benchmark Function Global minimum f(x) = 0 at x i = 0 for all i p dimensions Lots of local minima 29

30 Rastrigin Benchmark Function Population size = 50 Mutation rate = 1% Crossover prob. = 100% Single point crossover Elitism = 2 15 dimensions 30 GA.m


Download ppt "Derivative-Free Optimization: Genetic Algorithms Dan Simon Cleveland State University 1."

Similar presentations


Ads by Google