Download presentation
Presentation is loading. Please wait.
Published byLily Moody Modified over 9 years ago
1
2015/10/221 Genetic Algorithm
2
2 Outline Evolutionary Computation Genetic algorithms Genetic Programming
3
3 緣起 達爾文 (Charles Darwin) 生於英國 1809/02/12 物種原始 (1859) 理論 (Natural selection) 物種隨環境的變動而改變 生物演化為連續性的漸變 同一類生物來自共同的祖先 適者生存,不適者淘汰(天擇說)
4
4 Evolution Strategies 20 世紀 50 年代即有生物學家以電腦(計算機)模擬生物的遺 傳現象。 源起 RechenbegandSchwefel(1965) ( Evolution Strategies ) 1965 年,德國人 Rechenbergand Schwefel 在柏林工業大學 (the Technical University of Berlin) ,為了解決流體力學中,模型控制裏實數 參數最佳化的問題。合作發明了一種新的方法運用電腦以求解決問題, 就是「演化策略」( Evolution Strategies ) 根據生物演化的現象, Rechenberg 歸納以下的結論:『演化會使生物過 程達到最佳化,而演化本身也是一種生物過程,所以演化必然使本身也 達到最佳化。』這種探討關於演化本身的演化,也就是考慮到演化的策 略 提出以突變( mutate )為主的演化方法
5
5 Genetic Algorithm 同時 J. H. Holland ( University of Michigan ) 1967 年美國芝加哥大學 J.H.Holland 教授及其學生、同僚,發 展了一套 " 適應系統 " 的進化演算。 1968 年提出模式理論。 1975 年出版 “ 自然界和人工系統的適應性( Adaptation in Nature and Artificial System ) ( 書:代表作) ,發展了遺傳 演算法的理論基礎 介紹了交配 (Crossover) 遺傳運算
6
6 Evolutionary Programming Proposed by L. J. Fogelin 1966 and refined by his son D.B. Fogelin 1991 The goal of EP is to achieve intelligent behaviorthrough simulated evolution L. Fogel 想要發展和人工智能中專家系統不同的模型,以 便消除系統對人為設計的依賴,而能自我調適。由演化的觀 點出發,他將智能視為一種天擇的產物。所以不像專家系統 需要模擬人類的思考行為,而是直接讓系統演化出所須的行 為模式。 Genetic Programming Proposed by J. R. Koza in 1992 Genetic Algorithms, Genetic Programming, Evolution Strategies, and Evolutionary Programming ,共同成為 Evolutionary Computation 最重要的四大分支
7
7 Evolutionary Algorithms Genetic Algorithms (GA) Genetic Programming (GP) Evolution Strategies (ES) Evolutionary Programming (EP)
8
8 Problem solution using evolutionary algorithms Coding of solutions Objective function Genetic operators Specific knowledge problem Genetic search Genetic search solution selection replication Recombination crossover mutation Fitness assignment
9
9 ES 、 EP generation selection reproduction Genetic operators
10
10 Simple Genetic Algorithm() { Initialize population; evaluate population; while (termination criterion not reached) { select solutions for next population (reproduction); perform crossover and mutation; evaluate population; } } GAs
11
11 Basics of GAs A genetic algorithm is a search procedure based on the mechanics of natural selection and genetics. Algorithm starts with a set of solutions (represented by chromosomes) called population. Solutions from one population are taken and used to form a new population. This is motivated by a hope, that the new population will be better than the old one. Solutions which are selected to form new solutions (offspring) are selected according to their fitness - the more suitable they are, the more chances they are reproduced. This is repeated until some condition (for example number of populations or improvement of the best solution) is satisfied. Require two things Survival-of-the-fittest Variation
12
12
13
13 2 important genetic operators Crossover Mutation
14
14 Single point crossover 0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0 1|1|1|1|1|1|1|1|1|1|1|1|1|1|1|1 |0|0|0|0|0|0|0|0|0|0|0|0|0 |1|1|1|1|1|1|1|1|1|1|1|1|1 1|1|1 0|0|0 Drawback: position bias Gene in loci 1 and 2 often Crossover together parent children
15
15 1|1|1|1|1 Multipoint crossover 0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0 1|1|1|1|1|1|1|1|1|1|1|1|1|1|1|1 parent ||0|0|0|0|0 ||1|1|1|1|1|1|1|1|1 0|0|0|0 0|0|0|0|0|0|0 1|1 children
16
16 Uniform crossover 0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0 1|1|1|1|1|1|1|1|1|1|1|1|1|1|1|1 parent 0|1|0|0|1|1|0|0|1|0|1|1|0|0|1|0 each parent 50% children 1|0|1|1|0|0|1|1|0|1|0|0|1|1|0|1 Inverse of the other child
17
17 Point mutate 0|1|0|0|1|1|0|0|1|0|1|1|0|0|1|0 parent 0|1|0|0|1|1|0|0|1|0|1|0|0|0|1|0 child
18
18 Crossover for real-valued variables The original GA was designed for binary-encoded data (x1, x2, …, xn)(y1, y2,…,yn) α (x1,x2,…, αy k +(1-α)x k,…,xn) (y1,y2,…, α x k +(1- α )y k,…,yn) Ex: (0.5,1.0,1.5,2.0) (0.2,0.7, 0.2, 0.7) α=0.4 At 3nd gene (0.5,1.0,(0.4)(0.2)+(0.6)(1.5), 2.0) (0.5,1.0,0.98,2.0) (0.2,0.7,(0.4)(1.5)+(0.6)(0.2),0.7) (0.2,0.7,0.72,0.7) Single arithmetic crossover
19
19 Simple arithmetic crossover (x1, x2, …, xn)(y1, y2,…,yn) (x1, x2,…, α y k +(1- α)x k,…, α y n +(1- α)x n ) (y1, y2, …, α x k +(1- α)y k,…, α x n +(1- α)y n )
20
20 Discrete Crossover: each gene is chosen with uniform probability to be the gene of one or the other of the parents ’ chromosomes Parents : (0.5, 1.0, 1.5,2.0), (0.2,0.7, 0.2, 0.7) Child: (0.2, 0.7, 1.5, 0.7)
21
21 Normally distributed mutation Random shock may be added to each variable. The shock Should be normally distributed N(0, σ) Suppose the shock is N(µ=0, σ= 0.1) Pm =1, each variable is mutated Shock are 0.05, -0.17, -0.03, 0.08 Chromosome (0.2,0.7,1.5, 0.7) is mutated to (0.2+0.05, 0.7-0.17, 1.5-0.03, 0.7+0.08) = (0.25, 0.53, 1.47, 0.78)
22
22 A simple GA at work Encode Encode solutions to a problem as a set of numbers. Define a Fitness Metric This is a number that defines a solution's “ goodness ”. Evolve Improve the population by a process of Darwinian selection favoring the reproduction of fitter solutions. Initialize population by randomly generating N genomes. Evaluate fitness of all the individuals in the population. Repeat this loop until the solutions are adequate.
23
23 Find the maximum of N(µ=16, σ=4) 最大值 x=16 ,假設不知道! f(x) x 481216202428 Init: p c =0.75, p m =0.002 Representation: n=4 l=5 00000-11111 (0)(31)
24
24 4 init. chromosomes 00100 (4), 01001(9), 11011(27) and 11111(31) Fitness fu. = f(x)
25
25 chromosome Decimal value fitness Selection probability 00100 4 0.001108 0.04425 0100190.0215690.86145 11011 270.0022730.09078 11111310.0000880.00351
26
26 selection : 01001, 11011 are selected Crossover: at the second bit 0 1 0 0 1 1 1 0 1 1 0 1 0 1 1 1 1 0 0 1 parent children 1125 No mutate this time
27
27 selection : 01001, 00100 are selected No crossover this time New population: 00100, 01001, 01011(11), 11001 11 is a closer to 16!
28
28 chromosome Decimal value fitness Selection probability 00100 4 0.001108 0.014527 0100190.0215690.282783 01011 110.0456620.598657 11001250.000088 0.104003
29
29 selection At 1 st generation, 01001(9) dominated the fitness measure (86%), it will be selected too many times, generates too many copies, which impairs GA search capability (easy to stuck at local optimum)---crowding phenomenon Variability vs fitness
30
30 Selection (contd) Boltzmann selection T: temperature, from high to low At the beginning, fitness bias is suppressed, so variability is high, the search space is large At the end, fitness bias is enlarged, so global optimum is found quickly.
31
31 Elitism: requires Gas to retain a certain number of the fittest chromosomes. Ranking: ranks the chromosomes according to their fitness, reduces crowding problem, but low variability Etc.
32
32 Advantages of GA GAs can search spaces of hypotheses containing complex interacting parts, where the impact of each part on overall hypothesis fitness may be difficult to model GAs are easily parallelized and can take advantage of the decreasing costs of powerful computer hardware
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.