Biologically Inspired Computing: Selection and Reproduction Schemes This is DWC’s lecture three `Biologically Inspired Computing’

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

Genetic Algorithm.
Biologically Inspired Computing: Operators for Evolutionary Algorithms
Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
Genetic Algorithms Representation of Candidate Solutions GAs on primarily two types of representations: –Binary-Coded –Real-Coded Binary-Coded GAs must.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Evolutionary Computational Intelligence
Multimodal Problems and Spatial Distribution Chapter 9.
Genetic Algorithms GAs are one of the most powerful and applicable search methods available GA originally developed by John Holland (1975) Inspired by.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Tutorial 4 (Lecture 12) Remainder from lecture 10: The implicit fitness sharing method Exercises – any solutions? Questions?
Genetic Algorithm.
Evolutionary Intelligence
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Intro. ANN & Fuzzy Systems Lecture 36 GENETIC ALGORITHM (1)
Schemata Theory Chapter 11. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Why Bother with Theory? Might provide performance.
Genetic Algorithms Michael J. Watts
Genetic algorithms Charles Darwin "A man who dares to waste an hour of life has not discovered the value of life"
The Generational Control Model This is the control model that is traditionally used by GP systems. There are a distinct number of generations performed.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
Today’s Topics Read –For exam: Chapter 13 of textbook –Not on exam: Sections & Genetic Algorithms (GAs) –Mutation –Crossover –Fitness-proportional.
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult problems.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
A New Evolutionary Approach for the Optimal Communication Spanning Tree Problem Sang-Moon Soak Speaker: 洪嘉涓、陳麗徽、李振宇、黃怡靜.
Automated Patch Generation Adapted from Tevfik Bultan’s Lecture.
Genetic Algorithms Genetic algorithms provide an approach to learning that is based loosely on simulated evolution. Hypotheses are often described by bit.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Evolutionary Computing Chapter 5. / 32 Chapter 5: Fitness, Selection and Population Management Selection is second fundamental force for evolutionary.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Genetic Algorithms. The Basic Genetic Algorithm 1.[Start] Generate random population of n chromosomes (suitable solutions for the problem) 2.[Fitness]
Biologically Inspired Computing: Evolutionary Algorithms: Some Details and Examples This is additional material for lecture 2 of `Biologically Inspired.
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Local Search. Systematic versus local search u Systematic search  Breadth-first, depth-first, IDDFS, A*, IDA*, etc  Keep one or more paths in memory.
Selection Methods Choosing the individuals in the population that will create offspring for the next generation. Richard P. Simpson.
Genetic Algorithms. Underlying Concept  Charles Darwin outlined the principle of natural selection.  Natural Selection is the process by which evolution.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Comptation Dr. Kenneth Stanley January 23, 2006.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Multimodal Problems and Spatial Distribution A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 9.
Genetic Algorithms. Solution Search in Problem Space.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 1 Let’s look at… Machine Evolution.
Introduction to Genetic Algorithms
Genetic Algorithms Author: A.E. Eiben and J.E. Smith
Genetic Algorithms.
An Evolutionary Approach
Evolutionary Algorithms Jim Whitehead
CSC 380: Design and Analysis of Algorithms
CS Fall 2016 (Shavlik©), Lecture 12, Week 6
`Biologically Inspired Computing’
Genetic Algorithms Chapter 3.
CS Fall 2016 (Shavlik©), Lecture 12, Week 6
EE368 Soft Computing Genetic Algorithms.
Searching for solutions: Genetic Algorithms
A Gentle introduction Richard P. Simpson
Machine Learning: UNIT-4 CHAPTER-2
This is additional material for week 2 of
Biologically Inspired Computing: Operators for Evolutionary Algorithms
CSC 380: Design and Analysis of Algorithms
Presentation transcript:

Biologically Inspired Computing: Selection and Reproduction Schemes This is DWC’s lecture three `Biologically Inspired Computing’

Reminder You have to read the additional required study material Generally, lecture k assumes that you have read the additional study material for lectures k-1, k-2, etc …

Today’s additional material This lecture is mainly about the overall flow of control in some simple EA designs, and about possible methods to use for the SELECTION step. The additional material is about Operators (crossover and mutation) Another item of additional material is a book chapter on “Selected Applications of Natural Computation” – which discusses various applications of EAs. This will appear soon in the Springer “Handbook of Natural Computation”

A steady state, mutation-only, replace- worst EA with tournament selection 0. Initialise: generate a population of popsize random solutions, evaluate their fitnesses. 1.Run Select to obtain a parent solution X. 2.With probability mute_rate, mutate a copy of X to obtain a mutant M (otherwise M = X) 3.Evaluate the fitness of M. 4.Let W be the current worst in the population (BTR). If M is not less fit than W, then replace W with M. (otherwise do nothing) 5.If a termination condition is met (e.g. we have done 10,000 evaluationss) then stop. Otherwise go to 1. Select: randomly choose tsize individuals from the population. Let c be the one with best fitness (BTR); return X.

With a modest level of pressure. you may end up here or here: Selection Issues Very low pressure selection (e.g. random) No evolutionary `progress’ at all. Suppose the green blobs indicate the initial population. With very high pressure (e.g. always select best), you will end up here

Example: comparing selection methods on a TSP Aberystwyth Brighton Edinburgh Exeter Glasgow Inverness Liverpool London Newcastle Nottingham Oxford Stratford

These data came from:

Generational/elitist /pop = 20; gens = 100: so: 2000 evaluations Size of search space: 479, 001, 600 Random search, best of 2,000: 2,043 minutes EA: random selection (but…): 1,995 minutes EA: Roulette Wheel selection: 1,912 minutes EA: Rank (bias =1) selection: 1,892 minutes EA: Rank (bias = 2) selection: 1,733 minutes EA: Rank (bias = 3) selection: 1,944 minutes EA: Rank (bias = 5) selection: 2,131 minutes EA: Tournament (tsize = 2) selection: 1,895 minutes EA: Tournament (tsize = 3) selection: 1,818 minutes

These are the results of one run each – you should never base conlcusions on such data. However, they illustrate the point about selection pressure needing to be neither too low (random) nor too high Random search, best of 2,000: 2,043 minutes EA: random selection (but…): 1,995 minutes EA: Roulette Wheel selection: 1,912 minutes EA: Rank (bias =1) selection: 1,892 minutes EA: Rank (bias = 2) selection: 1,733 minutes EA: Rank (bias = 3) selection: 1,944 minutes EA: Rank (bias = 5) selection: 2,131 minutes EA: Tournament (tsize = 2) selection: 1,895 minutes EA: Tournament (tsize = 3) selection: 1,818 minutes

Some Selection Methods Grand old method: Fitness Proportionate Selection also called Roulette Wheel selection Suppose there are P individuals with fitnesses f1, f2, …, fP; and higher values mean better fitness. The probability of selecting individual i is simply: This is equivalent to spinning a roulette wheel with sectors proportional to fitness

Problems with Roulette Wheel Selection Having probability of selection directly proportional to fitness has a nice ring to it. It is still used a lot, and is convenient for theoretical analyses, but: What about when we are trying to minimise the `fitness’ value? What about when we may have negative fitness values? We can modify things to sort these problems out easily, but fitprop remains too sensitive to fine detail of the fitness measure. Suppose we are trying to maximise something, and we have a population of 5 fitnesses: 100, 0.4, 0.3, 0.2, the best is 100 times more likely to be selected than all the rest put together! But a slight modification of the fitness calculation might give us: 200, 100.4, 100.3, 100.2, – a much more reasonable situation. Point is: Fitprop requires us to be very careful how we design the fine detail of fitness assignment. Other selection methods are better in this respect, and more used now.

Tournament Selection Tournament selection: tournament size = t Repeat t times choose a random individual from the pop and remember its fitness Return the best of these t individuals (BTR) This is very tunable, avoids the problems of superfit or superpoor solutions, and is very simple to implement

Rank Based Selection The fitnesses in the pop are Ranked (BTR) from Popsize (fittest) down to 1 (least fit). The selection probabilities are proportional to rank. There are variants where the selection probabilities are a function of the rank.

Rank with low bias Here, selective fitnesses are based on rank 0.5

Rank with high bias Here, selective fitnesses are based on rank 2

Tournament Selection Parameter: tournament size, t To select a parent, randomly choose t individuals from the population (with replacement). Return the fittest of these t (BTR) What happens to selection pressure as we increase t? What degree of selection pressure is there if t = 10 and popsize = 10,000 ?

Truncation selection Applicable only in generational algorithms, where each generation involves replacing most or all of the population. Parameter pcg (ranging from 0 to 100%) Take the best pcg% of the population (BTR); produce the next generation entirely by applying variation operators to these. How does selection pressure vary with pcg ?

Spatially Structured Populations Local Mating (Collins and Jefferson) The pop is spatially organised (each individual has co-ordinates) LM is a combined selection/replacement strategy.

Spatially Structured Populations Local Mating (Collins and Jefferson) 1. Choose random cell 2. Random walk length w from that cell. Parameter: w, length of random walk.

Spatially Structured Populations Local Mating (Collins and Jefferson) 1. Choose random cell 2. Random walk length w from that cell. Parameter: w, length of random walk. 3. Selected: the fittest encountered on the walk

1. Choose random cell 2. Random walk length w from that cell. 3. Selected: the fittest encountered on the walk If doing a crossover, then do another random walk from the same cell to get another parent. If doing mutation, we just use the one we already have. Then … 4. Child replaces individual in the starting cell, if >=

Spatially Structured Populations The ECO Method (Davidor) The pop is spatially organised (each individual has co-ordinates) ECO is another combined selection/replacement strategy.

Spatially Structured Populations The ECO Method (Davidor) Each individual has a Neighbourhood, consisting of itself and the eight immediately surrounding it Showing the neighbourhood of the red individual

The ECO Method (Davidor) In ECO, run in a steady state way, each step involves: 1: Choose an individual at random. 2. Run fitness proportionate selection among only the neighbourhood of that individual, selecting a parent. 3. Select parent 2 in the same way 4. Generate and evaluate a child from these parents (maybe via just crossover, or crossover + mutation – these details are irrelevant to the ECO scheme itself). 5. Use the replace-worst strategy within the neighbourhood to incorporate the child.

(l,m) and (l+m) schemes An (l+m) scheme works as follows: the difference from (l, m) is highlighted in blue: The population size is l. In each generation, produce m mutants of the l population members. This is done by simply randomly selecting a parent from the l and mutating it – repeating that m times. Note that m could be much bigger (or smaller) than l. Then, the next generation becomes the best l of the combined set of the current population and the m children. Is this an elitist strategy?

That’s it for now The spatially structured populations techniques tend to have excellent performance. This is because of their ability to maintain diversity – i.e. they seem much better at being able to maintain lots of difference within the population, which provides fuel for the evolution to carry on, rather than too- quickly converge on a possibly non-ideal answer. Diversity maintenance in general, and more techniques for it, will be discussed in a later lecture.