Evolutionary Computational Inteliigence Lecture 6a: Multimodality.

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Parameter control Chapter 8. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Parameter Control in EAs 2 Motivation 1 An EA has many.
1 Evolutionary Computational Inteliigence Lecture 6b: Towards Parameter Control Ferrante Neri University of Jyväskylä.
Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 8.
Evolutionary Computational Intelligence
Introduction to Genetic Algorithms Yonatan Shichel.
Multimodal Problems and Spatial Distribution Chapter 9.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Evolutionary Computational Intelligence Lecture 9: Noisy Fitness Ferrante Neri University of Jyväskylä.
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
Evolutionary Computational Intelligence
Lecture 5: EP and DE 1 Evolutionary Computational Intelligence Lecture 5a: Overview about Evolutionary Programming Ferrante Neri University of Jyväskylä.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Tutorial 4 (Lecture 12) Remainder from lecture 10: The implicit fitness sharing method Exercises – any solutions? Questions?
Image Registration of Very Large Images via Genetic Programming Sarit Chicotay Omid E. David Nathan S. Netanyahu CVPR ‘14 Workshop on Registration of Very.
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
1 Reasons for parallelization Can we make GA faster? One of the most promising choices is to use parallel implementations. The reasons for parallelization.
Multimodal Optimization (Niching) A/Prof. Xiaodong Li School of Computer Science and IT, RMIT University Melbourne, Australia
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
Genetic Algorithms Michael J. Watts
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
2 Fundamentals of Genetic Algorithms Alexandre P. Alves da Silva and Djalma M. Falca˜o.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithms Genetic Algorithms – What are they? And how they are inspired from evolution. Operators and Definitions in Genetic Algorithms paradigm.
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult problems.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
A New Evolutionary Approach for the Optimal Communication Spanning Tree Problem Sang-Moon Soak Speaker: 洪嘉涓、陳麗徽、李振宇、黃怡靜.
/ 26 Evolutionary Computing Chapter 8. / 26 Chapter 8: Parameter Control Motivation Parameter setting –Tuning –Control Examples Where to apply parameter.
Evolutionary Computing Chapter 5. / 32 Chapter 5: Fitness, Selection and Population Management Selection is second fundamental force for evolutionary.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #12 2/20/02 Evolutionary Algorithms.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
Parallel Genetic Algorithms By Larry Hale and Trevor McCasland.
Waqas Haider Bangyal 1. Evolutionary computing algorithms are very common and used by many researchers in their research to solve the optimization problems.
5. Implementing a GA 4 학습목표 GA 를 사용해 실제 문제를 해결할 때 고려해야 하는 사항에 대해 이해한다 Huge number of choices with little theoretical guidance Implementation issues + sophisticated.
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Multiobjective Optimization  Particularities of multiobjective optimization  Multiobjective.
Genetic Algorithms MITM613 (Intelligent Systems).
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
Genetic Algorithms Chapter Description of Presentations
Evolutionary Computing Chapter 12. / 26 Chapter 12: Multiobjective Evolutionary Algorithms Multiobjective optimisation problems (MOP) -Pareto optimality.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Comptation Dr. Kenneth Stanley January 23, 2006.
Multimodal Problems and Spatial Distribution A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 9.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
CEng 713, Evolutionary Computation, Lecture Notes parallel Evolutionary Computation.
Introduction to Genetic Algorithms
Genetic Algorithms Author: A.E. Eiben and J.E. Smith
Dr. Kenneth Stanley September 11, 2006
Evolution Strategies Evolutionary Programming
C.-S. Shieh, EC, KUAS, Taiwan
Example: Applying EC to the TSP Problem
CSC 380: Design and Analysis of Algorithms
Multimodal Problems and Spatial Distribution
Multimodal Problems and Spatial Distribution
Meta-Heuristic Algorithms 16B1NCI637
Example: Applying EC to the TSP Problem
Genetic Algorithms Chapter 3.
Parameter control Chapter 8.
EE368 Soft Computing Genetic Algorithms.
Multimodal Problems and Spatial Distribution
Parameter control Chapter 8.
CSC 380: Design and Analysis of Algorithms
Parameter control Chapter 8.
Presentation transcript:

Evolutionary Computational Inteliigence Lecture 6a: Multimodality

Multimodality Most interesting problems have more than one locally optimal solution and our goal is to detect all of them

Multi-Objective Problems (MOPs) Wide range of problems can be categorised by the presence of a number of n possibly conflicting objectives: – buying a car: speed vs. price vs. reliability Two part problem: – finding set of good solutions – choice of best for particular application

MOP Car example I want to buy a car I would like it’s the cheapest the possible (minimize f 1 ) and the most comfortable the possible (maximize f 2 ) If I consider the two functions separately I obtain: – min f 1 – max f 2

MOPs 1: Conventional approaches rely on using a weighting of objective function values to give a single scalar objective function which can then be optimised: to find other solutions have to re-optimise with different w i.

MOPs 2: Dominance we say x dominates y if it is at least as good on all criteria and better on at least one Dominated by x f2f2 f1f1 Pareto front x

Implications for Evolutionary Optimisation Two main approaches to diversity maintenance: Implicit approaches (decision space): – Impose an equivalent of geographical separation – Impose an equivalent of speciation Explicit approaches (fitness): – Make similar individuals compete for resources (fitness) – Make similar individuals compete with each other for survival

Periodic migration of individual solutions between populations Implicit 1: “Island” Model Parallel EAs EA

Island Model EAs: Run multiple populations in parallel, in some kind of communication structure (usually a ring or a torus). After a (usually fixed) number of generations (an Epoch), exchange individuals with neighbours Repeat until ending criteria met Partially inspired by parallel/clustered systems

Island Model Parameter Setting The idea is simple but its success is subject to a proper parameter setting It must be somehow known the number of “islands”,i.e. basins of attraction we are considering It must be set the population size for each separate island If some a priori information regarding the fitness landscape is given, island model can be efficient, otherwise it can likely fail

Implicit 2: Diffusion Model Parallel EAs Impose spatial structure (usually grid) in 1 pop Current individual Neighbours

Diffusion Model EAs Consider each individual to exist on a point on a grid Selection (hence recombination) and replacement happen using concept of a neighbourhood a.k.a. deme Leads to different parts of grid searching different parts of space, good solutions diffuse across grid over a number of gens

Diffusion Model Example Assume rectangular grid so each individual has 8 immediate neighbours For each point we can consider a population mad up of 9 individuals One of the other 8 remaining point is selected (e.g. by means of roulette wheel) Recombination between starting and selected point occurs In a steady state logic replacement of the fittest occurs

Implicit 3: Automatic Speciation It restricts the recombination on the basis genotypic structure of the solutions in order to have recombination only amongst individual of the same specie – comparing the maximum genotypic distance between solutions – Adding a “tag” (genotypic enlargement) in order to characterize the belonging of each individual to a certain specie In both cases, problem requires a lot of comparisons and the computational overhead can be very high

Explicit 1: Fitness Sharing Restricts the number of individuals within a given niche by “sharing” their fitness, so as to allocate individuals to niches in proportion to the niche fitness need to set the size of the niche  share in either genotype or phenotype space run EA as normal but after each gen set Meaning of the distance is representation dependent

Explicit 2: Crowding Attempts to distribute individuals evenly amongst niches relies on the assumption that offspring will tend to be close to parents randomly selects a couple of parents, produce 2 offspring each offspring compete in a pair- tournament for surviving with the most similar parent (steady state) i.e. the parent which has minimal distance

Fitness Sharing vs. Crowding Fitness Sharing Crowding

Multimodality and Constraints In some cases we are not satisfied by finding all the local optima but only a subset of them having certain properties (e.g. fitness values) In such cases the combination of algorithmic components can be beneficial A rather efficient and simple option is to properly combine a cascade

Fast Evolutionary Deterministic Algorithm (2006) FEDA is composed by: – Quasi Genetic Algorithm (QGA, 2004) – Fitness Sharing Selection Scheme (FSS) – Multistart Hooke Jeeves Algorithm (HJA)

Quasi Genetic Algorithm

FEDA The set of solutions coming from QGA (usually a lot) are processed by FSS We thus obtain a smaller set of points which have good fitness values and are spread out in the decision space The HJA is then applied to each of those solutions

Grounding Grid Problem 1

Grounding Grid Problem 2

Grounding System Problem

Evolutionary Computational Inteliigence Lecture 6b: Towards Parameter Control

Motivation 1 An EA has many strategy parameters, e.g. mutation operator and mutation rate crossover operator and crossover rate selection mechanism and selective pressure (e.g. tournament size) population size Good parameter values facilitate good performance Q1 How to find good parameter values ?

Motivation 2 EA parameters are rigid (constant during a run) BUT an EA is a dynamic, adaptive process THUS optimal parameter values may vary during a run Q2: How to vary parameter values?

Parameter tuning Parameter tuning: the traditional way of testing and comparing different values before the “real” run Problems: users mistakes in settings can be sources of errors or sub-optimal performance costs much time parameters interact: exhaustive search is not practicable good values may become bad during the run (e.g. Population size)

Parameter Setting: Problems A wrong parameter setting can lead to an undesirable algorithmic behavious since it can lead to stagnation or premature convergence Too large population size, stagnation Too small population size, premature convergence In some “moments” of the evolution I would like to have a large pop size (when I need to explore and prevent premature convergence); in other “moments” I would like to have a small one (when I need to exploit available genotypes)

Parameter control Parameter control: setting values on-line, during the actual run, I would like that the algorithm “decides” by itself how to properly vary parameter setting over the run Some popular options for pursuing this aim are: predetermined time-varying schedule p = p(t) using feedback from the search process encoding parameters in chromosomes and rely on natural selection (similar to ES self-adaptation)

Related Problems Problems: finding optimal p is hard, finding optimal p(t) is harder still user-defined feedback mechanism, how to ”optimize”? when would natural selection work for strategy parameters? Provisional answer: In agreement with the No Free Lunch Theorem, optimal control strategy does not exist. Nevertheless, there are a plenty of interesting proposals that can be very performing in some problems. Some of these strategies are very problem oriented while some others are much more robust and thus applicable in a fairly wide spectrum of optimization problems