Theory Chapter 11. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Overview (reduced w.r.t. book) Motivations and problems Holland’s.

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

A First Course in Genetic Algorithms
Biologically Inspired Computing: Operators for Evolutionary Algorithms
Parameter control Chapter 8. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Parameter Control in EAs 2 Motivation 1 An EA has many.
Genetic Algorithms for Real Parameter Optimization Written by Alden H. Wright Department of Computer Science University of Montana Presented by Tony Morelli.
Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 8.
Genetic Algorithms as a Tool for General Optimization Angel Kuri 2001.
1 Structure of search space, complexity of stochastic combinatorial optimization algorithms and application to biological motifs discovery Robin Gras INRIA.
Date:2011/06/08 吳昕澧 BOA: The Bayesian Optimization Algorithm.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Estimation of Distribution Algorithms Ata Kaban School of Computer Science The University of Birmingham.
Evolutionary Computational Intelligence
Multimodal Problems and Spatial Distribution Chapter 9.
Genetic Algorithms GAs are one of the most powerful and applicable search methods available GA originally developed by John Holland (1975) Inspired by.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
An Introduction to Black-Box Complexity
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Simulation Models as a Research Method Professor Alexander Settles.
Chapter 14 Genetic Algorithms.
Theory Chapter 11. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Overview Motivations and problems Holland’s Schema Theorem.
Tutorial 1 Temi avanzati di Intelligenza Artificiale - Lecture 3 Prof. Vincenzo Cutello Department of Mathematics and Computer Science University of Catania.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Genetic Algorithm.
© Negnevitsky, Pearson Education, Lecture 11 Evolutionary Computation: Genetic algorithms Why genetic algorithm work? Why genetic algorithm work?
Theory A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 11 1.
FDA- A scalable evolutionary algorithm for the optimization of ADFs By Hossein Momeni.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
A Brief Introduction to GA Theory. Principles of adaptation in complex systems John Holland proposed a general principle for adaptation in complex systems:
Schemata Theory Chapter 11. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Why Bother with Theory? Might provide performance.
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
ART – Artificial Reasoning Toolkit Evolving a complex system Marco Lamieri Spss training day
Theory Chapter 11. A.E. Eiben and J.E. Smith, EC Theory, modified by Ch. Eick Overview Motivations and problems Holland’s Schema Theorem – Derivation,
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
Genetic Algorithms Siddhartha K. Shakya School of Computing. The Robert Gordon University Aberdeen, UK
1 Chapter 14 Genetic Algorithms. 2 Chapter 14 Contents (1) l Representation l The Algorithm l Fitness l Crossover l Mutation l Termination Criteria l.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 16 February 2007 William.
Algorithms and their Applications CS2004 ( ) Dr Stephen Swift 12.1 An Introduction to Genetic Algorithms.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
Chapter 9 Genetic Algorithms.  Based upon biological evolution  Generate successor hypothesis based upon repeated mutations  Acts as a randomized parallel.
Introduction to Genetic Algorithms. Genetic Algorithms We’ve covered enough material that we can write programs that use genetic algorithms! –More advanced.
Edge Assembly Crossover
Genetic Algorithms Genetic algorithms provide an approach to learning that is based loosely on simulated evolution. Hypotheses are often described by bit.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #12 2/20/02 Evolutionary Algorithms.
1. Genetic Algorithms: An Overview  Objectives - Studying basic principle of GA - Understanding applications in prisoner’s dilemma & sorting network.
CS621: Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 5: Power of Heuristic; non- conventional search.
GENETIC ALGORITHMS Tanmay, Abhijit, Ameya, Saurabh.
1 Chapter 3 GAs: Why Do They Work?. 2 Schema Theorem SGA’s features: binary encoding proportional selection one-point crossover strong mutation Schema.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Computation Theory Dr. Kenneth Stanley January 25, 2006.
Something about Building Block Hypothesis Ying-Shiuan You Taiwan Evolutionary Intelligence LAB 2009/10/31.
Presenter : Tsung-Yu Ho Review Mixing Problem Categories Crossover as a Mixer (√) Crossover as a Innovator (√) Crossover as a Disrupter All.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Comptation Dr. Kenneth Stanley January 23, 2006.
Multimodal Problems and Spatial Distribution A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 9.
Genetic Algorithms. Solution Search in Problem Space.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
Chapter 14 Genetic Algorithms.
Dr. Kenneth Stanley September 13, 2006
Genetic Algorithms Chapter 3.
Machine Learning: UNIT-4 CHAPTER-2
Theory Chapter 11.
Presentation transcript:

Theory Chapter 11

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Overview (reduced w.r.t. book) Motivations and problems Holland’s Schema Theorem Markov Chain Models No Free Lunch ?

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Why Bother with Theory? Might provide performance guarantees – Convergence to the global optimum can be guaranteed providing certain conditions hold Might aid better algorithm design – Increased understanding can be gained about operator interplay etc. Mathematical Models of EAs also inform theoretical biologists Because you never know ….

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Problems with Theory ? EAs are vast, complex dynamical systems with many degrees of freedom The type of problems for which they do well, are precisely those it is hard to model The degree of randomness involved means – stochastic analysis techniques must be used – Results tend to describe average behaviour After 100 years of work in theoretical biology, they are still using fairly crude models of very simple systems ….

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Holland’s Schema Theorem A schema (pl. schemata) is a string in a ternary alphabet ( 0,1 # = “don’t care”) representing a hyperplane within the solution space. – E.g. 0001# #1# #0#, ##1##0## etc Two values can be used to describe schemata, – the Order (number of defined positions) = 6,2 – the Defining Length - length of sub-string between outmost defined positions = 9, 3

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Example Schemata 111 1## # 001 1#0 H o(H) d(H) # 2 1 1# ## 1 0

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Schema Fitnesses The true “fitness” of a schema H is taken by averaging over all possible values in the “don’t care” positions, but this is effectively sampled by the population, giving an estimated fitness f(H) With Fitness Proportionate Selection P s (instance of H) = n(H,t) * f(H,t) / ( *  ) therefore proportion in next parent pool is: m’(H,t+1) = m(H,t) * f(H,t) /

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Schema Disruption I One Point Crossover selects a crossover point at random from the l-1 possible points For a schema with defining length d the random point will fall inside the schema with probability = d(H) / (l-1). If recombination is applied with probability P c the survival probability is P c *d(H)/(l-1)

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Schema Disruption II The probability that bit-wise mutation with probability P m will NOT disrupt the schemata is simply the probability that mutation does NOT occur in any of the defining positions, P survive (mutation) = ( 1- Pm)o (H) = 1 – o(H) * P m + terms in P m 2 +… For low mutation rates, this survival probability under mutation approximates to 1 - o(h)* P m

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory The Schema Theorem Put together, the proportion of a schema H in successive generations varies as: Condition for schema to increase its representation is: Inequality is due to convergence affecting crossover disruption, exact versions have been developed

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Implications 1: Operator Bias One Point Crossover – less likely to disrupt schemata which have short defining lengths relative to their order, as it will tend to keep together adjacent genes – this is an example of Positional Bias Uniform Crossover – No positional bias since choices independent – BUT is far more likely to pick 50% of the bits from each parent, less likely to pick (say) 90% from one – this is called Distributional Bias Mutation – also shows Distributional Bias, but not Positional

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Operator Bias has been extensively studied by Eschelman and Schaffer ( empirically) and theoretically by Spears & DeJong. Results emphasise the importance of utilising all available problem specific knowledge when choosing a representation and operators for a new problem Operator Biases ctd

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Implications 2:The Building Block Hypothesis Closely related to the Schema Theorem is the “Building Block Hypothesis” (Goldberg 1989) This suggests that Genetic Algorithms work by discovering and exploiting “building blocks” - groups of closely interacting genes - and then successively combining these (via crossover) to produce successively larger building blocks until the problem is solved. Has motivated study of Deceptive problems – Based on the notion that the lower order schemata within a partition lead the search in the opposite direction to the global optimum i.e. for a k-bit partition there are dominant epistatic interactions of order k-1

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Criticisms of the Schema Theorem It presents an inequality that does not take into account the constructive effects of crossover and mutation – Exact versions have been derived – Have links to Price’s theorem in biology Because the mean population fitness, and the estimated fitness of a schema will vary from generation to generation, it says nothing about gen. t+2 etc. “Royal Road” problems constructed to be GA-easy based on schema theorem turned out to be better solved by random mutation hill-climbers BUT it remains a useful conceptual tool and has historical importance

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Other Landscape Metrics As well as epistasis and deception, several other features of search landscapes have been proposed as providing explanations as to what sort of problems will prove hard for GAs – fitness-distance correlation – number of peaks present in the landscape – the existence of plateaus – all these imply a neighbourhood structure to the search space. It must be emphasised that these only hold for one operator

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Markov Chain Analysis A system is called a Markov Chain if – It can exist only in one of a finite number of states – So can be described by a variable X t – The probability of being in any state at time t+1 depends only on the state at time t. Has been used to provide convergence proofs Eiben et al 1989 (almost sure convergence of GAs): – IF the space is connected via variation operators AND selection is elitist AND 2 “trivial cond’s” – THEN P[ generation(n) contains optimum] = 1 for some n

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory No Free Lunch Theorems IN LAYMAN’S TERMS, – Averaged over all problems – For any performance metric related to number of distinct points seen – All non-revisiting black-box algorithms will display the same performance Implications – New black box algorithm is good for one problem => probably poor for another – Makes sense not to use “black-box algorithms” Lots of ongoing work showing counter- examples