An Exemplar-Based Fitness Function with Dynamic Value Based on Sub-Problem Difficulty Steve Rowe June 2006 Srowe -at- cybernet -dot- com Please put "genetic"

Slides:



Advertisements
Similar presentations
Logical and Artificial Intelligence in Games Lecture 14
Advertisements

© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
Algorithm Design Techniques
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
SMA 6304/MIT2.853/MIT2.854 Manufacturing Systems Lecture 19-20: Single-part-type, multiple stage systems Lecturer: Stanley B. Gershwin
Finite Automata CPSC 388 Ellen Walker Hiram College.
Genetic Algorithms By: Anna Scheuler and Aaron Smittle.
On the Genetic Evolution of a Perfect Tic-Tac-Toe Strategy
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
21-May-15 Genetic Algorithms. 2 Evolution Here’s a very oversimplified description of how evolution works in biology Organisms (animals or plants) produce.
Hidden Markov Models Ellen Walker Bioinformatics Hiram College, 2008.
CS5371 Theory of Computation
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Object Recognition Using Genetic Algorithms CS773C Advanced Machine Intelligence Applications Spring 2008: Object Recognition.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
CHAPTER 4 Decidability Contents Decidable Languages
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
P OPULATION -B ASED I NCREMENTAL L EARNING : A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning 吳昕澧 Date:2011/07/19.
Selecting Informative Genes with Parallel Genetic Algorithms Deodatta Bhoite Prashant Jain.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Genetic Algorithm What is a genetic algorithm? “Genetic Algorithms are defined as global optimization procedures that use an analogy of genetic evolution.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Genetic Algorithm.
Computer Implementation of Genetic Algorithm
Efficient Model Selection for Support Vector Machines
Evolving a Sigma-Pi Network as a Network Simulator by Justin Basilico.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Intro. ANN & Fuzzy Systems Lecture 36 GENETIC ALGORITHM (1)
Lecture 23: Finite State Machines with no Outputs Acceptors & Recognizers.
Schemata Theory Chapter 11. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Why Bother with Theory? Might provide performance.
CS Machine Learning Genetic Algorithms (II).
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Genetic Algorithms Michael J. Watts
ART – Artificial Reasoning Toolkit Evolving a complex system Marco Lamieri Spss training day
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Computabilty Computability Finite State Machine. Regular Languages. Homework: Finish Craps. Next Week: On your own: videos +
GENETIC ALGORITHMS FOR THE UNSUPERVISED CLASSIFICATION OF SATELLITE IMAGES Ankush Khandelwal( ) Vaibhav Kedia( )
Genetic Algorithms Genetic Algorithms – What are they? And how they are inspired from evolution. Operators and Definitions in Genetic Algorithms paradigm.
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult problems.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
GENETIC ALGORITHMS.  Genetic algorithms are a form of local search that use methods based on evolution to make small changes to a popula- tion of chromosomes.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Lecture 4: Statistics Review II Date: 9/5/02  Hypothesis tests: power  Estimation: likelihood, moment estimation, least square  Statistical properties.
Evolution-Natural and Artificial John Maynard Smith Interdisciplinary Program in Cognitive Science Lee, Jung-Woo June, 7, 1999.
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #12 2/20/02 Evolutionary Algorithms.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
The Acceptance Problem for TMs
USING MICROBIAL GENETIC ALGORITHM TO SOLVE CARD SPLITTING PROBLEM.
Artificial Intelligence (CS 370D)
Basics of Genetic Algorithms (MidTerm – only in RED material)
○ Hisashi Shimosaka (Doshisha University)
EE368 Soft Computing Genetic Algorithms.
Boltzmann Machine (BM) (§6.4)
Md. Tanveer Anwar University of Arkansas
Alex Bolsoy, Jonathan Suggs, Casey Wenner
Presentation transcript:

An Exemplar-Based Fitness Function with Dynamic Value Based on Sub-Problem Difficulty Steve Rowe June 2006 Srowe -at- cybernet -dot- com Please put "genetic" in the subject if you write.

The GA Binary Chromosome Evolutionary Programming (EP) Approach: –100% Mutation Rate –0% Cross-over (Recombination) Rate BUT, non-EP approach: –Mutation rate is constant –Mutation is not restricted to small changes

FSM Encoding For a state machine with a maximum of N states, on an alphabet with M characters –N*(1+M*ceiling(log2(N))) bits are used. –Genome is divided into N bit fields, one for each state. –Each state bitfield is divided into a 1-bit flag (isFinal) and M binary numbers, which is the number of the state to go to when that character is read. –The first state is always the start state

FSM Encoding Example 2 states max, over the alphabet {a,b,c}: –Bits = 2 * (1 + 3 * log2(2)) = 2 * (1+3*1) = 8 final abc 1 bit final abc 1 bit State 0State 1

FSM Encoding Example The 2-state machine over {a,b,c} that accepts strings with an odd number of 'b's final abc 1101 abc 0010 State 0State 1 a,c b b

FSM Encoding Example 3 states max, over the alphabet {a,b}: –Bits = 3 * (1 + 2 * ceil(log2(3))) –= 3 * (1+2*2) = 15 Finalab 1 bit2 bits Finalab 1 bit2 bits Finalab 1 bit2 bits State 0 State 1State 2

FSM Encoding Example A machine that accepts only strings in (ab)*. –That is, e, ab, abab, ababab, etc. Finalab Finalab Finalab 010 State 0 State 1State 2 a b a,b a b

Encoding Notes If the number of states is not a power of 2, it is possible to code a "broken" FSM where there is a transition to a non-existent state. –My implementation guarantees unbroken FSMs by taking the destination modulo the number of states. Typical inductions are on {a,b} with 12 states (108 bits per genome), so the search space is about 3.25x That's about years to search at a million genomes per second.

The Fitness Function How do you test if a FSM accepts a given regular language? –RLs are potentially infinite in size Answer: Create a representative sample of strings that are in the language, and count the number of those that are accepted. Oops: This simple machine accepts all strings over {a,b}. a,b

The Fitness Function So we also need a representative sample of strings that are not in the language. So we have a set of strings, each of which has a flag indicating if it is in the language.

The Fitness Function During evaluation, we run the FSM that corresponds to each (unique) member of the population on each string in the test set. If the FSM accepts a string in the language or rejects a string that is not in the language, it scores a "correct". The score for a FSM = correct/total tests

Initial Results For the language consisting of strings with both "ab" and "ba" as substrings, a solution is: –

Initial Results But it's easier to see this way Even with high mutation, it takes on average 66,000 generations to converge.

Problems As with many genetic programming applications, changing 1 bit can turn a successful genome into a complete failure. –Not as much of a problem here because a 1-bit change only changes the isFinal for a single state, or the state transtion for a single symbol from a single state. There can be some big, attractive local maxima with the given fitness scheme (scoring 19 out of 20, but that last test case is nothing like the others).

One Solution Bigger genomes? –Theory: It is sometimes easier to build a Rube Goldberg FSM than to build an elegant, compact one. –So: Allow the maximum number of states to be larger and convergence seems to happen faster. –But that's crap, because increasing the number of states also increases the search space by many times

Think the Problem Through With this fitness paradigm, a local maximum trap happens when some feature of the target language is under-represented in the test set. A FSM that accepts the language sans that feature will score well, but not perfectly, and may be very different in implementation than one that accepts the correct language.

Missed it by this much. The machine shown here accepts strings in b*a+b(a|b)*, which is close to the target so it scores well. But it doesn't reject abbbb This is a local maximum that is difficult to leave.

Another Approach Pick the tests that are often failing and give them more weight. –Score them higher OR –Score others lower I use –weight = incorrect / number of evaluations –=(evaluations - correct) / evaluations Weight adjustments are made once per generation, so they are not very expensive.

Implications The elite in the population will have their fitnesses lowered relative to individuals that got the "rare" problems right. –This limits the dominance of dynasties because: The more members of a population that get a problem right, the less that problem is worth The better a member scores, the more of them there will be Therefore, the very problems that a well-scoring member is getting right are the ones that are being devalued the fastest!

Fitness versus Generation The fitness of the best individual (blue trace) and the mean population fitness (red trace) are shown versus generation. Circled are the characteristic steady declines in fitness value of the elite caused by the dynamic fitness function.

Results Over many different languages and thousands of trials, the dynamic fitness function is superior. –2 orders of magnitude faster convergence –3 orders of magnitude smaller variance

Conclusion I have presented a fitness function based on exemplar data rather than a closed form function. I have shown that the fitness function can be used to induce FSMs based on example strings from a language. I have introduced an optimization technique for varying the value of individual test cases within the test set that causes much faster convergence.