Dr. Kenneth Stanley September 13, 2006

Slides:



Advertisements
Similar presentations
Genetic Algorithms Representation of Candidate Solutions GAs on primarily two types of representations: –Binary-Coded –Real-Coded Binary-Coded GAs must.
Advertisements

Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Theory Chapter 11. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Overview (reduced w.r.t. book) Motivations and problems Holland’s.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Introduction to Genetic Algorithms Yonatan Shichel.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Chapter 14 Genetic Algorithms.
Genetic Algorithms Sushil J. Louis Evolutionary Computing Systems LAB Dept. of Computer Science University of Nevada, Reno
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Genetic Algorithm.
Genetic algorithms. Genetic Algorithms in a slide  Premise Evolution worked once (it produced us!), it might work again  Basics Pool of solutions Mate.
© Negnevitsky, Pearson Education, Lecture 11 Evolutionary Computation: Genetic algorithms Why genetic algorithm work? Why genetic algorithm work?
Theory A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 11 1.
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
A Brief Introduction to GA Theory. Principles of adaptation in complex systems John Holland proposed a general principle for adaptation in complex systems:
Schemata Theory Chapter 11. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Why Bother with Theory? Might provide performance.
CS Machine Learning Genetic Algorithms (II).
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
By Prafulla S. Kota Raghavan Vangipuram
Genetic Algorithms Michael J. Watts
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Soft Computing A Gentle introduction Richard P. Simpson.
1 Chapter 14 Genetic Algorithms. 2 Chapter 14 Contents (1) l Representation l The Algorithm l Fitness l Crossover l Mutation l Termination Criteria l.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 16 February 2007 William.
GENETIC ALGORITHMS.  Genetic algorithms are a form of local search that use methods based on evolution to make small changes to a popula- tion of chromosomes.
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
Edge Assembly Crossover
1. Genetic Algorithms: An Overview  Objectives - Studying basic principle of GA - Understanding applications in prisoner’s dilemma & sorting network.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
CS621: Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 5: Power of Heuristic; non- conventional search.
1. Genetic Algorithms: An Overview 4 학습목표 GA 의 기본원리를 파악하고, Prisoner’s dilemma 와 sorting network 에의 응용 및 이론적 배경을 이해한 다.
CAP6938 Neuroevolution and Developmental Encoding Intro to Neuroevolution Dr. Kenneth Stanley September 18, 2006.
GENETIC ALGORITHMS Tanmay, Abhijit, Ameya, Saurabh.
Genetic Algorithms MITM613 (Intelligent Systems).
Neural Networks And Its Applications By Dr. Surya Chitra.
1 Chapter 3 GAs: Why Do They Work?. 2 Schema Theorem SGA’s features: binary encoding proportional selection one-point crossover strong mutation Schema.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Computation Theory Dr. Kenneth Stanley January 25, 2006.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Comptation Dr. Kenneth Stanley January 23, 2006.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Using GA’s to Solve Problems
Chapter 14 Genetic Algorithms.
Introduction to Genetic Algorithms and Evolutionary Computing
Genetic Algorithms.
Dr. Kenneth Stanley September 11, 2006
An Evolutionary Approach
Evolutionary Algorithms Jim Whitehead
Dr. Kenneth Stanley January 30, 2006
School of Computer Science & Engineering
C.-S. Shieh, EC, KUAS, Taiwan
Dr. Kenneth Stanley September 25, 2006
Artificial Intelligence 16. Genetic Algorithms
Example: Applying EC to the TSP Problem
Artificial Intelligence (CS 370D)
Genetic Algorithms, Search Algorithms
Example: Applying EC to the TSP Problem
Evolutionist approach
Dr. Kenneth Stanley February 6, 2006
Searching for solutions: Genetic Algorithms
1. Genetic Algorithms: An Overview
A Gentle introduction Richard P. Simpson
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
Beyond Classical Search
Presentation transcript:

Dr. Kenneth Stanley September 13, 2006 CAP6938 Neuroevolution and Developmental Encoding Evolutionary Computation Theory Dr. Kenneth Stanley September 13, 2006

Schema Theory (Holland 1975) A building block is a set of genes with good values Schemas are a formalization of buildings blocks Schemas are bit strings with *’s (wildcards) 1****0 is all 6-bit strings surrounded by 1 and 0 Order 2: 2 defined bits A schema defines a hyperplane Example: 1** 1

Schema Fitness GA implicitly evaluates fitness for all its schemas Average fitness of a schema is average fitness of all possible instances of it A GA behaves as if it were really storing these averages

Schema Theorem on Selection Idea: Calculate approximate dynamics of increase and decreases of schema instances Instances of H at time t: Observed avg. fitness of H at time t: Goal: Calculate Using the fact that number of offspring is proportional to fitness: Thus, increases or decreases in instances depends on schema average fitness

Schema Theorem with Crossover and Mutation Question is what the probability is that schema H will survive a crossover or mutation Let d(H) be H’s defining length Probability that schema H will survive crossover: Equation shows it’s higher for shorter schemas Probability of surviving mutation:

Total Schema Theorem The expected number of instances of schema H taking into account selection, crossover, and mutation: Meaning: Low-order schemas whose average fitness remains above the mean will increase exponentially. Reason: Increase of non-disrupted schema proportional to

Building Blocks Hypothesis (Goldberg 1989) Crossover combines good schema into equally good or better higher-order schema That is, crossover (or mutation) is not just destructive: It is a power behind the GA

Questioning the BBH Why would separately discovered building blocks be compatible? What about speciation? Hybridization is rare in nature Gradual elaboration is safer Schema Theorem and BBH assume fixed length genomes

No Free Lunch Theorem (Wolpert and Macready 1996) An attack on GAs and “black box” optimization Across all possible problems, no optimization method is better than any other “Elevated performance over one class of problems is exactly paid for in performance over another class.” Implication: Your method is not the best Or is it?

Hill Climbing vs. Hill Descending Isn’t hill climbing better overall? No

Very Bad News “If an algorithm performs better than random search on some class of problems then it must perform worse than random search on the remaining problems.” “One should be weary of trying to generalize [previously obtained] results to other problems.” “If the practitioner has knowledge of problem characteristics but does not incorporate them into the optimization algorithm…there are no formal assurances that the algorithm chosen will be at all effective.”

Hope is not Lost An algorithm can be better over a class of problems if it exploits a common property of that class What is the class of problems known as the “real world?” Characterizing a class has become important

Function Approximation is a Subclass of Optimization A function approximator can be estimated Estimation means described in fewer dimensions (parameters) than the final solution That is not true of optimization in general f(x) can be as simple or as complex as we want There may be a limit on the # of bits in f(x), i.e. the size of the memory, but we can use them however we want

Exploiting Approximation How can the structure of approximation problems be exploited? Start with simple approximations Complexify them gradually Information about a function can be elaborated Information is not accumulated in general optimization Neural networks are approximators Real world problems are often approximation problems

Next Week: Neuroevolution (NE) Combining EC with neural networks Fixed-topology NE and TWEANNs The Competing Conventions Problem Genetic Algorithms and Neural Networks by Darrell Whitley (1995) Evolving Artificial Neural Networks by Xin Yao (1999) Genetic set recombination and its application to neural network topology optimisation by Radcliffe, N. J. (1993). (Skim from section 4 on, except for 9.2)