Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2013/14 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Genetic Algorithms Representation of Candidate Solutions GAs on primarily two types of representations: –Binary-Coded –Real-Coded Binary-Coded GAs must.
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
EvoNet Flying Circus Introduction to Evolutionary Computation Brought to you by (insert your name) The EvoNet Training Committee The EvoNet Flying Circus.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Evolutionary Computational Intelligence
Doug Downey, adapted from Bryan Pardo, Machine Learning EECS 349 Machine Learning Genetic Programming.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Tutorial 1 Temi avanzati di Intelligenza Artificiale - Lecture 3 Prof. Vincenzo Cutello Department of Mathematics and Computer Science University of Catania.
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
CS 447 Advanced Topics in Artificial Intelligence Fall 2002.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Genetic Programming. Agenda What is Genetic Programming? Background/History. Why Genetic Programming? How Genetic Principles are Applied. Examples of.
Khaled Rasheed Computer Science Dept. University of Georgia
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Slides are based on Negnevitsky, Pearson Education, Lecture 10 Evolutionary Computation: Evolution strategies and genetic programming n Evolution.
Prepared by Barış GÖKÇE 1.  Search Methods  Evolutionary Algorithms (EA)  Characteristics of EAs  Genetic Programming (GP)  Evolutionary Programming.
Genetic Algorithm.
Genetic Algorithms and Ant Colony Optimisation
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
Genetic algorithms Prof Kang Li
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
What is Genetic Programming? Genetic programming is a model of programming which uses the ideas (and some of the terminology) of biological evolution to.
Introduction to Evolutionary Algorithms Session 4 Jim Smith University of the West of England, UK May/June 2012.
Neural and Evolutionary Computing - Lecture 5 1 Evolutionary Computing. Genetic Algorithms Basic notions The general structure of an evolutionary algorithm.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithms Genetic Algorithms – What are they? And how they are inspired from evolution. Operators and Definitions in Genetic Algorithms paradigm.
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
Genetic Algorithms. Evolutionary Methods Methods inspired by the process of biological evolution. Main ideas: Population of solutions Assign a score or.
Artificial Intelligence Chapter 4. Machine Evolution.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
EvoNet Flying Circus Introduction to Evolutionary Computation Brought to you by (insert your name) The EvoNet Training Committee The EvoNet Flying Circus.
1 Genetic Algorithms and Ant Colony Optimisation.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
METAHEURISTICS Genetic Algorithm Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Genetic Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 6.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
EvoNet Flying Circus Introduction to Evolutionary Computation Brought to you by (insert your name) The EvoNet Training Committee The EvoNet Flying Circus.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Genetic Programming.
Computational Intelligence
Evolutionary Algorithms Jim Whitehead
Evolution Strategies Evolutionary Programming
School of Computer Science & Engineering
CSC 380: Design and Analysis of Algorithms
Computational Intelligence
Artificial Intelligence Chapter 4. Machine Evolution
Computational Intelligence
Artificial Intelligence Chapter 4. Machine Evolution
Genetic Programming Chapter 6.
Machine Learning: UNIT-4 CHAPTER-2
Genetic Programming Chapter 6.
Genetic Programming Chapter 6.
Computational Intelligence
Beyond Classical Search
CSC 380: Design and Analysis of Algorithms
Presentation transcript:

Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 2 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 2 Plan for Today ● Evolutionary Algorithms (EA) ● Optimization Basics ● EA Basics

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 3 Optimization Basics ? !! ! !? ! ?! modelling simulation optimization system outputinput

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 4 Optimization Basics given: objective function f: X → R feasible region X (= nonempty set) optimization problem: find x* 2 X such that f(x*) = min{ f(x) : x 2 X } note: max{ f(x) : x 2 X } = –min{ –f(x) : x 2 X } x* global solution f(x*) global optimum objective: find solution with minimal or maximal value!

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 5 Optimization Basics local solution x* 2 X : 8 x 2 N(x*): f(x*) ≤ f(x) neighborhood of x* = bounded subset of X example: X = R n, N  (x*) = { x 2 X: || x – x*|| 2 ≤  if x* local solution then f(x*) local optimum / minimum remark: evidently, every global solution / optimum is also local solution / optimum; the reverse is wrong in general! ab x* example: f: [a,b] → R, global solution at x*

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 6 Optimization Basics What makes optimization difficult? some causes: local optima (is it a global optimum or not?) constraints (ill-shaped feasible region) non-smoothness (weak causality) discontinuities ( ) nondifferentiability, no gradients) lack of knowledge about problem ( ) black / gray box optimization) f(x) = a 1 x a n x n → max! with x i 2 {0,1}, a i 2 R add constaint g(x) = b 1 x b n x n ≤ b ) x i * = 1 if a i > 0 ) NP-hard add capacity constraint to TSP ) CVRP ) still harder strong causality needed!

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 7 Optimization Basics When using which optimization method? mathematical algorithms problem explicitly specified problem-specific solver available problem well understood ressources for designing algorithm affordable solution with proven quality required ) don‘t apply EAs randomized search heuristics problem given by black / gray box no problem-specific solver available problem poorly understood insufficient ressources for designing algorithm solution with satisfactory quality sufficient ) EAs worth a try

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 8 Evolutionary Algorithm Basics idea: using biological evolution as metaphor and as pool of inspiration ) interpretation of biological evolution as iterative method of improvement feasible solution x 2 X = S 1 x... x S n = chromosome of individual multiset of feasible solutions = population: multiset of individuals objective function f: X → R = fitness function often: X = R n, X = B n = {0,1} n, X = P n = {  :  is permutation of {1,2,...,n} } also : combinations like X = R n x B p x P q or non-cartesian sets ) structure of feasible region / search space defines representation of individual

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 9 Evolutionary Algorithm Basics initialize population evaluation parent selection variation (yields offspring) survival selection (yields new population) evaluation (of offspring) stop? output: best individual found Y N algorithmic skeleton

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 10 Evolutionary Algorithm Basics Specific example: (1+1)-EA in B n for minimizing some f: B n → R population size = 1, number of offspring = 1, selects best from 1+1 individuals parentoffspring 1.initialize X (0) 2 B n uniformly at random, set t = 0 2.evaluate f(X (t) ) 3.select parent: Y = X (t) 4.variation: flip each bit of Y independently with probability p m = 1/n 5.evaluate f(Y) 6.selection: if f(Y) ≤ f(X (t) ) then X (t+1) = Y else X (t+1) = X (t) 7.if not stopping then t = t+1, continue at (3) no choice, here

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 11 Evolutionary Algorithm Basics Specific example: (1+1)-EA in R n for minimizing some f: R n → R population size = 1, number of offspring = 1, selects best from 1+1 individuals parentoffspring 1.initialize X (0) 2 C ½ R n uniformly at random, set t = 0 2.evaluate f(X (t) ) 3.select parent: Y = X (t) 4.variation = add random vector: Y = Y + Z, e.g. Z » N(0, I n ) 5.evaluate f(Y) 6.selection: if f(Y) ≤ f(X (t) ) then X (t+1) = Y else X (t+1) = X (t) 7.if not stopping then t = t+1, continue at (3) no choice, here compact set = closed & bounded

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 12 Evolutionary Algorithm Basics Selection (a)select parents that generate offspring → selection for reproduction (b)select individuals that proceed to next generation → selection for survival necessary requirements: - selection steps must not favor worse individuals - one selection step may be neutral (e.g. select uniformly at random) - at least one selection step must favor better individuals typically: selection only based on fitness values f(x) of individuals seldom: additionally based on individuals‘ chromosomes x (→ maintain diversity)

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 13 Evolutionary Algorithm Basics Selection methods population P = (x 1, x 2,..., x  ) with  individuals uniform / neutral selection choose index i with probability 1/  fitness-proportional selection choose index i with probability s i = two approaches: 1. repeatedly select individuals from population with replacement 2. rank individuals somehow and choose those with best ranks (no replacement) problems: f(x) > 0 for all x 2 X required ) g(x) = exp( f(x) ) > 0 but already sensitive to additive shifts g(x) = f(x) + c almost deterministic if large differences, almost uniform if small differences don‘t use!

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 14 Evolutionary Algorithm Basics Selection methods population P = (x 1, x 2,..., x  ) with  individuals rank-proportional selection order individuals according to their fitness values assign ranks fitness-proportional selection based on ranks ) avoids all problems of fitness-proportional selection but: best individual has only small selection advantage (can be lost!) outdated! k-ary tournament selection draw k individuals uniformly at random (typically with replacement) from P choose individual with best fitness (break ties at random) ) has all advantages of rank-based selection and probability that best individual does not survive:

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 15 Evolutionary Algorithm Basics Selection methods without replacement population P = (x 1, x 2,..., x  ) with  parents and population Q = (y 1, y 2,..., y ) with offspring ( , )-selection or truncation selection on offspring or comma-selection rank offspring according to their fitness select  offspring with best ranks ) best individual may get lost, ≥  required (  + )-selection or truncation selection on parents + offspring or plus-selection merge offspring and  parents rank them according to their fitness select  individuals with best ranks ) best individual survives for sure

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 16 Evolutionary Algorithm Basics Selection methods: Elitism - Intrinsic elitism: method selects from parent and offspring, best survives with probability 1 - Forced elitism:if best individual has not survived then re-injection into population, i.e., replace worst selected individual by previously best parent methodP{ select best }from parents & offspringintrinsic elitism neutral< 1no fitness proportionate< 1no rank proportionate< 1no k-ary tournament< 1no ( ¹ + ¸ )= 1yes ( ¹, ¸ )= 1no Elitist selection: best parent is not replaced by worse individual.

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 17 Evolutionary Algorithm Basics Variation operators: depend on representation mutation→ alters a single individual recombination→ creates single offspring from two or more parents may be applied ● exclusively (either recombination or mutation) chosen in advance ● exclusively (either recombination or mutation) in probabilistic manner ● sequentially (typically, recombination before mutation); for each offspring ● sequentially (typically, recombination before mutation) with some probability

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 18 Evolutionary Algorithm Basics Variation in B n ● Mutation Individuals 2 { 0, 1 } n a)local→ choose index k 2 { 1, …, n } uniformly at random, flip bit k, i.e., x k = 1 – x k b)global→ for each index k 2 { 1, …, n }: flip bit k with probability p m 2 (0,1) c)“nonlocal“→ choose K indices at random and flip bits with these indices d)inversion→ choose start index k s and end index k e at random invert order of bits between start and and index a) k= b) ksks keke d) c) K=2 → →

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 19 Evolutionary Algorithm Basics Variation in B n ● Recombination (two parents) a)1-point crossover→ draw cut-point k 2 {1,…,n-1} uniformly at random; choose first k bits from 1st parent, choose last n-k bits from 2nd parent b)K-point crossover→ draw K distinct cut-points uniformly at random; choose bits 1 to k 1 from 1st parent, choose bits k 1 +1 to k 2 from 2nd parent, choose bits k 2 +1 to k 3 from 1st parent, and so forth … c)uniform crossover→ for each index i: choose bit i with equal probability from 1st or 2nd parent a) ) c) ) b) ) Individuals 2 { 0, 1 } n

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 20 Evolutionary Algorithm Basics Variation in B n ● Recombination (multiparent: ½ = #parents) b)gene pool crossover ( ½ > 2) Individuals 2 { 0, 1 } n a)diagonal crossover (2 < ½ < n) AAAAAAAAAA BBBBBBBBBB CCCCCCCCCC DDDDDDDDDD → choose ½ – 1 distinct cut points, select chunks from diagonals ABBBCCDDDD BCCCDDAAAA CDDDAABBBB DAAABBCCCC can generate ½ offspring; otherwise choose initial chunk at random for single offspring → for each gene: choose donating parent uniformly at random

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 21 Evolutionary Algorithm Basics Variation in P n ● Mutation Individuals X = ¼ (1, …, n) a)local→ 2-swap / 1-translocation b)global→ draw number K of 2-swaps, apply 2-swaps K times K is positive random variable; its distrinution may be uniform, binomial, geometrical, …; E[K] and V[K] may control mutation strength expectationvariance

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 22 Evolutionary Algorithm Basics Variation in P n ● Recombination (two parents) a)order-based crossover (OB) b)partially mapped crossover (PMX) c)cycle crossover (CX) Individuals X = ¼ (1, …, n)

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 23 Evolutionary Algorithm Basics Variation in P n ● Recombination (multiparent) a)xx crossover b)xx crossover c)xx crossover Individuals X = ¼ (1, …, n)

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 24 Evolutionary Algorithm Basics Variation in R n ● Mutation a)local→ Z with bounded support Individuals X 2 R n Definition Let f Z : R n → R + be p.d.f. of r.v. Z. The set { x 2 R n : f Z (x) > 0 } is termed the support of Z. additive: Y = X + Z(Z: n-dimensional random vector) offspring =parent + mutation x 0 fZfZ 0 b)nonlocal→ Z with unbounded support fZfZ x 0 0 most frequently used!

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 25 Evolutionary Algorithm Basics Variation in R n ● Recombination (two parents) a)all crossover variants adapted from B n b)intermediate c)intermediate (per dimension) d)discrete e)simulated binary crossover (SBX) Individuals X 2 R n

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 26 Evolutionary Algorithm Basics Variation in R n ● Recombination (multiparent), ½ ≥ 3 parents a)intermediate where and (all points in convex hull) b)intermediate (per dimension) Individuals X 2 R n

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 27 Evolutionary Algorithm Basics Theorem Let f: R n → R be a strictly quasiconvex function. If f(x) = f(y) for some x ≠ y then every offspring generated by intermediate recombination is better than its parents. Proof: ■

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2011/12 28 Evolutionary Algorithm Basics Theorem Let f: R n → R be a differentiable function and f(x) < f(y) for some x ≠ y. If (y – x)‘ r f(x) < 0 then there is a positive probability that an offspring generated by intermediate recombination is better than both parents. Proof: ■

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2009/10 29 Evolutionary Algorithms: Historical Notes Idea emerged independently several times: about late 1950s / early 1960s. Three branches / “schools“ still active today. ● Evolutionary Programming (EP): Pioneers: Lawrence Fogel, Alvin Owen, Michael Walsh (New York, USA). Original goal: Generate intelligent behavior through simulated evolution. Approach: Evolution of finite state machines predicting symbols. Later (~1990s) specialized to optimization in R n by David B. Fogel. ● Genetic Algorithms (GA): Pioneer: John Holland (Ann Arbor, MI, USA). Original goal: Analysis of adaptive behavior. Approach: Viewing evolution as adaptation. Simulated evolution of bit strings. Applied to optimization tasks by PhD students (Kenneth de Jong, 1975; et al.). ● Evolution Strategies (ES): Pioneers: Ingo Rechenberg, Hans-Paul Schwefel, Peter Bienert (Berlin, Germany). Original goal: Optimization of complex systems. Approach: Viewing variation/selection as improvement strategy. First in Z n, then R n.

Lecture 10 G. Rudolph: Computational Intelligence ▪ Winter Term 2009/10 30 Evolutionary Algorithms: Historical Notes “Offspring“ from GA branch: ● Genetic Programming (GP): Pioneers: Nichael Lynn Cramer 1985, then: John Koza (Stanford, USA). Original goal: Evolve programs (parse trees) that must accomplish certain task. Approach: GA mechanism transfered to parse trees. Later: Programs as successive statements → Linear GP (e.g. Wolfgang Banzhaf) Already beginning early 1990s: Borders between EP, GA, ES, GP begin to blurr... ) common term Evolutionary Algorithm embracing all kind of approaches ) broadly accepted name for the field: Evolutionary Computation scientific journals: Evolutionary Computation (MIT Press) since 1993, IEEE Transactions on Evolutionary Computation since 1997, several more specialized journals started since then.