Computational Intelligence Winter Term 2013/14 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.

Slides:



Advertisements
Similar presentations
3.6 Support Vector Machines
Advertisements

Logical and Artificial Intelligence in Games Lecture 14
Constraint Satisfaction Problems
Algorithm Design Techniques
February 21, 2002 Simplex Method Continued
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
1 The tiling algorithm Learning in feedforward layered networks: the tiling algorithm writed by Marc M é zard and Jean-Pierre Nadal.
Von Karman Institute for Fluid Dynamics RTO, AVT 167, October, R.A. Van den Braembussche von Karman Institute for Fluid Dynamics Tuning of Optimization.
ABC Technology Project
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2010/11 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Local Search Algorithms
Optimization 1/33 Radford, A D and Gero J S (1988). Design by Optimization in Architecture, Building, and Construction, Van Nostrand Reinhold, New York.
Artificial Intelligence
25 seconds left…...
Computational Intelligence Winter Term 2013/14 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Dantzig-Wolfe Decomposition
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Evolution strategies Chapter 4.
1 Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations.
Computational Intelligence
Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Biologically Inspired Computing: Operators for Evolutionary Algorithms
Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Evolutionary Computational Intelligence
Iterative Improvement Algorithms
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Prepared by Barış GÖKÇE 1.  Search Methods  Evolutionary Algorithms (EA)  Characteristics of EAs  Genetic Programming (GP)  Evolutionary Programming.
Genetic Algorithm.
Genetic algorithms Prof Kang Li
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Neural and Evolutionary Computing - Lecture 5 1 Evolutionary Computing. Genetic Algorithms Basic notions The general structure of an evolutionary algorithm.
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
1 Combinatorial Problem. 2 Graph Partition Undirected graph G=(V,E) V=V1  V2, V1  V2=  minimize the number of edges connect V1 and V2.
Genetic Algorithms. Evolutionary Methods Methods inspired by the process of biological evolution. Main ideas: Population of solutions Assign a score or.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Exact and heuristics algorithms
1 Genetic Algorithms and Ant Colony Optimisation.
Genetic Algorithms Genetic algorithms provide an approach to learning that is based loosely on simulated evolution. Hypotheses are often described by bit.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
METAHEURISTICS Genetic Algorithm Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Biologically inspired algorithms BY: Andy Garrett YE Ziyu.
CS621: Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 5: Power of Heuristic; non- conventional search.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Genetic Algorithms.
Computational Intelligence
Evolutionary Algorithms Jim Whitehead
School of Computer Science & Engineering
Introduction to Genetic Algorithm (GA)
CSC 380: Design and Analysis of Algorithms
CS621: Artificial Intelligence
Computational Intelligence
Computational Intelligence
Biologically Inspired Computing: Operators for Evolutionary Algorithms
Computational Intelligence
CSC 380: Design and Analysis of Algorithms
Population Methods.
Presentation transcript:

Computational Intelligence Winter Term 2013/14 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 2 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 2 Plan for Today ● Evolutionary Algorithms (EA) ● Optimization Basics ● EA Basics

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 3 Optimization Basics ? !! ! !? ! ?! modelling simulation optimization system outputinput

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 4 Optimization Basics given: objective function f: X → R feasible region X (= nonempty set) optimization problem: find x* 2 X such that f(x*) = min{ f(x) : x 2 X } note: max{ f(x) : x 2 X } = –min{ –f(x) : x 2 X } x* global solution f(x*) global optimum objective: find solution with minimal or maximal value!

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 5 Optimization Basics local solution x* 2 X : 8 x 2 N(x*): f(x*) ≤ f(x) neighborhood of x* = bounded subset of X example: X = R n, N  (x*) = { x 2 X: || x – x*|| 2 ≤  if x* local solution then f(x*) local optimum / minimum remark: evidently, every global solution / optimum is also local solution / optimum; the reverse is wrong in general! ab x* example: f: [a,b] → R, global solution at x*

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 6 Optimization Basics What makes optimization difficult? some causes: local optima (is it a global optimum or not?) constraints (ill-shaped feasible region) non-smoothness (weak causality) discontinuities ( ) nondifferentiability, no gradients) lack of knowledge about problem ( ) black / gray box optimization) f(x) = a 1 x a n x n → max! with x i 2 {0,1}, a i 2 R add constaint g(x) = b 1 x b n x n ≤ b ) x i * = 1 iff a i > 0 ) NP-hard add capacity constraint to TSP ) CVRP ) still harder strong causality needed!

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 7 Optimization Basics When using which optimization method? mathematical algorithms problem explicitly specified problem-specific solver available problem well understood ressources for designing algorithm affordable solution with proven quality required ) don‘t apply EAs randomized search heuristics problem given by black / gray box no problem-specific solver available problem poorly understood insufficient ressources for designing algorithm solution with satisfactory quality sufficient ) EAs worth a try

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 8 Evolutionary Algorithm Basics idea: using biological evolution as metaphor and as pool of inspiration ) interpretation of biological evolution as iterative method of improvement feasible solution x 2 X = S 1 x... x S n = chromosome of individual multiset of feasible solutions = population: multiset of individuals objective function f: X → R = fitness function often: X = R n, X = B n = {0,1} n, X = P n = {  :  is permutation of {1,2,...,n} } also : combinations like X = R n x B p x P q or non-cartesian sets ) structure of feasible region / search space defines representation of individual

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 9 Evolutionary Algorithm Basics initialize population evaluation parent selection variation (yields offspring) survival selection (yields new population) evaluation (of offspring) stop? output: best individual found Y N algorithmic skeleton

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 10 Evolutionary Algorithm Basics Specific example: (1+1)-EA in B n for minimizing some f: B n → R population size = 1, number of offspring = 1, selects best from 1+1 individuals parentoffspring 1.initialize X (0) 2 B n uniformly at random, set t = 0 2.evaluate f(X (t) ) 3.select parent: Y = X (t) 4.variation: flip each bit of Y independently with probability p m = 1/n 5.evaluate f(Y) 6.selection: if f(Y) ≤ f(X (t) ) then X (t+1) = Y else X (t+1) = X (t) 7.if not stopping then t = t+1, continue at (3) no choice, here

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 11 Evolutionary Algorithm Basics Specific example: (1+1)-EA in R n for minimizing some f: R n → R population size = 1, number of offspring = 1, selects best from 1+1 individuals parentoffspring 1.initialize X (0) 2 C ½ R n uniformly at random, set t = 0 2.evaluate f(X (t) ) 3.select parent: Y = X (t) 4.variation = add random vector: Y = Y + Z, e.g. Z » N(0, I n ) 5.evaluate f(Y) 6.selection: if f(Y) ≤ f(X (t) ) then X (t+1) = Y else X (t+1) = X (t) 7.if not stopping then t = t+1, continue at (3) no choice, here compact set = closed & bounded

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 12 Evolutionary Algorithm Basics Selection (a)select parents that generate offspring → selection for reproduction (b)select individuals that proceed to next generation → selection for survival necessary requirements: - selection steps must not favor worse individuals - one selection step may be neutral (e.g. select uniformly at random) - at least one selection step must favor better individuals typically: selection only based on fitness values f(x) of individuals seldom: additionally based on individuals‘ chromosomes x (→ maintain diversity)

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 13 Evolutionary Algorithm Basics Selection methods population P = (x 1, x 2,..., x  ) with  individuals uniform / neutral selection choose index i with probability 1/  fitness-proportional selection choose index i with probability s i = two approaches: 1. repeatedly select individuals from population with replacement 2. rank individuals somehow and choose those with best ranks (no replacement) problems: f(x) > 0 for all x 2 X required ) g(x) = exp( f(x) ) > 0 but already sensitive to additive shifts g(x) = f(x) + c almost deterministic if large differences, almost uniform if small differences don‘t use!

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 14 Evolutionary Algorithm Basics Selection methods population P = (x 1, x 2,..., x  ) with  individuals rank-proportional selection order individuals according to their fitness values assign ranks fitness-proportional selection based on ranks ) avoids all problems of fitness-proportional selection but: best individual has only small selection advantage (can be lost!) outdated! k-ary tournament selection draw k individuals uniformly at random (typically with replacement) from P choose individual with best fitness (break ties at random) ) has all advantages of rank-based selection and probability that best individual does not survive:

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 15 Evolutionary Algorithm Basics Selection methods without replacement population P = (x 1, x 2,..., x  ) with  parents and population Q = (y 1, y 2,..., y ) with offspring ( , )-selection or truncation selection on offspring or comma-selection rank offspring according to their fitness select  offspring with best ranks ) best individual may get lost, ≥  required (  + )-selection or truncation selection on parents + offspring or plus-selection merge offspring and  parents rank them according to their fitness select  individuals with best ranks ) best individual survives for sure

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 16 Evolutionary Algorithm Basics Selection methods: Elitism - Intrinsic elitism: method selects from parent and offspring, best survives with probability 1 - Forced elitism:if best individual has not survived then re-injection into population, i.e., replace worst selected individual by previously best parent methodP{ select best }from parents & offspringintrinsic elitism neutral< 1no fitness proportionate< 1no rank proportionate< 1no k-ary tournament< 1no ( ¹ + ¸ )= 1yes ( ¹, ¸ )= 1no Elitist selection: best parent is not replaced by worse individual.

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 17 Evolutionary Algorithm Basics Variation operators: depend on representation mutation→ alters a single individual recombination→ creates single offspring from two or more parents may be applied ● exclusively (either recombination or mutation) chosen in advance ● exclusively (either recombination or mutation) in probabilistic manner ● sequentially (typically, recombination before mutation); for each offspring ● sequentially (typically, recombination before mutation) with some probability

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 18 Evolutionary Algorithm Basics Variation in B n ● Mutation Individuals 2 { 0, 1 } n a)local→ choose index k 2 { 1, …, n } uniformly at random, flip bit k, i.e., x k = 1 – x k b)global→ for each index k 2 { 1, …, n }: flip bit k with probability p m 2 (0,1) c)“nonlocal“→ choose K indices at random and flip bits with these indices d)inversion→ choose start index k s and end index k e at random invert order of bits between start and and index a) k= b) ksks keke d) c) K=2 → →

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 19 Evolutionary Algorithm Basics Variation in B n ● Recombination (two parents) a)1-point crossover→ draw cut-point k 2 {1,…,n-1} uniformly at random; choose first k bits from 1st parent, choose last n-k bits from 2nd parent b)K-point crossover→ draw K distinct cut-points uniformly at random; choose bits 1 to k 1 from 1st parent, choose bits k 1 +1 to k 2 from 2nd parent, choose bits k 2 +1 to k 3 from 1st parent, and so forth … c)uniform crossover→ for each index i: choose bit i with equal probability from 1st or 2nd parent a) ) c) ) b) ) Individuals 2 { 0, 1 } n

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 20 Evolutionary Algorithm Basics Variation in B n ● Recombination (multiparent: ½ = #parents) b)gene pool crossover ( ½ > 2) Individuals 2 { 0, 1 } n a)diagonal crossover (2 < ½ < n) AAAAAAAAAA BBBBBBBBBB CCCCCCCCCC DDDDDDDDDD → choose ½ – 1 distinct cut points, select chunks from diagonals ABBBCCDDDD BCCCDDAAAA CDDDAABBBB DAAABBCCCC can generate ½ offspring; otherwise choose initial chunk at random for single offspring → for each gene: choose donating parent uniformly at random

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 21 Evolutionary Algorithm Basics Variation in P n ● Mutation Individuals X = ¼ (1, …, n) a)local→ 2-swap / 1-translocation b)global→ draw number K of 2-swaps, apply 2-swaps K times K is positive random variable; its distribution may be uniform, binomial, geometrical, …; E[K] and V[K] may control mutation strength expectationvariance

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 22 Evolutionary Algorithm Basics Variation in P n ● Recombination (two parents) Individuals X = ¼ (1, …, n) b)partially mapped crossover (PMX) - select two indices k 1 and k 2 with k 1 ≤ k 2 uniformly at random - copy genes k 1 to k 2 from 1 st parent to offspring (keep positions) - copy all genes not already contained in offspring from 2 nd parent (keep positions) - from left to right: fill in remaining genes from 2 nd parent a)order-based crossover (OBX) - select two indices k 1 and k 2 with k 1 ≤ k 2 uniformly at random - copy genes k 1 to k 2 from 1 st parent to offspring (keep positions) - copy genes from left to right from 2 nd parent, starting after position k 2 x x x x x x x x x x

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 23 Evolutionary Algorithm Basics Variation in R n ● Mutation a)local→ Z with bounded support Individuals X 2 R n Definition Let f Z : R n → R + be p.d.f. of r.v. Z. The set { x 2 R n : f Z (x) > 0 } is termed the support of Z. additive: Y = X + Z(Z: n-dimensional random vector) offspring =parent + mutation x 0 fZfZ 0 b)nonlocal→ Z with unbounded support fZfZ x 0 0 most frequently used!

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 24 Evolutionary Algorithm Basics Variation in R n ● Recombination (two parents) a)all crossover variants adapted from B n b)intermediate c)intermediate (per dimension) d)discrete e)simulated binary crossover (SBX) Individuals X 2 R n → for each dimension with probability p c draw from:

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 25 Evolutionary Algorithm Basics Variation in R n ● Recombination (multiparent), ½ ≥ 3 parents a)intermediate where and (all points in convex hull) b)intermediate (per dimension) Individuals X 2 R n

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 26 Evolutionary Algorithm Basics Theorem Let f: R n → R be a strictly quasiconvex function. If f(x) = f(y) for some x ≠ y then every offspring generated by intermediate recombination is better than its parents. Proof: ■

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2013/14 27 Evolutionary Algorithm Basics Theorem Let f: R n → R be a differentiable function and f(x) < f(y) for some x ≠ y. If (y – x)‘ r f(x) < 0 then there is a positive probability that an offspring generated by intermediate recombination is better than both parents. Proof: ■