Presentation is loading. Please wait.

Presentation is loading. Please wait.

Managing Director / CTO NuTech Solutions GmbH / Inc. Martin-Schmeißer-Weg 15 D – 44227 Dortmund Tel.: +49 (0) 231 / 72 54 63-10.

Similar presentations


Presentation on theme: "Managing Director / CTO NuTech Solutions GmbH / Inc. Martin-Schmeißer-Weg 15 D – 44227 Dortmund Tel.: +49 (0) 231 / 72 54 63-10."— Presentation transcript:

1 Managing Director / CTO NuTech Solutions GmbH / Inc. Martin-Schmeißer-Weg 15 D – 44227 Dortmund baeck@nutechsolutions.de Tel.: +49 (0) 231 / 72 54 63-10 Fax: +49 (0) 231 / 72 54 63-29 Thomas Bäck November 26, 2004 Evolution Strategies for Industrial Applications Natural Computing Leiden Institute for Advanced Computer Science (LIACS) Niels Bohrweg 1 NL-2333 CA Leiden baeck@liacs.nl Tel.: +31 (0) 71 527 7108 Fax: +31 (0) 71 527 6985

2 2 AlgorithmsTheoryExamples Overview Other Background I Daniel Dannett: Biology = Engineering

3 3 AlgorithmsTheoryExamples Overview Other Background II Realistic Scenario....

4 4 AlgorithmsTheoryExamples Overview Other Background III Phenotypic evolution... and genotypic

5 5 AlgorithmsTheoryExamples Overview Other Optimization E.g. costs (min), quality (max), error (min), stability (max), profit (max),... Difficulties: High-dimensional Nonlinear, non-quadratic: Multimodal Noisy, dynamic, discontinuous Evolutionary landscapes are like that !

6 6 AlgorithmsTheoryExamples Overview Other Overview  Evolutionary Algorithm Applications: Examples  Evolutionary Algorithms: Some Algorithmic Details  Genetic Algorithms  Evolution Strategies  Some Theory of EAs  Convergence Velocity Issues  Other Examples  Drug Design  Inverse Design of Cas  Summary

7 7 AlgorithmsTheoryExamples Overview Other Modeling – Simulation - Optimization !!! ??? !!! Simulation Modeling / Data Mining Optimization !!! ??? !!!

8 8 AlgorithmsTheoryExamples Overview Other General Aspects Evaluation EA-OptimizerBusiness Process Model Simulation FunctionModel from Data ExperimentSubjectiveFunction(s)

9 9 AlgorithmsTheoryExamples Overview Other Examples I: Inflatable Knee Bolster Optimization Support plate FEM #4 Initial position of knee bag modeldeployed knee bag (unit only) Volume of 14L Load distribution plate Tether Support plate Vent hole MAZDA Picture Load distribution plate FEM #3 Tether FEM #5 Knee bag FEM #2 Straps are defined in knee bag(FEM #2) Low Cost ES: 0.677 GA (Ford): 0.72 Hooke Jeeves DoE: 0.88 Low Cost ES: 0.677 GA (Ford): 0.72 Hooke Jeeves DoE: 0.88

10 10 AlgorithmsTheoryExamples Overview Other IKB: Previous Designs

11 11 AlgorithmsTheoryExamples Overview Other Objective:Min PtotalSubject to: Left Femur load <= 7000 Right Femur load <= 7000 IKB: Problem Statement

12 12 AlgorithmsTheoryExamples Overview Other Quality: 8.888 Simulations: 160 IKB Results I: Hooke-Jeeves

13 13 AlgorithmsTheoryExamples Overview Other Quality: 7.142 Simulations: 122 IKB Results II: (1+1)-ES

14 14 AlgorithmsTheoryExamples Overview Other Optical Coatings: Design Optimization  Nonlinear mixed-integer problem, variable dimensionality.  Minimize deviation from desired reflection behaviour.  Excellent synthesis method; robust and reliable results.

15 15 AlgorithmsTheoryExamples Overview Other  Dielectric filter design.  n=40 layers assumed.  Layer thicknesses x i in [0.01, 10.0].  Quality function: Sum of quadratic penalty terms.  Penalty terms = 0 iff constraints satisfied. Client: Corning, Inc., Corning, NY Dielectric Filter Design Problem

16 16 AlgorithmsTheoryExamples Overview Other Benchmark Results: Overview of Runs  Factor 2 in quality.  Factor 10 in effort.  Reliable, repeatable results.

17 17 AlgorithmsTheoryExamples Overview Other  Grid evaluation for 2 variables.  Close to the optimum (from vector of quality 0.0199).  Global view (left), vs. Local view (right). Problem Topology Analysis: An Attempt

18 18 AlgorithmsTheoryExamples Overview Other 18 Speed Variables (continuous) for Casting Schedule Turbine Blade after Casting  FE mesh of 1/3 geometry: 98.610 nodes, 357.300 tetrahedrons, 92.830 radiation surfaces large problem:  run time varies: 16 h 30 min to 32 h (SGI, Origin, R12000, 400 MHz)  at each run: 38,3 GB of view factors (49 positions) are treated! Examples II: Bridgman Casting Process

19 19 AlgorithmsTheoryExamples Overview Other Quality Comparison of the Initial and Optimized Configurations Initial (DoE) GCM(Commercial Gradient Based Method) Evolution Strategy Global Quality Turbine Blade after Casting Examples II: Bridgman Casting Process

20 20 AlgorithmsTheoryExamples Overview Other  Generates green times for next switching schedule.  Minimization of total delay / number of stops.  Better results (3 – 5%) / higher flexibility than with traditional controllers.  Dynamic optimization, depending on actual traffic (measured by control loops). Client: Dutch Ministry of Traffic Rotterdam, NL Examples IV: Traffic Light Control

21 21 AlgorithmsTheoryExamples Overview Other  Minimization of passenger waiting times.  Better results (3 – 5%) / higher flexibility than with traditional controllers.  Dynamic optimization, depending on actual traffic. Client: Fujitec Co. Ltd., Osaka, Japan Examples V: Elevator Control

22 22 AlgorithmsTheoryExamples Overview Other  Minimization of defects in the produced parts.  Optimization on geometric parameters and forces.  Fast algorithm; finds very good results. Client: AutoForm Engineering GmbH, Dortmund Examples VI: Metal Stamping Process

23 23 AlgorithmsTheoryExamples Overview Other  Minimization of end-to-end-blockings under service constraints.  Optimization of routing tables for existing, hard-wired networks.  10%-1000% improvement. Client: SIEMENS AG, München Examples VII: Network Routing

24 24 AlgorithmsTheoryExamples Overview Other  Minimization of total costs.  Creates new fuel assembly reload patterns.  Clear improvements (1%-5%) of existing expert solutions.  Huge cost saving. Client: SIEMENS AG, München Examples VIII: Nuclear Reactor Refueling

25 25 AlgorithmsTheoryExamples Overview Other Experimental design optimisation: Optimise efficieny. Initial design Final design: 32% improvement in efficieny.... evolves... Two-Phase Nozzle Design (Experimental)

26 26 AlgorithmsTheoryExamples Overview Other Multipoint Airfoil Optimization (1) High Lift! Low Drag! Start Cruise Client: 22 design parameters.

27 27 AlgorithmsTheoryExamples Overview Other Multipoint Airfoil Optimization (2) Pareto set after 1000 Simulations Three compromise wing designs Find pressure profiles that are a compromise between two given target pressure distributions under two given flow conditions!

28 28 AlgorithmsTheoryExamples Overview Other Evolutionary Algorithms: Some Algorithmic Details

29 29 AlgorithmsTheoryExamples Overview Other Unifying Evolutionary Algorithm t := 0; initialize(P(t)); evaluate(P(t)); while not terminate do P‘(t) := mating_selection(P(t)); P‘‘(t) := variation(P‘(t)); evaluate(P‘‘(t)); P(t+1) := environmental_selection(P‘‘(t) u Q); t := t+1; od

30 30 AlgorithmsTheoryExamples Overview Other Evolutionary Algorithm Taxonomy Evolution Strategies Genetic Algorithms Genetic Programming Evolutionary Programming Classifier Systems Many mixed forms; agent-based systems, swarm systems, A-life systems,...

31 31 AlgorithmsTheoryExamples Overview Other  Real-valued representation  Normally distributed mutations  Fixed recombination rate (= 1)  Deterministic selection  Creation of offspring surplus  Self-adaptation of strategy parameters: Variance(s), Covariances  Binary representation  Fixed mutation rate p m (= 1/n)  Fixed crossover rate p c  Probabilistic selection  Identical population size  No self-adaptation Genetic Algorithm Evolution Strategies Genetic Algorithms vs. Evolution Strategies

32 32 AlgorithmsTheoryExamples Overview Other Genetic Algorithms  Often binary representation.  Mutation by bit inversion with probability p m.  Various types of crossover, with probability p c.  k -point crossover.  Uniform crossover.  Probabilistic selection operators.  Proportional selection.  Tournament selection.  Parent and offspring population size identical.  Constant strategy parameters.

33 33 AlgorithmsTheoryExamples Overview Other Mutation 011101010000001 011100010100001  Mutation by bit inversion with probability p m.  p m identical for all bits.  p m small (e.g., p m = 1/l ).

34 34 AlgorithmsTheoryExamples Overview Other Crossover  Crossover applied with probability p c.  p c identical for all individuals.  k -point crossover: k points chosen randomly.  Example: 2-point crossover.

35 35 AlgorithmsTheoryExamples Overview Other Selection  Fitness proportional:  f fitness  population size  Tournament selection:  Randomly select q << individuals.  Copy best of these q into next generation.  Repeat times.  q is the tournament size (often: q = 2 ).

36 36 AlgorithmsTheoryExamples Overview Other Evolution Strategies  Real-valued representation.  Normally distributed mutations.  Various types of recombination.  Discrete (exchange of variables).  Intermediate (averaging).  Involving two or more parents.  Deterministic selection, offspring surplus .  Elitist: (  )  Non-elitist: (  )  Self-Adaptation of strategy parameters.

37 37 AlgorithmsTheoryExamples Overview Other Mutation  -adaptation by means of –1/5-success rule. –Self-adaptation. Creation of a new solution: Convergence speed:  Ca. 10  n down to 5  n is possible. More complex / powerful strategies: –Individual step sizes  i. –Covariances.

38 38 AlgorithmsTheoryExamples Overview Other Self-Adaptation  Learning while searching: Intelligent Method.  Different algorithmic approaches, e.g: Pure self-adaptation: Mutational step size control MSC: Derandomized step size adaptation Covariance adaptation

39 39 AlgorithmsTheoryExamples Overview Other Self-Adaptive Mutation n = 2, n  = 1, n  = 0 n = 2, n  = 2, n  = 0 n = 2, n  = 2, n  = 1

40 40 AlgorithmsTheoryExamples Overview Other Self-Adaptation:  Motivation: General search algorithm  Geometric convergence: Arbitrarily slow, if s wrongly controlled !  No deterministic / adaptive scheme for arbitrary functions exists.  Self-adaptation: On-line evolution of strategy parameters.  Various schemes:  Schwefel one , n , covariances; Rechenberg MSA.  Ostermeier, Hansen: Derandomized, Covariance Matrix Adaptation.  EP variants (meta EP, Rmeta EP).  Bäck: Application to p in GAs. Step size Direction

41 41 AlgorithmsTheoryExamples Overview Other Self-Adaptation: Dynamic Sphere  Optimum  :  Transition time proportionate to n.  Optimum  learned by self- adaptation.

42 42 AlgorithmsTheoryExamples Overview Other Selection (  ) (  )

43 43 AlgorithmsTheoryExamples Overview Other Possible Selection Operators  (1+1)-strategy: one parent, one offspring.  (1, )-strategies: one parent, offspring. Example: (1,10)-strategy. Derandomized / self-adaptive / mutative step size control.  ( , )-strategies:  >1 parents,  >  offspring Example: (2,15)-strategy. Includes recombination. Can overcome local optima.  (  + )-strategies: elitist strategies.

44 44 AlgorithmsTheoryExamples Overview Other Advantages of Evolution Strategies  Self-Adaptation of strategy parameters.  Direct, global optimizers !  Faster than GAs !  Extremely good in solution quality.  Very small number of function evaluations.  Dynamical optimization problems.  Design optimization problems.  Discrete or mixed-integer problems.  Experimental design optimisation.  Combination with Meta-Modeling techniques.

45 45 AlgorithmsTheoryExamples Overview Other Some Theory of EAs

46 46 AlgorithmsTheoryExamples Overview Other Robust vs. Fast:  Global convergence with probability one: General, but for practical purposes useless.  Convergence velocity: Local analysis only, specific functions only.

47 47 AlgorithmsTheoryExamples Overview Other Convergence Velocity Analysis, ES:  A convex function („sphere model“).  Simplest case: (  )-ES  Illustration: (1,4)-ES

48 48 AlgorithmsTheoryExamples Overview Other Convergence Velocity Analysis, ES:  Order statistics:  p  (z) denotes the p.d.f. of Z   Idea: Best offspring has smallest r / largest z‘.  The following holds from geometric considerations:  One gets:

49 49 AlgorithmsTheoryExamples Overview Other Convergence Velocity Analysis, ES:  Using one finally gets  One gets: (dimensionless)

50 50 AlgorithmsTheoryExamples Overview Other Convergence Velocity Analysis, ES:  Convergence velocity, (  )-ES and (  )-ES:

51 51 AlgorithmsTheoryExamples Overview Other Convergence Velocity Analysis, GA:  (1+1)-GA, (1, )-GA, (1+ )-GA.  For counting ones function:  Convergence velocity:  Mutation rate p, q = 1 – p, k max = l – f a.

52 52 AlgorithmsTheoryExamples Overview Other Convergence Velocity Analysis, GA:  Optimum mutation rate ?  Absorption times from transition matrix in block form, usingwhere

53 53 AlgorithmsTheoryExamples Overview Other Convergence Velocity Analysis, GA:  p too large: Exponential  p too small: Almost constant.  Optimal: O(l ln l). p

54 54 AlgorithmsTheoryExamples Overview Other Convergence Velocity Analysis, GA:  (1, )-GA ( k min = -f a ), (1+ )-GA ( k min = 0 ) :

55 55 AlgorithmsTheoryExamples Overview Other Convergence Velocity Analysis, EA: (1, )-GA, (1+ )-GA:(1, )-ES, (1+ )-ES: Conclusion: Unifying, search-space independent theory !?

56 56 AlgorithmsTheoryExamples Overview Other Current Drug Targets: GPCR http://www.gpcr.org/

57 57 AlgorithmsTheoryExamples Overview Other Goals (in Cooperation with LACDR):  CI Methods:  Automatic knowledge extraction from biological databases – fuzzy rules.  Automatic optimisation of structures – evolution strategies.  Exploration for  Drug Discovery,  De novo Drug Design. Charge distribution on VdW surface of CGS15943 “Fingerprint” New derivative with good receptor affinity. Initialisation Final (optimized)

58 58 AlgorithmsTheoryExamples Overview Other Evolutionary DNA-Computing (with IMB):  DNA-Molecule = Solution candidate !  Potential Advantage: > 10 12 candidate solutions in parallel.  Biological operators:  Cutting, Splicing.  Ligating.  Amplification.  Mutation.  Current approaches very limited.  Our approach:  Suitable NP-complete problem.  Modern technology.  Scalability (n > 30).

59 59 AlgorithmsTheoryExamples Overview Other UP of CAs (= Inverse Design of CAs)  1D CAs: Earlier work by Mitchell et al., Koza,...  Transition rule: Assigns each neighborhood configuration a new state.  One rule can be expressed by bits.  There are rules for a binary 1D CA. 100001101010100 Neighborhood (radius r = 2)

60 60 AlgorithmsTheoryExamples Overview Other UP of CAs (rule encoding)  Assume r=1: Rule length is 8 bits  Corresponding neighborhoods 10000110 000 001 010 011 100 101 110 111

61 61 AlgorithmsTheoryExamples Overview Other Inverse Design of CAs: 1D  Time evolution diagram:

62 62 AlgorithmsTheoryExamples Overview Other Inverse Design of CAs: 1D  Majority problem:  Particle-based rules.  Fitness values: 0.76, 0.75, 0.76, 0.73

63 63 AlgorithmsTheoryExamples Overview Other Inverse Design of CAs: 1D Don‘t care about initial state rules Block expanding rules Particle communication based rules

64 64 AlgorithmsTheoryExamples Overview Other Inverse Design of CAs: 1D Majority Records  Gacs, Kurdyumov, Levin 1978 (hand-written):81.6%  Davis 1995 (hand-written):81.8%  Das 1995 (hand-written):82.178%  David, Forrest, Koza 1996 (GP):82.326%

65 65 AlgorithmsTheoryExamples Overview Other Inverse Design of Cas: 2D  Generalization to 2D (nD) CAs ?  Von Neumann vs. Moore neighborhood (r = 1)  Generalization to r > 1 possible (straightforward)  Search space size for a GA: vs. 10 0 1 110 0 1 1 0 0 1 1

66 66 AlgorithmsTheoryExamples Overview Other Inverse Design of CAs  Learning an AND rule.  Input boxes are defined.  Some evolution plots:

67 67 AlgorithmsTheoryExamples Overview Other Inverse Design of CAs  Learning an XOR rule.  Input boxes are defined.  Some evolution plots:

68 68 AlgorithmsTheoryExamples Overview Other Inverse Design of CAs  Learning the majority task.  84/169 in a), 85/169 in b).  Fitness value: 0.715

69 69 AlgorithmsTheoryExamples Overview Other Inverse Design of CAs  Learning pattern compression tasks.

70 70 AlgorithmsTheoryExamples Overview Other Evolution = Computation ? Yes Search & Optimization are fundamental problems / tasks in many applications (learning, engineering,...).

71 71 AlgorithmsTheoryExamples Overview Other Summary  Explicative models based on Fuzzy Rules.  Descriptive models based on e.g. Kriging method.  Few data points necessary, high modeling accuracy.  Used in product design, quality control, management decision support, prediction and optimization.  Optimization based on Evolution Strategies (and traditional methods).  Few function evaluations necessary.  Robust, widely usable, excellent solution quality.  Self-adaptivity (easy to use !).  Patents: US no. 5,826,251; Germany no. 43 08 083, 44 16 465, 196 40 635.

72 72 AlgorithmsTheoryExamples Overview Other Questions ? Thank you very much for your time !

73 73 AlgorithmsTheoryExamples Overview Other Evolutionary Algorithm Applications  Few Evaluations: Response Surface Approximations

74 74 AlgorithmsTheoryExamples Overview Other Original100 points 400 points 200 points ES plus Response Surface Approximation  Learning of a RSA.  Utilization for selection of promising points. Bäck et al., Metamodel- Assisted Evolution Strategies, PPSN6, Granada Spain (2002)

75 75 AlgorithmsTheoryExamples Overview Other Data infrastructure for optimization with RSA Optimizer Quality and constraint values Aggregation Parameter ranges, initial values Parameter values SIMULATOR e.g. CASTS, FLUENT Interface Response Surface Approximator Response Surface Approximator Database of trial results Database of trial results

76 76 AlgorithmsTheoryExamples Overview Other Estimation of approximation error Criterion for selecting trial points: Estimation of quality value (from data base).  Points with high expected quality.  Points located in unexplored regions of search space.

77 77 AlgorithmsTheoryExamples Overview Other Method: Kriging Interpolation  Invented for prediction of gold distribution in African gold mines.  Used here as interpolation method for RSA.  Model assumption: Realizations of n-dimensional random walk.  Reconstruction according to Maximum Likelyhood Principle.  Error = minimized quadratic error from Maximum Likelyhood Analysis. Discrete Random Walk Continuous Random Walk y1y1 y2y2 y3y3 y4y4 x1x1 x2x2 x3x3 x4x4 x f(x) x ftft

78 78 AlgorithmsTheoryExamples Overview Other Given: List of m trial results Searched for: Approximate estimation model for unknown points. Estimated approximation error: Data-Driven Modeling by Kriging Interpolation

79 79 AlgorithmsTheoryExamples Overview Other Stochastic Gaussian Process – Covariance function (kernel): Maximum Likelihood estimation (model parameters of Gaussian process): 1-dimensional optimization + matrix inversion at each iteration Kriging Interpolation: Theory

80 80 AlgorithmsTheoryExamples Overview Other Estimation of : Estimated error of the prediction: Kriging Interpolation: Theory

81 81 AlgorithmsTheoryExamples Overview Other Kriging-Method: Approximation / Error Estimation x f(x) MSE: Estimated Approximation Error f(x) x

82 82 AlgorithmsTheoryExamples Overview Other Kriging-Method: Airfoil Optimization Example x MSE: Estimated Approximation Error Comparison of various strategies  Fast seq. Quadratic programming.  Pattern search.  MC sampling.  (2+10)-ES.  RSA-supported ES.

83 83 AlgorithmsTheoryExamples Overview Other Evolutionary Algorithm Applications  Noisy Objectives: Thresholding

84 84 AlgorithmsTheoryExamples Overview Other Noisy Fitness Functions: Thresholding  Fitness evaluation is disturbed by noise, e.g.:stochastic distribution of passengers within an elevator system.  Traffic control problems in general.  Probability of generating a real improvement is very small.  Introduce explicit barrier into the (1+1)-ES to distinguish real improvements from overvalued individuals: Only accept offspring if it outperforms the parent by at least a value of  (threshold).

85 85 AlgorithmsTheoryExamples Overview Other Finding the Optimal Threshold For Gaussian noise  General optimal threshold:  For the sphere model (where R is the distance to the optimum):

86 86 AlgorithmsTheoryExamples Overview Other Influence of Thresholding (I) solid lines: opt. threshold dashed lines: crosses: data measured in ES- runs (sphere) noise strength (from top to bottom) normalized mutation strength  * normalized progress rate  *

87 87 AlgorithmsTheoryExamples Overview Other Influence of Thresholding (II) noise strength (from top to bottom) normalized threshold  * normalized progress rate  *

88 88 AlgorithmsTheoryExamples Overview Other Applications: elevator control  Simulation of an elevator group controller takes a long time  Instead use artificial problem tightly related to the real-world problem: S-Ring

89 89 AlgorithmsTheoryExamples Overview Other Application in Elevator Controller: S-Ring distance to optimum R: 0 = greedy, 100 = optimum Quality gain q  Only thresholding leads to a positive quality gain (  = 0.3)  A too large threshold value does not permit any progress (  = 1.0)

90 90 AlgorithmsTheoryExamples Overview Other Evolutionary DNA-Computing:  Example: Maximum Clique Problem  Problem Instance: Graph  Feasible Solution: V‘ such that  Objective Function: Size |V‘| of clique V‘  Optimal Solution:Clique V‘ that maximizes |V‘|.  Example: 1 2 3 4 5 6 7 8 {2,3,6,7}: Maximum Clique (01100110) {4,5,8}: Clique. (00011001)

91 91 AlgorithmsTheoryExamples Overview Other DNA-Computing: Classical Approach 1: X := randomly generate DNA strands representing all candidates; 2: Remove the set Y of all non-cliques from X: C = X – Y; 3: Identify with smallest length (largest clique);  Based on filtering out the optimal solution.  Fails for large n (exponential growth).  Applied in the lab for n=6 (Ouyang et el., 1997); limited to n=36 (nanomole operations).

92 92 AlgorithmsTheoryExamples Overview Other DNA-Computing: Evolutionary Approach 1: Generate an initial random population P, ; 2: while not terminate do 3:P := amplify and mutate P; 4:Remove the set Y of all non-cliques from P: P := P - Y; 5:P‘ := select shortest DNA strands from P; 6: od  Based on evolving an (near-) optimal solution.  Also applicable for large n.  Currently tested in the lab (Leiden, IMB).

93 93 AlgorithmsTheoryExamples Overview Other Scalability Issues: Maximum Clique: Simulation Results (1, )-GA (best of 10) Problem n =10100100010000Opt. brock200_12001417171921 brock200_2200 6 9 8 912 brock200_32001011121215 brock200_42001112131417 hamming8-4256---12121616 p_hat300-1300--- 6 7 7 8 p_hat300-2300---19192025  Averages (not shown here) confirm trends.  Theory for large (NOT infinite) population sizes (other than c  ) ?

94 94 AlgorithmsTheoryExamples Overview Other Evolutionary Algorithm Applications  Multiple Criteria Decision Making (MCDM)

95 95 AlgorithmsTheoryExamples Overview Other Multi Criteria Optimization (1)  Most Problems: More than one aspect to optimise.  Conflicting Criteria !  Classical optimization techniques map multiple criteria to one single value, e.g. by weighted sum:  But: How can optimal weights be determined?  Evolution Strategies can directly use the concept of Pareto Dominance

96 96 AlgorithmsTheoryExamples Overview Other Multi Criteria Optimization (2)  Multi Criteria Optimization does not mean:  Decide on „What is a good compromise“ before optimization (e.g. by choosing weighting factors).  Find one single optimal solution.  Multi Criteria Optimization means:  Decide on a compromise after optimization.  Find a set of multiple compromise solutions.  Evolutionary Multi Criteria Optimization means:  Use the population structure to represent the set of multiple compromise solutions.  Use the concept of Pareto Dominance

97 97 AlgorithmsTheoryExamples Overview Other Multi Criteria Optimization (3) Weighted Sum Solution Alternative Solution Theoretical Pareto Set Criterion 2 Criterion 1

98 98 AlgorithmsTheoryExamples Overview Other Pareto Dominance: Definition  If all f i (a) are better than f i (b), then a dominates b.  If all f i (b) are better than f i (a), then b dominates a.  If there are i and j, such that  f i (a) is better than f i (b), but  f j (b) is better than f j (a), then  a and b do not dominate each other („are equal“, „are incomparable“) Assume two design solutions a and b with F(a) = (f 1 (a),...,f k (a)) and F(b) = (f 1 (b),...,f k (b))

99 99 AlgorithmsTheoryExamples Overview Other Multipoint Airfoil Optimization (3) Pressure profile at high lift flow conditions Pressure profile at low drag flow conditions Find pressure profiles that are a compromise between two given target pressure distributions under two given flow conditions!

100 100 AlgorithmsTheoryExamples Overview Other Evolutionary Multi Criteria Optimization  General idea: After selection, the parent population should... ... contain as many dominating individuals as possible ... be evenly distributed along the Pareto front  Two concepts to achieve this:  NSGA-II (Deb) - NSES  SPEA2 (Zitzler, Thiele)

101 101 AlgorithmsTheoryExamples Overview Other Nondominated Sorting Evolutionary Algorithm (1)  Selecting dominating individuals by Nondominated Sorting 1. Start with rank 1 and the complete population. 2. In the current population: Find the Pareto set. 3. Assign the individuals of the Pareto set the same rank. 4. Increment the rank. 5. Remove the Pareto set from the current population. 6. Start over at 2.  Individuals with low ranks are prefered!

102 102 AlgorithmsTheoryExamples Overview Other Nondominated Sorting Evolutionary Algorithm (2) First Pareto Front Rank 1 Second Pareto Front Rank 2 Third Pareto Front Rank 3

103 103 AlgorithmsTheoryExamples Overview Other d2d2 d1d1  Estimating population density depending on distances to next neighbours.  Prefer individuals with low density! Nondominated Sorting Evolutionary Algorithm (3)

104 104 AlgorithmsTheoryExamples Overview Other NSEA and Evolution Strategies  Observation:  Pareto dominance is expressed as integer value (Rank).  Density estimation is expressed as a real value between 0 and 1 (  ).  Idea: Set Fitness to Rank + Density.  With increasing density, the fitness increases towards Rank + 1.  With decreasing density, the fitness decreses towards Rank.  Dominating individuals have low fitness, independent of their density.  Use this fitness value in a standard (  + ) Evolution Strategy (archive implicit !).


Download ppt "Managing Director / CTO NuTech Solutions GmbH / Inc. Martin-Schmeißer-Weg 15 D – 44227 Dortmund Tel.: +49 (0) 231 / 72 54 63-10."

Similar presentations


Ads by Google