Presentation is loading. Please wait.

Presentation is loading. Please wait.

Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur.

Similar presentations


Presentation on theme: "Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur."— Presentation transcript:

1 Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

2 Angers, 10 June 20102 Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

3 Angers, 10 June 20103 Introduction About me…. 2001-2005: PhD in Lille (France) – supervisor E-G. Talbi Research visitor in ETH Zurich/Switzerland (2005) – with E. Zitzler 2006-2007: Research Assistant at Nottingham University/England Since September 2007: Assistant Professor (University of Angers) Research interests Main area: multiobjective optimisation Metaheuristics for multiobjective optimisation (GAs, Local Search, Memetic algorithms, Path relinking, and also exact methods) Hybrid and adaptive metaheuristics (cooperation, parallelism) MO optimisation under uncertainty Applications (continuous test functions, flow-shop problem, routing problem…) Motivations Mainly linked to my previous research activities

4 Angers, 10 June 20104 MultiObjective Optimisation (I) …by V. Barichard two weeks ago! Single objective optimisation Optimisation problems Resolution approaches Multiobjective optimisation problems Description Dominance relation Resolution approaches and result evaluation Resolution approaches Pareto dominance based algorithms Outputs comparison Today: new trends in MOO

5 Angers, 10 June 20105 Motivations Efficients optimisation algorithms are often: Complex Complex mechanisms (diversification, evaluation…) Hybrid algorithms Parameters-dependant Numerous Great influence on the results (setted by hand, or adaptive setting) Dependant to the size of the problem Dependant to the problem treated Need of generic algorithms which are: Simple Adaptable to a range of optimisation problems Small number of parameters …but efficient!  Design of generic multi-objective metaheuristics  Problem specific optimisation

6 Angers, 10 June 20106 Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

7 Angers, 10 June 20107 Multiobjective optimisation Non-dominated solution Dominated feasible solution b e g f 1 f 2 f a c Pareto Dominance y dominates z if and only if  i  [1, …, n], y i  z i and  i  [1, …, n], y i < z i Non-dominated solution A solution x is non dominated if a solution which dominates x does not exist Goal: Find a good quality and well diversified set of non- dominated solutions

8 Angers, 10 June 20108 Multiobjective optimisation Non-dominated solution Dominated feasible solution b e g f 1 f 2 f a c No total order relation exist (  from single objective case) We can not compare solution a={4,7} with solution b={8,5}. Resulting specific questions How to assign fitness of solutions in evolutionary algo. (for selection)? How to find good compromise solutions? How to evaluate different outputs obtained by different algorithms? Goal: Find a good quality and well diversified set of non- dominated solutions

9 Angers, 10 June 20109 Evolutionary Multiobjective Optimisation? Multiobjective Optimisation : find a set of compromise solutions Evolutionary Algorithms (EAs) : evolve a set of solutions.  EAs are naturally wellsuited to find multiple efficient solutions in a single simulation run a tremendous number of multiobjective evolutionary algorithms have been proposed over the last two decade. f 2 f 1

10 Angers, 10 June 201010 Multiobjective Fitness assignment Fitness assignement : central point of (population-based) multiobjective metaheuristics Generic population-based search algorithm: Create initial population P repeat generate a new solution x add x to the population P evaluate fitness of solution x (and update P?) delete the worst solution of P until termination criteria is verified return P Need to ‘rank’ solutions

11 Angers, 10 June 201011 Multiobjective Fitness assignment Until mid 80’s: agregation of objective functions Now: Pareto dominance based ranking methods (dominance depth, counter of dominance…) f 2 f 1 f 1 f 2 + σσ 21 Convex hull

12 Angers, 10 June 201012 Multiobjective Fitness assignment Dominance depth [Srinivas & Deb 94] f 2 f 1 Rk=1 Rk=2 Rk=3

13 Angers, 10 June 201013 Multiobjective Fitness assignment Counter of dominance [Fonseca & Flemming 93] f 2 f 1 Rk=0 Rk=1 Rk=3 Rk=4 Rk=3 Rk=1 Rk=0 Rk=7

14 Angers, 10 June 201014 1 +4+4 Multiobjective Fitness assignment Sum of ranks ≈ [Bentley & Wakefield 97] f 2 f 1 RK=16 RK=13 RK=12 RK=11 RK=9 RK=7 RK=5 RK=8 RK=5 RK=4

15 Angers, 10 June 201015 Multiobjective Fitness assignment Pareto dominance ranking methods drawbacks Binary value - No quantification of the dominance Comparison is difficult if to many Pareto solutions can be generated (need to add clustering tool). General goal of MO optimisation: « Find a good quality and well diversified set of non-dominated solutions » How to achieve this? Define indicators which are able to evaluate a set of solutions Optimise the indicator value during the search

16 Angers, 10 June 201016 Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

17 Angers, 10 June 201017 Quality indicators Usefull to compare two (or more) optimizers How to compare set A against set B? f 1 f 2 Approximation A Approximation B ?

18 Angers, 10 June 201018 Quality indicators Definition (Quality indicator): An m-ary quality indicator I is a function, which assigns each vector (A 1,A 2, …, A m ) of m approximation sets a real value I(A 1,…, A m ) [Zitzler 2005]. f 1 f 2 Approximation A Approximation B

19 Angers, 10 June 201019 Quality indicators Definition (Quality indicator): An m-ary quality indicator I is a function, which assigns each vector (A 1,A 2, …, A m ) of m approximation sets a real value I(A 1,…, A m ) [Zitzler 2005]. Unary indicator : I(P 1 ),…, I(P m )  compare real values. Binary indicator : I(P 1,P 2 )  compare two sets! Comparison of m outputs: Use a reference set (e.g. the best known Pareto set) and compare each output against the reference set Many research on this subject – many indicators: hypervolume indicator, ε-indicator, average best weight combination, distance from reference, error ratio, chi-square-like deviation indicator, spacing, generational distance, maximum Pareto front error, maximum spread, coverage error, Pareto spread… [Zitzler 2005]

20 Angers, 10 June 201020 ε-indicator I (A,B) ε I (B,A) ε Normalized space f 1 f 2 Binary indicator epsilon [Zitzler & Kuenzli 04] I  (A,B)= Minimal translation to apply on the set A so that every solution in set B is dominated by at least as one solution in A.

21 Angers, 10 June 201021 ε-indicator I (A) ε Normalized space f 1 f 2 Unary indicator version of binary indicator epsilon I  (A)= Minimal translation to apply on the set A so that every solution in a reference set R is dominated by at least as one solution in A.

22 Angers, 10 June 201022 Hypervolume indicator Also known as S-metric, lebesgue’ measure… Hypervolume enclosed by approximation A according to a reference point Z f 1 f 2 Approximation A

23 Angers, 10 June 201023 Hypervolume indicator Also known as S-metric, lebesgue’ measure… Hypervolume enclosed by approximation A according to a reference point Z f 1 f 2 Approximation A Z

24 Angers, 10 June 201024 Hypervolume indicator Also known as S-metric, lebesgue’ measure… Hypervolume enclosed by approximation A according to a reference point Z f 1 f 2 Approximation A Normalized space Z I (A) HD

25 Angers, 10 June 201025 Hypervolume indicator Hypervolume as binary indicator [Zitzler & Kuenzli 04] Hypervolume enclosed by approximation A and not by approximation B, according to a reference point Z f 1 f 2 Approximation A Approximation B I (A,B) HD I (B,A) HD Normalized space Z

26 Angers, 10 June 201026 Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

27 Angers, 10 June 201027 IBEA principle Fitness assignment: Define a binary indicator I which allows to compare two solutions When a solution x is added to a population P Compare x against every solution in P using indicator I to compute x fitness For each solution S in P, update fitness according to I and x Selection Delete the solution which have the worst fitness value

28 Angers, 10 June 201028 From binary indicator to fitness assignment f 1 f 2 I ( a, b )>0 HD I (b,a)>0 HD a b f 1 f 2 I (b,a)=- I (a,b)>0 a b HD f 1 f 2 I ( a, b )>0 I (b,a)>0 a b f 1 f 2 a b ε ε I (b,a)<0 I ( a, b )>0 ε ε

29 Angers, 10 June 201029 From binary indicator to fitness assignment f 1 f 2 I ( a, b )>0 HD I (b,a)>0 HD a b f 1 f 2 I (b,a)=- I (a,b)>0 a b HD f 1 f 2 I ( a, b )>0 I (b,a)>0 a b f 1 f 2 a b ε ε I (b,a)<0 I ( a, b )>0 ε ε Binary indicator value of a population against a single solution:

30 Angers, 10 June 201030 IBEA principles [Zitzler & Kuenzli 2004] Define a binary indicator I and an initial population P of n solutions Generate a set Q of m new solutions using genetic operators Select a set R of N solutions from Q U P, which minimize I(Q U P,R) Repeat until termination criteria verified  return R Advantages Outperforms NSGA-II and SPEA2 on continuous test functions Small number of parameters (population size, m, binary indicator) No diversity preservation mechanism required Could take into account the decision-maker preference But… Delete optimaly m solutions from a population is difficult (greedy in IBEA) Evolutionary algorithm convergence is usualy slow Indicator-Based Evolutionary Algorithm

31 Angers, 10 June 201031 But… Delete optimaly m solutions from a population is difficult (greedy in IBEA) Evolutionary algorithm convergence is usualy slow local search methods are known to be efficient metaheuristics for single-objective optimization… application to MOO? f 2 f 1 Cut m solutions f 2 f 1 IBEA: Delete 1 by 1 f 2 f 1 ES(n,1) : 1 to delete

32 Angers, 10 June 201032 Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

33 Angers, 10 June 201033 Single Objective Local Search Evaluate solutions « arround » an initial one, and select a solution which is better. Efficient heuristic, easy to understand and to implement. Several Neighborhood Improvement strategy (first, best) Iterated version (random pop., or other strategy) x1 x2 Solution space f(x)

34 Angers, 10 June 201034 MO Local searches issues Difficulties resulting from the multiobjective aspect of the problems. Initialisation (random?) Solution Evaluation (agregation, Pareto, indicator) Neighborhood (related to all objectives?) Neigh. Exploration (partial, 1st improvement, best imp.) Selection strategy (all improvements, dominance…) Population size (unique solution, fixed or variable size) Archive of best known? Iteration (re-initialisation) Stopping criterion (progress threshold, entire set in local optima?) …

35 Angers, 10 June 201035 MO Local searches example : PLS Classical and intuitive dominance-based MO local search [Talbi et al. 2001] [Basseur et al. 2003] [Angel et al. 2004] f 1 f 2 Different versions : stopping criterion, archive, selection strategy… Problems : non-dominated solution are incomparable variable population size (can be huge)  Indicator-Based MO Local Search!

36 Angers, 10 June 201036 Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

37 Angers, 10 June 201037 Indicator-Based MO Local Search Initialisation of the population P of size N Fitness assignment For each x є P, Fitness(x)=I(P\{x},x) Local search Step: for all x є P do x*  one random neighbour of x Fitness(x*)=I(P,x*) For each z є P, update its Fitness: Fitness(z)+=I(x*,z) Remove w, the solution with minimal Fitness value in P U x* Repeat until all neighbours tested, or w≠x* (new solution found) Stopping criterion: no new non-dominated solution found during an entire local search step: return the set of non-dominated solutions of P.  Iterated IBMOLS: repeat the process, with different initial populations

38 Angers, 10 June 201038 Parameters: indicators Binary indicators issued from performance assessment studies: Iε [Zitzler & Kuenzli 04] I HD [Zitzler & Kuenzli 04] Comparison with classical dominance-based ranking methods, adapted into indicators: I Ben [Bentley & Wakefield 97] I Sri [Srinivas & Deb 94] I Fon [Fonseca & Flemming 93]

39 Angers, 10 June 201039 Parameters: indicators Binary indicators issued from performance assessment studies: Iε [Zitzler & Kuenzli 04] I HD [Zitzler & Kuenzli 04] Comparison with classical dominance-based ranking methods, adapted into indicators: I Ben [Bentley & Wakefield 97] I Sri [Srinivas & Deb 94] I Fon [Fonseca & Flemming 93] f 2 f 1 Rk=1 Rk=2 Rk=3 f 1

40 Angers, 10 June 201040 Parameters: indicators Binary indicators issued from performance assessment studies: Iε [Zitzler & Kuenzli 04] I HD [Zitzler & Kuenzli 04] Comparison with classical dominance-based ranking methods, adapted into indicators: I Ben [Bentley & Wakefield 97] I Sri [Srinivas & Deb 94] I Fon [Fonseca & Flemming 93] f 2 f 1 Rk=0 Rk=1 Rk=3 Rk=4 Rk=3 Rk=1 Rk=0 Rk=7 f 1

41 Angers, 10 June 201041 Parameters: indicators Binary indicators issued from performance assessment studies: Iε [Zitzler & Kuenzli 04] I HD [Zitzler & Kuenzli 04] Comparison with classical dominance-based ranking methods, adapted into indicators: I Ben [Bentley & Wakefield 97] I Sri [Srinivas & Deb 94] I Fon [Fonseca & Flemming 93] with 1 +4+4 f 2 RK=16 RK=13 RK=12 RK=11 RK=9 RK=7 RK=5 RK=8 RK=5 RK=4 f 1

42 Angers, 10 June 201042 Parameters: population initialisation Rand: Generate a set P of size n of random permutations. Cross: Apply a classical crossover operator to pairs of solutions selected from the archive A of size m of non-dominated solutions. If 2n<m, then select randomly 2n solutions from A. If 2n≥m, then select A, and complete with random solutions.  Create n new solutions by applying crossover on the 2n selected solutions. SA: Random noise applied on archived solutions. If n<m, then select randomly n solutions from A. If n≥m, then select A, and complete with random solutions.  Create n new solutions by applying random noise (mutations) on the n selected solutions.

43 Angers, 10 June 201043 Application: Ring Star problem Applications in telecommunication network design and transit systems planning. Problems from 70 to 300 locations Minimise Ring cost Assignment cost

44 Angers, 10 June 201044 Application: Nurse Scheduling QMC NURSE SCHEDULING PROBLEM: process of timetabling staff (allocating nurses to working shifts) over a period of time. Hard constraints to satisfy 3 Objective functions: minimise the violation of 3 soft constraints violations of “SingleNight, WeekendSplit, WeekendBalance” number of violations of “Coverage” penalty for “CoverageBalance” Problem details: Ward of 20 to 30 nurses Planning period is 28 days, with 3 shift types: day, evening and night Full time/Part time nurses (e.g. 8h, 15h, 23h, 30h, 40h…) Nurses hierarchy, according to their qualifications and training Coverage demand is different for each shift Working regulations to be satisfied (e.g. annual leave)

45 Angers, 10 June 201045 Application: Biobjective Flow-shop problem  N jobs to schedule on M machines  Critical ressources  Permutation flow shop  Objectives to minimise :  Cmax : Maximum completion time  T : Sum of tardiness (or average tardiness)  Taillard’ Benchmarks [Taillard 93], extended to the biobjective case M1M1 M2M2 M3M3 Cmax _T_T _

46 Angers, 10 June 201046 Parameters / Performance assessment Binary quality indicators: Iε, I HD [Zitzler & Kuenzli 04] Comparison with classical dominance-based ranking methods, adapted into indicators: I Ben [Bentley & Wakefield 97], I Fon [Fonseca & Fleming 93], I Sri [Srinivas & Deb 94] Population size: small fixed values (3, 5, 8, 10, 20, 30, 50) Population initialisation: Random, Crossover on solutions in the archive, Random noise on archived solutions (Simulated Annealing) 20 runs on each instance, short run time (20” to 20’) Performance assessment Hypervolume indicator difference of the different sets of non-dominated solutions obtained Statistical analysis (Mann-Withley test) Z

47 Angers, 10 June 201047 Results : table analysis For each algorithm : 20 Hypervolume indicator differences computed from the 20 runs Tables show the average value for each pair algorithm/instance Stastistical analysis : Rank the differents runs of two different algorithm Using Hypervolume difference Mann-Withley test: compute the confidence level that the obtained classification is not hazardous Results in bold: algorithm never outperformed by another algorithm with a confidence level greater than 95% AAAABAAABABABBBBBB ABBABABAAABABBBAAB

48 Angers, 10 June 201048 Results  Indicator sensitivity  Superiority of performance assessment based indicators over dominance based indicators  Superiority of epsilon indicator over hypervolume indicator

49 Angers, 10 June 201049 Results  Initialisation strategy sensitivity  Superiority of Simulated annealing (random mutations) initialisation  Optimal noise rate around 10%

50 Angers, 10 June 201050 Results  Population size sensitivity  Best performance obtained with small population size  Optimal population size increases with the size of the problem considered

51 Angers, 10 June 201051 Experiments: parameter sensitivity summary of the best values ProblemRun timeP sizeIndicatorInitialisation 20*5 (1)20”3 I Fon, Iε SA, Cross 20*5 (2)20”3 I Fon, I Sri Cross 20*10 (1)1’5 Iε, I HD SA 20*10 (2)1’8 Iε, I HD SA 20*202’8 Iε, I Fon SA 50*55’8 Iε, I Fon SA 50*1010’10 Iε, I HD SA 50*2020’’30 Iε, I HD SA, Cross

52 Angers, 10 June 201052 IBMOLS: conclusions IBMOLS: generic indicator-based local search for MOPs Small number of parameters (pop size, binary indicator, initialisation function) No diversity preservation mechanism required Superiority of Iε binary indicator on different problems Parameter sensitivity analysis Performance assessment based indicators Small population size Population initialisation: random noise on archived solutions Very good overall results obtained (new best-knowns) …BUT hypervolume indicator is known as the most intuitive performance indicator and the only one being fully sensitive to Pareto dominance relation. Why the results are desapointing?

53 Angers, 10 June 201053 Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

54 Angers, 10 June 201054 Hypervolume UNARY indicator I HD does not correspond to the definition of hypervolume indicator! Zref f 2 f 1

55 Angers, 10 June 201055 Hypervolume UNARY indicator I HD does not correspond to the definition of hypervolume indicator! Zref f 2 f 1 We would like to compute…

56 Angers, 10 June 201056 Hypervolume UNARY indicator I HD does not correspond to the definition of hypervolume indicator! Zref f 2 f 1 f 2 f 1 We would like to compute… But we compute…

57 Angers, 10 June 201057 Hypervolume UNARY indicator I HD does not correspond to the definition of hypervolume indicator! Zref f 2 f 1 f 2 f 1 We would like to compute… But we compute…

58 Angers, 10 June 201058 Hypervolume UNARY indicator I HD does not correspond to the definition of hypervolume indicator! Zref f 2 f 1 We would like to compute…  We can’t compute hypervolume contribution of a solution by comparing only pairs of solutions

59 Angers, 10 June 201059 Hypervolume-Based MO Local Search Initialisation of the population P of size N Fitness assignment For each x є P, Fitness(x)=I(P\{x},x) Local search Step: for all x є P do x*  one random neighbour of x Fitness(x*)=I(P,x*) For each z є P, update its Fitness: Fitness(z)+=I(x*,z) Remove w, the solution with minimal Fitness value in P U x* Repeat until all neighbours tested, or w≠x* (new solution found) Stopping criterion: no new non-dominated solution found during an entire local search step: return the set of non-dominated solutions of P.  Iterated IBMOLS: repeat the process, with different initial populations

60 Angers, 10 June 201060 Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution f 2 f 1

61 Angers, 10 June 201061 Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution f 2 f 1

62 Angers, 10 June 201062 Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 1 st case : x is dominated f 2 f 1

63 Angers, 10 June 201063 Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 1 st case : x is dominated x’ fitness equal to the biggest dominance area between a solution of P and x f 2 f 1

64 Angers, 10 June 201064 Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 1 st case : x is dominated x’ fitness equal to the biggest dominance area between a solution of P and x Delete the dominated solution with the worst fitness value No more fitness update f 2 f 1

65 Angers, 10 June 201065 Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 1 st case : x is dominated x’ fitness equal to the biggest dominance area between a solution of P and x Delete the dominated solution with the worst fitness value No more fitness update f 2 f 1

66 Angers, 10 June 201066 Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated f 2 f 1

67 Angers, 10 June 201067 Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated f 2 f 1 Zref

68 Angers, 10 June 201068 Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? f 2 f 1 Update fitnesses

69 Angers, 10 June 201069 Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) f 2 f 1 x y1y1 y0y0 z0z0 z1z1

70 Angers, 10 June 201070 Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) Compute new fitness for x neighbours (thanks to x and neighbour which is perhaps newly dominated) f 2 f 1 x y1y1 y0y0 z0z0 z1z1

71 Angers, 10 June 201071 Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) Compute new fitness for x neighbours (thanks to x and neighbour which is perhaps newly dominated) f 2 f 1

72 Angers, 10 June 201072 Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) Compute new fitness for x neighbours (thanks to x and neighbour which is perhaps newly dominated) Delete worst solution w If w is dominated: no fitness change f 2 f 1

73 Angers, 10 June 201073 Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) Compute new fitness for x neighbours (thanks to x and neighbour which is perhaps newly dominated) Delete worst solution w If w is dominated: no fitness change If w is non-dominated: update fitness of w neighbours. f 2 f 1

74 Angers, 10 June 201074 Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) Compute new fitness for x neighbours (thanks to x and neighbour which is perhaps newly dominated) Delete worst solution w If w is dominated: no fitness change If w is non-dominated: update fitness of w neighbours (thanks to them and w). f 2 f 1 y0y0 y1y1 y2y2

75 Angers, 10 June 201075 Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) Compute new fitness for x neighbours (thanks to x and neighbour which is perhaps newly dominated) Delete worst solution w If w is dominated: no fitness change If w is non-dominated: update fitness of w neighbours (thanks to them and w). f 2 f 1

76 Angers, 10 June 201076 General observations Overall complexity in O(n)! Special cases when x is non-dominated and on the extremity of the Pareto set Z ref coordinates replace neighbour coordinates The reference point Z ref need to be fixed Solution : Z ref = {+ ,+  } ! It allows us to maintain the extremities of the Pareto set into the population. The algorithm is defined for the bi-objective case only Need to extend it to the general case Hypervolume calculation is NP according to the number of objective function, then our algorithm too. BUT: Multi-objective problems studied mainly deals with 2 objective functions, sometime 3 objective functions, and almost never more than 4 objective functions.

77 Angers, 10 June 201077 Outlines Motivations Multiobjective optimisation Quality indicators Indicator-Based MultiObjective Search Hypervolume-Based Optimisation Description Experiments Conclusions and perspectives

78 Angers, 10 June 201078 Application: Biobjective Flow-shop problem  N jobs to schedule on M machines  Critical ressources  Permutation flow shop  Objectives to minimise :  Cmax : Maximum completion time  T : Sum of tardiness (or average tardiness)  Taillard’ Benchmarks [Taillard 93], extended to the biobjective case M1M1 M2M2 M3M3 Cmax _T_T _

79 Angers, 10 June 201079 Parameters / Performance assessment Binary quality indicators: Iε, I HD [Zitzler & Kuenzli 04] and I NH presented previouly. Population size: small fixed values (from 10 to 30) Population initialisation: 30% Random noise on archived solutions 20 runs on each instance run time from 20” to 60’ Performance assessment Hypervolume indicator difference of the different sets of non-dominated solutions obtained Statistical analysis (Mann-Withley test) Z

80 Angers, 10 June 201080 Results on Flow-shop problem

81 Angers, 10 June 201081 Outlines Motivations Multiobjective optimisation Quality indicators Indicator-Based MultiObjective Search Hypervolume-Based Optimisation Conclusions and perspectives

82 Angers, 10 June 201082 Conclusions and perspectives Conclusions Indicator-based multiobjective optimisation : a growing research area Very simple mechanism (no diversity preservation mecanism needed) Very efficient (outperform classical generic methods) New principle : still a lot of research fields to exploit IBEA : Indicator-Based Evolutionary Algorithm Efficient according to other multiobjective evolutionary algorithms Superiority of Iε binary indicator on different problems Evolutionary Algorithm  Slow convergence IBMOLS : Indicator-Based MultiObjective Local Search Combine advantages of IBEA and fast convergence of iterated local searches algorithms Hypervolume indicator needed to be improved HBMOLS : Hypervolume-Based MultiObjective Local Search Selection based on hypervolume maximisation Greatly outperforms IBMOLS versions

83 Angers, 10 June 201083 Conclusions and perspectives Perspectives Application to another multiobjective problems real world problems academics problems mathematical functions More than 2 objective optimisation? Propose adaptations to optimise more than 2 objectives (in process with Ron- Qiang Zeng) Study the limitation of the proposed algorithm in terms of complexity Study the possible use of approximation algorithms Study different versions of Hypervolume based selection Mainly for fitness computation of dominated solutions Application of indicator-based strategy on other search methods Path Relinking (in process with Ron-Qiang Zeng)

84 Angers, 10 June 201084 Global conclusion / research perspectives Lessons from past research (tx to E. Zitzler) EMO provides information about a problem (search space exploration) EMO can help in single-objective scenarios (multiobjectivization) But… MOO is part of the decision making process (preferences) But… MOO for large n is different from n = 2 (high-dimensional objective spaces) Research perspectives MOO = part of the decision making process. How to collaborate between decision maker and MOO Uncertainty and robustness Expensive objective function evaluations Hybridisation: Metaheuristics and OR methods (examples) Multi-multiobjective problems definition Many objective optimisation


Download ppt "Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur."

Similar presentations


Ads by Google