Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur.

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

G5BAIM Artificial Intelligence Methods
GA Approaches to Multi-Objective Optimization
Topic Outline ? Black-Box Optimization Optimization Algorithm: only allowed to evaluate f (direct search) decision vector x objective vector f(x) objective.
Non-dominated Sorting Genetic Algorithm (NSGA-II)
MOEAs University of Missouri - Rolla Dr. T’s Course in Evolutionary Computation Matt D. Johnson November 6, 2006.
CS6800 Advanced Theory of Computation
1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
Biased Random Key Genetic Algorithm with Hybrid Decoding for Multi-objective Optimization Panwadee Tangpattanakul, Nicolas Jozefowiez, Pierre Lopez LAAS-CNRS.
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Surrogate Model Based Differential Evolution for Multiobjective Optimization (GP - DEMO) Šmarna gora, Miha Mlakar Algoritmi po vzorih iz narave.
Optimal Design Laboratory | University of Michigan, Ann Arbor 2011 Design Preference Elicitation Using Efficient Global Optimization Yi Ren Panos Y. Papalambros.
Multiobjective Optimization Chapter 7 Luke, Essentials of Metaheuristics, 2011 Byung-Hyun Ha R1.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
1 Wendy Williams Metaheuristic Algorithms Genetic Algorithms: A Tutorial “Genetic Algorithms are good at taking large, potentially huge search spaces and.
A Study on Recent Fast Ways of Hypervolume Calculation for MOEAs Mainul Kabir ( ) and Nasik Muhammad Nafi ( ) Department of Computer Science.
Multi-Objective Optimization Using Evolutionary Algorithms
Spring, 2013C.-S. Shieh, EC, KUAS, Taiwan1 Heuristic Optimization Methods Pareto Multiobjective Optimization Patrick N. Ngatchou, Anahita Zarei, Warren.
A New Evolutionary Algorithm for Multi-objective Optimization Problems Multi-objective Optimization Problems (MOP) –Definition –NP hard By Zhi Wei.
Reporter : Mac Date : Multi-Start Method Rafael Marti.
Design of Curves and Surfaces by Multi Objective Optimization Rony Goldenthal Michel Bercovier School of Computer Science and Engineering The Hebrew University.
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
The Pareto fitness genetic algorithm: Test function study Wei-Ming Chen
Resource Allocation Problem Reporter: Wang Ching Yu Date: 2005/04/07.
A New Algorithm for Solving Many-objective Optimization Problem Md. Shihabul Islam ( ) and Bashiul Alam Sabab ( ) Department of Computer Science.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Optimal Arrangement of Ceiling Cameras for Home Service Robots Using Genetic Algorithms Stefanos Nikolaidis*, ** and Tamio Arai** *R&D Division, Square.
Genetic Algorithms: A Tutorial
Quality Indicators (Binary ε-Indicator) Santosh Tiwari.
Evolutionary Multi-objective Optimization – A Big Picture Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical.
Genetic Algorithm.
On comparison of different approaches to the stability radius calculation Olga Karelkina Department of Mathematics University of Turku MCDM 2011.
1 Paper Review for ENGG6140 Memetic Algorithms By: Jin Zeng Shaun Wang School of Engineering University of Guelph Mar. 18, 2002.
Constrained Evolutionary Optimization Yong Wang Associate Professor, PhD School of Information Science and Engineering, Central South University
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Comparison of Differential Evolution and Genetic Algorithm in the Design of a 2MW Permanent Magnet Wind Generator A.D.Lilla, M.A.Khan, P.Barendse Department.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
PSO and ASO Variants/Hybrids/Example Applications & Results Lecture 12 of Biologically Inspired Computing Purpose: Not just to show variants/etc … for.
Omni-Optimizer A Procedure for Single and Multi-objective Optimization Prof. Kalyanmoy Deb and Santosh Tiwari.
Simulation in Wind Turbine Vibrations: A Data Driven Analysis Graduate Students: Zijun Zhang PI: Andrew Kusiak Intelligent Systems Laboratory The University.
DIVERSITY PRESERVING EVOLUTIONARY MULTI-OBJECTIVE SEARCH Brian Piper1, Hana Chmielewski2, Ranji Ranjithan1,2 1Operations Research 2Civil Engineering.
Advanced Computer Architecture & Processing Systems Research Lab Framework for Automatic Design Space Exploration.
Quality of Pareto set approximations Eckart Zitzler, Jörg Fliege, Carlos Fonseca, Christian Igel, Andrzej Jaszkiewicz, Joshua Knowles, Alexander Lotov,
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Yuan-Ze University A Genetic Algorithm with Injecting Artificial Chromosomes for Single Machine Scheduling Problems Pei-Chann Chang, Shih-Shin Chen, Qiong-Hui.
Multi-objective Evolutionary Algorithms (for NACST/Seq) summarized by Shin, Soo-Yong.
Tamaki Okuda ● Tomoyuki Hiroyasu   Mitsunori Miki   Shinya Watanabe  
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Multiobjective Optimization  Particularities of multiobjective optimization  Multiobjective.
Genetic algorithms: A Stochastic Approach for Improving the Current Cadastre Accuracies Anna Shnaidman Uri Shoshani Yerach Doytsher Mapping and Geo-Information.
Evolutionary multi-objective algorithm design issues Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical.
Evolutionary Computing Chapter 12. / 26 Chapter 12: Multiobjective Evolutionary Algorithms Multiobjective optimisation problems (MOP) -Pareto optimality.
1 ParadisEO-MOEO for a Bi-objective Flow-Shop Scheduling Problem May 2007 E.-G. Talbi and the ParadisEO team
Zhengli Huang and Wenliang (Kevin) Du
Metaheuristics for the New Millennium Bruce L. Golden RH Smith School of Business University of Maryland by Presented at the University of Iowa, March.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
ZEIT4700 – S1, 2016 Mathematical Modeling and Optimization School of Engineering and Information Technology.
L-Dominance: An Approximate-Domination Mechanism
Department of Computer Science
Meta-heuristics Introduction - Fabien Tricoire
Presented by: Dr Beatriz de la Iglesia
C.-S. Shieh, EC, KUAS, Taiwan
Heuristic Optimization Methods Pareto Multiobjective Optimization
A New multi-objective algorithm: Pareto archived dds
Multi-Objective Optimization
Chen-Yu Lee, Jia-Fong Yeh, and Tsung-Che Chiang
Aiman H. El-Maleh Sadiq M. Sait Syed Z. Shazli
MOEA Testing and Analysis
Multiobjective Optimization
Presentation transcript:

Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur

Angers, 10 June Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

Angers, 10 June Introduction About me… : PhD in Lille (France) – supervisor E-G. Talbi Research visitor in ETH Zurich/Switzerland (2005) – with E. Zitzler : Research Assistant at Nottingham University/England Since September 2007: Assistant Professor (University of Angers) Research interests Main area: multiobjective optimisation Metaheuristics for multiobjective optimisation (GAs, Local Search, Memetic algorithms, Path relinking, and also exact methods) Hybrid and adaptive metaheuristics (cooperation, parallelism) MO optimisation under uncertainty Applications (continuous test functions, flow-shop problem, routing problem…) Motivations Mainly linked to my previous research activities

Angers, 10 June MultiObjective Optimisation (I) …by V. Barichard two weeks ago! Single objective optimisation Optimisation problems Resolution approaches Multiobjective optimisation problems Description Dominance relation Resolution approaches and result evaluation Resolution approaches Pareto dominance based algorithms Outputs comparison Today: new trends in MOO

Angers, 10 June Motivations Efficients optimisation algorithms are often: Complex Complex mechanisms (diversification, evaluation…) Hybrid algorithms Parameters-dependant Numerous Great influence on the results (setted by hand, or adaptive setting) Dependant to the size of the problem Dependant to the problem treated Need of generic algorithms which are: Simple Adaptable to a range of optimisation problems Small number of parameters …but efficient!  Design of generic multi-objective metaheuristics  Problem specific optimisation

Angers, 10 June Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

Angers, 10 June Multiobjective optimisation Non-dominated solution Dominated feasible solution b e g f 1 f 2 f a c Pareto Dominance y dominates z if and only if  i  [1, …, n], y i  z i and  i  [1, …, n], y i < z i Non-dominated solution A solution x is non dominated if a solution which dominates x does not exist Goal: Find a good quality and well diversified set of non- dominated solutions

Angers, 10 June Multiobjective optimisation Non-dominated solution Dominated feasible solution b e g f 1 f 2 f a c No total order relation exist (  from single objective case) We can not compare solution a={4,7} with solution b={8,5}. Resulting specific questions How to assign fitness of solutions in evolutionary algo. (for selection)? How to find good compromise solutions? How to evaluate different outputs obtained by different algorithms? Goal: Find a good quality and well diversified set of non- dominated solutions

Angers, 10 June Evolutionary Multiobjective Optimisation? Multiobjective Optimisation : find a set of compromise solutions Evolutionary Algorithms (EAs) : evolve a set of solutions.  EAs are naturally wellsuited to find multiple efficient solutions in a single simulation run a tremendous number of multiobjective evolutionary algorithms have been proposed over the last two decade. f 2 f 1

Angers, 10 June Multiobjective Fitness assignment Fitness assignement : central point of (population-based) multiobjective metaheuristics Generic population-based search algorithm: Create initial population P repeat generate a new solution x add x to the population P evaluate fitness of solution x (and update P?) delete the worst solution of P until termination criteria is verified return P Need to ‘rank’ solutions

Angers, 10 June Multiobjective Fitness assignment Until mid 80’s: agregation of objective functions Now: Pareto dominance based ranking methods (dominance depth, counter of dominance…) f 2 f 1 f 1 f 2 + σσ 21 Convex hull

Angers, 10 June Multiobjective Fitness assignment Dominance depth [Srinivas & Deb 94] f 2 f 1 Rk=1 Rk=2 Rk=3

Angers, 10 June Multiobjective Fitness assignment Counter of dominance [Fonseca & Flemming 93] f 2 f 1 Rk=0 Rk=1 Rk=3 Rk=4 Rk=3 Rk=1 Rk=0 Rk=7

Angers, 10 June Multiobjective Fitness assignment Sum of ranks ≈ [Bentley & Wakefield 97] f 2 f 1 RK=16 RK=13 RK=12 RK=11 RK=9 RK=7 RK=5 RK=8 RK=5 RK=4

Angers, 10 June Multiobjective Fitness assignment Pareto dominance ranking methods drawbacks Binary value - No quantification of the dominance Comparison is difficult if to many Pareto solutions can be generated (need to add clustering tool). General goal of MO optimisation: « Find a good quality and well diversified set of non-dominated solutions » How to achieve this? Define indicators which are able to evaluate a set of solutions Optimise the indicator value during the search

Angers, 10 June Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

Angers, 10 June Quality indicators Usefull to compare two (or more) optimizers How to compare set A against set B? f 1 f 2 Approximation A Approximation B ?

Angers, 10 June Quality indicators Definition (Quality indicator): An m-ary quality indicator I is a function, which assigns each vector (A 1,A 2, …, A m ) of m approximation sets a real value I(A 1,…, A m ) [Zitzler 2005]. f 1 f 2 Approximation A Approximation B

Angers, 10 June Quality indicators Definition (Quality indicator): An m-ary quality indicator I is a function, which assigns each vector (A 1,A 2, …, A m ) of m approximation sets a real value I(A 1,…, A m ) [Zitzler 2005]. Unary indicator : I(P 1 ),…, I(P m )  compare real values. Binary indicator : I(P 1,P 2 )  compare two sets! Comparison of m outputs: Use a reference set (e.g. the best known Pareto set) and compare each output against the reference set Many research on this subject – many indicators: hypervolume indicator, ε-indicator, average best weight combination, distance from reference, error ratio, chi-square-like deviation indicator, spacing, generational distance, maximum Pareto front error, maximum spread, coverage error, Pareto spread… [Zitzler 2005]

Angers, 10 June ε-indicator I (A,B) ε I (B,A) ε Normalized space f 1 f 2 Binary indicator epsilon [Zitzler & Kuenzli 04] I  (A,B)= Minimal translation to apply on the set A so that every solution in set B is dominated by at least as one solution in A.

Angers, 10 June ε-indicator I (A) ε Normalized space f 1 f 2 Unary indicator version of binary indicator epsilon I  (A)= Minimal translation to apply on the set A so that every solution in a reference set R is dominated by at least as one solution in A.

Angers, 10 June Hypervolume indicator Also known as S-metric, lebesgue’ measure… Hypervolume enclosed by approximation A according to a reference point Z f 1 f 2 Approximation A

Angers, 10 June Hypervolume indicator Also known as S-metric, lebesgue’ measure… Hypervolume enclosed by approximation A according to a reference point Z f 1 f 2 Approximation A Z

Angers, 10 June Hypervolume indicator Also known as S-metric, lebesgue’ measure… Hypervolume enclosed by approximation A according to a reference point Z f 1 f 2 Approximation A Normalized space Z I (A) HD

Angers, 10 June Hypervolume indicator Hypervolume as binary indicator [Zitzler & Kuenzli 04] Hypervolume enclosed by approximation A and not by approximation B, according to a reference point Z f 1 f 2 Approximation A Approximation B I (A,B) HD I (B,A) HD Normalized space Z

Angers, 10 June Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

Angers, 10 June IBEA principle Fitness assignment: Define a binary indicator I which allows to compare two solutions When a solution x is added to a population P Compare x against every solution in P using indicator I to compute x fitness For each solution S in P, update fitness according to I and x Selection Delete the solution which have the worst fitness value

Angers, 10 June From binary indicator to fitness assignment f 1 f 2 I ( a, b )>0 HD I (b,a)>0 HD a b f 1 f 2 I (b,a)=- I (a,b)>0 a b HD f 1 f 2 I ( a, b )>0 I (b,a)>0 a b f 1 f 2 a b ε ε I (b,a)<0 I ( a, b )>0 ε ε

Angers, 10 June From binary indicator to fitness assignment f 1 f 2 I ( a, b )>0 HD I (b,a)>0 HD a b f 1 f 2 I (b,a)=- I (a,b)>0 a b HD f 1 f 2 I ( a, b )>0 I (b,a)>0 a b f 1 f 2 a b ε ε I (b,a)<0 I ( a, b )>0 ε ε Binary indicator value of a population against a single solution:

Angers, 10 June IBEA principles [Zitzler & Kuenzli 2004] Define a binary indicator I and an initial population P of n solutions Generate a set Q of m new solutions using genetic operators Select a set R of N solutions from Q U P, which minimize I(Q U P,R) Repeat until termination criteria verified  return R Advantages Outperforms NSGA-II and SPEA2 on continuous test functions Small number of parameters (population size, m, binary indicator) No diversity preservation mechanism required Could take into account the decision-maker preference But… Delete optimaly m solutions from a population is difficult (greedy in IBEA) Evolutionary algorithm convergence is usualy slow Indicator-Based Evolutionary Algorithm

Angers, 10 June But… Delete optimaly m solutions from a population is difficult (greedy in IBEA) Evolutionary algorithm convergence is usualy slow local search methods are known to be efficient metaheuristics for single-objective optimization… application to MOO? f 2 f 1 Cut m solutions f 2 f 1 IBEA: Delete 1 by 1 f 2 f 1 ES(n,1) : 1 to delete

Angers, 10 June Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

Angers, 10 June Single Objective Local Search Evaluate solutions « arround » an initial one, and select a solution which is better. Efficient heuristic, easy to understand and to implement. Several Neighborhood Improvement strategy (first, best) Iterated version (random pop., or other strategy) x1 x2 Solution space f(x)

Angers, 10 June MO Local searches issues Difficulties resulting from the multiobjective aspect of the problems. Initialisation (random?) Solution Evaluation (agregation, Pareto, indicator) Neighborhood (related to all objectives?) Neigh. Exploration (partial, 1st improvement, best imp.) Selection strategy (all improvements, dominance…) Population size (unique solution, fixed or variable size) Archive of best known? Iteration (re-initialisation) Stopping criterion (progress threshold, entire set in local optima?) …

Angers, 10 June MO Local searches example : PLS Classical and intuitive dominance-based MO local search [Talbi et al. 2001] [Basseur et al. 2003] [Angel et al. 2004] f 1 f 2 Different versions : stopping criterion, archive, selection strategy… Problems : non-dominated solution are incomparable variable population size (can be huge)  Indicator-Based MO Local Search!

Angers, 10 June Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

Angers, 10 June Indicator-Based MO Local Search Initialisation of the population P of size N Fitness assignment For each x є P, Fitness(x)=I(P\{x},x) Local search Step: for all x є P do x*  one random neighbour of x Fitness(x*)=I(P,x*) For each z є P, update its Fitness: Fitness(z)+=I(x*,z) Remove w, the solution with minimal Fitness value in P U x* Repeat until all neighbours tested, or w≠x* (new solution found) Stopping criterion: no new non-dominated solution found during an entire local search step: return the set of non-dominated solutions of P.  Iterated IBMOLS: repeat the process, with different initial populations

Angers, 10 June Parameters: indicators Binary indicators issued from performance assessment studies: Iε [Zitzler & Kuenzli 04] I HD [Zitzler & Kuenzli 04] Comparison with classical dominance-based ranking methods, adapted into indicators: I Ben [Bentley & Wakefield 97] I Sri [Srinivas & Deb 94] I Fon [Fonseca & Flemming 93]

Angers, 10 June Parameters: indicators Binary indicators issued from performance assessment studies: Iε [Zitzler & Kuenzli 04] I HD [Zitzler & Kuenzli 04] Comparison with classical dominance-based ranking methods, adapted into indicators: I Ben [Bentley & Wakefield 97] I Sri [Srinivas & Deb 94] I Fon [Fonseca & Flemming 93] f 2 f 1 Rk=1 Rk=2 Rk=3 f 1

Angers, 10 June Parameters: indicators Binary indicators issued from performance assessment studies: Iε [Zitzler & Kuenzli 04] I HD [Zitzler & Kuenzli 04] Comparison with classical dominance-based ranking methods, adapted into indicators: I Ben [Bentley & Wakefield 97] I Sri [Srinivas & Deb 94] I Fon [Fonseca & Flemming 93] f 2 f 1 Rk=0 Rk=1 Rk=3 Rk=4 Rk=3 Rk=1 Rk=0 Rk=7 f 1

Angers, 10 June Parameters: indicators Binary indicators issued from performance assessment studies: Iε [Zitzler & Kuenzli 04] I HD [Zitzler & Kuenzli 04] Comparison with classical dominance-based ranking methods, adapted into indicators: I Ben [Bentley & Wakefield 97] I Sri [Srinivas & Deb 94] I Fon [Fonseca & Flemming 93] with f 2 RK=16 RK=13 RK=12 RK=11 RK=9 RK=7 RK=5 RK=8 RK=5 RK=4 f 1

Angers, 10 June Parameters: population initialisation Rand: Generate a set P of size n of random permutations. Cross: Apply a classical crossover operator to pairs of solutions selected from the archive A of size m of non-dominated solutions. If 2n<m, then select randomly 2n solutions from A. If 2n≥m, then select A, and complete with random solutions.  Create n new solutions by applying crossover on the 2n selected solutions. SA: Random noise applied on archived solutions. If n<m, then select randomly n solutions from A. If n≥m, then select A, and complete with random solutions.  Create n new solutions by applying random noise (mutations) on the n selected solutions.

Angers, 10 June Application: Ring Star problem Applications in telecommunication network design and transit systems planning. Problems from 70 to 300 locations Minimise Ring cost Assignment cost

Angers, 10 June Application: Nurse Scheduling QMC NURSE SCHEDULING PROBLEM: process of timetabling staff (allocating nurses to working shifts) over a period of time. Hard constraints to satisfy 3 Objective functions: minimise the violation of 3 soft constraints violations of “SingleNight, WeekendSplit, WeekendBalance” number of violations of “Coverage” penalty for “CoverageBalance” Problem details: Ward of 20 to 30 nurses Planning period is 28 days, with 3 shift types: day, evening and night Full time/Part time nurses (e.g. 8h, 15h, 23h, 30h, 40h…) Nurses hierarchy, according to their qualifications and training Coverage demand is different for each shift Working regulations to be satisfied (e.g. annual leave)

Angers, 10 June Application: Biobjective Flow-shop problem  N jobs to schedule on M machines  Critical ressources  Permutation flow shop  Objectives to minimise :  Cmax : Maximum completion time  T : Sum of tardiness (or average tardiness)  Taillard’ Benchmarks [Taillard 93], extended to the biobjective case M1M1 M2M2 M3M3 Cmax _T_T _

Angers, 10 June Parameters / Performance assessment Binary quality indicators: Iε, I HD [Zitzler & Kuenzli 04] Comparison with classical dominance-based ranking methods, adapted into indicators: I Ben [Bentley & Wakefield 97], I Fon [Fonseca & Fleming 93], I Sri [Srinivas & Deb 94] Population size: small fixed values (3, 5, 8, 10, 20, 30, 50) Population initialisation: Random, Crossover on solutions in the archive, Random noise on archived solutions (Simulated Annealing) 20 runs on each instance, short run time (20” to 20’) Performance assessment Hypervolume indicator difference of the different sets of non-dominated solutions obtained Statistical analysis (Mann-Withley test) Z

Angers, 10 June Results : table analysis For each algorithm : 20 Hypervolume indicator differences computed from the 20 runs Tables show the average value for each pair algorithm/instance Stastistical analysis : Rank the differents runs of two different algorithm Using Hypervolume difference Mann-Withley test: compute the confidence level that the obtained classification is not hazardous Results in bold: algorithm never outperformed by another algorithm with a confidence level greater than 95% AAAABAAABABABBBBBB ABBABABAAABABBBAAB

Angers, 10 June Results  Indicator sensitivity  Superiority of performance assessment based indicators over dominance based indicators  Superiority of epsilon indicator over hypervolume indicator

Angers, 10 June Results  Initialisation strategy sensitivity  Superiority of Simulated annealing (random mutations) initialisation  Optimal noise rate around 10%

Angers, 10 June Results  Population size sensitivity  Best performance obtained with small population size  Optimal population size increases with the size of the problem considered

Angers, 10 June Experiments: parameter sensitivity summary of the best values ProblemRun timeP sizeIndicatorInitialisation 20*5 (1)20”3 I Fon, Iε SA, Cross 20*5 (2)20”3 I Fon, I Sri Cross 20*10 (1)1’5 Iε, I HD SA 20*10 (2)1’8 Iε, I HD SA 20*202’8 Iε, I Fon SA 50*55’8 Iε, I Fon SA 50*1010’10 Iε, I HD SA 50*2020’’30 Iε, I HD SA, Cross

Angers, 10 June IBMOLS: conclusions IBMOLS: generic indicator-based local search for MOPs Small number of parameters (pop size, binary indicator, initialisation function) No diversity preservation mechanism required Superiority of Iε binary indicator on different problems Parameter sensitivity analysis Performance assessment based indicators Small population size Population initialisation: random noise on archived solutions Very good overall results obtained (new best-knowns) …BUT hypervolume indicator is known as the most intuitive performance indicator and the only one being fully sensitive to Pareto dominance relation. Why the results are desapointing?

Angers, 10 June Outlines Motivations Evolutionary Multiobjective Optimisation Quality indicators Indicator-Based Evolutionary Algorithm Multiobjective Local Searches Indicator-Based Local Search Hypervolume-Based Optimisation Conclusions and perspectives

Angers, 10 June Hypervolume UNARY indicator I HD does not correspond to the definition of hypervolume indicator! Zref f 2 f 1

Angers, 10 June Hypervolume UNARY indicator I HD does not correspond to the definition of hypervolume indicator! Zref f 2 f 1 We would like to compute…

Angers, 10 June Hypervolume UNARY indicator I HD does not correspond to the definition of hypervolume indicator! Zref f 2 f 1 f 2 f 1 We would like to compute… But we compute…

Angers, 10 June Hypervolume UNARY indicator I HD does not correspond to the definition of hypervolume indicator! Zref f 2 f 1 f 2 f 1 We would like to compute… But we compute…

Angers, 10 June Hypervolume UNARY indicator I HD does not correspond to the definition of hypervolume indicator! Zref f 2 f 1 We would like to compute…  We can’t compute hypervolume contribution of a solution by comparing only pairs of solutions

Angers, 10 June Hypervolume-Based MO Local Search Initialisation of the population P of size N Fitness assignment For each x є P, Fitness(x)=I(P\{x},x) Local search Step: for all x є P do x*  one random neighbour of x Fitness(x*)=I(P,x*) For each z є P, update its Fitness: Fitness(z)+=I(x*,z) Remove w, the solution with minimal Fitness value in P U x* Repeat until all neighbours tested, or w≠x* (new solution found) Stopping criterion: no new non-dominated solution found during an entire local search step: return the set of non-dominated solutions of P.  Iterated IBMOLS: repeat the process, with different initial populations

Angers, 10 June Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution f 2 f 1

Angers, 10 June Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution f 2 f 1

Angers, 10 June Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 1 st case : x is dominated f 2 f 1

Angers, 10 June Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 1 st case : x is dominated x’ fitness equal to the biggest dominance area between a solution of P and x f 2 f 1

Angers, 10 June Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 1 st case : x is dominated x’ fitness equal to the biggest dominance area between a solution of P and x Delete the dominated solution with the worst fitness value No more fitness update f 2 f 1

Angers, 10 June Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 1 st case : x is dominated x’ fitness equal to the biggest dominance area between a solution of P and x Delete the dominated solution with the worst fitness value No more fitness update f 2 f 1

Angers, 10 June Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated f 2 f 1

Angers, 10 June Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated f 2 f 1 Zref

Angers, 10 June Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? f 2 f 1 Update fitnesses

Angers, 10 June Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) f 2 f 1 x y1y1 y0y0 z0z0 z1z1

Angers, 10 June Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) Compute new fitness for x neighbours (thanks to x and neighbour which is perhaps newly dominated) f 2 f 1 x y1y1 y0y0 z0z0 z1z1

Angers, 10 June Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) Compute new fitness for x neighbours (thanks to x and neighbour which is perhaps newly dominated) f 2 f 1

Angers, 10 June Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) Compute new fitness for x neighbours (thanks to x and neighbour which is perhaps newly dominated) Delete worst solution w If w is dominated: no fitness change f 2 f 1

Angers, 10 June Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) Compute new fitness for x neighbours (thanks to x and neighbour which is perhaps newly dominated) Delete worst solution w If w is dominated: no fitness change If w is non-dominated: update fitness of w neighbours. f 2 f 1

Angers, 10 June Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) Compute new fitness for x neighbours (thanks to x and neighbour which is perhaps newly dominated) Delete worst solution w If w is dominated: no fitness change If w is non-dominated: update fitness of w neighbours (thanks to them and w). f 2 f 1 y0y0 y1y1 y2y2

Angers, 10 June Zref Fitness update : algorithm Algorithm Population P[1..N] : known fitness New solution x to evaluate Dominance state needed for each solution 2 nd case : x is non-dominated Update fitness: New dominated solutions? Compute x fitness (thanks to his non- dominated neighbours) Compute new fitness for x neighbours (thanks to x and neighbour which is perhaps newly dominated) Delete worst solution w If w is dominated: no fitness change If w is non-dominated: update fitness of w neighbours (thanks to them and w). f 2 f 1

Angers, 10 June General observations Overall complexity in O(n)! Special cases when x is non-dominated and on the extremity of the Pareto set Z ref coordinates replace neighbour coordinates The reference point Z ref need to be fixed Solution : Z ref = {+ ,+  } ! It allows us to maintain the extremities of the Pareto set into the population. The algorithm is defined for the bi-objective case only Need to extend it to the general case Hypervolume calculation is NP according to the number of objective function, then our algorithm too. BUT: Multi-objective problems studied mainly deals with 2 objective functions, sometime 3 objective functions, and almost never more than 4 objective functions.

Angers, 10 June Outlines Motivations Multiobjective optimisation Quality indicators Indicator-Based MultiObjective Search Hypervolume-Based Optimisation Description Experiments Conclusions and perspectives

Angers, 10 June Application: Biobjective Flow-shop problem  N jobs to schedule on M machines  Critical ressources  Permutation flow shop  Objectives to minimise :  Cmax : Maximum completion time  T : Sum of tardiness (or average tardiness)  Taillard’ Benchmarks [Taillard 93], extended to the biobjective case M1M1 M2M2 M3M3 Cmax _T_T _

Angers, 10 June Parameters / Performance assessment Binary quality indicators: Iε, I HD [Zitzler & Kuenzli 04] and I NH presented previouly. Population size: small fixed values (from 10 to 30) Population initialisation: 30% Random noise on archived solutions 20 runs on each instance run time from 20” to 60’ Performance assessment Hypervolume indicator difference of the different sets of non-dominated solutions obtained Statistical analysis (Mann-Withley test) Z

Angers, 10 June Results on Flow-shop problem

Angers, 10 June Outlines Motivations Multiobjective optimisation Quality indicators Indicator-Based MultiObjective Search Hypervolume-Based Optimisation Conclusions and perspectives

Angers, 10 June Conclusions and perspectives Conclusions Indicator-based multiobjective optimisation : a growing research area Very simple mechanism (no diversity preservation mecanism needed) Very efficient (outperform classical generic methods) New principle : still a lot of research fields to exploit IBEA : Indicator-Based Evolutionary Algorithm Efficient according to other multiobjective evolutionary algorithms Superiority of Iε binary indicator on different problems Evolutionary Algorithm  Slow convergence IBMOLS : Indicator-Based MultiObjective Local Search Combine advantages of IBEA and fast convergence of iterated local searches algorithms Hypervolume indicator needed to be improved HBMOLS : Hypervolume-Based MultiObjective Local Search Selection based on hypervolume maximisation Greatly outperforms IBMOLS versions

Angers, 10 June Conclusions and perspectives Perspectives Application to another multiobjective problems real world problems academics problems mathematical functions More than 2 objective optimisation? Propose adaptations to optimise more than 2 objectives (in process with Ron- Qiang Zeng) Study the limitation of the proposed algorithm in terms of complexity Study the possible use of approximation algorithms Study different versions of Hypervolume based selection Mainly for fitness computation of dominated solutions Application of indicator-based strategy on other search methods Path Relinking (in process with Ron-Qiang Zeng)

Angers, 10 June Global conclusion / research perspectives Lessons from past research (tx to E. Zitzler) EMO provides information about a problem (search space exploration) EMO can help in single-objective scenarios (multiobjectivization) But… MOO is part of the decision making process (preferences) But… MOO for large n is different from n = 2 (high-dimensional objective spaces) Research perspectives MOO = part of the decision making process. How to collaborate between decision maker and MOO Uncertainty and robustness Expensive objective function evaluations Hybridisation: Metaheuristics and OR methods (examples) Multi-multiobjective problems definition Many objective optimisation