Presentation is loading. Please wait.

Presentation is loading. Please wait.

DIVERSITY PRESERVING EVOLUTIONARY MULTI-OBJECTIVE SEARCH Brian Piper1, Hana Chmielewski2, Ranji Ranjithan1,2 1Operations Research 2Civil Engineering.

Similar presentations


Presentation on theme: "DIVERSITY PRESERVING EVOLUTIONARY MULTI-OBJECTIVE SEARCH Brian Piper1, Hana Chmielewski2, Ranji Ranjithan1,2 1Operations Research 2Civil Engineering."— Presentation transcript:

1 DIVERSITY PRESERVING EVOLUTIONARY MULTI-OBJECTIVE SEARCH Brian Piper1, Hana Chmielewski2, Ranji Ranjithan1,2 1Operations Research 2Civil Engineering North Carolina State University

2 NEED FOR SOLUTION DIVERSITY
The optimal solution to a modeled system is not necessarily optimal for the real system For complex engineering problems, usually modeled system ≈ (but ≠) real system A set of alternative solutions (optimal and near-optimal) with maximally different solution characteristics can be useful in decision-making Can we systematically search for maximally different solutions that are “good” (near-optimal)?

3 SEARCH FOR DIVERSE SOLUTIONS
Formal search methods for solutions that are diverse, i.e., maximally different in decision space: Prior bodies of work on methods based on -- Mathematical programming methods {Brill et al, …; } -- Evolutionary algorithms {Zechman & Ranjithan, …; } Recent interests in Evolutionary Multi-objective Optimization for solution diversity -- E.g., Ulrich et al. 2010; Shir et al. 2009, 2010 Goal: new EMO-methods for generating Pareto solutions that are diverse in decision space

4 PARETO SET & DECISION SPACE DIVERSITY
Finds Pareto front with efficiency comparable to an algorithm with only fitness criteria Grants higher reproductive probability to some slightly less fit solutions with high decision space diversity Produces subsequent populations and a final solution set with high fitness and input parameter diversity Red = Non-dominated solutions Green = near-non-dominated solutions, but more diverse in decision space

5 PARETO SET & DECISION SPACE DIVERSITY
The final set of solutions can be a mix of the Pareto and near-Pareto solutions Red = Non-dominated solutions Green = near-non-dominated solutions, but more diverse in decision space

6 GENERAL STEPS OF THE ALGORITHMS
Initialize population (P) of n individuals for generation i:     Evaluate fitness of solutions          Update Archive (A)      Select parents from (P U A) considering Pareto optimality metric in objective space (OS)     Diversity metric in decision space (DS) Perform crossover and mutation if stopping criterion is unmet, proceed to generation i + 1 Select from archive the final solution set

7 "TRIPLE RANK" ALGORITHM SELECTION PROCEDURE  Non-dominance-based Ranking primary criterion: Pareto front rank Secondary criterion: OS hypervolume contribution within each rank Decision space Diversity-based Rankings  distance to closest non-dominated point in the DS sum of the distances to the two closest neighbors in the DS

8 "TRIPLE RANK" ALGORITHM SELECTION PROCEDURE…  A solution may become a parent by:  DOMINATING - being in the current Pareto front EXCELLING over 50% of solutions in the Pareto ranking and at least one of the two diversity rankings      

9 "TRIPLE RANK" ALGORITHM SELECTION PROCEDURE…  A solution may become a parent by: DOMINATING - being in the Pareto front EXCELLING over 50% of solutions in the fitness ranking and at least one of the two diversity rankings REPLACING one of its nearest neighbors already in the parent set if one of its rankings greatly exceeds its neighbor's, and the other two are within a range

10 "TRIPLE RANK" ALGORITHM SELECTION PROCEDURE…  A solution may become a parent by: DOMINATING - being in the Pareto front EXCELLING over 50% of solutions in the fitness ranking and at least one of the two diversity rankings REPLACING one of its nearest neighbors already in the parent set if one of its rankings greatly exceeds its neighbor's, and the other two are within a range SPECIALIZING - performing well in just one of the three rankings

11 "TRIPLE RANK" ALGORITHM ARCHIVING PROCEDURE Non-dominated points are added to a Pareto archive (AP) Near-Pareto optimal solutions with high diversity ranks are added to a diversity archive (AD) The AP is trimmed (< population size) by keeping the non-dominated solutions with the highest diversity ranks within a neighborhood

12 CLUSTER SELECTION ALGORITHM
SELECTION PROCEDURE: Clustering Step Assign rank based on constrained non-dominated sort Calculate hypervolume contribution within each rank Cluster solutions in Objective Space (OS)  K-means, hierarchical, etc. Calculate distance to cluster centroid in Decision Space (DS)

13 SELECTION PROCEDURE: Binary Tournament Step
CLUSTER SELECTION ALGORITHM    SELECTION PROCEDURE: Binary Tournament Step  A solution may become a parent through:  R(ank)H(ypervolume)C(luster) Selection Binary Tournament Lower Pareto rank wins If equal rank & hypervolume difference is not within a threshold then: hypervolume contribution wins else: larger distance to centroid of cluster in decision space wins

14 CLUSTER SELECTION ALGORITHM
ARCHIVING PROCEDURE Add to the archive non-dominated solution from current population Solution added if non-dominant and far in DS and OS from non-dominated solutions already added For each non-dominated solution in the archive Add from own cluster solutions different in DS

15 DOMINATED PROMOTION ALGORITHM
SELECTION PROCEDURE: Hypervolume-based Fitness Assignment  For each dominated solution assign:  the hypervolume of the new non-dominated front if that solution is considered as a non-dominated solution remove all solutions dominating the one being considered For each non-dominated solution, assign the hypervolume of the non-dominated set

16 DOMINATED PROMOTION ALGORITHM
SELECTION PROCEDURE: Hypervolume-based Fitness Assignment  For each dominated solution assign:  the hypervolume of the new non-dominated front if that solution is considered as a non-dominated solution remove all solutions dominating the one being considered For each non-dominated solution, assign the hypervolume of the non-dominated set

17 SELECTION PROCEDURE: Binary Tournament Step
DOMINATED PROMOTION ALGORITHM SELECTION PROCEDURE: Binary Tournament Step  A solution may become a parent through:  Binary Tournament Both feasible, largest hypervolume wins Feasible beats infeasible Both infeasible, minimum sum of infeasibilities wins

18 DOMINATED PROMOTION ALGORITHM
ARCHIVING PROCEDURE Update non-dominated solutions Within a neighborhood of each non-dominated solution in OS Select solutions that are sufficiently far in DS Trim archive set by removing nearest neighbor solutions in DS

19 FINAL SOLUTION SELECTION
The final solution set is constructed two different ways by searching the "neighborhood" around each non-dominated solution and selecting the solution with the largest minimum distance in the DS:   1. to other points in    the neighborhood             OR 2. to all other points in the archive

20 Average pairwise distance of all solutions (normalized by decision
DIVERSITY METRICS Average pairwise distance of all solutions (normalized by decision space diameter), (Shir, et. al, 2009) Average nearest neighbor distance Minimum nearest neighbor distance 𝐷𝑠= 1 𝐷𝑆 𝐷 2 𝑛(𝑛+1) 𝑖=1 𝑛−1 𝑗=𝑖+1 𝑛 𝑑 𝑖𝑗 𝐷 1 = 1 𝑛 𝑖=1 𝑛 𝑚𝑖𝑛 𝑑 𝑖𝑗 ,∀𝑗≠𝑖 𝐷 2 =𝑚𝑖𝑛 𝑑 𝑖𝑗 ,∀𝑖,𝑗|𝑖≠𝑗

21 TESTING AND COMPARISON
Two test problems: Lamé Superspheres Omni-Test GA Settings: Population size: 200 Number of generations: Simulated Binary Crossover Gaussian Mutation 30 random trials Three candidates for diversity front Best Pareto front found Diversity front chosen by nearest neighborhood solution Diversity front chosen by nearest archive solution Performance comparisons based on Hypervolume metric Three diversity metrics

22 min 𝑓 1 = 1+𝑟 ⋅ cos 2 (𝑥 1 ) min 𝑓 2 = 1+𝑟 ⋅ sin 2 𝑥 1
TEST PROBLEM: LAMÉ SUPERSPHERES n = 4 Decision Variables 2 Objective Functions min 𝑓 1 = 1+𝑟 ⋅ cos 2 (𝑥 1 ) min 𝑓 2 = 1+𝑟 ⋅ sin 2 𝑥 1 where 𝜀= 1 𝑛 −1 𝑖=1 𝑛 𝑥 𝑖 and 𝑟= ( sin (𝜋⋅𝜀 )) 2 and 𝑥 1 ∈ 0, 𝜋 2 , 𝑥 𝑖 ∈ 1,5 ∀𝑖=2,…, 𝑛

23 RESULTS: LAMÉ SUPERSPHERES
Algorithm Archive Sorting for Final Solution Set Selection Method Hypervolume Shir Diversity Metric Avg. Dist. To Nearest Neighbor Min. of Min. Dist. To Nearest Neighbor Triple Rank Non-dominated 3.206 ± 0.003 0.353 ± 0.014 0.345 ± 0.059 0.033 ± 0.027 Archive Min. distances 3.190 ± 0.009 0.375 ± 0.017 0.752 ± 0.065 0.335 ± 0.080 Neighborhood Min distances 3.194 ± 0.006 0.398 ± 0.017 0.592 ± 0.065 0.113 ± 0.051 Cluster Selection 3.206 ± 0.001 0.340 ± 0.025 0.309 ± 0.044 0.028 ± 0.011 3.194 ± 0.008 0.349 ± 0.017 0.591 ± 0.065 0.197 ± 0.066 3.197 ± 0.006 0.357 ± 0.018 0.478 ± 0.058 0.094 ± 0.035 Dominated Promotion 3.208 ± 0.001 0.363 ± 0.008 0.286 ± 0.020 0.062 ± 0.008 3.198 ± 0.004 0.372 ± 0.011 0.707 ± 0.047 0.289 ± 0.069 3.201 ± 0.004 0.393 ± 0.008 0.462 ± 0.035 0.078 ± 0.016 Niching-CMA 3.172 ± 0.037 0.412 ± 0.061 CMA-MO 3.205 ± 0.007 0.115 ± 0.019 NSGA-II 3.203 ± 0.001 0.224 ± 0.046 NSGA-II-Agg. 3.109 ± 0.108 0.307 ± 0.049 Omni-Opt. 2.481 ± 0.375 0.029 ± 0.060 best for the metric value and the robustness (i.e., low standard deviation) red bold font for best (if it occurs in other algorithms besides ours) black bold font for best occurring in our algorithms

24 COMPARISION OF SHIR DIVERSITY METRIC ON LAMÉ SUPERSPHERES TEST RUNS
RESULTS: LAMÉ SUPERSPHERES COMPARISION OF SHIR DIVERSITY METRIC ON LAMÉ SUPERSPHERES TEST RUNS (30 random trials) TR     TR     TR   CS   CS    CS    DP   DP    DP  Niche  CMA NSGA NSGA Omni ND    All     Near  ND   All    Near   ND   All    Near CMA    MO    II     II-Agg

25 min 𝑓 1 = 𝑖=1 𝑛 sin (𝜋 𝑥 𝑖 ) min 𝑓 2 = 𝑖=1 𝑛 cos (𝜋 𝑥 𝑖 )
TEST PROBLEM: OMNI-TEST n = 5 Decision Variables 2 Objective Functions min 𝑓 1 = 𝑖=1 𝑛 sin (𝜋 𝑥 𝑖 ) min 𝑓 2 = 𝑖=1 𝑛 cos (𝜋 𝑥 𝑖 ) 𝑥 𝑖 ∈ 0,6 , ∀𝑖

26 RESULTS: OMNI-TEST Algorithm
Archive Sorting for Final Solution Set Selection Method Hypervolume Shir Diversity Metric Avg. Dist. To Nearest Neighbor Min. of Min. Dist. To Nearest Neighbor Triple Rank Non-dominated ± 0.267 0.186 ± 0.070 0.182 ± 0.076 0.004 ± 0.005 Archive Min. distances ± 0.267 0.199 ± 0.073 0.268 ± 0.109 0.012 ± 0.005 Neighborhood Min distances 0.201 ± 0.073 0.256 ± 0.104 0.011 ± 0.005 Cluster Selection ± 0.122 0.109 ± 0.064 0.117 ± 0.079 0.012 ± 0.004 ± 0.130 0.129 ± 0.073 0.246 ± 0.162 0.033 ± 0.012 ± 0.127 0.129 ± 0.071 0.212 ± 0.126 0.027 ± 0.008 Dominated Promotion ± 0.151 0.272 ± 0.046 0.570 ± 0.159 ± 0.007 ± 0.151 0.281 ± 0.044 0.740 ± 0.192 0.050 ± 0.015 0.281 ± 0.043 0.712 ± 0.180 0.049 ± 0.015 Niching-CMA 30.27 ± 0.05 0.247 ± 0.061 CMA-MO 30.43 ± 0.002 0.042 ± 0.028 NSGA-II 30.17 ± 0.034 0.191 ± 0.085 NSGA-II-Agg. 29.81 ± 0.2 0.207 ± 0.065 Omni-Opt. 29.72 ± 0.20 ± 0.002 best for the metric value and the robustness (i.e., low standard deviation) red bold font for best (if it occurs in other algorithms besides ours) black bold font for best occurring in our algorithms

27 COMPARISION OF SHIR DIVERSITY METRIC
RESULTS: OMNI-TEST COMPARISION OF SHIR DIVERSITY METRIC ON OMNI-TEST RUNS (30 random trials) TR     TR     TR   CS   CS    CS    DP   DP    DP  Niche  CMA NSGA NSGA Omni ND    All     Near  ND   All    Near   ND   All    Near CMA    MO    II     II-Agg

28 - Relaxation margin for near-optimality
OBSERVATIONS & OUTLOOK Dominated Promotion algorithms perform consistently well in Pareto optimality and DS diversity metrics While most algorithms are robust, some sensitivity to internal parameters is observed. There is a need for improvement in parameter selections for: - Relaxation margin for near-optimality - Neighborhood size in final solution selection Next steps also include - Improving the adaptive mechanism for archive trimming and updating - Applying and testing the methods on more test problems and engineering applications

29

30 CLUSTER SELECTION PARETO FRONT
TEST PROBLEM: LAMÉ SUPERSPHERES CLUSTER SELECTION PARETO FRONT Lame_cluster_selection (_best and _nearest_all)

31 CLUSTER SELECTION ALL-ARCHIVE NEAREST NEIGHBOR DISTANCE SELECTION
TEST PROBLEM: LAMÉ SUPERSPHERES CLUSTER SELECTION ALL-ARCHIVE NEAREST NEIGHBOR DISTANCE SELECTION Lame_cluster_selection (_nearest_all)

32 TEST PROBLEM: LAMÉ SUPERSPHERES
CLUSTER SELECTION HYPERVOLUME CONVERGENCE Lame_sluster_selection

33 LAMÉ SUPERSPHERES OBJECTIVE AND DECISION SPACES
TEST PROBLEM: LAMÉ SUPERSPHERES LAMÉ SUPERSPHERES OBJECTIVE AND DECISION SPACES

34 CLUSTER SELECTION PARETO FRONT
TEST PROBLEM: OMNI-TEST CLUSTER SELECTION PARETO FRONT Cluster_selection_best and _nearest_all

35 CLUSTER SELECTION ALL-ARCHIVE NEAREST NEIGHBOR DISTANCE SELECTION
TEST PROBLEM: OMNI-TEST CLUSTER SELECTION ALL-ARCHIVE NEAREST NEIGHBOR DISTANCE SELECTION CLUSTER_SELECTION_best and _nearest_all

36 CLUSTER SELECTION HYPERVOLUME CONVERGENCE
TEST PROBLEM: OMNI-TEST CLUSTER SELECTION HYPERVOLUME CONVERGENCE Cluster selection _best and _nearest_all


Download ppt "DIVERSITY PRESERVING EVOLUTIONARY MULTI-OBJECTIVE SEARCH Brian Piper1, Hana Chmielewski2, Ranji Ranjithan1,2 1Operations Research 2Civil Engineering."

Similar presentations


Ads by Google