Presentation is loading. Please wait.

Presentation is loading. Please wait.

Non-revisiting Genetic Algorithm with Constant Memory

Similar presentations


Presentation on theme: "Non-revisiting Genetic Algorithm with Constant Memory"— Presentation transcript:

1 Non-revisiting Genetic Algorithm with Constant Memory
Yang LOU, Shiu Yin YUEN Department of Electronic Engineering City University of Hong Kong

2 Contents Introduction of cNrGA Two Pruning Strategies
Experimental Results Conclusions

3 Abbreviations NrGA – Non-revisiting Genetic Algorithm
cNrGA – NrGA for continuous problems BSP Tree – Binary Space Partitioning Tree (Archive Structure in (c-)NrGA) LRU – Least Recently Used R – Random MT – Memory Threshold

4 Brief Intro. to cNrGA Re-generate the same individual – uniform crossover; Parameter-less adaptive mutation performed within sub-regions; Re-evaluation is avoided.

5 cNrGA Use the entire search history and parameter- less adaptive mutation Usage of memory is directly proportional to the Max number of Function Evaluations (MemoryUsage ∝ MaxFEs) Conventionally, MaxFEs = 40,000 is recommended for problems such as CEC test suite To Keep To Overcome To Overcome

6 BSP Tree and Sub-regions
2D Example [Upper Pic] Insertion of nodes (solutions) a and b [Lower Pic] Insertion of node (solution) c A node = {solution, fitness, time stamp, info. of sub-region, etc.}

7 Contents Introduction of cNrGA Two Pruning Strategies
Experimental Results Conclusions

8 Strategies to Prune BSP Tree
Basic Idea: When the memory threshold (MT) is reached, an element in the archive is pruned before recording a new one. Least Recently Used (LRU) Pruning To delete the most useless information in the archive, time stamp is necessary Random (R) Pruning To randomly delete some information in the archive

9 Comparison of LRU &. R LRU &. R: Remove two nodes (mutual siblings) from the BST tree, and add back the nodes recording the latest generated solutions LRU vs. R: Choose different history information to delete

10 Technically suggest not to prune node a
Technically easy to prune node b Re-allocate the sub- regions for related nodes Delete historical information of node b

11 The Novel Operators cNrGA/CM/LRU &. cNrGA/CM/R
LRU &. R: Novel Parameter-less Adaptive Mutation Operators Introduce no additional parameters Enlarge the mutation range LRU: Non-uniform - Older solutions are prone deleted and their search sub-regions are expanded R: Uniform 0.5 0.25 (Exploration ) *

12 Contents Introduction of cNrGA Two Pruning Strategies
Experimental Results Conclusions

13 Experimental Study Compare: cNrGA, cNrGA/CM/LRU, and cNrGA/CM/R
CEC 2013 benchmark functions Parameters: Population Size: 100 Crossover Rate: 0.5 Very few parameters in NrGA

14 Experimental Study 𝐼𝑛𝑓𝑜.𝐿𝑜𝑠𝑠= 𝑀𝑎𝑥𝐹𝐸𝑠−𝑀𝑇 𝑀𝑎𝑥𝐹𝐸𝑠 MaxFEs MT Info. Loss
𝐼𝑛𝑓𝑜.𝐿𝑜𝑠𝑠= 𝑀𝑎𝑥𝐹𝐸𝑠−𝑀𝑇 𝑀𝑎𝑥𝐹𝐸𝑠 MaxFEs 40,000 50,000 60,000 100,000 MT 30,000 10,000 Info.
 Loss 25% 40% 50% 90%

15 Average Rank (i) 25% Loss MaxFEs = 40,000 MT = 30,000 cNrGA
cNrGA/
CM/LRU cNrGA/CM/R Rank 1.96 1.86 2.18

16 Average Rank (ii) 40% Loss MaxFEs = 50,000 MT = 30,000 cNrGA
cNrGA/
CM/LRU cNrGA/CM/R Rank 2 2.11 1.89

17 Average Rank (iii) 50% Loss MaxFEs = 60,000 MT = 30,000 cNrGA
cNrGA/
CM/LRU cNrGA/CM/R Rank 1.93 2 2.07

18 Average Rank (iv) 90% Loss MaxFEs = 100,000 MT = 10,000 cNrGA
cNrGA/
CM/LRU cNrGA/CM/R Rank 2 1.93 1.65

19 Contents Introduction of cNrGA Two Pruning Strategies
Experimental Results Conclusions

20 Conclusions The applicability of cNrGA is widen to the situations when MaxFEs is substantial; Least Recently Used (LRU) and Random (R) pruning strategies are proposed to maintain the memory usage constant ; Both pruning strategies maintain the good performance of cNrGA.

21 Thanks for your attention!
Q & A


Download ppt "Non-revisiting Genetic Algorithm with Constant Memory"

Similar presentations


Ads by Google