Presentation is loading. Please wait.

Presentation is loading. Please wait.

- Divided Range Multi-Objective Genetic Algorithms -

Similar presentations


Presentation on theme: "- Divided Range Multi-Objective Genetic Algorithms -"— Presentation transcript:

1 The New Model of Parallel Genetic Algorithm in Multi-Objective Optimization Problems
- Divided Range Multi-Objective Genetic Algorithms - ○Tomoyuki Hiroyasu Mitsunori Miki Shinya Watanabe Intelligent Systems Design Laboratory, Doshisha University,Japan Doshisha Univ., Japan I’m Shinya Watanabe and a graduate student of Doshisha University in japan. Now I’m talking about our study, the title is “Parallel Evolution Multi-Criterion Optimization for Block Layout Problems”.

2 EMO Background (1) ●Multi-criterion Optimizations ●Genetic Algorithms
(Ex. VEGA,MOGA,NPGA…etc) ・High computation cost Parallel Computing Doshisha Univ., Japan These are the background of our story. Multi-criterion optimizations solved by Evolutionary algorithms are often called EMO. The study on EMO is not few. Many of the studies in this category get good results. But EMO has been known to be several problems. One of them is high computation cost. One of the simplest and most powerful solutions is performing EMO on Parallel Computers. Because Evolutionary Algorithms have potential parallelism and PC Cluster systems become very popular these days.

3 Background (2) ●Parallel EMOs Algorithms Divided Range Multi-Objective
Distributed GA model Master slave model Cellular GA model SW-HUB Divided Range Multi-Objective Genetic Algorithms (DRMOGA) Doshisha Univ., Japan And Now,we’d like to focus parallel Emos Algorithms. Some parallel models for EMOs are proposed. But there are few studies for the validity on parallel model.So, we proposed the new parallel model Divided Range Multi-Objective Genetic Algorithms(DRMOGA). This model is applied to some test functions and it is found that this model is effective model for continuous multi-objective problems. But this model hasn’t been applied to discrete problems. Therefore, to find the effectiveness of DRMOGA, DRMGOA is applied to discrete problems. Now, I selected block layout problems as discrete problems.

4 Multi-Criterion Optimization Problems(1)
●Multi-Criterion Optimization Problems (MOPs) f 2 (x) Feasible region 1 Weak pareto optimal solutions Pareto optimal solutions Design variables X={x1, x2, …. , xn} Objective function F={f1(x), f2(x), … , fm(x)} Constraints Gi(x)<0 ( i = 1, 2, … , k) Doshisha Univ., Japan In the optimization problems, when there are several objective function, the problems are called multi-objective or multi-criterion problems(MOPs). In general,The MOPs are formulated as follows. Usually ,there are trade off relation between the objective functions. Therefore the optimum solution is not only one. In MOPs, the concept of the pareto optimal solution is used. This figure shows the Pareto-optimal solution of a two-objective problem. In figure, Pareto-optimal solutions are illustrated as Red line , and Weak pareto-optimal solutions are illustrated as Blue line. Weak pareto solution is the set of solutions that have the minimum value of one objective function. Since Pareto-optimal solution is a rational solution to MOPs, The first goal of solving the MOPs is to obtain Pareto-optimal solutions.

5 f Multi-Criterion Optimization Problems(2)
・Pareto dominant and Ranking method Pareto-optimal Set f 2 1 3 Pareto optimal solutions The set of non-inferior individuals in each generation. Ranking number of dominant individuals Rank = 1+ Doshisha Univ., Japan Here I talk about the method that judge which one individuals is pareto-solution or not. This figure shows the minimum problems to a two-objective function. In this figure, Red Line is pareto opromal solutions. As you see, The blue points in this figure is non-inferior in comparison with red point from any point of view. In brief, pareto-optimal set is the set of non-inferior individuals in each generation. In MOPs, concept of Ranking is usually made use of. Ranking expand concept of pareto-optimal. Fonseca determined the ranking as follow . Usually, these rankings are used as fitness function for selection such as the roulette selection.

6 Genetic Algorithms population Features crossover
Start (Initialization) individuals mutation population Crossover Mutation Selection Evaluation Features From a metaphor of the same mechanism of the evolution in nature Stochastic searching Multi-point searching High calculation cost Iteration End (Find solution) Doshisha Univ., Japan Here, I’m explaining Genetic Algorithms briefly. GAs are optimization methods that derive its behavior from a metaphor of the same mechanisms of the evolution in nature. The features of GAs are stochastic searching and multi-point searching. In GAs, there are several searching points called “Individuals” like this. So, it is possible to apply GAs both to problems having continuous values and problems having discrete values. This is the typical flow of Simple GAs. In GAs, evaluation, selection, crossover, mutation, those are genetic operations repeated every generation. After some generations, we may get a good solution in result. One of the disadvantages of GAs is the high calculation cost, GAs need a lot of iterations and it takes much time. One of the solution of this problem is performing GAs on parallel computers.

7 Pareto optimal solutions
Multi-objective GA (1) ・Multi-objective GA f 1 (x) 1 gene 5 gene 10 gene 30 gene 2 Pareto optimal solutions 50 gene Doshisha Univ., Japan Now, I’d like to talk about Multi-objective GA. In Multi-objective GA, Population is scattered(スカッター,スキャッター) In the objective field like this figure. like single objective GA , genetic operations such as evaluation, selection, crossover, and mutation, are repeated. and As generation grow, the set of individuals move toward the pareto optimum solutions.

8 Multi-objective GA (2) Squire EMO VEGA Schaffer (1985)
VEGA+Pareto optimum individuals Tamaki  (1995) Ranking Goldberg (1989) MOGA Fonseca (1993) Non Pareto optimum Elimination Kobayashi (1996) Ranking + sharing Srinvas (1994) Others Doshisha Univ., Japan There are some researches that are focused on the multi-objective GA. I could like to explain leading research a little. Schaffer developed the VEGA. This research is the first research in this category. Goldberg introduced the ranking method and Fonseca also developed the MOGA. And There are more and more others research in this category.. Like this way, there are several models of multi objective GA and they can derive the good Pareto optimal solutions. However, it needs a lot of iterations to calculate the values of objective functions and the constrains. This leads to the high calculation coast. One of the solution of this problem is performing GAs on parallel computers. Especially, multi-objective GA need more availability of memory. That’s why, when there are many objective functions, the many points are necessary.

9 Parallerization of Genetic Algorithms
Distributed GA model Island model (Free topology) Master slave model Global Parallelization (Only evaluate in parallel) Cellular GA model Neighborhood model (Mainly Grid topology) Doshisha Univ., Japan

10 DGA model Island 1 Island 2 ・Cannot perform the efficient search
(x) f 2 f 1 (x) 2 f (x) 1 (x) Island 2 f 2 f (x) 1 ・Cannot perform the efficient search ・Need a big population size in each island Doshisha Univ., Japan And I’d like to talk about parallel EMO. Regardless of single or multi objective GA, Most of parallel GA is DGA. In this model, population is divided into several subpopulations and a SGA is performed in each subpopulation. and Sometimes, individuals are exchanged by the operation of the migration. As some researchers investigated, This models can obtain better solution than SGA. And we also proposed the new parallel model in multi objective GA.

11 Divided Range Multi-Objective GA(1)
f 1 (x) 2 Division 1 f (x) 1 2 Division 1 Division 2 Max Pareto Optimum solution Min f 1 (x) 2 Division 2 1st The individuals are sorted by the values of focused objective function. 2nd The N/m individuals are chosen in sequence. 3rd SGA is performed on each sub population. 4th After some generations, the step is returned to first Step Doshisha Univ., Japan That is called Divided Range Multi-Objective GA (DRMOGA). The DRMOGA is one of the divided population models and a population is divided into sub populations. This figure shows the concept of the DRMOGA. In this figure, there are two objective function. Individuals are divided into two by the value of focused objective function F_1 . This algorithm is following the next steps. 1st Step The individuals are sorted by the values of focused objective function. 2nd Step The N/m(N over m) individuals are chosen in sequence. 3rd Step SGA is performed on each sub population. 4th Step After some generations, the step is returned to first Step. As the results, there exist m sub populations. The most Important point is that the searching domain is different in each sub population..

12 Divided Range Multi-Objective GA(2)
・DGA( Island model) f2(x) f1(x) f2(x) f1(x) f2(x) f1(x) + = ・DRMOGA f2(x) f1(x) f2(x) f1(x) f2(x) f1(x) = + Doshisha Univ., Japan This figure shows comparison of DGA and DRMOGA’s searching. As you can see, the sub population of DGA search same feasible domain, Therefore the efficient search can not be preformed. On the other hand, The sub population of DRMOGA is determined by the value of focused objective function. Therefore each subpopulation’s search area doesn’t overlap and I think that DRMOGA can have the high diversity.

13 Configuration of GA (1) Vector None a1 = {0.02, 10.03, ・・・, 7.52}
Expression of genes a1 = {0.02, 10.03, ・・・, 7.52} Crossover Center Neighborhood Crossover Selection ・Rank 1 selection with sharing ・Roulette selection ・Roulette selection + sharing Mutation None When the movement of the Pareto frontier is very small Terminal condition Doshisha Univ., Japan Now I’d like to explain configuration of GA. This figure shows coding example. In block packing method, a chromosome has two kinds of information: those are block number and the direction of block. And This table show that we selected genetic operations. In operation of selection, we selected Pareto reservation (リベイション) strategy (ストラテジー ). this method is that all of the rank 1 individuals are preserved. and In crossover, PMX method is used. PMX is originally developed for Traveling Salesman problems. In the mutation, 2 bit substitution method is used. In this method, arbitrary 2 bits are selected and these bits are substituted.

14 ・Center Neighbored Crossover (CNX)
Configuration of GA (2) ・Center Neighbored Crossover (CNX) 1st N+1 parent Individuals are selected randomly. 2nd The vector of the gravity is derived with the following equation. 3rd New individual is generated. rg = Σri 1 n+1 Parent2 (x2,y2 ) Parent3 (x3,y3 (x1,y1 Parent1 N ominee 1 (x4,y4) 2 (x 5 ,y 3 6 Gravity × Child 7 rchild = rg + Σti ei σi = α|r i - r g| Doshisha Univ., Japan

15 Configuration of GA (3) α = 3 α = 6 ・Normal Distribution
Doshisha Univ., Japan

16 Parameter (1) Application models Parameter
SGA , DGA , DRMOGA Parameter GA parameter value SGA DGA・DRMOGA Population size crossover rate 1.0 mutation rate migration interval (sort interval) 5 5 migration rate 0.2 Doshisha Univ., Japan We applied SGA,DGA,and DRMOGA to block layout problems. To investigate the characteristics of the three models, We use two layout problems that has thirteen , and twenty seven blocks. This table shows width and length of each block in 13 block problem. The parameters of the GA are showed in this table.

17 Cases Parameter (2) selection method Case α Case1 3 Case2 6
Pareto optimal 3 Case2 6 Case3 Only roulette selection Case4 Case5 Roulette selection with sharing Case6 Doshisha Univ., Japan We applied SGA,DGA,and DRMOGA to block layout problems. To investigate the characteristics of the three models, We use two layout problems that has thirteen , and twenty seven blocks. This table shows width and length of each block in 13 block problem. The parameters of the GA are showed in this table.

18 Matrix - Evaluation methods -
Pareto optimum individuals Error (smaller values arebetter ( E>0) Cover rate (index of diversity, 0<C<1) Generation (smaller values are better) Doshisha Univ., Japan

19 Cover rate Cover rate f f Cover rate 85 . 2 9 8 = + (x) (x)
1 (x) 2 f 1 (x) 2 cover rate(f1)=8/10=0.8 cover rate(f2)=9/10=0.9 Cover rate 85 . 2 9 8 = + Doshisha Univ., Japan

20 Cluster system for calculation
Spec. of Cluster (5 nodes) Processor PentiumⅡ(Deschutes) Clock MHz # Processors 1 × 5 Main memory 128Mbytes × 5 Network Fast Ethernet (100Mbps) Communication TCP/IP, MPICH 1.1.2 OS Linux Compiler gcc (egcs ) Doshisha Univ., Japan The numerical examples were performed on the PC Cluster System. This shows the specification of our cluster system.

21 Veldhuizen and Lamount (1999) K. Deb (1999)
Numerical Example Tamaki et al. (1995) Veldhuizen and Lamount (1999) K. Deb (1999) Doshisha Univ., Japan

22 Example 1 Objective functions Constraints Doshisha Univ., Japan

23 Example 2 Objective functions Constraints f3 f2 f1
Doshisha Univ., Japan

24 Example 3 Objective functions f2 ・・・ f1 x1 f1 Doshisha Univ., Japan

25 å Example 4 x g ( x , , x ) = 1 + 10 N - 1 Objective functions f2 f1
・・・ 2 N N - 1 Doshisha Univ., Japan

26 Results (Example1) DGA (Case5) DRMOGA (Case5) f2 f2 f1 f1
Doshisha Univ., Japan

27 Results (Example1) 40 48 Case error cover rate Simple 436 0.00 1.00
number of solutions error cover rate generations Simple Case1 436 0.00 1.00 799 Case2 382 0.03 1.00 1000 Case3 471 0.00 1.00 35 Case4 444 0.00 1.00 367 Case5 461 0.00 1.00 39 Case6 330 0.00 1.00 1000 island Case1 436 0.01 1.00 43 Case2 438 0.01 1.00 59 Case3 423 0.01 1.00 273 Case4 435 0.01 1.00 44 Case5 431 0.01 1.00 66 Case6 404 0.01 1.00 927 DR Case1 500 0.00 1.00 40 Case2 500 0.00 1.00 48 Case3 494 0.00 1.00 105 Case4 494 0.00 1.00 548 Case5 495 0.00 1.00 199 Case6 494 0.00 1.00 814 Doshisha Univ., Japan

28 Results (Example2) DGA (Case1) DRMOGA (Case1) Doshisha Univ., Japan

29 Results (Example2) x1 x1 x2 x2 DGA (Case1) DRMOGA (Case1)
Doshisha Univ., Japan

30 Results (Example2) Case cover rate 0.95 0.96 0.92 0.85 0.88 0.53
number of solutions cover rate generations Simple Case1 500 0.75 15 Case2 500 0.74 18 Case3 491 0.51 19 Case4 485 0.50 30 Case5 316 0.48 19 Case6 207 0.47 198 island Case1 428 0.79 19 Case2 426 0.79 36 Case3 434 0.76 22 Case4 403 0.77 55 Case5 6 0.04 1000 Case6 125 0.43 943 DR Case1 386 0.95 44 Case2 330 0.96 256 Case3 429 0.92 82 Case4 255 0.85 277 Case5 337 0.88 66 Case6 90 0.53 117 Doshisha Univ., Japan

31 Results (Example3) DGA (Case4) DRMOGA (Case4) f2 f2 f1 f1
Doshisha Univ., Japan

32 Results (Example3) x1 x1 DGA (Case4) DRMOGA (Case4) f1 f1
Doshisha Univ., Japan

33 Results (Example3) Case number of solutions error cover rate
generations Simple Case1 500 7.21 0.41 394 Case2 456 5.92 0.32 612 Case3 469 3.75 0.14 1000 Case4 374 2.37 0.47 1000 Case5 482 3.84 0.48 1000 Case6 423 2.60 0.43 1000 island Case1 345 6.41 0.46 570 Case2 322 5.88 0.48 919 Case3 301 3.70 0.22 1000 Case4 220 2.60 0.35 1000 Case5 283 3.39 0.43 1000 Case6 240 2.41 0.31 1000 DR Case1 412 6.87 0.38 533 Case2 363 5.38 0.28 774 Case3 425 4.53 0.40 780 Case4 293 0.01 0.99 1000 Case5 393 3.92 0.41 692 Case6 254 0.14 0.94 971 Doshisha Univ., Japan

34 Results (Example3) DRMOGA (Case4) - Object sharing -
DRMOGA (Case4)-Plan sharing- Doshisha Univ., Japan

35 Results (Example3) DRMOGA (Case4) - Object sharing -
fx1 x1 fx1 x1 DRMOGA (Case4) - Object sharing - DRMOGA (Case4)-Plan sharing- Doshisha Univ., Japan

36 Results (Example4) DGA (Case6) DRMOGA (Case6) f2 f2 f1 f1
Doshisha Univ., Japan

37 Results (Example4) x1 x1 DGA (Case6) DRMOGA (Case6) f1 f1
Doshisha Univ., Japan

38 Results (Example4) Case error cover rate 0.03 0.08 0.60 0.24 0.25 0.60
number of solutions error cover rate generations Simple Case1 500 1.70 0.31 209 Case2 500 1.71 0.38 358 Case3 470 0.32 0.22 1000 0.03 Case4 477 0.40 1000 Case5 492 0.34 0.58 855 0.08 0.60 Case6 493 899 island Case1 385 1.89 0.40 333 Case2 409 1.75 0.53 403 Case3 304 0.31 0.33 1000 0.24 Case4 361 0.46 1000 Case5 376 0.27 0.60 1000 Case6 365 0.25 0.60 1000 DR Case1 494 1.93 0.37 212 Case2 457 3.10 0.34 54 Case3 460 0.39 0.30 262 0.03 Case4 451 0.52 387 Case5 442 0.39 0.47 291 0.07 0.61 Case6 402 654 Doshisha Univ., Japan

39 Conclusion In this study, we introduced the new model of genetic algorithm in the multi objective optimization problems: Distributed Genetic Algorithms (DRGAs). DRGA is the model that is suitable for the parallel processing. can derive the solutions with short time. can derive the solutions that have high accuracy. can sometimes derive the better solutions compared to the single island model. Doshisha Univ., Japan

40 Conclusions In this study, we introduced the new model of genetic algorithm in the multi objective optimization problems: Distributed Genetic Algorithms (DRGAs). DRGA is the model that is suitable for the parallel processing. can derive the solutions with short time. can derive the solutions that have high accuracy. can sometimes derive the better solutions compared to the single island model. Doshisha Univ., Japan

41 27 And these are the results of DGA and SGA.
Results of 27 Blocks case 27 SGA Doshisha Univ., Japan And these are the results of DGA and SGA. They often got the real weak pareto solutions.


Download ppt "- Divided Range Multi-Objective Genetic Algorithms -"

Similar presentations


Ads by Google