Island Based GA for Optimization University of Guelph School of Engineering Hooman Homayounfar March 2003.

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

CS6800 Advanced Theory of Computation
1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) February, 9, 2009.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
1 Wendy Williams Metaheuristic Algorithms Genetic Algorithms: A Tutorial “Genetic Algorithms are good at taking large, potentially huge search spaces and.
Non-Linear Problems General approach. Non-linear Optimization Many objective functions, tend to be non-linear. Design problems for which the objective.
A New Evolutionary Algorithm for Multi-objective Optimization Problems Multi-objective Optimization Problems (MOP) –Definition –NP hard By Zhi Wei.
Reporter : Mac Date : Multi-Start Method Rafael Marti.
PGA – Parallel Genetic Algorithm Hsuan Lee. Reference  E Cantú-Paz, A Survey on Parallel Genetic Algorithm, Calculateurs Paralleles, Reseaux et Systems.
Ant Colony Optimization Optimisation Methods. Overview.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 25 Instructor: Paul Beame.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Genetic Algorithms: A Tutorial
1 Reasons for parallelization Can we make GA faster? One of the most promising choices is to use parallel implementations. The reasons for parallelization.
Genetic Algorithm.
Efficient Model Selection for Support Vector Machines
Neural and Evolutionary Computing - Lecture 10 1 Parallel and Distributed Models in Evolutionary Computing  Motivation  Parallelization models  Distributed.
1 Paper Review for ENGG6140 Memetic Algorithms By: Jin Zeng Shaun Wang School of Engineering University of Guelph Mar. 18, 2002.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Heuristic Optimization Methods
An Introduction to Artificial Life Lecture 4b: Informed Search and Exploration Ramin Halavati In which we see how information.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
1 “Genetic Algorithms are good at taking large, potentially huge search spaces and navigating them, looking for optimal combinations of things, solutions.
EE459 I ntroduction to Artificial I ntelligence Genetic Algorithms Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
Doshisha Univ., Kyoto, Japan CEC2003 Adaptive Temperature Schedule Determined by Genetic Algorithm for Parallel Simulated Annealing Doshisha University,
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Applications of Dynamic Programming and Heuristics to the Traveling Salesman Problem ERIC SALMON & JOSEPH SEWELL.
Exact and heuristics algorithms
C OMPARING T HREE H EURISTIC S EARCH M ETHODS FOR F UNCTIONAL P ARTITIONING IN H ARDWARE -S OFTWARE C ODESIGN Theerayod Wiangtong, Peter Y. K. Cheung and.
Genetic Algorithms Przemyslaw Pawluk CSE 6111 Advanced Algorithm Design and Analysis
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Parallel Genetic Algorithms By Larry Hale and Trevor McCasland.
Optimization Problems
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
Genetic Algorithms Chapter Description of Presentations
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms. Solution Search in Problem Space.
EVOLUTIONARY SYSTEMS AND GENETIC ALGORITHMS NAME: AKSHITKUMAR PATEL STUDENT ID: GRAD POSITION PAPER.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
1 Intro to AI Local Search. 2 Intro to AI Local search and optimization Local search: –use single current state & move to neighboring states Idea: –start.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Intelligent Exploration for Genetic Algorithms Using Self-Organizing.
Paper Review for ENGG6140 Memetic Algorithms
Genetic Algorithms.
Heuristic Optimization Methods
Bulgarian Academy of Sciences
School of Computer Science & Engineering
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Comparing Genetic Algorithm and Guided Local Search Methods
Genetic Algorithms: A Tutorial
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
Aiman H. El-Maleh Sadiq M. Sait Syed Z. Shazli
ENGG*6140 Optimization for Engineering
Md. Tanveer Anwar University of Arkansas
Lecture 4. Niching and Speciation (1)
Traveling Salesman Problem by Genetic Algorithm
Genetic Algorithms: A Tutorial
Presentation transcript:

Island Based GA for Optimization University of Guelph School of Engineering Hooman Homayounfar March 2003

Outlines -Dynamic Optimization Problems - Current Techniques and Limitations - Advanced GA - IGA for Optimization - Implementation - Results Analysis and Conclusion - Future work

Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible solutions. Finding the optimum solution, which has the minimum cost, is the main goal of the optimization. In most of the case, searching of the entire solution space is practically impossible. - Optimization Problems classification:. Static : Constrains remain fixed during the computation and after that.. Dynamic: Constrains vary during the computation or after finding the optimum solution.

Fig 1: Static and dynamic optimizations ProblemSolutionUsing Correct the solution Change in problem Static Optimization Dynamic Optimization

Dynamic optimization Definition: Problems constrains and elements are changed after solving the problem. Goal: To find the new optimum solution in the best way (the worst way is to solve the problem from the scratch) Current techniques: - Using memory: Storing the history of each peak for further exploration. - Editing the solution: Modifying the last optimum solution. - GA (adaptive mutation): Increasing the mutation rate after each change. - Multi-Population GA: Keep tracking of each pick by a sub-population (i.e. an island)

Optimization Problems Applications: Vehicle routing Good delivery Large scale scheduling and transportation (i.e. Army logistics) Characteristics of dynamic optimization environments: Elements and conditions change by the time. Optimum solution change by the time. Computation time is high. Example: Traveling Salesman Problem (TSP) for Good delivery

Heuristic Techniques Classification For Optimization  Traditional Techniques (e.g. Tabu search, Simulating Annealing, …) Exploiting and tuning of the solution  Evolutionary Algorithm (e.g. Genetic Algorithms) Exploring the search space and blending the solutions  Hybrid Algorithms Exploring good solutions and tuning them for finding the optimum  Learning Algorithms (e.g. Neural Networks, Reinforcement Learning) Learning how to generate the good solutions

Limitations and challenging issues Local optimums and premature convergence is always a problem. In other words the optimum solution is not guaranteed. Optimization of np-hard problems, dealing with huge benchmarks, is very complex and time consuming. Dynamic nature of the problems, increases the complexity. Each technique has some strengths and some weak points. Usually each one has a good performance on specific benchmarks. There is no comprehensive technique that can solve most of the problems desirably.

Genetic Algorithms, strength and drawbacks GA : Inspiring from genetic engineering to improve a generation of the chromosomes (i.e. solutions) and result excellent genomes (i.e. solutions). Generation 1Generation 2 Chromosome 1 Chromosome 2 Chromosome n Chromosome 1 Chromosome 2 Chromosome n Generation m Chromosome 1 Chromosome 2 Chromosome n Evolution Xover Mutation Replacement Selection Solution 1 …. Fig 2: Genetic Algorithm

Why GA for optimization ? GA is Able to cover the solution space widely Easy to hybrid with other algorithms (e.g. Local search) Flexible and suitable for dynamic environments Limitations of basic GA. Still no guarantee for optimum solution (i.e. premature convergence). High computation time

Advanced GA Adaptive GA: Auto adjusting the GA operators according the evaluation of the chromosomes in each generation Fig 3: Adaptive GA Evolution of individuals GA Parameters adjustment Initialization Evaluation of convergence rate Next generation Final solution

Parallel GA: - Independent/Dependent multi-population GA - Synchronized/Synchronized PGA - P2P/Master-slave sup-populations Advanced GA (Cont.) Sub Population 1 Sub Population 2 Sub Population n …. Problem Best solution Fig 4: Parallel GA

Hybrid GA: Using a greedy algorithm (i.e. Local Search) to improve the quality of individuals in each generation Fig 5: Hybrid GA Advanced GA (Cont.) Evolution of individuals by GA Exploitation by heuristic search Initialization Evaluation of individuals by GA Next generation Final solution

Multi-level GA: Splitting the problem into the small sub-problems and merging the sub-solutions Fig 6: Multi Level GA Advanced GA (Cont.) Original problem Sub-problem 1 Master Population Sub-problem 2 Sub-problem 3 PGA Final solution Clustering Merging Sub-population 1 Sub-population 2 Sub-population 3

Fig 7: Island Based GA IGA for optimization What is IGA (Island-based GA) ? IGA is a multi-population GA in which chromosomes can migrate between the islands (sup-population). migration Island 1 Island 2 Island n Island 3

IGA for optimization IGA (Island-based GA) characteristics: Customized multi-population (i.e. Islands) Synchronized and P2P migration (i.e. ring topology) Adaptive operators: - Local operators (mutation, crossover and hybrid rate) - Global operators (migration rate, migration period) Selectable hybrid (e.g. GA+LS, GA+TS, GA+SA) Using two method crossovers dynamically (i.e. one and two point) Auto-controlling “Occurrence” of each chromosomes to prevent the saturation of the population.

Fig 8: Periodically remote chromosomes injection prevents a common convergence Tour Cost Generation no Pop 1 (without remote injection) Pop 2 (without remote injection) Pop 1 (with remote injection) Pop 2 (with remote injection) Injection starts and stops periodically

Calculate the costs Read the benchmark Generate the islands Send the global variables to each island Run islands in parallel Receive the best solution so far from each island Has the last island sent the results ? Show the results Initial global variables No Stop Yes Fig 9: IGA main algorithm Islands Start

Fig 10: IGA algorithm for an island Initial the population Cross over and mutation Local search Migration (send/receive chromosome) Selection New population Evaluation of population & parameters adjustment Send the best individual to the controller

Advantages of IGA Due to multi-population characteristic of IGA, the possibility of getting stuck with local optimums is less in IGA than a single-population GA. For lowering the computation time, each island may reside on a machine. Periodically migration of chromosomes between the islands lowers chance of premature convergence. Adaptive operators, improve the performance. Using a multi-method algorithm (i.e. hybrid) takes most advantage of the different search techniques. Each island can use different operator values (population size, mutation rate and etc). This increases the diversity of the chromosomes and decreases similarity of the islands. PGA are more flexible when dealing with dynamic environments. IGA has a better performance (i.e. in terms of quality of results) than regular PGA.

Dynamic Benchmark Generator For simulating a dynamic environment a dynamic generator is needed. In a dynamic TSP two types of change can happen: - A change in a distance between the two cities. - A change in number of the cities (add or removing a city) In this work the first type is considered as dynamic benchmark generator. The second type is considered as future work.

Fig 11: Convergence in static (a) and dynamic (b) environments (changes are in the generations 100 and 200) Dynamic changes

Fig 12: Sharp changes in a dynamic environment

Implementation and results so far Using TSP as Benchmark Evaluating and tuning the GA operators in static benchmarks, including: - Local operator: Mutation and Crossover rates - Hybrid operators: Method and rates - Global operators: Rate and period of immigration and no. of islands Creating a “Dynamic benchmark generator” that can periodically change the distances between the cities Observation of the system reactions (best fitness) to the dynamic changes

Implementation and results so far (Cont.) Generalizing the optimum values of the operators from static to the dynamic environment Evaluating the performance of the algorithm (results) by a factor (i.e. improvement average cost) that has a consistent values, in addition to “Best cost”, which is random A visualized output for evaluation of the algorithm Evaluation of adaptive parameters

Evaluation of IGA For evaluation of IGA two comparisons have been done: Comparison of pure and hybrid IGA (quality and Computation time) to verify the preferred algorithm. Comparison of IGA with the traditional searching methods, in terms of quality of the results and computation time, to evaluate the performance of the IGA.

Table 1: Comparison of pure IGA and hybrid IGA (No of runs = 5) Benchmarks=att48Benchmarks=berlin52 IGA Method Opti mum BestAccu racy Time (sec) Opti mum BestAccu racy Time (sec) Pure Hybrid

Fig 13: Comparison between pure and hybrid IGA

Comparison between IGA and other methods Current Heuristic Methods: Local Search (LS): A greedy algorithm that considers the best first change in the solution. Simulating Annealing (SA): An algorithm that refers to the simulation technique in conjunction with an annealing (i.e. cooling) schedule of declining temperature. Tabu search (TS): An algorithm similar to LS plus using memory to avoid repeating moves.

Fig 14: A comparison among the different search techniques

Fig 15: Search methods processing time comparison

Fig 16: No. of Islands evaluation in terms of CPU time (IGA)

Fig 17: Evaluation of the population size in IGA

Fig 18: Evaluation of the population size in IGA (2)

Results analysis and conclusion Multi-population GA,including IGA, have a better performance compared with single-population GA. Using a hill-climbing (i.e. Local search) method with GA (Hybrid GA), improves the results considerably. Migration of chromosomes lowers a premature convergence. IGA can handle dynamic optimization problems better than plain (single population) GA. Optimum values for migration parameters (i.e. rate and period) and also for number of the islands can be obtained for each benchmark.

Variable crossover (one/two point) is better than fixed crossover. Independent characteristic of the islands and cooperation among them can handle changes in a dynamic benchmark better. IGA has a better performance than traditional search methods (e.g. Local Search, Tabu Search, Simulating Annealing) in term of efficiency (i.e. quality of the results and considerable CPU time). Migration in IGA helps to handle large benchmarks better. Results analysis and conclusion (Cont.)

Future works Still the results are far from the ideal. More research is needed to overcome the current limitations in the optimization. Some of the efforts that could be made, in this work, are: Distributed IGA for faster results on huge benchmarks Solving TSP with variable number of cities to be more realistic Using AI (e.g. reinforcement learning and self-trainer) for improving the results Optimizing the IGA by using different techniques (e.g. using different migration topologies) Research on other algorithms beside GA for dynamic optimization Working more on adaptive algorithms Using multi-agent technology in IGA to come up with current limitations