Download presentation
Presentation is loading. Please wait.
Published byMartin McGee Modified over 8 years ago
1
Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to deliver satisfactory solutions to large and complex problems in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, Ant Colony Optimization Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
2
Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to deliver satisfactory solutions to large and complex problems in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
3
Parallel global search technique that emulates natural genetic operators GAs are stochastic search methods for exploring complex problem space in order to find optimal solutions using minimal information Population of individuals (tentative solutions) Fitness function (individual’s suitability to problem) Operators: selection, crossover and mutation Stop criterion (# iterations, finding of individual) Convergence towards a global solution Genetic Algorithms Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to solve large and complex problems allow to tackle large-size problems by delivering satisfactory solutions in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
4
1. GA search procedure - in terms of GNs –The GN model simultaneously evaluates several fitness functions, ranks the individuals per their FF and chooses the best FF regarding the problem 2. Selection and tuning of GA operators –The GN model has the possibility to test different groups of the defined genetic algorithm operators and choose the most appropriate combination among them. –The developed GN executes a genetic algorithm and implements tuning of the genetic operators, as well as the fitness function, regarding to the considered problem GN models of GAs Parallel global search technique that emulates natural genetic operators GAs are stochastic search methods for exploring complex problem space in order to find optimal solutions using minimal information Population of individuals (tentative solutions) Fitness function (individual’s suitability to problem) Operators: selection, crossover and mutation Stop criterion (# iterations, finding of individual) Convergence towards a global solution Genetic Algorithms Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to solve large and complex problems allow to tackle large-size problems by delivering satisfactory solutions in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
5
ACO is a new metaheuristic method inspired by the social behaviour of ants in nature. It finds good solutions for optimization problems with restrictive constraints Low level interaction between single agents results in a complex behaviour of the whole ant colony –Shortest path from food source to formicary –Communication via pheromone (distributed numerical information), which ants use to probabilistically construct solutions Ant Colony Optimization 1. GA search procedure - in terms of GNs –The GN model simultaneously evaluates several fitness functions, ranks the individuals per their FF and chooses the best FF regarding the problem 2. Selection and tuning of GA operators –The GN model has the possibility to test different groups of the defined genetic algorithm operators and choose the most appropriate combination among them. –The developed GN executes a genetic algorithm and implements tuning of the genetic operators, as well as the fitness function, regarding to the considered problem GN models of GAs Parallel global search technique that emulates natural genetic operators GAs are stochastic search methods for exploring complex problem space in order to find optimal solutions using minimal information Population of individuals (tentative solutions) Fitness function (individual’s suitability to problem) Operators: selection, crossover and mutation Stop criterion (# iterations, finding of individual) Convergence towards a global solution No problem-specific info required in GAs, hence they’re more flexible and adaptable Genetic Algorithms Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to solve large and complex problems allow to tackle large-size problems by delivering satisfactory solutions in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
6
ACO search procedure – in terms of GNs –A GN was constructed, describing the ACO algorithm. –On this basis, the opportunity arose for modification and improvement of the ACO algorithm. –GN models realizing the new modified versions of ACO were built. –The test samples proved that these modifications, resulting from the application of GNs, yield better results according to time. GN models of ACO ACO is a new metaheuristic method inspired by the social behaviour of ants in nature. It finds good solutions for optimization problems with restrictive constraints Low level interaction between single agents results in a complex behaviour of the whole ant colony –Shortest path from food source to formicary –Communication via pheromone (distributed numerical information), which ants use to probabilistically construct solutions Ant Colony Optimization 1. GA search procedure - in terms of GNs –The GN model simultaneously evaluates several fitness functions, ranks the individuals per their FF and chooses the best FF regarding the problem 2. Selection and tuning of GA operators –The GN model has the possibility to test different groups of the defined genetic algorithm operators and choose the most appropriate combination among them. –The developed GN executes a genetic algorithm and implements tuning of the genetic operators, as well as the fitness function, regarding to the considered problem GN models of GAs Parallel global search technique that emulates natural genetic operators GAs are stochastic search methods for exploring complex problem space in order to find optimal solutions using minimal information Population of individuals (tentative solutions) Fitness function (individual’s suitability to problem) Operators: selection, crossover and mutation Stop criterion (# iterations, finding of individual) Convergence towards a global solution No problem-specific info required in GAs, hence they’re more flexible and adaptable Genetic Algorithms Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to solve large and complex problems allow to tackle large-size problems by delivering satisfactory solutions in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
7
Usually metaheuristics are combined with local search procedure or an exact method. Our idea is to combine two metaheuristics. –The GA starts with population which is closer to optimal solution. Sometimes after a number of iterations the GA goes to stagnation, the population stop to be improved. –Next, the GA solutions are provided as input for the ACO algorithm and the pheromone is updated accordingly. –ACO with updated pheromone is run and thus a new population for GA is generated Any ACO / GA version can be used, depending on the problem solved. GN for hybrid ACO/GA ACO search procedure – in terms of GNs –A GN was constructed, describing the ACO algorithm. –On this basis, the opportunity arose for modification and improvement of the ACO algorithm. –GN models realizing the new modified versions of ACO were built. –The test samples proved that these modifications, resulting from the application of GNs, yield better results according to time. GN models of ACO ACO is a new metaheuristic method inspired by the social behaviour of ants in nature. It finds good solutions for optimization problems with restrictive constraints Low level interaction between single agents results in a complex behaviour of the whole ant colony –Shortest path from food source to formicary –Communication via pheromone (distributed numerical information), which ants use to probabilistically construct solutions Ant Colony Optimization 1. GA search procedure - in terms of GNs –The GN model simultaneously evaluates several fitness functions, ranks the individuals per their FF and chooses the best FF regarding the problem 2. Selection and tuning of GA operators –The GN model has the possibility to test different groups of the defined genetic algorithm operators and choose the most appropriate combination among them. –The developed GN executes a genetic algorithm and implements tuning of the genetic operators, as well as the fitness function, regarding to the considered problem GN models of GAs Parallel global search technique that emulates natural genetic operators GAs are stochastic search methods for exploring complex problem space in order to find optimal solutions using minimal information Population of individuals (tentative solutions) Fitness function (individual’s suitability to problem) Operators: selection, crossover and mutation Stop criterion (# iterations, finding of individual) Convergence towards a global solution No problem-specific info required in GAs, hence they’re more flexible and adaptable Genetic Algorithms Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to solve large and complex problems allow to tackle large-size problems by delivering satisfactory solutions in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
8
We describe ACO and GA with GNs (G ACO and G GA, respectively) and using them we prepare a GN describing the hybrid ACO/GA algorithm. The problem is coded in G proc. Both G ACO and G GA have one input and one output places: Constructing the GN model Usually metaheuristics are combined with local search procedure or an exact method. Our idea is to combine two metaheuristics. –The GA starts with population which is closer to optimal solution. Sometimes after a number of iterations the GA goes to stagnation, the population stop to be improved. –Next, the GA solutions are provided as input for the ACO algorithm and the pheromone is updated accordingly. –ACO with updated pheromone is run and thus a new population for GA is generated Any ACO / GA version can be used, depending on the problem solved. GN for hybrid ACO/GA ACO is a new metaheuristic method inspired by the social behaviour of ants in nature. It finds good solutions for optimization problems with restrictive constraints Low level interaction between single agents results in a complex behaviour of the whole ant colony –Shortest path from food source to formicary –Communication via pheromone (distributed numerical information), which ants use to probabilistically construct solutions Ant Colony Optimization 1. GA search procedure - in terms of GNs –The GN model simultaneously evaluates several fitness functions, ranks the individuals per their FF and chooses the best FF regarding the problem 2. Selection and tuning of GA operators –The GN model has the possibility to test different groups of the defined genetic algorithm operators and choose the most appropriate combination among them. –The developed GN executes a genetic algorithm and implements tuning of the genetic operators, as well as the fitness function, regarding to the considered problem GN models of GAs Parallel global search technique that emulates natural genetic operators GAs are stochastic search methods for exploring complex problem space in order to find optimal solutions using minimal information Population of individuals (tentative solutions) Fitness function (individual’s suitability to problem) Operators: selection, crossover and mutation Stop criterion (# iterations, finding of individual) Convergence towards a global solution No problem-specific info required in GAs, hence they’re more flexible and adaptable Genetic Algorithms Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to solve large and complex problems allow to tackle large-size problems by delivering satisfactory solutions in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
9
Let token of GN G proc enter place l 1 of the GN with initial characteristic “current problem description (graph of the problem, problem constraints, etc.” where W GA,2 = “a next iteration is necessary”, W GA,3 = ¬ W GA,2, where ¬ P is the negation of predicate P. Constructing the GN model We describe ACO and GA with GNs (G ACO and G GA, respectively) and using them we prepare a GN describing the hybrid ACO/GA algorithm. The problem is coded in G proc. Both G ACO and G GA have one input and one output places: Constructing the GN model Usually metaheuristics are combined with local search procedure or an exact method. Our idea is to combine two metaheuristics. –The GA starts with population which is closer to optimal solution. Sometimes after a number of iterations the GA goes to stagnation, the population stop to be improved. –Next, the GA solutions are provided as input for the ACO algorithm and the pheromone is updated accordingly. –ACO with updated pheromone is run and thus a new population for GA is generated Any ACO / GA version can be used, depending on the problem solved. GN for hybrid ACO/GA ACO is a new metaheuristic method inspired by the social behaviour of ants in nature. It finds good solutions for optimization problems with restrictive constraints Low level interaction between single agents results in a complex behaviour of the whole ant colony –Shortest path from food source to formicary –Communication via pheromone (distributed numerical information), which ants use to probabilistically construct solutions Ant Colony Optimization 1. GA search procedure - in terms of GNs –The GN model simultaneously evaluates several fitness functions, ranks the individuals per their FF and chooses the best FF regarding the problem 2. Selection and tuning of GA operators –The GN model has the possibility to test different groups of the defined genetic algorithm operators and choose the most appropriate combination among them. –The developed GN executes a genetic algorithm and implements tuning of the genetic operators, as well as the fitness function, regarding to the considered problem GN models of GAs Parallel global search technique that emulates natural genetic operators GAs are stochastic search methods for exploring complex problem space in order to find optimal solutions using minimal information Population of individuals (tentative solutions) Fitness function (individual’s suitability to problem) Operators: selection, crossover and mutation Stop criterion (# iterations, finding of individual) Convergence towards a global solution No problem-specific info required in GAs, hence they’re more flexible and adaptable Genetic Algorithms Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to solve large and complex problems allow to tackle large-size problems by delivering satisfactory solutions in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
10
The -tokens from places l 2 or l 5 enter place i ACO without a new characteristic. It transfers through GN G ACO and going out of it (through place o ACO ) obtains the characteristic “current solutions of ACO-algorithm”. Constructing the GN model Let token of GN G proc enter place l 1 of the GN with initial characteristic “current problem description (graph of the problem, problem constraints, etc.” where W GA,2 = “a next iteration is necessary”, W GA,3 = ¬ W GA,2, where ¬ P is the negation of predicate P. Constructing the GN model We describe ACO and GA with GNs (G ACO and G GA, respectively) and using them we prepare a GN describing the hybrid ACO/GA algorithm. The problem is coded in G PROC. Both G ACO and G GA have one input and one output places: Constructing the GN model Usually metaheuristics are combined with local search procedure or an exact method. Our idea is to combine two metaheuristics. –The GA starts with population which is closer to optimal solution. Sometimes after a number of iterations the GA goes to stagnation, the population stop to be improved. –Next, the GA solutions are provided as input for the ACO algorithm and the pheromone is updated accordingly. –ACO with updated pheromone is run and thus a new population for GA is generated Any ACO / GA version can be used, depending on the problem solved. GN for hybrid ACO/GA ACO is a new metaheuristic method inspired by the social behaviour of ants in nature. It finds good solutions for optimization problems with restrictive constraints Low level interaction between single agents results in a complex behaviour of the whole ant colony –Shortest path from food source to formicary –Communication via pheromone (distributed numerical information), which ants use to probabilistically construct solutions Ant Colony Optimization 1. GA search procedure - in terms of GNs –The GN model simultaneously evaluates several fitness functions, ranks the individuals per their FF and chooses the best FF regarding the problem 2. Selection and tuning of GA operators –The GN model has the possibility to test different groups of the defined genetic algorithm operators and choose the most appropriate combination among them. –The developed GN executes a genetic algorithm and implements tuning of the genetic operators, as well as the fitness function, regarding to the considered problem GN models of GAs Parallel global search technique that emulates natural genetic operators GAs are stochastic search methods for exploring complex problem space in order to find optimal solutions using minimal information Population of individuals (tentative solutions) Fitness function (individual’s suitability to problem) Operators: selection, crossover and mutation Stop criterion (# iterations, finding of individual) Convergence towards a global solution No problem-specific info required in GAs, hence they’re more flexible and adaptable Genetic Algorithms Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to solve large and complex problems allow to tackle large-size problems by delivering satisfactory solutions in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
11
where W ACO,4 = “The end-condition is satisfied”, W ACO,5 = ¬ W ACO,5 When the truth-value of W ACO,4 is “true”, token enters place l 4 with the characteristic “representation of the current solutions (populations) in appropriate form of the GA”. Otherwise, it enters place l 5 without a new characteristic. Constructing the GN model The -tokens from places l 2 or l 5 enter place i ACO without a new characteristic. It transfers through GN G ACO and going out of it (through place o ACO ) obtains the characteristic “current solutions of ACO-algorithm”. Constructing the GN model Let token of GN G proc enter place l 1 of the GN with initial characteristic “current problem description (graph of the problem, problem constraints, etc.” where W GA,2 = “a next iteration is necessary”, W GA,3 = ¬ W GA,2, where ¬ P is the negation of predicate P. Constructing the GN model We describe ACO and GA with GNs (G ACO and G GA, respectively) and using them we prepare a GN describing the hybrid ACO/GA algorithm. The problem is coded in G PROC. Both G ACO and G GA have one input and one output places: Constructing the GN model Usually metaheuristics are combined with local search procedure or an exact method. Our idea is to combine two metaheuristics. –The GA starts with population which is closer to optimal solution. Sometimes after a number of iterations the GA goes to stagnation, the population stop to be improved. –Next, the GA solutions are provided as input for the ACO algorithm and the pheromone is updated accordingly. –ACO with updated pheromone is run and thus a new population for GA is generated Any ACO / GA version can be used, depending on the problem solved. GN for hybrid ACO/GA ACO is a new metaheuristic method inspired by the social behaviour of ants in nature. It finds good solutions for optimization problems with restrictive constraints Low level interaction between single agents results in a complex behaviour of the whole ant colony –Shortest path from food source to formicary –Communication via pheromone (distributed numerical information), which ants use to probabilistically construct solutions Ant Colony Optimization 1. GA search procedure - in terms of GNs –The GN model simultaneously evaluates several fitness functions, ranks the individuals per their FF and chooses the best FF regarding the problem 2. Selection and tuning of GA operators –The GN model has the possibility to test different groups of the defined genetic algorithm operators and choose the most appropriate combination among them. –The developed GN executes a genetic algorithm and implements tuning of the genetic operators, as well as the fitness function, regarding to the considered problem GN models of GAs Parallel global search technique that emulates natural genetic operators GAs are stochastic search methods for exploring complex problem space in order to find optimal solutions using minimal information Population of individuals (tentative solutions) Fitness function (individual’s suitability to problem) Operators: selection, crossover and mutation Stop criterion (# iterations, finding of individual) Convergence towards a global solution No problem-specific info required in GAs, hence they’re more flexible and adaptable Genetic Algorithms Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to solve large and complex problems allow to tackle large-size problems by delivering satisfactory solutions in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
12
Token from place l 4 enters place i GA with the characteristic “current population (solutions) of the GA”. Constructing the GN model where W ACO,4 = “The end-condition is satisfied”, W ACO,5 = ¬ W ACO,5 When the truth-value of W ACO,4 is “true”, token enters place l 4 with the characteristic “representation of the current solutions (populations) in appropriate form of the GA”. Otherwise, it enters place l 5 without a new characteristic. Constructing the GN model The -tokens from places l 2 or l 5 enter place i ACO without a new characteristic. It transfers through GN G ACO and going out of it (through place o ACO ) obtains the characteristic “current solutions of ACO-algorithm (population generations)”. Constructing the GN model Let token of GN G proc enter place l 1 of the GN with initial characteristic “current problem description (graph of the problem, problem constraints, etc.” where W GA,2 = “a next iteration is necessary”, W GA,3 = ¬ W GA,2, where ¬ P is the negation of predicate P. Constructing the GN model We describe ACO and GA with GNs (G ACO and G GA, respectively) and using them we prepare a GN describing the hybrid ACO/GA algorithm. The problem is coded in G PROC. Both G ACO and G GA have one input and one output places: Constructing the GN model Usually metaheuristics are combined with local search procedure or an exact method. Our idea is to combine two metaheuristics. –The GA starts with population which is closer to optimal solution. Sometimes after a number of iterations the GA goes to stagnation, the population stop to be improved. –Next, the GA solutions are provided as input for the ACO algorithm and the pheromone is updated accordingly. –ACO with updated pheromone is run and thus a new population for GA is generated Any ACO / GA version can be used, depending on the problem solved. GN for hybrid ACO/GA ACO is a new metaheuristic method inspired by the social behaviour of ants in nature. It finds good solutions for optimization problems with restrictive constraints Low level interaction between single agents results in a complex behaviour of the whole ant colony –Shortest path from food source to formicary –Communication via pheromone (distributed numerical information), which ants use to probabilistically construct solutions Ant Colony Optimization 1. GA search procedure - in terms of GNs –The GN model simultaneously evaluates several fitness functions, ranks the individuals per their FF and chooses the best FF regarding the problem 2. Selection and tuning of GA operators –The GN model has the possibility to test different groups of the defined genetic algorithm operators and choose the most appropriate combination among them. –The developed GN executes a genetic algorithm and implements tuning of the genetic operators, as well as the fitness function, regarding to the considered problem GN models of GAs Parallel global search technique that emulates natural genetic operators GAs are stochastic search methods for exploring complex problem space in order to find optimal solutions using minimal information Population of individuals (tentative solutions) Fitness function (individual’s suitability to problem) Operators: selection, crossover and mutation Stop criterion (# iterations, finding of individual) Convergence towards a global solution No problem-specific info required in GAs, hence they’re more flexible and adaptable Genetic Algorithms Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to solve large and complex problems allow to tackle large-size problems by delivering satisfactory solutions in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
13
Thank you for your attention! Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas Acknowledgment to Grants DID-02-29 “Modeling Processes with Fixed Development Rules” and DTK-02-44 “Effective Monte Carlo Methods for Large-Scale Scientific Problems” by National Science Fund of Bulgaria, and Grant JP 100372 by Royal Society, UK 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria Token from place l 4 enters place i GA with the characteristic “current population (solutions) of the GA”. Constructing the GN model where W ACO,4 = “The end-condition is satisfied”, W ACO,5 = ¬ W ACO,5 When the truth-value of W ACO,4 is “true”, token enters place l 4 with the characteristic “representation of the current solutions (populations) in appropriate form of the GA”. Otherwise, it enters place l 5 without a new characteristic. Constructing the GN model The -tokens from places l 2 or l 5 enter place i ACO without a new characteristic. It transfers through GN G ACO and going out of it (through place o ACO ) obtains the characteristic “current solutions of ACO-algorithm (population generations)”. Constructing the GN model Let token of GN G proc enter place l 1 of the GN with initial characteristic “current problem description (graph of the problem, problem constraints, etc.” where W GA,2 = “a next iteration is necessary”, W GA,3 = ¬ W GA,2, where ¬ P is the negation of predicate P. Constructing the GN model We describe ACO and GA with GNs (G ACO and G GA, respectively) and using them we prepare a GN describing the hybrid ACO/GA algorithm. The problem is coded in G PROC. Both G ACO and G GA have one input and one output places: Constructing the GN model Usually metaheuristics are combined with local search procedure or an exact method. Our idea is to combine two metaheuristics. –The GA starts with population which is closer to optimal solution. Sometimes after a number of iterations the GA goes to stagnation, the population stop to be improved. –Next, the GA solutions are provided as input for the ACO algorithm and the pheromone is updated accordingly. –ACO with updated pheromone is run and thus a new population for GA is generated Any ACO / GA version can be used, depending on the problem solved. GN for hybrid ACO/GA ACO is a new metaheuristic method inspired by the social behaviour of ants in nature. It finds good solutions for optimization problems with restrictive constraints Low level interaction between single agents results in a complex behaviour of the whole ant colony –Shortest path from food source to formicary –Communication via pheromone (distributed numerical information), which ants use to probabilistically construct solutions Ant Colony Optimization 1. GA search procedure - in terms of GNs –The GN model simultaneously evaluates several fitness functions, ranks the individuals per their FF and chooses the best FF regarding the problem 2. Selection and tuning of GA operators –The GN model has the possibility to test different groups of the defined genetic algorithm operators and choose the most appropriate combination among them. –The developed GN executes a genetic algorithm and implements tuning of the genetic operators, as well as the fitness function, regarding to the considered problem GN models of GAs Parallel global search technique that emulates natural genetic operators GAs are stochastic search methods for exploring complex problem space in order to find optimal solutions using minimal information Population of individuals (tentative solutions) Fitness function (individual’s suitability to problem) Operators: selection, crossover and mutation Stop criterion (# iterations, finding of individual) Convergence towards a global solution No problem-specific info required in GAs, hence they’re more flexible and adaptable Genetic Algorithms Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to solve large and complex problems allow to tackle large-size problems by delivering satisfactory solutions in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
14
Thank you for your attention! Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas Acknowledgment to Grants DID-02-29 “Modeling Processes with Fixed Development Rules” and DTK-02-44 “Effective Monte Carlo Methods for Large-Scale Scientific Problems” by National Science Fund of Bulgaria, and Grant JP 100372 by Royal Society, UK 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria Token from place l 4 enters place i GA with the characteristic “current population (solutions) of the GA”. Constructing the GN model where W ACO,4 = “The end-condition is satisfied”, W ACO,5 = ¬ W ACO,5 When the truth-value of W ACO,4 is “true”, token enters place l 4 with the characteristic “representation of the current solutions (populations) in appropriate form of the GA”. Otherwise, it enters place l 5 without a new characteristic. Constructing the GN model The -tokens from places l 2 or l 5 enter place i ACO without a new characteristic. It transfers through GN G ACO and going out of it (through place o ACO ) obtains the characteristic “current solutions of ACO-algorithm (population generations)”. Constructing the GN model Let token of GN G proc enter place l 1 of the GN with initial characteristic “current problem description (graph of the problem, problem constraints, etc.” where W GA,2 = “a next iteration is necessary”, W GA,3 = ¬ W GA,2, where ¬ P is the negation of predicate P. Constructing the GN model We describe ACO and GA with GNs (G ACO and G GA, respectively) and using them we prepare a GN describing the hybrid ACO/GA algorithm. The problem is coded in G PROC. Both G ACO and G GA have one input and one output places: Constructing the GN model Usually metaheuristics are combined with local search procedure or an exact method. Our idea is to combine two metaheuristics. –The GA starts with population which is closer to optimal solution. Sometimes after a number of iterations the GA goes to stagnation, the population stop to be improved. –Next, the GA solutions are provided as input for the ACO algorithm and the pheromone is updated accordingly. –ACO with updated pheromone is run and thus a new population for GA is generated Any ACO / GA version can be used, depending on the problem solved. GN for hybrid ACO/GA ACO is a new metaheuristic method inspired by the social behaviour of ants in nature. It finds good solutions for optimization problems with restrictive constraints Low level interaction between single agents results in a complex behaviour of the whole ant colony –Shortest path from food source to formicary –Communication via pheromone (distributed numerical information), which ants use to probabilistically construct solutions Ant Colony Optimization 1. GA search procedure - in terms of GNs –The GN model simultaneously evaluates several fitness functions, ranks the individuals per their FF and chooses the best FF regarding the problem 2. Selection and tuning of GA operators –The GN model has the possibility to test different groups of the defined genetic algorithm operators and choose the most appropriate combination among them. –The developed GN executes a genetic algorithm and implements tuning of the genetic operators, as well as the fitness function, regarding to the considered problem GN models of GAs Parallel global search technique that emulates natural genetic operators GAs are stochastic search methods for exploring complex problem space in order to find optimal solutions using minimal information Population of individuals (tentative solutions) Fitness function (individual’s suitability to problem) Operators: selection, crossover and mutation Stop criterion (# iterations, finding of individual) Convergence towards a global solution No problem-specific info required in GAs, hence they’re more flexible and adaptable Genetic Algorithms Extension of Petri Nets and their modifications Apparatus for description of parallel processes Static structure: –Transitions –Places Dynamic structure: –Tokens –Predicate index matrices Memory Time Generalized Nets Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and effective to solve large and complex problems allow to tackle large-size problems by delivering satisfactory solutions in a reasonable time some of the most successful metaheuristics: –Genetic Algorithms –Ant Colony Optimization Generalized Nets, ACO Algorithms and Genetic Algorithms Vassia Atanassova Stefka Fidanova Ivan Popchev Panagiotis Chountas 8th IMACS Seminar on Monte Carlo Methods August 29–September 2, 2011, Borovets, Bulgaria
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.