Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

G5BAIM Artificial Intelligence Methods
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Particle Swarm Optimization (PSO)
1 Wendy Williams Metaheuristic Algorithms Genetic Algorithms: A Tutorial “Genetic Algorithms are good at taking large, potentially huge search spaces and.
Reporter : Mac Date : Multi-Start Method Rafael Marti.
1 A hybrid particle swarm optimization algorithm for optimal task assignment in distributed system Peng-Yeng Yin and Pei-Pei Wang Department of Information.
EAs for Combinatorial Optimization Problems BLG 602E.
Ant Colony Optimization Optimisation Methods. Overview.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Ant Colony Optimization: an introduction
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Ant colony optimization algorithms Mykulska Eugenia
Lecture: 5 Optimization Methods & Heuristic Strategies Ajmal Muhammad, Robert Forchheimer Information Coding Group ISY Department.
Genetic Algorithms: A Tutorial
CSM6120 Introduction to Intelligent Systems Other evolutionary algorithms.
Genetic Algorithms and Ant Colony Optimisation
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Swarm Intelligence 虞台文.
Algorithms and their Applications CS2004 ( )
Design & Analysis of Algorithms Combinatory optimization SCHOOL OF COMPUTING Pasi Fränti
(Particle Swarm Optimisation)
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
Object Oriented Programming Assignment Introduction Dr. Mike Spann
Metaheuristic Methods and Their Applications Lin-Yu Tseng ( 曾怜玉 ) Institute of Networking and Multimedia Department of Computer Science and Engineering.
Local Search: walksat, ant colonies, and genetic algorithms.
1 “Genetic Algorithms are good at taking large, potentially huge search spaces and navigating them, looking for optimal combinations of things, solutions.
1 Genetic Algorithms and Ant Colony Optimisation.
Genetic Algorithms Przemyslaw Pawluk CSE 6111 Advanced Algorithm Design and Analysis
Genetic Algorithms CSCI-2300 Introduction to Algorithms
Edge Assembly Crossover
Introduction to Optimization
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
Biologically Inspired Computation Ant Colony Optimisation.
Particle Swarm Optimization (PSO)
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms. Solution Search in Problem Space.
Ant Colony Optimisation. Emergent Problem Solving in Lasius Niger ants, For Lasius Niger ants, [Franks, 89] observed: –regulation of nest temperature.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Particle Swarm Optimization (PSO) Algorithm. Swarming – The Definition aggregation of similar animals, generally cruising in the same directionaggregation.
Swarm Intelligence. Content Overview Swarm Particle Optimization (PSO) – Example Ant Colony Optimization (ACO)
Genetic Algorithms.
Advanced Computing and Networking Laboratory
metaheuristic methods and their applications
CSCI 4310 Lecture 10: Local Search Algorithms
Scientific Research Group in Egypt (SRGE)
Scientific Research Group in Egypt (SRGE)
Particle Swarm Optimization
PSO -Introduction Proposed by James Kennedy & Russell Eberhart in 1995
Meta-heuristics Introduction - Fabien Tricoire
آموزش شبکه عصبی با استفاده از روش بهینه سازی PSO
Tabu Search Review: Branch and bound has a “rigid” memory structure (i.e. all branches are completed or fathomed). Simulated Annealing has no memory structure.
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Comparing Genetic Algorithm and Guided Local Search Methods
Neuro-Computing Lecture 6
metaheuristic methods and their applications
Study Guide for ES205 Yu-Chi Ho Jonathan T. Lee Nov. 7, 2000
Genetic Algorithms: A Tutorial
Multi-Objective Optimization
Genetic Algorithms CSCI-2300 Introduction to Algorithms
School of Computer Science & Engineering
Ant Colony Optimization
Design & Analysis of Algorithms Combinatorial optimization
现代智能优化算法-粒子群算法 华北电力大学输配电系统研究所 刘自发 2008年3月 1/18/2019
Artificial Intelligence CIS 342
Traveling Salesman Problem by Genetic Algorithm
SWARM INTELLIGENCE Swarms
Population Based Metaheuristics
Genetic Algorithms: A Tutorial
Simulated Annealing & Boltzmann Machines
Presentation transcript:

metaheuristic methods and their applications

Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method? Trajectory Methods Population-Based Methods The Applications of Metaheuristic Methods

Optimization Problems Computer Science Traveling Salesman Problem Maximum Clique Problem Operational Research (OR) Flow Shop Scheduling Problem (processing orders) P-Median Problem (location selection) Many optimization problems are NP-hard.

HERE…

(Traveling Salesman Problem) A solutionA sequence Tour length = Tour length = 63 There are (n-1)! tours in total

Strategies for Solving NP-hard Optimization Problems Branch-and-Bound (B&B)  Finds an exact solution Approximation Algorithms – There is an approximation algorithm for TSP which can find a tour with tour length ≦ 1.5× (optimal tour length) in O(n 3 ) time. Heuristic Methods  Deterministic Metaheuristic Methods  Heuristic + Randomization

What is a Metaheruistic Method? Meta : in an upper level Heuristic : to find A metaheuristic is formally defined as an iterative generation process which guides a subordinate heuristic by combining intelligently different concepts for exploring and exploiting the search space, learning strategies are used to structure information in order to find efficiently near-optimal solutions.

Fundamental Properties of Metaheuristics Metaheuristics are strategies that “guide” the search process. The goal is to efficiently explore the search space in order to find (near-)optimal solutions. Techniques which constitute metaheuristic algorithms range from simple local search procedures to complex learning processes. Metaheuristic algorithms are approximate and usually non-deterministic.

Fundamental Properties of Metaheuristics (cont.) They may incorporate mechanisms to avoid getting trapped in confined areas of the search space. The basic concepts of metaheuristics permit an abstract level description. Metaheuristics are not problem-specific. Metaheuristics may make use of domain-specific knowledge in the form of heuristics that are controlled by the upper level strategy. Today’s more advanced metaheuristics use search experience (embodied in some form of memory) to guide the search.

Trajectory Methods (heuristics) 1. Basic Local Search : Iterative Improvement Improve (N(S)) can be  F i rst improvement ‚ Best improvement ƒ Intermediate option

Trajectory Methods x1x1 x2x2 x3x3 X4X4 X5X5

Simulated Annealing – Kirkpatrick (Science 1983) Temperature T may be defined as Random walk + iterative improvement

Tabu Search – Glover 1986 Tabu list Tabu tenure: the length of the tabu list Aspiration condition

Tabu Search

Population-Based Methods 1. Genetic Algorithm – Holland 1975 Coding of a solution --- Chromosome Fitness function --- Related to objective function Initial population

An example : TSP (Traveling Salesman Problem) A solutionA sequence Tour length = Tour length = 63

Genetic Algorithm Reproduction (Selection) Crossover Mutation

Example 1 Maximize f(x) = x 2 where x  I and 0  x  31

1.Coding of a solution : A five-bit integer, e.g Fitness function : F(x) = f(x) = x 2 3.Initial population : (Randomly generated)

Reproduction Roulette Wheel

Reproduction

Crossover

Mutation The probability of mutation Pm = bits * = 0.02 bits

Pheromone (smell) Ant Colony Optimization Dorigo 1992

Initialization Loop Each ant applies a state transition rule to incrementally build a solution and applies a local updating rule to the pheromone Until each of all ants has built a complete solution A global pheromone updating rule is applied Until End_Condition

Example : TSP A simple heuristics – a greedy method: choose the shortest edge to go out of a node Solution : 15432

Each ant builds a solution using a step by step constructive decision policy. How ant k choose the next city to visit? Where, be a distance measure associated with edge (r,s)

Local update of pheromone where is the initial pheromone level Global update of pheromone where globally best tour

Particle Swarm Optimization (Kennedy and Eberhart 1995)

positions velocities Population initialized by assigning random positions and velocities; potential solutions are then flown through hyperspace. Each particle keeps track of its “best” (highest fitness) position in hyperspace. pBest – This is called “pBest” for an individual particle. gBest – It is called “gBest” for the best in the population. pBest gBest At each time step, each particle stochastically accelerates toward its pBest and gBest.

Particle Swarm Optimization Process 1.Initialize population in hyperspace. Includes position and velocity. 2.Evaluate fitness of individual particle. 3.Modify velocities based on personal best and global best. 4.Terminate on some condition. 5.Go to step 2.

Initialization: - Positions and velocities

Modify velocities - based on personal best and global best. xtxt gBest pBest v x t+1 Here I am, now! New Position !

Particle Swarm Optimization Modification of velocities and positions v t+1 = v t + c 1 * rand() * (pBest - x t ) + c 2 * rand() * (gBest - x t ) x t+1 = x t + v t+1

The Applications of Metaheuristics 1.Solving NP-hard Optimization Problems Traveling Salesman Problem Maximum Clique Problem Flow Shop Scheduling Problem (processing orders) P-Median Problem (location selection) 2.Search Problems in Many Applications Feature Selection in Pattern Recognition Automatic Clustering Machine Learning (e.g. Neural Network Training)

For NP-hard optimization problems and complicated search problems, metaheuristic methods are good choices for solving these problems. More efficient than branch-and-bound. Obtain better quality solutions than heuristic methods. Hybridization of metaheuristics How to make the search more systematic ? How to make the search more controllable ? How to make the performance scalable?