Metaheuristics Meta- Greek word for upper level methods

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

Algorithm Design Methods Spring 2007 CSE, POSTECH.
G5BAIM Artificial Intelligence Methods
Problems and Their Classes
Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Reducibility Class of problems A can be reduced to the class of problems B Take any instance of problem A Show how you can construct an instance of problem.
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
Discrete Optimization Shi-Chung Chang. Discrete Optimization Lecture #1 Today: Reading Assignments 1.Chapter 1 and the Appendix of [Pas82] 2.Chapter 1.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
Better Ants, Better Life? Hybridization of Constraint Propagation and Ant Colony Optimization Supervisors: Bernd Meyer, Andreas Ernst Martin Held Jun 2nd,
Dealing with NP-Complete Problems
The Theory of NP-Completeness
Models Physical: Scale, Analog Symbolic: Drawings Computer Programs Mathematical: Analytical (Deduction) Experimental (Induction)
1 A hybrid particle swarm optimization algorithm for optimal task assignment in distributed system Peng-Yeng Yin and Pei-Pei Wang Department of Information.
Analysis of Algorithms CS 477/677
MAE 552 – Heuristic Optimization
Ant Colony Optimization Optimisation Methods. Overview.
Better Ants, Better Life? Hybridization of Constraint Programming and Ant Colony Optimization Supervisors: Dr. Bernd Meyer, Dr. Andreas Ernst Martin Held.
Chapter 11: Limitations of Algorithmic Power
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
1 IE 607 Heuristic Optimization Introduction to Optimization.
Ant Colony Optimization: an introduction
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Lecture: 5 Optimization Methods & Heuristic Strategies Ajmal Muhammad, Robert Forchheimer Information Coding Group ISY Department.
Programming & Data Structures
Computational Complexity Polynomial time O(n k ) input size n, k constant Tractable problems solvable in polynomial time(Opposite Intractable) Ex: sorting,
Building “ Problem Solving Engines ” for Combinatorial Optimization Toshi Ibaraki Kwansei Gakuin University (+ M. Yagiura, K. Nonobe and students, Kyoto.
Swarm Intelligence 虞台文.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
Heuristic Optimization Methods Tabu Search: Advanced Topics.
Computational Complexity Jang, HaYoung BioIntelligence Lab.
P, NP, and Exponential Problems Should have had all this in CS 252 – Quick review Many problems have an exponential number of possibilities and we can.
Heuristic Optimization Methods Greedy algorithms, Approximation algorithms, and GRASP.
1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs.
Exact and heuristics algorithms
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
LOG740 Heuristic Optimization Methods Local Search / Metaheuristics.
Introduction to Optimization
Optimization Problems
CS6045: Advanced Algorithms NP Completeness. NP-Completeness Some problems are intractable: as they grow large, we are unable to solve them in reasonable.
Hub Location–Allocation in Intermodal Logistic Networks Hüseyin Utku KIYMAZ.
CES 512 Theory of Software Systems B. Ravikumar (Ravi) Office: 141 Darwin Hall Course Web site:
CES 592 Theory of Software Systems B. Ravikumar (Ravi) Office: 124 Darwin Hall.
Introduction to Optimization 1. Optimization … is a branch of mathematics and computational science that studies methods and techniques specially designed.
IE 312 Review 1. The Process 2 Problem Model Conclusions Problem Formulation Analysis.
DEPARTMENT/SEMESTER ME VII Sem COURSE NAME Operation Research Manav Rachna College of Engg.
Young CS 331 D&A of Algo. NP-Completeness1 NP-Completeness Reference: Computers and Intractability: A Guide to the Theory of NP-Completeness by Garey and.
The NP class. NP-completeness Lecture2. The NP-class The NP class is a class that contains all the problems that can be decided by a Non-Deterministic.
Great Theoretical Ideas in Computer Science.
Optimization Problems
Heuristic Optimization Methods
Digital Optimization Martynas Vaidelys.
Meta-heuristics Introduction - Fabien Tricoire
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
metaheuristic methods and their applications
Integer Programming (정수계획법)
Optimization Problems
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
traveling salesman problem
Integer Programming (정수계획법)
Approximation Algorithms
Area Coverage Problem Optimization by (local) Search
Complexity Theory: Foundations
Discrete Optimization
Presentation transcript:

Metaheuristics Meta- Greek word for upper level methods Heuristics – Greek word heuriskein – art of discovering new strategies to solve problems. Exact and Approximate methods Exact Math programming LP, IP, NLP, DP Approximate Heuristics Metaheuristics used for Combinatorial Optimization problems – a general class of IP problems with discrete decision variables and finite solution space. Objective function and constraints could be non-linear too. Uses relaxation techniques to prune the search space. Constraint Programming problems – used for timetabling and scheduling problems. Uses constraint propagation techniques that reduces the variable domain. Declaration of variables is a lot more compact

Need for Metaheuristics What if the objective function can only be simulated and there is no (or an inaccurate) mathematical model that connect the variables Math and constraint programming need exact math formulations, therefore cannot be applied to the above Large solution space- Ex: TSP, 5 cities have 120 solutions (5!), 10 cities have 10! ~ 3.6 million, 75 cities have 2.5X10109 solutions Ex: Parallel machine job scheduling: process n jobs on m identical machines. There are mn solutions Complexity - time and space complexity Time is the number of steps required to solve a problem of size n. We are looking for the worst-case asymptotic bound on the step count (not an exact count). Asymptotic behavior is the limiting behavior as n tends to a large number.

Complexity of Algorithms An algorithm has a complexity f(n)= O(g(n)) if there exist positive constants n0 and c such that forall n>n0, f(n)<=c.g(n). In this case f(n) is upper bounded by g(n). P class – polynomial time If g(n) is a polynomial g(n) = ak.nk + ….+ a1.n + a0 Then the corresponding algorithm has O(nk) complexity. Shortest path algorithms such as Dijkstra’s with n nodes. Worst case scenario is that each of the n nodes have n neighbors. The algorithm needs no more than n2 steps (upper bound) to find the solution. The complexity is O(n2) – a quadratic polynomial. Other examples are minimum spanning tree, max flow network etc. The solutions for the above problems are tractable

Complexity of Algorithms Exponential time O(cn) where is c is a positive constant >1 Ex: n= 10 n=20 n=30 n=40 n=50 O(2n) 0.001s 1s 17.9min 12.7days 35.7years NP class – non-deterministic polynomial time, the solutions for the problems are intractable NP- hard problems (a more generic name) need exponential time to find optimal solutions. Most real world optimization problems fit this category. Provably efficient algorithms do not exist. Metaheuristics can help to solve (near optimally) this class of problems. May not be even decision problems such as subset sum problems. NP-complete problems are in NP class. A solution to NP-complete can be verified in polynomial time, however finding a solution is very difficult. These are also NP-hard and the hardest among NP hard problems. Typically decision making problems. Ex: TSP, scheduling problems such as flow-shop, job-shop scheduling, n jobs – m machine scheduling, quadratic assignment and location problems, grouping problems such as data clustering, graph partitioning, routing and set-covering, knapsack and so on.

Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Trajectory based methods (single solution based methods) search process is characterized by a trajectory in the search space It can be viewed as an evolution of a solution in (discrete) time of a dynamical system Tabu Search, Simulated Annealing, Iterated Local search, variable neighborhood search, guided local search Population based methods Every step of the search process has a population – a set of- solutions It can be viewed as an evolution of a set of solutions in (discrete) time of a dynamical system Genetic algorithms, swarm intelligence - ant colony optimization, bee colony optimization, scatter search Hybrid methods Parallel metaheuristics: parallel and distributed computing- independent and cooperative search You will learn these techniques through several examples

History of Metaheuristics 1965: first Evolution Strategy 1975: first Genetic Algorithms 1983: Simulated Annealing 1986: Tabu Search 1991: Ant Colony Optimization 1997: Variable Neighborhood Search 2000+: parallel and distributed computing in metaheuristics

Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution and improve it. A greedy solution may be a good starting point. Goal: Find a near optimal solution in a given bounded time. Applied to combinatorial and constraint optimization problems Diversification and intensification of the search are the two strategies for search in Metaheuristics. One must strike a balance between them. Too much of either one will yield poor solutions. Remember that you have only a limited amount of time to search and you are also looking for a good quality solution. Quality vs Time tradeoff. For applications such as design decisions focus on high quality solutions (take more time) Ex. high cost of investment, and for control/operational decisions where quick and frequent decisions are taken look for good enough solutions (in very limited time) Ex: scheduling Trajectory based methods are heavy on intensification, while population based methods are heavy on diversification.

Classifications in Metaheuristics Nature inspired (swarm intelligence from biology) vs nonnature inspired (simulated annealing from physics) Memory usage (tabu search) vs memoryless methods (local search, simulated annealing SA) Deterministic (tabu, local search) vs stochastic (GA, SA) metaheuristics Deterministic – same initial solution will lead to same final solution after several search steps Stochastic – same initial solution will lead to different final solutions due to some random rules in the algorithm Population based (manipulates a whole population of solutions – exploration/diversification) vs single solution based search (manipulates a single solution – exploitation/intensification). Iterative vs greedy. Iterative – start with a complete solution(s) and transform at each iteration. Greedy- start with an empty solution and add decision variables till a complete solution is obtained.

When to use Metaheuristics If one can use exact methods then do not use Metaheuristics. P class problem with large number of solutions. P-time algorithms are known but too expensive to implement Dynamic optimization- real-time optimization- Metaheuristics can reduce search time and we are looking for good enough solutions. A difficult NP-hard problem - even a moderate size problem. Problems where objective function is a black box, i.e. often simulated and has no/inaccurate math formulation for the objective function f(x) x Metaheuristic Black box objective function Assess quality of f(x) Quality metric

Evaluation of Results Best Metaheuristic approach- does not exist for any problem. No proof of optimality exists and you don’t care for one either. A heuristic approach designed for problem A does not always apply to problem B. It is context-dependent. Questions: Is the metaheuristic algorithm well designed and does it behave well? No common agreement exist in the literature. What do we look for during implementation Easy to apply for hard problems Intensify: should be able to find a local optima Diversify: should be able to widely explore the solution space A greedy solution is a good starting point for an iterative search

Evaluation of Results Evaluation criteria include: Ability to find good/optimal solutions. No way to prove optimality unless all solutions are exhaustively enumerated, which is infeasible. Comparison with optimal solutions (if available). Test the heuristics on small problems before scaling it up. Comparison with Lower/Upper bounds if already known (may be a target was already set) Comparison with other metaheuristics methods Reliability and stability over several runs Average gap between solution while approaching the (local) optimum statistical tools (standard deviation of solutions, Confidence intervals, ANOVA ) Reasonable computational time Usually: between a few seconds upto a few hours

Definitions Neighborhood of solution s is denoted a N(s). Local minima: solution s’ is local minima if forall s in N(s’), f(s’)<=f(s). Strictly local minima if f(s’)< f(s)

Main concepts to implement a Metaheuristic algorithm Representation of solutions Vector of Binary values – 0/1 Knapsack, 0/1 IP problems Vector of discrete values- Location , and assignment problems Vector of continuous values on a real line – continuous, parameter optimization Permutation – sequencing, scheduling, TSP Objective function Constraint handling Reject strategies- keep only feasible solution and reject infeasible ones automatically. Penalizing strategy - infeasible solutions are considered and a penalty function is added to the objective function for including a infeasible solution during the search. st, st+1, st+2 are the search sequence where st+2 and st are feasible but st+1 is infeasible and f(st+2) is better than f(st). The penalized obj function is f’(s) = f(s)+ l c(s) where c(s) is the cost of infeasible s and l is a weight.

Metaheuristics Next class Single-solution metaheuristics