1 IE 607 Heuristic Optimization Simulated Annealing.

Slides:



Advertisements
Similar presentations
Decision Support Andry Pinto Hugo Alves Inês Domingues Luís Rocha Susana Cruz.
Advertisements

G5BAIM Artificial Intelligence Methods
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Simulated Annealing Premchand Akella. Agenda Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing General Idea: Start with an initial solution
Simulated annealing... an overview. Contents 1.Annealing & Stat.Mechs 2.The Method 3.Combinatorial minimization ▫The traveling salesman problem 4.Continuous.
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Optimization methods Morten Nielsen Department of Systems biology, DTU.
Simulated Annealing Student (PhD): Umut R. ERTÜRK Lecturer : Nazlı İkizler Cinbiş
Local search algorithms
Local search algorithms
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
MAE 552 – Heuristic Optimization Lecture 8 February 8, 2002.
Optimization Techniques
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Task Assignment and Transaction Clustering Heuristics.
Simulated Annealing 10/7/2005.
Review Best-first search uses an evaluation function f(n) to select the next node for expansion. Greedy best-first search uses f(n) = h(n). Greedy best.
1 Simulated Annealing Terrance O ’ Regan. 2 Outline Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
MAE 552 – Heuristic Optimization Lecture 6 February 4, 2002.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
Elements of the Heuristic Approach
JM - 1 Introduction to Bioinformatics: Lecture XVI Global Optimization and Monte Carlo Jarek Meller Jarek Meller Division of Biomedical.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Global Optimization The Problem minimize f(x) subject to g i (x)>b i i=1,…,m h j (x)=c j j=1,…,n When x is discrete we call this combinatorial optimization.
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
Simulated Annealing.
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
Mathematical Models & Optimization?
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
B. Stochastic Neural Networks
Simulated Annealing G.Anuradha.
Introduction to Simulated Annealing Study Guide for ES205 Yu-Chi Ho Xiaocang Lin Aug. 22, 2000.
Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000.
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Local search algorithms In many optimization problems, the state space is the space of all possible complete solutions We have an objective function that.
Optimization Problems
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
A study of simulated annealing variants Ana Pereira Polytechnic Institute of Braganca, Portugal Edite Fernandes University of Minho,
Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Lecture 18, CS5671 Multidimensional space “The Last Frontier” Optimization Expectation Exhaustive search Random sampling “Probabilistic random” sampling.
1 Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations Simulated Annealing (SA)
Biointelligence Laboratory, Seoul National University
Scientific Research Group in Egypt (SRGE)
Simulated Annealing Chapter
Department of Computer Science
Heuristic Optimization Methods
Van Laarhoven, Aarts Version 1, October 2000
Optimization Techniques Gang Quan Van Laarhoven, Aarts Sources used.
By Rohit Ray ESE 251 Simulated Annealing.
G5BAIM Artificial Intelligence Methods
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Maria Okuniewski Nuclear Engineering Dept.
CSE 589 Applied Algorithms Spring 1999
School of Computer Science & Engineering
Introduction to Simulated Annealing
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Artificial Intelligence Search Algorithms
Greg Knowles ECE Fall 2004 Professor Yu Hu Hen
Simulated Annealing & Boltzmann Machines
Presentation transcript:

1 IE 607 Heuristic Optimization Simulated Annealing

2 Metropolis et al. (Los Alamos National Lab), “Equation of State Calculations by Fast Computing Machines,” Journal of Chemical Physics, Origins and Inspiration from Natural Systems

3 In annealing, a material is heated to high energy where there are frequent state changes (disordered). It is then gradually (slowly) cooled to a low energy state where state changes are rare (approximately thermodynamic equilibrium, “frozen state”). For example, large crystals can be grown by very slow cooling, but if fast cooling or quenching is employed the crystal will contain a number of imperfections (glass).

4 Metropolis’s paper describes a Monte Carlo simulation of interacting molecules. Each state change is described by: –If E decreases, P = 1 –If E increases, P = where T = temperature (Kelvin), k = Boltzmann constant, k > 0, energy change To summarize behavior: For large T, P  For small T, P 

5 Kirkpatrick, Gelatt and Vecchi, “Optimization by Simulated Annealing,” Science, Used the Metropolis algorithm to optimization problems – spin glasses analogy, computer placement and wiring, and traveling salesman, both difficult and classic combinatorial problems. SA for Optimization

6 SimulationOptimization System statesFeasible solutions EnergyCost (Objective) Change of stateNeighboring solution (Move) TemperatureControl parameter Frozen stateFinal solution Thermodynamic Simulation vs Combinatorial Optimization

7 Spin glasses display a reaction called frustration, in which opposing forces cause magnetic fields to line up in different directions, resulting in an energy function with several low energy states  in an optimization problem, there are likely to be several local minima which have cost function values close to the optimum Spin Glasses Analogy

8 Notation: T 0 starting temperature T F final temperature (T 0 > T F, T 0,T F 0) T t temperature at state t mnumber of states n(t)number of moves at state t (total number of moves = n * m) move operator Canonical Procedure of SA

9 x 0 initial solution x i solution i x F final solution f(x i )objective function value of x i cooling parameter Canonical Procedure of SA

10 Procedure (Minimization) Select x 0, T 0, m, n, α Set x 1 = x 0, T 1 = T 0, x F = x 0 for (t = 1,…,m) { for (i = 1,…n) { x TEMP = σ(x i ) if f(x TEMP ) ≤ f(x i ), x i+1 = x TEMP else if, x i+1 = x TEMP else x i+1 = x i if f(x i+1 ) ≤ f(x F ), x F = x i+1 } T t+1 = α T t } return x F

11 Combinatorial Example: TSP

12 Continuous Example: 2 Dimensional Multimodal Function

13 Assumption: can reach any solution from any other solution 1.Single T – Homogeneous Markov Chain - constant T - global optimization does not depend on initial solution - basically works on law of large numbers SA: Theory

14 2.Multiple T - sequence of homogeneous Markov Chains (one at each T) (by Aarts and van Laarhoven)  number of moves is at least quadratic with search space at T  running time for SA with such guarantees of optimality will be Exponential!! - a single non-homogeneous Markov Chain (by Hajek)  if, the SA converges if c ≥ depth of largest local minimum where k = number of moves

15 2.Multiple T (cont.) - usually successful between 0.8 & Lundy & Mees - Aarts & Korst

16 Initial starting point - construct “good” solution ( pre-processing)  search must be commenced at a lower temperature  save a substantial amount of time (when compared to totally random) - Multi-start Variations of SA

17 Move - change neighborhood during search (usually contract)  e.g. in TSP, 2-opt neighborhoods restrict two points “close” to one another to be swapped - non-random move Variations of SA

18 Annealing Schedule - constant T: such a temperature must be high enough to climb out of local optimum, but cool enough to ensure that these local optima are visited  problem-specific & instance-specific - 1 move per T Variations of SA

19 Annealing Schedule (cont.) - change schedule during search (include possible reheating)  # of moves accepted  # of moves attempted  entropy, specific heat, variance of solutions at T Variations of SA

20 Acceptance Probability Variations of SA By Johnson et al. By Brandimarte et al., need to decide whether to accept moves for the change of energy = 0

21 Stopping Criteria - total moves attempted - no improvement over n attempts - no accepted moves over m attempts - minimum temperature Variations of SA

22 Finish with Local Search - post-processing  apply a descent algorithm to the final annealing solution to ensure that at least a local minimum has been found Variations of SA

23 Parallel Implementations - multiple processors proceed with different random number streams at given T  the best result from all processors is chosen for the new temperature - if one processor finds a neighbor to accept  convey to all other processors and search start from there Variations of SA

24 Graph partitioning & graph coloring Traveling salesman problem Quadratic assignment problem VLSI and computer design Scheduling Image processing Layout design A lot more… Applications of SA

tml Some Websites