Introduction to Simulated Annealing

Slides:



Advertisements
Similar presentations
Decision Support Andry Pinto Hugo Alves Inês Domingues Luís Rocha Susana Cruz.
Advertisements

Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Simulated Annealing Premchand Akella. Agenda Motivation The algorithm Its applications Examples Conclusion.
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
MAE 552 – Heuristic Optimization Lecture 8 February 8, 2002.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
MAE 552 – Heuristic Optimization
Simulated Annealing 10/7/2005.
1 Simulated Annealing Terrance O ’ Regan. 2 Outline Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
1 IOE/MFG 543 Chapter 14: General purpose procedures for scheduling in practice Section 14.4: Local search (Simulated annealing and tabu search)
Stochastic Relaxation, Simulating Annealing, Global Minimizers.
MAE 552 – Heuristic Optimization Lecture 6 February 4, 2002.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
Elements of the Heuristic Approach
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Vilalta&Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
1 IE 607 Heuristic Optimization Simulated Annealing.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Simulated Annealing.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
Simulated Annealing G.Anuradha.
Introduction to Simulated Annealing Study Guide for ES205 Yu-Chi Ho Xiaocang Lin Aug. 22, 2000.
Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000.
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
L10 – Map labeling algorithms NGEN06(TEK230) – Algorithms in Geographical Information Systems L10- Map labeling algorithms by: Sadegh Jamali (source: Lecture.
Heuristic Methods for the Single- Machine Problem Chapter 4 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R2.
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
A study of simulated annealing variants Ana Pereira Polytechnic Institute of Braganca, Portugal Edite Fernandes University of Minho,
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Optimization Problems
Scientific Research Group in Egypt (SRGE)
Optimization via Search
Simulated Annealing Chapter
Simulated Annealing Premchand Akella.
Heuristic Optimization Methods
Van Laarhoven, Aarts Version 1, October 2000
School of Computer Science & Engineering
By Rohit Ray ESE 251 Simulated Annealing.
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Maria Okuniewski Nuclear Engineering Dept.
Assignment I TSP with Simulated Annealing
Linear Programming.
Chap 3. The simplex method
Heuristic search INT 404.
Optimization Problems
CSE 589 Applied Algorithms Spring 1999
of the Artificial Neural Networks.
School of Computer Science & Engineering
More on Search: A* and Optimization
EE368 Soft Computing Genetic Algorithms.
Boltzmann Machine (BM) (§6.4)
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Artificial Intelligence
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Greg Knowles ECE Fall 2004 Professor Yu Hu Hen
Simulated Annealing & Boltzmann Machines
Stochastic Methods.
Presentation transcript:

Introduction to Simulated Annealing

Difficulty in Searching Global Optima barrier to local search starting point descend direction local minima Local search techniques, such as steepest descend method, are very good in finding local optima. However, difficulties arise when the global optima is different from the local optima. Since all the immediate neighboring points around a local optima is worse than it in the performance value, local search can not proceed once trapped in a local optima point. We need some mechanism that can help us escape the trap of local optima. And the simulated annealing is one of such methods. global minima

Intuition of Simulated Annealing Origin: The annealing process of heated solids. Intuition: By allowing occasional ascent in the search process, we might be able to escape the trap of local minima. The name of simulated annealing origins from the simulation of annealing process of heated solids. “In condensed matter physics, annealing denotes a physical process in which a solid in a heat bath is heated up by increasing the temperature of the heat bath to a maximum value at which all particles of the solid randomly arrange themselves in the liquid phase, followed by cooling through slowly lowering the temperature of the heat bath. In this way, all particles arrange themselves in the low energy ground state of a corresponding lattice.” (quoted from Simulated Annealing: Theory and Applications) In solving combinatorial optimization problems, we make an analogy to the aforementioned process. The basic idea is that by allowing the search process to proceed in an unfavorable direction occasionally, we might be able to escape the trap of local optima and reach the global optima.

Consequences of the Occasional Ascents desired effect Help escaping the local optima. However, like swords have two edges, there are two consequences of allowing occasional ascent steps. On one hand, it fulfills our desire to let the algorithm proceed beyond local optima. On the other hand, we might miss the global optima by allowing the search process to pass through it. To maintain the desired effect and reduce the adverse effect, we need a sophisticated scheme to control the acceptance of occasional ascents, which is the heart of simulated annealing. adverse effect Might pass global optima after reaching it

Control of Annealing Process Acceptance of a search step (Metropolis Criterion): Assume the performance change in the search direction is . Always accept a descending step, i.e. Suppose that a step in the search direction produce a difference of  in the performance value. The acceptance criterion, which is often referred to as Metropolis Criterion, is as follows: If it is a favorable direction, say   0, we always accept it. Otherwise, this step is accepted with a probability exp(-/T), where T is a parameter. Accept a ascending step only if it pass a random test,

Control of Annealing Process Cooling Schedule: T, the annealing temperature, is the parameter that control the frequency of acceptance of ascending steps. We gradually reduce temperature T(k). It is obvious that the parameter T plays a pivotal role in the acceptance criterion. The smaller the T is, the less likely an unfavorable step will be accepted. To obtain a desirable result, the parameter T will be gradually reduced as the algorithm proceeds. So at the beginning, T is large and thus the search can easily escape the trap of local optima. And later on, by reducing T, we allow the algorithm to converge. It is common practice that at each choice of T, we will allow the algorithm to proceed a certain steps, L(k). And the choice of the sequence of parameters {T(k), L(k)} is often called the cooling schedule. At each temperature, search is allowed to proceed for a certain number of steps, L(k). The choice of parameters is called the cooling schedule.

Simulated Annealing Algorithm 0) k = 0; 1) Search (i  j), performance difference ; 2) If   0 then accept, else if exp(-/T(k)) > random[0,1) then accept; This is an abstract description of a simulated annealing algorithm. With proper selection of parameters, it is proven that it can converge to a global optima with probability 1. 3) Repeat 1) and 2) for L(k) steps; 4) k = k+1; 5) Repeat 1) – 4) until stopping criterion is met.

Implementation of Simulated Annealing Select a local search scheme Determine the cooling schedule For example: Set L = n, the number of variables in the problem. Set T(0) such that exp(-/T(0))  1. Set T(k+1) = T(k), where  is a constant smaller but close to 1. The implementation of simulated annealing algorithm is problem dependent. First, a proper local search scheme must be chosen. For continuous problems, steepest descend is often used. For discrete problems, a neighborhood structure is defined. And usually search is carried out by randomly selecting one of the neighbors of the current design. Similarly, there is no fixed formula for determining the cooling schedule. A guideline is that: the number of search steps at each temperature setting should be sufficient, usually correlated to the size of the problem; the initial temperature should be high enough to allow frequent acceptance; and finally the decrease of the temperature should not be too fast. We provide a particular choice of cooling schedule here as an example.

Implementation of Simulated Annealing Understand the result: This is a stochastic algorithm. The outcome may be different at different trials. Convergence to global optima can only be realized in asymptotic sense. Finally, we like to emphasize the interpretation of the algorithm output. Simulated annealing is a stochastic algorithm. Because random variables are used in the algorithm, the outcome of different trials may vary even for the exact same choice of cooling schedule. Moreover, the convergence to the global optima of simulated annealing is only achieved when algorithm proceeds to infinite number of iterations.

Update archive

Multi-Objective Simulated Annealing

Update Nondominated Set Update archive Update Nondominated Set

Multi-Objective

Sanghamitra Bandyopadhyay, Sriparna Saha, Ujjwal Maulik and Kalyanmoy Deb. A Simulated Annealing Based Multi-objective Optimization Algorithm: AMOSA , IEEE Transactions on Evolutionary Computation, Volume 12, No. 3, JUNE 2008, Pages 269-283.

After clusters are obtained, the member within each cluster whose average distance to the other members is the minimum, is considered as the representative member of the cluster.

Update Nondominated Set Update archive Update Nondominated Set

HL:The maximum size of the Archive on termination HL:The maximum size of the Archive on termination. This set is equal to the maximum number of nondominated solutions required by the user. SL:The maximum size to which the Archive may be filled before clustering is used to reduce its size to HL. (SL>HL). Tmax: Maximum (initial) temperature. Tmin : Minimal (final) temperature. iter: Number of iterations at each temperature. α: The cooling rate in SA.

Case 1:

Case 2:

Case 3: