Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Simulated Annealing

Similar presentations


Presentation on theme: "Introduction to Simulated Annealing"— Presentation transcript:

1 Introduction to Simulated Annealing

2 Difficulty in Searching Global Optima
barrier to local search starting point descend direction local minima Local search techniques, such as steepest descend method, are very good in finding local optima. However, difficulties arise when the global optima is different from the local optima. Since all the immediate neighboring points around a local optima is worse than it in the performance value, local search can not proceed once trapped in a local optima point. We need some mechanism that can help us escape the trap of local optima. And the simulated annealing is one of such methods. global minima

3 Intuition of Simulated Annealing
Origin: The annealing process of heated solids. Intuition: By allowing occasional ascent in the search process, we might be able to escape the trap of local minima. The name of simulated annealing origins from the simulation of annealing process of heated solids. “In condensed matter physics, annealing denotes a physical process in which a solid in a heat bath is heated up by increasing the temperature of the heat bath to a maximum value at which all particles of the solid randomly arrange themselves in the liquid phase, followed by cooling through slowly lowering the temperature of the heat bath. In this way, all particles arrange themselves in the low energy ground state of a corresponding lattice.” (quoted from Simulated Annealing: Theory and Applications) In solving combinatorial optimization problems, we make an analogy to the aforementioned process. The basic idea is that by allowing the search process to proceed in an unfavorable direction occasionally, we might be able to escape the trap of local optima and reach the global optima.

4 Consequences of the Occasional Ascents
desired effect Help escaping the local optima. However, like swords have two edges, there are two consequences of allowing occasional ascent steps. On one hand, it fulfills our desire to let the algorithm proceed beyond local optima. On the other hand, we might miss the global optima by allowing the search process to pass through it. To maintain the desired effect and reduce the adverse effect, we need a sophisticated scheme to control the acceptance of occasional ascents, which is the heart of simulated annealing. adverse effect Might pass global optima after reaching it

5 Control of Annealing Process
Acceptance of a search step (Metropolis Criterion): Assume the performance change in the search direction is . Always accept a descending step, i.e. Suppose that a step in the search direction produce a difference of  in the performance value. The acceptance criterion, which is often referred to as Metropolis Criterion, is as follows: If it is a favorable direction, say   0, we always accept it. Otherwise, this step is accepted with a probability exp(-/T), where T is a parameter. Accept a ascending step only if it pass a random test,

6 Control of Annealing Process
Cooling Schedule: T, the annealing temperature, is the parameter that control the frequency of acceptance of ascending steps. We gradually reduce temperature T(k). It is obvious that the parameter T plays a pivotal role in the acceptance criterion. The smaller the T is, the less likely an unfavorable step will be accepted. To obtain a desirable result, the parameter T will be gradually reduced as the algorithm proceeds. So at the beginning, T is large and thus the search can easily escape the trap of local optima. And later on, by reducing T, we allow the algorithm to converge. It is common practice that at each choice of T, we will allow the algorithm to proceed a certain steps, L(k). And the choice of the sequence of parameters {T(k), L(k)} is often called the cooling schedule. At each temperature, search is allowed to proceed for a certain number of steps, L(k). The choice of parameters is called the cooling schedule.

7 Simulated Annealing Algorithm
0) k = 0; 1) Search (i  j), performance difference ; 2) If   0 then accept, else if exp(-/T(k)) > random[0,1) then accept; This is an abstract description of a simulated annealing algorithm. With proper selection of parameters, it is proven that it can converge to a global optima with probability 1. 3) Repeat 1) and 2) for L(k) steps; 4) k = k+1; 5) Repeat 1) – 4) until stopping criterion is met.

8 Implementation of Simulated Annealing
Select a local search scheme Determine the cooling schedule For example: Set L = n, the number of variables in the problem. Set T(0) such that exp(-/T(0))  1. Set T(k+1) = T(k), where  is a constant smaller but close to 1. The implementation of simulated annealing algorithm is problem dependent. First, a proper local search scheme must be chosen. For continuous problems, steepest descend is often used. For discrete problems, a neighborhood structure is defined. And usually search is carried out by randomly selecting one of the neighbors of the current design. Similarly, there is no fixed formula for determining the cooling schedule. A guideline is that: the number of search steps at each temperature setting should be sufficient, usually correlated to the size of the problem; the initial temperature should be high enough to allow frequent acceptance; and finally the decrease of the temperature should not be too fast. We provide a particular choice of cooling schedule here as an example.

9 Implementation of Simulated Annealing
Understand the result: This is a stochastic algorithm. The outcome may be different at different trials. Convergence to global optima can only be realized in asymptotic sense. Finally, we like to emphasize the interpretation of the algorithm output. Simulated annealing is a stochastic algorithm. Because random variables are used in the algorithm, the outcome of different trials may vary even for the exact same choice of cooling schedule. Moreover, the convergence to the global optima of simulated annealing is only achieved when algorithm proceeds to infinite number of iterations.

10

11

12 Update archive

13 Multi-Objective Simulated Annealing

14 Update Nondominated Set
Update archive Update Nondominated Set

15 Multi-Objective

16 Sanghamitra Bandyopadhyay, Sriparna Saha, Ujjwal Maulik and Kalyanmoy Deb. A Simulated Annealing Based Multi-objective Optimization Algorithm: AMOSA , IEEE Transactions on Evolutionary Computation, Volume 12, No. 3, JUNE 2008, Pages

17 After clusters are obtained, the member within each cluster whose average distance to the other members is the minimum, is considered as the representative member of the cluster.

18 Update Nondominated Set
Update archive Update Nondominated Set

19 HL:The maximum size of the Archive on termination
HL:The maximum size of the Archive on termination. This set is equal to the maximum number of nondominated solutions required by the user. SL:The maximum size to which the Archive may be filled before clustering is used to reduce its size to HL. (SL>HL). Tmax: Maximum (initial) temperature. Tmin : Minimal (final) temperature. iter: Number of iterations at each temperature. α: The cooling rate in SA.

20

21 Case 1:

22 Case 2:

23 Case 3:

24


Download ppt "Introduction to Simulated Annealing"

Similar presentations


Ads by Google