Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000.

Slides:



Advertisements
Similar presentations
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Advertisements

Simulated Annealing Premchand Akella. Agenda Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing Methods Matthew Kelly April 12, 2011.
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Local search algorithms
Local search algorithms
MAE 552 – Heuristic Optimization Lecture 8 February 8, 2002.
Recent Development on Elimination Ordering Group 1.
Optimization Techniques
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
MAE 552 – Heuristic Optimization
Simulated Annealing 10/7/2005.
1 Simulated Annealing Terrance O ’ Regan. 2 Outline Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
Stochastic Relaxation, Simulating Annealing, Global Minimizers.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
Elements of the Heuristic Approach
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
1 IE 607 Heuristic Optimization Simulated Annealing.
Simulated Annealing.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
MonteCarlo Optimization (Simulated Annealing) Mathematical Biology Lecture 6 James A. Glazier.
Global Optimization The Problem minimize f(x) subject to g i (x)>b i i=1,…,m h j (x)=c j j=1,…,n When x is discrete we call this combinatorial optimization.
Informed search algorithms
1 Simulated Annealing Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations.
Simulated Annealing.
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
Course: Logic Programming and Constraints
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
G5BAIM Artificial Intelligence Methods
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
Simulated Annealing G.Anuradha.
Introduction to Simulated Annealing Study Guide for ES205 Yu-Chi Ho Xiaocang Lin Aug. 22, 2000.
02/17/10 CSCE 769 Optimization Homayoun Valafar Department of Computer Science and Engineering, USC.
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Chapter 10 Minimization or Maximization of Functions.
Optimization Problems
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Exhaustive search Hill.
CS621: Artificial Intelligence
1 Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations Simulated Annealing (SA)
Scientific Research Group in Egypt (SRGE)
Optimization via Search
Scientific Research Group in Egypt (SRGE)
Simulated Annealing Chapter
Heuristic Optimization Methods
Van Laarhoven, Aarts Version 1, October 2000
Optimization Techniques Gang Quan Van Laarhoven, Aarts Sources used.
By Rohit Ray ESE 251 Simulated Annealing.
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Maria Okuniewski Nuclear Engineering Dept.
CSE 589 Applied Algorithms Spring 1999
School of Computer Science & Engineering
Introduction to Simulated Annealing
More on Search: A* and Optimization
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Greg Knowles ECE Fall 2004 Professor Yu Hu Hen
Simulated Annealing & Boltzmann Machines
Stochastic Methods.
Presentation transcript:

Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000

2 Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search N

3 Possible Solutions Solution 1: Randomly select another starting point whenever trapped in a local optima — shake the picture horizontally.

4 Possible Solutions Solution 2: In additional to solution 1, we allow search to occasionally go on an ascending direction — shake the picture vertically.

5 Intuition of Simulated Annealing Origin: The annealing process of heated solids. Intuition: By allowing occasional ascent in the search process, we might be able to escape the trap of local minima. N

6 Consequences of the Occasional Ascents Help escaping the local optima. desired effect Might pass global optima after reaching it adverse effect N

7 Control of Annealing Process Acceptance of a search step (Metropolis Criterion):  Assume the performance change in the search direction is.  Accept a ascending step only if it pass a random test,  Always accept a descending step, i.e. N

8 Control of Annealing Process Cooling Schedule:  T, the annealing temperature, is the parameter that control the frequency of acceptance of ascending steps.  We gradually reduce temperature T(k).  At each temperature, search is allowed to proceed for a certain number of steps, L(k).  The choice of parameters {T(k), L(k)} is called the cooling schedule. N

9 Simulated Annealing Algorithm 0) k = 0; 1) Search (i  j), performance difference ; 2) If   0 then accept, else if exp(-/T(k)) > random[0,1) then accept; 3) Repeat 1) and 2) for L(k) steps; 4) k = k+1; 5) Repeat 1) – 4) until stopping criterion is met. N

10 Implementation of Simulated Annealing  Select a local search scheme Define a neighborhood structure Sample a new design from the neighboring designs of the current design. Accept the new design based on Metropolis Criterion. N

11 An Example A Traveling Salesman Problem a b c d e Visit each city once and only once with minimal overall traveling distance.

12 An Example Possible Solutions 1:abcde2:abced3:abdce4:abdec 5:abecd6:abedc7:acbde8:acbed 9:acdbe10:acebd11:adbce12:adcbe Choose a neighborhood structure Example 1: An arbitrary sequence

13 An Example Example 2: Two designs are neighbors iff two neighboring cities in one design are in the opposite order in another design

14 An Example Suppose current design is 1 at example 2. Then we select one of its neighbors, design 2, 3 and 7, with probability 1/ /3 Sample a neighboring design

15 Implementation of Simulated Annealing  Determine the cooling schedule For example: Set L = n, the number of variables in the problem. Set T(0) such that exp(-/T(0))  1. Set T(k+1) = T(k), where  is a constant smaller but close to 1. N

16 Implementation of Simulated Annealing  Understand the result: This is a stochastic algorithm. The outcome may be different at different trials. Convergence to global optima can only be realized in asymptotic sense. Slow convergence: to ensure convergence, the decreasing rate for T(k) should be set at no faster than N

17 Reference: Laarhoven, P.J.M., E.H.L. Aarts, Simulated Annealing: Theory and Applications, Kluwer Academic Publisher, Zhigljavsky, A.A., Theory of Global Random Search, Kluwer Academic Publishers, Hajek, B., “Cooling Schedules for Optimal Annealing,” Mathematics of Operations Research, Vol. 13, pp , 1988.