Simulated Annealing Methods Matthew Kelly April 12, 2011.

Slides:



Advertisements
Similar presentations
Vegetation Science Lecture 4 Non-Linear Inversion Lewis, Disney & Saich UCL.
Advertisements

Review Of Statistical Mechanics
Simulated Annealing General Idea: Start with an initial solution
Monte Carlo Methods and Statistical Physics
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Stochastic Parameter Optimization for Empirical Molecular Potentials function optimization simulated annealing tight binding parameters.
Applying Machine Learning to Circuit Design David Hettlinger Amy Kerr Todd Neller.
Simulated Annealing Student (PhD): Umut R. ERTÜRK Lecturer : Nazlı İkizler Cinbiş
Local search algorithms
Local search algorithms
Two types of search problems
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
Recent Development on Elimination Ordering Group 1.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Nature’s Algorithms David C. Uhrig Tiffany Sharrard CS 477R – Fall 2007 Dr. George Bebis.
1 Chapter 5 Advanced Search. 2 l
Simulated Annealing 10/7/2005.
Review Best-first search uses an evaluation function f(n) to select the next node for expansion. Greedy best-first search uses f(n) = h(n). Greedy best.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
EDA (CS286.5b) Day 7 Placement (Simulated Annealing) Assignment #1 due Friday.
Stochastic Relaxation, Simulating Annealing, Global Minimizers.
MAE 552 – Heuristic Optimization Lecture 6 February 4, 2002.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
1 IE 607 Heuristic Optimization Simulated Annealing.
MonteCarlo Optimization (Simulated Annealing) Mathematical Biology Lecture 6 James A. Glazier.
Swarm Intelligence 虞台文.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
CS 484 – Artificial Intelligence1 Announcements Homework 2 due today Lab 1 due Thursday, 9/20 Homework 3 has been posted Autumn – Current Event Tuesday.
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
Simulated Annealing.
For a new configuration of the same volume V and number of molecules N, displace a randomly selected atom to a point chosen with uniform probability inside.
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
Simulated Annealing G.Anuradha.
Introduction to Simulated Annealing Study Guide for ES205 Yu-Chi Ho Xiaocang Lin Aug. 22, 2000.
Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000.
1 Network Models Transportation Problem (TP) Distributing any commodity from any group of supply centers, called sources, to any group of receiving.
Molecular Modelling - Lecture 2 Techniques for Conformational Sampling Uses CHARMM force field Written in C++
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Optimization Problems
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
Advanced Computer Graphics Optimization Part 2 Spring 2002 Professor Brogan.
Lecture 11: Linkage Analysis IV Date: 10/01/02  linkage grouping  locus ordering  confidence in locus ordering.
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Exhaustive search Hill.
CS621: Artificial Intelligence
Optimization Problems
Optimization via Search
Simulated Annealing Chapter
Heuristic Optimization Methods
Van Laarhoven, Aarts Version 1, October 2000
Local Search Algorithms
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Optimization Problems
CSE 589 Applied Algorithms Spring 1999
Introduction to Simulated Annealing
More on Search: A* and Optimization
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
CSC 380: Design and Analysis of Algorithms
Simulated Annealing & Boltzmann Machines
Stochastic Methods.
Presentation transcript:

Simulated Annealing Methods Matthew Kelly April 12, 2011

What is Annealing? Slow cooling of a heated substance allows atoms to line themselves up, creating a stronger structure with minimum energy Accelerating the cooling process produces a structure with more energy and fewer atoms optimally aligned.

How Does Simulated Annealing Related to Monte Carlo If system is “cooled quickly”, we produce a local but not global minimum Allowing system to find minimum takes more time but will result in global minimum, i.e. structure with minimum energy Can be used to solve Travelling Salesman problem, which is NP-Hard – Has many local minima

Performing Simulated Annealing Use Metropolis algorithm – f – objective function – x – system state – T – variable with annealing schedule (e.g. temperature), which is gradually reduced – ∆x – random step procedure Should be consistently efficient, even if in a narrow valley or a minimum is being approached

Simulated Annealing as Applied to Travelling Salesman 1.Number cities 1 to n 2.Build a path by either: 1.Removing a link between two cities and reverse their order of traversal 2.Remove a link between two cities and add a link between two and place it between two other randomly chosen cities 3.Assign where point n+1 is point 1 4.Generate random arrangements to get range of ΔE. 5.Choose initial T that is much larger than the largest ΔE. 6.Iteratively decrease T by 10% after holding it constant 100*N configurations or 10*N successful configurations, whichever comes first.

A Caveat To show the robustness of using this procedure, add a penalty for “crossing the Mississippi”, i.e. a line to separate the points into two sets. Each point has an attributed value μ with points left of line having μ = -1 and points right of line having μ = +1. Modify the equation to enforce a penalty for crossing this line No Penalty Penalty Enforced Bonus For Crossing

Simulated Annealing for Continuous Minimization Use Downhill Simplex Method – Store last D-1 vertices where D is the Dimensionality of the function Move via reflections, expansions and contractions – First add a log. distributed random var prop. to T – Then subtract a log. distrib. rand var prop. to T Determine region reachable at a large T Perform stochastic tumbling Brownian (random and erratic) motion Slowly reduce T and Repeat Results in region exploration independent of narrowness and orientation Like Metropolis, always accepts true downhill step and sometimes accepts uphill

How Slow Should T Be Decreased? Various methods exist: 1. T i+1 = (1- ϵ) T after m moves. Determine ϵ / m experimentally 2. T i+1 = T 0 (1-k/K) α, k=# moves so far, α is a constant depending on data distribution of relative minima 3. After m moves, T =β(f 1 -f b ) where β is constant, f 1 smallest function in simplex, f b is best function ever. Never reduce T by more than some fraction of time γ

Advantages of this Method Not easily fooled by a quick payoff from falling into an unfavorable local minima Changes that cause large energy differences are sifted over when T is large. – e.g. In travelling salesman, river is cross minimum number of times (2) when T is large due to the large cost (λ) with neglect to correctness of the paths over the weight but are again considered when T is reduced. Well-correlated with thermodynamics, so has practical use.

Disadvantages of this Method Degree to which T should be reduced is not clearly defined, 3 potential method are given, all of which don’t work 100% of time Claim of “effectively solving the traveling salesman problem” does not change the fact that it is an NP problem that would take a long time to resolve. Removing the time characteristic from the method makes attribution to NP (solvable in nondeterministic polynomial time) meaningless. Scheme of restarting back to a best ever point appears to violate the Metropolis trait of always being able to reach the global minimum regardless of the starting point.