Stochastic Relaxation, Simulating Annealing, Global Minimizers.

Slides:



Advertisements
Similar presentations
Decision Support Andry Pinto Hugo Alves Inês Domingues Luís Rocha Susana Cruz.
Advertisements

Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Simulated Annealing Premchand Akella. Agenda Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing General Idea: Start with an initial solution
Simulated Annealing Methods Matthew Kelly April 12, 2011.
CS 678 –Boltzmann Machines1 Boltzmann Machine Relaxation net with visible and hidden units Learning algorithm Avoids local minima (and speeds up learning)
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Stochastic Parameter Optimization for Empirical Molecular Potentials function optimization simulated annealing tight binding parameters.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Random numbers and optimization techniques Jorge Andre Swieca School Campos do Jordão, January,2003 second lecture.
Easy Optimization Problems, Relaxation, Local Processing for a single variable.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
MAE 552 – Heuristic Optimization Lecture 8 February 8, 2002.
Recent Development on Elimination Ordering Group 1.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Simulated Annealing 10/7/2005.
1 Simulated Annealing Terrance O ’ Regan. 2 Outline Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
Easy Optimization Problems, Relaxation, Local Processing for a single variable.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
Elements of the Heuristic Approach
Relating computational and physical complexity Computational complexity: How the number of computational steps needed to solve a problem scales with problem.
Local Search and Optimization
1 IE 607 Heuristic Optimization Simulated Annealing.
Outline Review of extended ensemble methods (multi-canonical, Wang-Landau, flat-histogram, simulated tempering) Replica MC Connection to parallel tempering.
Spin models for image processing Alexey Abramov abramov _at_ physik3.gwdg.de Georg-August University, Bernstein Center for Computational Neuroscience,
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
1 Simulated Annealing Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations.
Simulated Annealing.
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Simulated Annealing G.Anuradha.
Introduction to Simulated Annealing Study Guide for ES205 Yu-Chi Ho Xiaocang Lin Aug. 22, 2000.
Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000.
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
13. Extended Ensemble Methods. Slow Dynamics at First- Order Phase Transition At first-order phase transition, the longest time scale is controlled by.
A local search algorithm with repair procedure for the Roadef 2010 challenge Lauri Ahlroth, André Schumacher, Henri Tokola
Optimization Problems
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
Javier Junquera Importance sampling Monte Carlo. Cambridge University Press, Cambridge, 2002 ISBN Bibliography.
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
Lecture 6 – Local Search Dr. Muhammad Adnan Hashmi 1 24 February 2016.
Lecture 11: Linkage Analysis IV Date: 10/01/02  linkage grouping  locus ordering  confidence in locus ordering.
CS621: Artificial Intelligence
1 Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations Simulated Annealing (SA)
Optimization via Search
Simulated Annealing Chapter
Simulated Annealing Premchand Akella.
Heuristic Optimization Methods
Van Laarhoven, Aarts Version 1, October 2000
By Rohit Ray ESE 251 Simulated Annealing.
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Machine Learning – Regression David Fenyő
CSE 589 Applied Algorithms Spring 1999
Optimization with Meta-Heuristics
Introduction to Simulated Annealing
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Simulated Annealing & Boltzmann Machines
Stochastic Methods.
Presentation transcript:

Stochastic Relaxation, Simulating Annealing, Global Minimizers

Different types of relaxation  Variable by variable relaxation – strict minimization  Changing a small subset of variables simultaneously – Window strict minimization relaxation  Stochastic relaxation – may increase the energy – should be followed by strict minimization

Complex landscape of E(X)

How to escape local minima?  First go uphill, then may hit a lower basin  In order to go uphill should allow increase in E(x)  Add stochasticity: allow E(x) to increase with probability which is governed by an external temperature-like parameter T The Metropolis Algorithm (Kirpartick et al. 1983) Assume x old is the current state, define x new to be a neighboring state and delE=E(x new )-E(x old ) then If delE<0 replace x old by x new else choose x new with probability P(x new )= and x old with probability P(x old )=1- P(x new )

The probability to accept an increasing energy move

The Metropolis Algorithm  As T 0 and when delE>0 : P(x new ) 0  At T=0: strict minimization  High T randomizes the configuration away from the minimum  Low T cannot escape local minima  Starting from a high T, the slower T is decreased the lower E(x) is achieved  The slow reduction in T allows the material to obtain a more arranged configuration: increase the size of its crystals and reduce their defectscrystalsdefects

Fast cooling – amorphous solid

Slow cooling - crystalline solid

SA for the 2D Ising E=-  ij s i s j, i and j are nearest neighbors E old =-2

SA for the 2D Ising E=-  ij s i s j, i and j are nearest neighbors E old =-2E new =2

SA for the 2D Ising E=-  ij s i s j, i and j are nearest neighbors E old =-2E new =2 delE=E new - E old =4>0 P(E new )=exp(-4/T)

SA for the 2D Ising E=-  ij s i s j, i and j are nearest neighbors E old =-2E new =2 delE=E new - E old =4>0 P(E new )=exp(-4/T) =0.3 => T=-4/ln0.3 ~ 3.3 Reduce T by a factor , 0<  <1: T n+1 =  T n

Exc#7: SA for the 2D Ising (see Exc#1) Consider the following cases: 1. For h 1 = h 2 =0 set a stripe of width 3,6 or 12 with opposite sign 2. For h 1 =-0.1, h 2 =0.4 set -1 at h 1 and +1 at h 2 3. Repeat 2. with 2 squares of 8x8 plus spins with h 2 =0.4 located apart from each other Calculate T 0 to allow 10% flips of a spin surrounded by 4 neighbors of the same sign Use faster / slower cooling scheduling a. What was the starting T 0, E in each case b. How was T 0 decreased, how many sweeps were employed c. What was the final configuration, was the global minimum achievable? If not try different T 0 d. Is it harder to flip a wider stripe? e. Is it harder to flip 2 squares than just one?