Simulated Annealing. Local Search. Cost function. Solution space

Similar documents
5. Simulated Annealing 5.1 Basic Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini

Finding optimal configurations ( combinatorial optimization)

Introduction to Simulated Annealing 22c:145

Local Search & Optimization

Methods for finding optimal configurations

Zebo Peng Embedded Systems Laboratory IDA, Linköping University

Local Search & Optimization

SIMU L TED ATED ANNEA L NG ING

Single Solution-based Metaheuristics

CS 380: ARTIFICIAL INTELLIGENCE

Motivation, Basic Concepts, Basic Methods, Travelling Salesperson Problem (TSP), Algorithms

Heuristic Optimisation

A.I.: Beyond Classical Search

Methods for finding optimal configurations

Optimization Methods via Simulation

Metaheuristics. 2.3 Local Search 2.4 Simulated annealing. Adrian Horga

1 Heuristics for the Traveling Salesman Problem

Local search algorithms. Chapter 4, Sections 3 4 1

Artificial Intelligence Heuristic Search Methods

MCMC Simulated Annealing Exercises.

5. Simulated Annealing 5.2 Advanced Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini

Design and Analysis of Algorithms

PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE

Chapter 4 Beyond Classical Search 4.1 Local search algorithms and optimization problems

Markov Chain Monte Carlo. Simulated Annealing.

CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash

Lin-Kernighan Heuristic. Simulated Annealing

Spin Glas Dynamics and Stochastic Optimization Schemes. Karl Heinz Hoffmann TU Chemnitz

Summary. AIMA sections 4.3,4.4. Hill-climbing Simulated annealing Genetic algorithms (briey) Local search in continuous spaces (very briey)

Motivation, Basic Concepts, Basic Methods, Travelling Salesperson Problem (TSP), Algorithms

Hill climbing: Simulated annealing and Tabu search

Local and Stochastic Search

22c:145 Artificial Intelligence

Local Search and Optimization

7.1 Basis for Boltzmann machine. 7. Boltzmann machines

Algorithms and Complexity theory

Lecture 4: Simulated Annealing. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved

AN OPTIMIZED FAST VOLTAGE STABILITY INDICATOR

Local search algorithms. Chapter 4, Sections 3 4 1

Computing the Initial Temperature of Simulated Annealing

Simulated Annealing SURFACE. Syracuse University

Overview. Optimization. Easy optimization problems. Monte Carlo for Optimization. 1. Survey MC ideas for optimization: (a) Multistart

Simulated Annealing. 2.1 Introduction

LOCAL SEARCH. Today. Reading AIMA Chapter , Goals Local search algorithms. Introduce adversarial search 1/31/14

Simulations with MM Force Fields. Monte Carlo (MC) and Molecular Dynamics (MD) Video II.vi

Local Search. Shin Yoo CS492D, Fall 2015, School of Computing, KAIST

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems

New rubric: AI in the news

Why Water Freezes (A probabilistic view of phase transitions)

A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait

A Two-Stage Simulated Annealing Methodology

Constraint satisfaction search. Combinatorial optimization search.

Computational statistics

Microcanonical Mean Field Annealing: A New Algorithm for Increasing the Convergence Speed of Mean Field Annealing.

Metaheuristics and Local Search

Chapter 4: Monte Carlo Methods. Paisan Nakmahachalasint

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches

Introduction to integer programming III:

The Quadratic Assignment Problem

Monte Carlo Markov Chains: A Brief Introduction and Implementation. Jennifer Helsby Astro 321

Using Matrix Analysis to Approach the Simulated Annealing Algorithm

Single Machine Models

Totally unimodular matrices. Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems

Optimization of Power Transformer Design using Simulated Annealing Technique

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

Who Has Heard of This Problem? Courtesy: Jeremy Kun

Branislav K. Nikolić

Lecture 35 Minimization and maximization of functions. Powell s method in multidimensions Conjugate gradient method. Annealing methods.

8.3 Hamiltonian Paths and Circuits

Introduction to Machine Learning CMU-10701

Ant Colony Optimization: an introduction. Daniel Chivilikhin

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

School of EECS Washington State University. Artificial Intelligence

A Cross Entropy Based Multi-Agent Approach to Traffic Assignment Problems

Advanced sampling. fluids of strongly orientation-dependent interactions (e.g., dipoles, hydrogen bonds)

Artificial Intelligence Methods (G5BAIM) - Examination

Nondeterministic Polynomial Time

Chapter 10. Optimization Simulated annealing

Fundamentals of Metaheuristics

Random Search. Shin Yoo CS454, Autumn 2017, School of Computing, KAIST

Reminder of some Markov Chain properties:

GTOC 7 Team 12 Solution

( ) ( ) ( ) ( ) Simulated Annealing. Introduction. Pseudotemperature, Free Energy and Entropy. A Short Detour into Statistical Mechanics.

Optimization Prof. A. Goswami Department of Mathematics Indian Institute of Technology, Kharagpur. Lecture - 20 Travelling Salesman Problem

Local search and agents

x 2 i 10 cos(2πx i ). i=1

Local search algorithms

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296

Principles of Computing, Carnegie Mellon University. The Limits of Computing

6 Markov Chain Monte Carlo (MCMC)

Introduction into Vehicle Routing Problems and other basic mixed-integer problems

Applied Integer Programming: Modeling and Solution

Part III: Traveling salesman problems

Combinatorial Optimization

Maze Adrian Fisher. Todd W. Neller

Pearson Edexcel Level 1/Level 2 GCSE (9-1) Combined Science Paper 2: Physics 2

CS 301: Complexity of Algorithms (Term I 2008) Alex Tiskin Harald Räcke. Hamiltonian Cycle. 8.5 Sequencing Problems. Directed Hamiltonian Cycle

Exercises NP-completeness

EE582 Physical Design Automation of VLSI Circuits and Systems

Pengju

Transcription:

Simulated Annealing

Hill climbing

Simulated Annealing Local Search Cost function? Solution space

Annealing Annealing is a thermal process for obtaining low energy states of a solid in a heat bath. The process contains two steps: Increase the temperature of the heat bath to a maximum value at which the solid melts. Decrease carefully the temperature of the heat bath until the particles arrange themselves in the ground state of the solid. Ground state is a minimum energy state of the solid. The ground state of the solid is obtained only if the maximum temperature is high enough and the cooling is done slowly.

Simulated Annealing To apply simulated annealing with optimization purposes we require the following: A successor function that returns a close neighboring solution given the actual one. This will work as the disturbance for the particles of the system. A target function to optimize that depends on the current state of the system. This function will work as the energy of the system. The search is started with a randomized state. In a polling loop we will move to neighboring states always accepting the moves that decrease the energy while only accepting bad moves accordingly to a probability distribution dependent on the temperature of the system.

Simulated Annealing Decrease the temperature slowly, accepting less bad moves at each temperature level until at very low temperatures the algorithm becomes a greedy hill-climbing algorithm. The distribution used to decide if we accept a bad movement is know as Boltzman distribution. This distribution is very well known is in solid physics and plays a central role in simulated annealing. Where γ is the current configuration of the system, E γ is the energy related with it, and Z is a normalization constant.

Annealing Process Annealing Process Raising the temperature up to a very high level (melting temperature, for example), the atoms have a higher energy state and a high possibility to re-arrange the crystalline structure. Cooling down slowly, the atoms have a lower and lower energy state and a smaller and smaller possibility to re-arrange the crystalline structure.

Statistical Mechanics Combinatorial Optimization State {r:} (configuration -- a set of atomic position ) weight e -E({r:])/K B T -- Boltzmann distribution E({r:]): energy of configuration K B : Boltzmann constant T: temperature Low temperature limit??

Analogy Physical System State (configuration) Energy Ground State Rapid Quenching Careful Annealing Optimization Problem Solution Cost function Optimal solution Iteration improvement Simulated annealing

Simulated Annealing Analogy Metal Problem Energy State Cost Function Temperature Control Parameter A completely ordered crystalline structure the optimal solution for the problem Global optimal solution can be achieved as long as the cooling process is slow enough.

Other issues related to simulated annealing 1. Global optimal solution is possible, but near optimal is practical 2. Parameter Tuning 1. Aarts, E. and Korst, J. (1989). Simulated Annealing and Boltzmann Machines. John Wiley & Sons. 3. Not easy for parallel implementation, but was implemented. 4. Random generator quality is important

Analogy Slowly cool down a heated solid, so that all particles arrange in the ground energy state At each temperature wait until the solid reaches its thermal equilibrium Probability of being in a state with energy E : Pr { E = E } = 1 / Z(T). exp (-E / k B.T) E T k B Z(T) Energy Temperature Boltzmann constant Normalization factor (temperature dependant)

Simulated Annealing Same algorithm can be used for combinatorial optimization problems: Energy E corresponds to the Cost function C Temperature T corresponds to control parameter c Pr { configuration = i } = 1/Q(c). exp (-C(i) / c) C c Q(c) Cost Control parameter Normalization factor (not important)

Components of Simulated Definition of solution Annealing Search mechanism, i.e. the definition of a neighborhood Cost-function

Control Parameters 1. How to define equilibrium? 2. How to calculate new temperature for next step? 1. Definition of equilibrium 1. Definition is reached when we cannot yield any significant improvement after certain number of loops 2. A constant number of loops is assumed to reach the equilibrium 2. Annealing schedule (i.e. How to reduce the temperature) 1. A constant value is subtracted to get new temperature, T = T - T d 2. A constant scale factor is used to get new temperature, T = T * R d A scale factor usually can achieve better performance

Control Parameters: Temperature Temperature determination: Artificial, without physical significant Initial temperature 1. Selected so high that leads to 80-90% acceptance rate Final temperature 1. Final temperature is a constant value, i.e., based on the total number of solutions searched. No improvement during the entire Metropolis loop 2. Final temperature when acceptance rate is falling below a given (small) value Problem specific and may need to be tuned

Simulated Annealing: the code 1. Create random initial solution γ 2. E old =cost(γ); 3. for(temp=temp max ; temp>=temp min ;temp=next_temp(temp) ) { 4. for(i=0;i<i max ; i++ ) { 5. succesor_func(γ); //this is a randomized function 6. E new =cost(γ); 7. delta=e new -E old; 8. if(delta>0) 9. if(random() >= exp(-delta/k*temp); 10. undo_func(γ); //rejected bad move 11. else 12. E old =E new //accepted bad move 13. else 14. E old =E new; //always accept good moves } }

Simulated Annealing Acceptance criterion and cooling schedule

Practical Issues with simulated annealing Cost function must be carefully developed, it has to be fractal and smooth. The energy function of the left would work with SA while the one of the right would fail.

Practical Issues with simulated annealing The cost function should be fast it is going to be called millions of times. The best is if we just have to calculate the deltas produced by the modification instead of traversing through all the state. This is dependent on the application.

Practical Issues with simulated annealing In asymptotic convergence simulated annealing converges to globally optimal solutions. In practice, the convergence of the algorithm depends of the cooling schedule. There are some suggestion about the cooling schedule but it stills requires a lot of testing and it usually depends on the application.

Practical Issues with simulated annealing Start at a temperature where 50% of bad moves are accepted. Each cooling step reduces the temperature by 10% The number of iterations at each temperature should attempt to move between 1-10 times each element of the state. The final temperature should not accept bad moves; this step is known as the quenching step.

Applications Basic Problems Traveling salesman Graph partitioning Matching problems Graph coloring Scheduling Engineering VLSI design Placement Routing Array logic minimization Layout Facilities layout Image processing Code design in information theory

Example of Simulated Annealing Traveling Salesman Problem (TSP) Given 6 cities and the traveling cost between any two cities A salesman need to start from city 1 and travel all other cities then back to city 1 Minimize the total traveling cost

Example: SA for traveling salesman Solution representation An integer list, i.e., (1,4,2,3,6,5) Search mechanism Swap any two integers (except for the first one) (1,4,2,3,6,5) (1,4,3,2,6,5) Cost function

Example: SA for traveling salesman Temperature 1. Initial temperature determination 1. Initial temperature is set at such value that there is around 80% acceptation rate for bad move 2. Determine acceptable value for (C new C old ) 2. Final temperature determination Stop criteria Solution space coverage rate Annealing schedule (i.e. How to reduce the temperature) A constant value is subtracted to get new temperature, T = T T d For instance new value is 90% of previous value. Depending on solution space coverage rate

Simulated Annealing The process of annealing can be simulated with the Metropolis algorithm, which is based on Monte Carlo techniques. We can apply this algorithm to generate a solution to combinatorial optimization problems assuming an analogy between them and physical many-particle systems with the following equivalences: Solutions in the problem are equivalent to states in a physical system. The cost of a solution is equivalent to the energy of a state.

Simulated Annealing Algorithm Initialize: initial solution x, highest temperature T h, and coolest temperature T l T= T h When the temperature is higher than T l While not in equilibrium Search for the new solution X Accept or reject X according to Metropolis Criterion End Decrease the temperature T End